I am confused. Here, you mean RNN as Recurrent Neural Networks, not Recursive Neural Networks. So this paper is different from the other one with the similar name published in 2015 AAAI Spring Symposium


Can you answer this question?
People are searching for a better answer to this question.

Compositional Vector Space Models for Knowledge Base Completion

Knowledge base (KB) completion adds new facts to a KB by making inferences from existing facts, for example by inferring with high likelihood nationality(X,Y) from bornIn(X,Y). Most previous methods infer simple one-hop relational synonyms like this, or use as evidence a multi-hop relational path treated as an atomic feature, like bornIn(X,Z) -> containedIn(Z,Y). This paper presents an approach that reasons about conjunctions of multi-hop relations non-atomically, composing the implications of a path using a recursive neural network (RNN) that takes as inputs vector embeddings of the binary relation in the path. Not only does this allow us to generalize to paths unseen at training time, but also, with a single high-capacity RNN, to predict new relation types not seen when the compositional model was trained (zero-shot learning). We assemble a new dataset of over 52M relational triples, and show that our method improves over a traditional classifier by 11%, and a method leveraging pre-trained embeddings by 7%.

Alan
10 minutes ago
Views
This is a comment super asjknd jkasnjk adsnkj
Cancel
Save
Upvote
Downvote
upvotes
  -  
Edit
-  
Unpublish

You are asking your first question!
How to quickly get a good answer:
  • Keep your question short and to the point
  • Check for grammar or spelling errors.
  • Phrase it like a question
Test
Test description