Relational recurrent neural networks (arXiv)

arxiv.org

Not to be confused with Recurrent relational networks (thank God for descriptive paper titles). Santoro et al. propose to model the interaction between different memory cells in a memory-based neural network (in contrast to previous work, which only models the interaction between the current state and each memory). They do this using multi-head dot product attention (used in the Transformer paper) and evaluate on different tasks including language modelling where they show improvements.

Read more...
Linkedin

Want to receive more content like this in your inbox?