A Simple Neural Network Module for Relational Reasoning


This is cool. The authors of this (very accessible) paper propose a new neural network architecture that natively understands the way that various entities relate to one another:

...the capacity to compute relations is baked into the RN architecture without needing to be learned, just as the capacity to reason about spatial, translation invariant properties is built-in to CNNs, and the capacity to reason about sequential dependencies is built into recurrent neural networks.

Relational reasoning is a core reasoning skill, and one that neural networks have not performed well on to-date. The authors go on to show that their network design performs at a superhuman level on certain challenging relational reasoning tasks like "What size is the cylinder that is left of the brown metal thing that is left of the big sphere?" The chart below contains their results. Impressive. There is also a blog post if you prefer.


Want to receive more content like this in your inbox?