DeepMind: A new model and dataset for long-range memory

deepmind.com

Throughout our lives, we build up memories that are retained over a diverse array of timescales, from minutes to months to years to decades. When reading a book, we can recall characters who were introduced many chapters ago, or in an earlier book in a series, and reason about their motivations and likely actions in the current context. We can even put the book down during a busy week, and pick up from where we left off without forgetting the plotline. 
We do not achieve such feats by storing every detail of sensory input we receive about the world throughout our lifetimes. Our brains select, filter, and integrate input stimuli based on factors of relevance, surprise, perceived danger, and repetition. In other words, we compress lifelong experience to a set of salient memories which help us understand the past, and better anticipate the future. A major goal of AI researchers is discovering ways of implementing such abilities in computational systems and benchmarks which require complex reasoning over long time-spans. 

Memory is an extremely deep area of research, perhaps one of the deepest in all of AI. DeepMind's announcement here of their Compressive Transformer architecture is an interesting step along that path. Good overview of the topic and a digestible review of their research.

Read more...
Linkedin

Want to receive more content like this in your inbox?