The Fall of RNN and LSTM Models

towardsdatascience.com

Attention based networks are used more and more by Google, Facebook, Salesforce, to name a few. All these companies have replaced RNN and variants for attention based models, and it is just the beginning. RNN have the days counted in all applications, because they require more resources to train and run than attention-based models.

This post on Towards Data Science made the rounds this week. I hadn't heard of attention-based networks before, but this post on WildML does a good job of explaining them:

Attention Mechanisms in Neural Networks are (very) loosely based on the visual attention mechanism found in humans. Human visual attention is well-studied and while there exist different models, all of them essentially come down to being able to focus on a certain region of an image with “high resolution” while perceiving the surrounding image in “low resolution”, and then adjusting the focal point over time.

This is an interesting topic. I'll keep a lookout for future posts; send anything my way that you come across.

Read more...
Linkedin

Want to receive more content like this in your inbox?