Nuts and Bolts of Building Deep Learning Applications

Andrew Ng just delivered a master class on applied deep learning at NIPS2016: 

Ng highlighted the fact that while NIPS is a research conference, many of the newly generated ideas are simply ideas, not yet battle-tested vehicles for converting mathematical acumen into dollars. The bread and butter of money-making deep learning is supervised learning with recurrent neural networks such as LSTMs in second place. Research areas such as Generative Adversarial Networks (GANs), Deep Reinforcement Learning (Deep RL), and just about anything branding itself as unsupervised learning, are simply Research, with a capital R.

The talk then goes on to give concrete recommendations on how to tune supervised deep learning models, recommending a focus on fundamentals.


Want to receive more content like this in your inbox?