Population based training of neural networks

deepmind.com

DeepMind presented a very interesting technique for hyperparameter optimization. The algorithm starts with multiple randomly chosen hyperparameter configurations in parallel and then periodically updates the running experiments or spawns new ones to test different, more promising configurations. Although this requires a hefty amount of machines, DeepMind managed to find configurations that exceeded previous results. For a more in-depth look, you may want to check out the paper as well.

Read more...
Linkedin

Want to receive more content like this in your inbox?