Weight Agnostic Neural Networks

weightagnostic.github.io

Not all neural network architectures are created equal, some perform much better than others for certain tasks. But how important are the weight parameters of a neural network compared to its architecture? In this work, we question to what extent neural network architectures alone, without learning any weight parameters, can encode solutions for a given task.

What a silly thing to want to do! Why would anyone want to build neural networks whose connection weights are all identical (and selected at random!)?

It turns out that eliminating a major area of deep learning complexity allows for more effective, and efficient, neural architecture search. The authors come to some useful conclusions at the end.

Long, unusual, and interesting.

Read more...
Linkedin

Want to receive more content like this in your inbox?