The Future of Machine Learning is Tiny

I’m convinced that machine learning can run on tiny, low-power chips, and that this combination will solve a massive number of problems we have no solutions for right now.

This is a fascinating read. To give you a quick sense of how small we're talking: 

...the MobileNetV2 image classification network takes 22 million ops (each multiply-add is two ops) in its smallest configuration. If I know that a particular system takes 5 picojoules to execute a single op, then it will take (5 picojoules * 22,000,000) = 110 microjoules of energy to execute. If we’re analyzing one frame per second, then that’s only 110 microwatts, which a coin battery could sustain continuously for nearly a year.

I learned a lot about neural network power consumption in this post—it's really the only place I've ever seen this discussed in depth before. Most of us spend our time running jobs on huge clusters in far-away data centers, so power consumption is not a common consideration.

The impacts on future product design could be significant. 


Want to receive more content like this in your inbox?