Deep Learning Scaling is Predictable, Empirically

Deep learning performance scales in three specific ways:

  1. We can search for improved model architectures.
  2. We can scale computation.
  3. We can create larger training data sets

This (very accessible!) summary of a recent paper takes a look at the empirical results of how different deep learning has scaled in different domains. It shows that there are consistent scaling properties across different problem domains, leading to the conclusion that we can make predictions about how future scaling will occur.


Want to receive more content like this in your inbox?