Deep learning performance scales in three specific ways:
- We can search for improved model architectures.
- We can scale computation.
- We can create larger training data sets
This (very accessible!) summary of a recent paper takes a look at the empirical results of how different deep learning has scaled in different domains. It shows that there are consistent scaling properties across different problem domains, leading to the conclusion that we can make predictions about how future scaling will occur.Read more...