Implicit Generation and Generalization Methods for Energy-Based Models

openai.com

We’ve made progress towards stable and scalable training of energy-based models (EBMs) resulting in better sample quality and generalization ability than existing models. Generation in EBMs spends more compute to continually refine its answers and doing so can generate samples competitive with GANs at low temperatures, while also having mode coverage guarantees of likelihood-based models. We hope these findings stimulate further research into this promising class of models.

Energy-based models were brand new to me prior to reading this post. This article kicked off a long rabbit-hole of reading, so beware! Of course Yann LeCunn wrote the original paper back in 2006.

Read more...
Linkedin

Want to receive more content like this in your inbox?