A Flexible Approach to Automated RNN Architecture Generation

arxiv.org

Neural architecture search (NAS) has emerged as a useful technique to discover novel deep neural network architectures that achieve excellent results. Salesforce researchers propose to make NAS better suited for generating RNNs by proposing a domain-specific language (DSL), which can produce novel RNNs of arbitrary depth and width. The DSL can define LSTMs and GRUs but also allows non-standard components such as layer normalization and trigonometric curves. They generate architectures with random search with a ranking function (using a recursive NN) and RL. The resulting architectures do not follow human intuitions but perform well on target tasks.

Read more...
Linkedin

Want to receive more content like this in your inbox?