[1706.09733] Stronger Baselines for Trustable Results in Neural Machine Translation


Strong baselines are one of the most important prerequisites for conducting reliable research. Many new methods for NMT, however only compare against vanilla implementations. Denkowski & Neubig propose three baselines that are easy to implement and yield significant gains over regular baselines: 1. using Adam with multiple restarts and learning rate annealing; 2. sub-word translation via bye pair encoding; 3. decoding with ensembles of independently trained models.

Linkedin Revue

Want to receive more content like this in your inbox?