Self-organized Hierarchical Softmax

arxiv.org

The authors propose a new self-organizing hierarchical softmax formulation for neural-network-based language models over large vocabularies. Instead of using a predefined hierarchical structure, their approach is capable of learning word clusters with clear syntactical and semantic meaning during the language model training process.

Read more...
Linkedin

Want to receive more content like this in your inbox?