Beyond Word Importance: Contextual Decomposition to Extract Interactions from LSTMs (ICLR 2018)

openreview.net

The authors propose contextual decomposition, an interpretation algorithm for analysing predictions of standard LSTMs by decomposing the output. The method allows identification of words and phrases of contrasting sentiment and extraction of positive and negative negations from a model trained on phrase-level annotations. The key insight is that gating dynamics in LSTMs are a vehicle for modeling interactions between variables.

Read more...
Linkedin

Want to receive more content like this in your inbox?