Supplementary material to our article

Self-Attentional Models for Lattice Inputs

Matthias Sperber, Graham Neubig, Ngoc Quan Pham, Alex Waibel
Association for Computational Linguistics (ACL); 2019

Code

Our model is implemented in XNMT. The employed version including extensions written for this paper can be downloaded here. The relevant XNMT extension is under xnmt/custom/struct_selfatt.py. How to convert PLF lattices into the required format is documented here: misc/test/config/lattice.yaml

Example configuration files