Self-Attentional Models for Lattice Inputs

Publication
Association for Computational Linguistic (ACL)

Code

Our model is implemented in XNMT. The employed version including extensions written for this paper can be downloaded here. The relevant XNMT extension is under xnmt/custom/struct_selfatt.py. How to convert PLF lattices into the required format is documented here: misc/test/config/lattice.yaml

Example configuration files