Self-Attentional Models for Lattice Inputs

Association for Computational Linguistic (ACL)


Our model is implemented in XNMT. The employed version including extensions written for this paper can be downloaded here. The relevant XNMT extension is under xnmt/custom/ How to convert PLF lattices into the required format is documented here: misc/test/config/lattice.yaml

Example configuration files