Supplementary material to our article

Attention-Passing Models for Robust and Data-Efficient End-to-End Speech Translation

Matthias Sperber, Graham Neubig, Jan Niehues, Alex Waibel
Transactions of the Association for Computational Linguistics (TACL); 2019


Our model is implemented in XNMT. The employed version including extensions written for this paper can be downloaded here. The relevant XNMT extension is under xnmt/custom/

Example configuration files