Neural Machine Translation has been shown to enable in-ference and cross-lingual knowledge transfer across multi-ple language directions using a single multilingual model.Focusing on this multilingual translation scenario, this worksummarizes FBK’s participation in the IWSLT 2017 sharedtask. Our submissions rely on two multilingual systemstrained on five languages (English, Dutch, German, Ital-ian, and Romanian). The first one is a20language direc-tion model, which handles all possible combinations of thefive languages. The second multilingual system is trainedonly on16directions, leaving the others as zero-shot trans-lation directions (i.erepresenting a more complex inferencetask on language pairs not seen at training time). Morespecifically, our zero-shot directions are Dutch$Germanand Italian$Romanian (resulting in four language combi-nations). Despite the small amount of parallel data usedfor training these systems, the resulting multilingual modelsare effective, even in comparison with models trained sepa-rately for every language pair (i.e.in more favorable condi-tions). We compare and show the results of the two multi-lingual models against a baseline single language pair sys-tems. Particularly, we focus on the four zero-shot directionsand show how a multilingual model trained with small datacan provide reasonable results. Furthermore, we investigatehow pivoting (i.eusing a bridge/pivot language for inferencein a source!pivot!target translations) using a multilingualmodel can be an alternative to enable zero-shot translation ina low resource setting.

FBK’s Multilingual Neural Machine Translation System for IWSLT 2017

Surafel M. Lakew;Quintino F. Lotito;Marco Turchi;Matteo Negri;Marcello Federico
2017-01-01

Abstract

Neural Machine Translation has been shown to enable in-ference and cross-lingual knowledge transfer across multi-ple language directions using a single multilingual model.Focusing on this multilingual translation scenario, this worksummarizes FBK’s participation in the IWSLT 2017 sharedtask. Our submissions rely on two multilingual systemstrained on five languages (English, Dutch, German, Ital-ian, and Romanian). The first one is a20language direc-tion model, which handles all possible combinations of thefive languages. The second multilingual system is trainedonly on16directions, leaving the others as zero-shot trans-lation directions (i.erepresenting a more complex inferencetask on language pairs not seen at training time). Morespecifically, our zero-shot directions are Dutch$Germanand Italian$Romanian (resulting in four language combi-nations). Despite the small amount of parallel data usedfor training these systems, the resulting multilingual modelsare effective, even in comparison with models trained sepa-rately for every language pair (i.e.in more favorable condi-tions). We compare and show the results of the two multi-lingual models against a baseline single language pair sys-tems. Particularly, we focus on the four zero-shot directionsand show how a multilingual model trained with small datacan provide reasonable results. Furthermore, we investigatehow pivoting (i.eusing a bridge/pivot language for inferencein a source!pivot!target translations) using a multilingualmodel can be an alternative to enable zero-shot translation ina low resource setting.
File in questo prodotto:
File Dimensione Formato  
iwslt17-system.pdf

accesso aperto

Tipologia: Documento in Post-print
Licenza: Dominio pubblico
Dimensione 255.92 kB
Formato Adobe PDF
255.92 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11582/313114
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
social impact