Supervised Project
Generation of simplified texts |
Loria - Synalp
| Claire Gardent
Categorie
URL
Don't put volumes
Summary
NLG Task:
Describes the findings of the Third Workshop on Neural Generation and Translation 2019, Participants were tasked with creating (NMT) systems that were both accurate and efficient and developing systems (DGT) that generate summaries from structured data, potentially with assistance from text in another language
Training Data:
A subset of the RotoWire dataset is used as the training data, accompanied by professional German translations which are sentences aligned to the original English articles.
obtained parallel dataset is called RotoWire English-German dataset
Model Description:
Workshop provided a forum for research in applications of neural models to machine translation and other language generation tasks (including summarization (Rush et al., 2015), NLG from structured data (Wen et al., 2015), dialog response generation (Vinyals and Le, 2015), among others).
Key Contribution:
In NLG track, we observed apparent difference between the constrainsed and unconstrained settings. NLE's groups approach showed that pre-training of the document level generation model on news corpora is effective even if the source input differs. Among constrained systems it is worth noting that all the systems but Team EdiNLG used the Transformer, but the result did not show any noticeable improvements compared to EdiNLG.
Results:
This paper summarized the results of the Third Workshop on Neural Generation and Translation, where we saw a number of research advances. Particularly, this year introduced a new document generation and translation task, that tested the efficacy of systems for both the purposes of translation and generation in a single testbed
Update