Natural Language Processing through Neural Machine Translation

- 1 min

This serves as the blog post to highlight my class project from the Natural Language Processing course I took in Fall of 2017 at McGill. This project was done in collaboration with Lucas Pagé-Caccia.

The full report detailing our work can be found at this link, while all source code can be found at this GitHub repo. You can find the abstract of our report below.


As a project for the Natural Language Processing (COMP550) class, we propose to explore the translation of natural language descriptions of images from the WMT17 Multimodal Machine Translation task. This be- ing a shared task with published results from the community, we show how neural machine translation can perform well with known architectures without any over engineering specific to this task. We further explore how adding knowledge such as image features and pre-trained embeddings affects performance, as well as jointly training the encoder for multiple languages. All source code, data and best model outputs are available at the follow- ing repository:

rss facebook twitter github gitlab youtube mail spotify lastfm instagram linkedin google google-plus pinterest medium vimeo stackoverflow reddit quora quora