Home

Melanie Tosik

14 December 2018

NLP with representation learning

This fall semester, I finally got into “Natural Language Processing with Representation Learning”, a popular course taught by Kyunghyun Cho at NYU. I remember reading the lecture note a few years ago and feeling really inspired to learn, so this was a particularly excited class for me.

If you’re unfamiliar with the topic, I highly recommend taking a look at the course materials yourself. The lecture note is available on GitHub, and the syllabus is open-source as well.


Throughout the course, I completed two large programming assignments:

View sentiment project on GitHub ☺︎

View inference project on GitHub ☺︎


For our final project, we had to build a sequence of neural machine translation systems for two language pairs, Vietnamese (Vi) →︎ English (En) and Chinese (Zh) →︎ English (En). Specifically, we implemented and evaluated the following model architectures:

  • RNN based encoder-decoder without attention
  • RNN based encoder-decoder with attention
  • CNN-RNN encoder-decoder with attention

Our results were promising despite the limited amount of training data! You can read all about our work in our project report on “RNN/CNN-based Neural Machine Translation for Vietnamese and Chinese to English”.


View translation project on GitHub ☺︎


Til next time,
Melanie

scribble