NLP with representation learning
This fall semester, I finally got into “Natural Language Processing with Representation Learning”, a popular course taught by Kyunghyun Cho at NYU. I remember reading the lecture note a few years ago and feeling really inspired to learn, so this was a particularly excited class for me.
Throughout the course, I completed two large programming assignments:
For our final project, we had to build a sequence of neural machine translation systems for two language pairs, Vietnamese (Vi) →︎ English (En) and Chinese (Zh) →︎ English (En). Specifically, we implemented and evaluated the following model architectures:
- RNN based encoder-decoder without attention
- RNN based encoder-decoder with attention
- CNN-RNN encoder-decoder with attention
Our results were promising despite the limited amount of training data! You can read all about our work in our project report on “RNN/CNN-based Neural Machine Translation for Vietnamese and Chinese to English”.
Til next time,