Hackerman Hall, Room B17
In this talk, I will describe three research problems I have recently worked on and found worth further discussion and investigation in the context of neural machine translation. First, I will discuss whether the standard autoregressive sequence model could be replaced with non-autoregressive one and if so, how we would do so by introducing the idea of iterative refinement for sequence generation. Second, I will introduce one particular type of meta-learning algorithms, called MAML [Finn et al., 2017] and discuss how this is well-suited for multilingual translation and in particular low-resource translation. Lastly, I will quickly discuss slightly old work on real-time translation. All of these works are highly experimental but at the same time extremely fun to think about and discuss.
Kyunghyun Cho is an assistant professor of computer science and data science at New York University and a research scientist atFacebook AI Research. He was a postdoctoral fellow at University of Montreal until summer 2015 under the supervision of Prof. Yoshua Bengio, and received PhD and MSc degrees from Aalto University early 2014 under the supervision of Prof. Juha Karhunen, Dr. Tapani Raiko and Dr. Alexander Ilin.He tries his best to find a balance among machine learning, natural language processing, and life, but almost always fails to do so.