Mohammad Azam Khan

Postdoc


Curriculum vitae




Meta-Learning for Low-Resource Unsupervised Neural MachineTranslation


Journal article


Yunwon Tae, Cheonbok Park, Taehee Kim, Soyoung Yang, Mohammad Azam Khan, Eunjeong Park, Tao Qin, J. Choo
arXiv.org, 2020

Semantic Scholar DBLP
Cite

Cite

APA   Click to copy
Tae, Y., Park, C., Kim, T., Yang, S., Khan, M. A., Park, E., … Choo, J. (2020). Meta-Learning for Low-Resource Unsupervised Neural MachineTranslation. ArXiv.org.


Chicago/Turabian   Click to copy
Tae, Yunwon, Cheonbok Park, Taehee Kim, Soyoung Yang, Mohammad Azam Khan, Eunjeong Park, Tao Qin, and J. Choo. “Meta-Learning for Low-Resource Unsupervised Neural MachineTranslation.” arXiv.org (2020).


MLA   Click to copy
Tae, Yunwon, et al. “Meta-Learning for Low-Resource Unsupervised Neural MachineTranslation.” ArXiv.org, 2020.


BibTeX   Click to copy

@article{yunwon2020a,
  title = {Meta-Learning for Low-Resource Unsupervised Neural MachineTranslation},
  year = {2020},
  journal = {arXiv.org},
  author = {Tae, Yunwon and Park, Cheonbok and Kim, Taehee and Yang, Soyoung and Khan, Mohammad Azam and Park, Eunjeong and Qin, Tao and Choo, J.}
}

Abstract

Unsupervised machine translation, which utilizes unpaired monolingual corpora as training data, has achieved comparable performance against supervised machine translation. However, it still suffers from data-scarce domains. To address this issue, this paper presents a meta-learning algorithm for unsupervised neural machine translation (UNMT) that trains the model to adapt to another domain by utilizing only a small amount of training data. We assume that domain-general knowledge is a significant factor in handling data-scarce domains. Hence, we extend the meta-learning algorithm, which utilizes knowledge learned from high-resource domains to boost the performance of low-resource UNMT. Our model surpasses a transfer learning-based approach by up to 2-4 BLEU scores. Extensive experimental results show that our proposed algorithm is pertinent for fast adaptation and consistently outperforms other baseline models.


Share



Follow this website


You need to create an Owlstown account to follow this website.


Sign up

Already an Owlstown member?

Log in