Recurrent Neural Network Techniques: Emphasis on Use in Neural Machine Translation

Dima Suleiman, Wael Etaiwi, Arafat Awajan


Natural Language Processing (NLP) is the processing and the representation of human language in a way that accommodate its use in modern computer technology. Several techniques including deep learning, graph-based, rule-based and word embedding can be used in variety of NLP application such as text summarization, question and answering and sentiment analysis. In this paper, machine translation techniques based on using recurrent neural networks are analyzed and discussed. The techniques are divided into three categories including recurrent neural network, recurrent neural network with phrase-based models and recurrent neural techniques with graph-based models. Several experiments are performed in several datasets to make translation between different languages. In addition, in most of techniques, BLEU is used in evaluating the performance of different translation models.

Full Text:



D. Suleiman and A. Awajan, "Comparative study of word embeddings models and their usage in Arabic language applications," International Arab Conference on Information Technology (ACIT), Werdanye, Lebanon, pp. 1-7.2018.

D. Suleiman, A. Awajan, and N. Al-Madi, "Deep Learning Based Technique for Plagiarism Detection in Arabic Texts," in 2017 International Conference on New Trends in Computing Sciences (ICTCS), Amman, pp. 216-222, 2017.

D. Suleiman, A. A. Awajan, and W. al Etaiwi, "Arabic Text Keywords Extraction using Word2Vec," in 2019 2nd International Conference on new Trends in Computing Sciences (ICTCS), Amman, Jordan, pp. 1-7, doi: 10.1109/ICTCS.2019.8923034, 2019.

C. Tapsai, P. Meesad, and C. Haruechaiyasak, “Natural Language Interface to Database for Data Retrieval and Processing,” j.asep, May 2020, doi: 10.14416/j.asep.2020.05.003.

D. Suleiman and A. Awajan, "Bag-of-concept based keyword extraction from Arabic documents," in 2017 8th International Conference on Information Technology (ICIT), Amman, Jordan, pp. 863-869, 2017.

D. Suleiman, G. Al-Naymat, M. Itriq, "Deep SMS Spam Detection using H2O Platform," International Journal of Advanced Trends in Computer Science and Engineering 9(5):9179-9188, DOI: 10.30534/ijatcse/2020/326952020, 2020.

D. Suleiman, M. Al-Zewairi, W. Etaiwi, G. Al-Naymat, "Empirical Evaluation of the Classification of Deep Learning under Big Data Processing Platforms," International Journal of Advanced Trends in Computer Science and Engineering 9(5):9189-9196, DOI: 10.30534/ijatcse/2020/327952020, 2020.

A. Alqudsi, N. Omar, and K. Shaker, “Arabic machine translation: a survey,” Artif. Intell. Rev., vol. 42, no. 4, pp. 549–572, Dec. 2014, doi: 10.1007/s10462-012-9351-1.

M. R. Costa-jussà and J. A. R. Fonollosa, “Latest trends in hybrid machine translation and its applications,” Comput. Speech Lang., vol. 32, no. 1, pp. 3–10, Jul. 2015, doi: 10.1016/j.csl.2014.11.001.

LeCun, Y., Bengio, Y., & Hinton, G. (2015). Deep learning. Nature, 521(7553), 436–444.

D. Suleiman and A. Awajan, "Deep Learning Based Abstractive Text Summarization: Approaches, Datasets, Evaluation Measures, and Challenges," in 2020 Mathematical Problems in Engineering, vol. 2020, pp. 1-29, doi: 10.1155/2020/9365340.

K. Cho et al., “Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation,” ArXiv14061078 Cs Stat, Jun. 2014, [Online]. Available:

F. Guzmán, S. R. Joty, L. Màrquez, and P. Nakov, “Machine Translation Evaluation with Neural Networks,” ArXiv171002095 Cs, Oct. 2017, [Online]. Available:

E. Greenstein and D. Penner, “Japanese-to-English Machine Translation Using Recurrent Neural Networks, ” Stanford Deep Learning for NLP Course, 2015.

M. Parmar and V. S. Devi, “Neural Machine Translation with Recurrent Highway Networks,” in International Conference on Mining Intelligence and Knowledge Exploration, 2019, pp. 299–308.

D. Datta, P. E. David, D. Mittal, and A. Jain, “Neural Machine Translation using Recurrent Neural Network,” Int. J. Eng. Adv. Technol., vol. 9, no. 4, pp. 1395–1400, 2020.

L. Liu, A. Finch, M. Utiyama, and E. Sumita, “Agreement on Target-Bidirectional Recurrent Neural Networks for Sequence-to-Sequence Learning,” J. Artif. Intell. Res., vol. 67, pp. 581–606, 2020.

P.-S. Huang, C. Wang, S. Huang, D. Zhou, and L. Deng, “Towards Neural Phrase-based Machine Translation,” Feb. 2018, Accessed: Jun. 01, 2018. [Online]. Available:

L. M. Werlen, N. Pappas, D. Ram, and A. Popescu-Belis, “Self-Attentive Residual Decoder for Neural Machine Translation,” ArXiv170904849 Cs, Sep. 2017, [Online]. Available:

S. K. Mahata, D. Das, and S. Bandyopadhyay, “Mtil2017: Machine translation using recurrent neural network on statistical machine translation,” J. Intell. Syst., vol. 28, no. 3, pp. 447–453, 2019.

L. Song, Y. Zhang, Z. Wang, and D. Gildea, “A Graph-to-Sequence Model for AMR-to-Text Generation,” ArXiv180502473 Cs, May 2018, [Online]. Available:

K. Hashimoto and Y. Tsuruoka, “Neural Machine Translation with Source-Side Latent Graph Parsing,” ArXiv170202265 Cs, Feb. 2017, [Online]. Available:

J. Su, Z. Tan, D. Xiong, R. Ji, X. Shi, and Y. Liu, “Lattice-Based Recurrent Neural Network Encoders for Neural Machine Translation,” ArXiv160907730 Cs, Sep. 2016, [Online]. Available:

D. Bahdanau, K. Cho, and Y. Bengio, “Neural machinetranslation by jointly learning to align and translate,” inProceedings of the International Conference on LearningRepresentations, Canada, 2014,

Papineni, K.; Roukos, S.;Ward, T.; and Zhu,W. 2002. Bleu: A method for automatic evaluation of machine translation. In Proc. of ACL2002, 311–318.


Creative Commons License
This work is licensed under a Creative Commons Attribution 3.0 License.