Informatika i Ee Primeneniya [Informatics and its Applications]
RUS  ENG    JOURNALS   PEOPLE   ORGANISATIONS   CONFERENCES   SEMINARS   VIDEO LIBRARY   PACKAGE AMSBIB  
General information
Latest issue
Archive
Impact factor

Search papers
Search references

RSS
Latest issue
Current issues
Archive issues
What is RSS



Inform. Primen.:
Year:
Volume:
Issue:
Page:
Find






Personal entry:
Login:
Password:
Save password
Enter
Forgotten password?
Register


Informatika i Ee Primeneniya [Informatics and its Applications], 2019, Volume 13, Issue 3, Pages 90–96
DOI: https://doi.org/10.14357/19922264190313
(Mi ia614)
 

This article is cited in 2 scientific papers (total in 2 papers)

Architecture of a machine translation system

V. A. Nuriev

Institute of Informatics Problems, Federal Research Center “Computer Science and Control” of the Russian Academy of Sciences, 44-2 Vavilov Str., Moscow 119333, Russian Federation
Full-text PDF (172 kB) Citations (2)
References:
Abstract: The paper describes architecture of a Neural Machine Translation (NMT) system. The subject is brought up since NMT, i.e., translation using artificial neural networks, is now a leading Machine Translation paradigm. The NMT systems manage to deliver much better quality of output than the machine translators of the previous generation (statistical translation systems) do. Yet, the translation they produce still may contain various errors and it is relatively inaccurate compared with human translations. Therefore, to improve its quality, it is important to see more clearly how an NMT system is built and works. Commonly, its architecture consists of two recurrent neural networks, one to get the input text sequence and the other to generate translated output (text sequence). The NMT system often has an attention mechanism helping it cope with long input sequences. As an example, Google's NMT system is taken as the Google Translate service is one of the most highly demanded today, it processes around 143 billion words in more than 100 languages per day. The paper concludes with some perspectives for future research.
Keywords: neural machine translation, artificial neural networks, recurrent neural networks, attention mechanism, architecture of a machine translation system, Google's Neural Machine Translation system.
Received: 30.06.2019
Document Type: Article
Language: Russian
Citation: V. A. Nuriev, “Architecture of a machine translation system”, Inform. Primen., 13:3 (2019), 90–96
Citation in format AMSBIB
\Bibitem{Nur19}
\by V.~A.~Nuriev
\paper Architecture of~a~machine translation system
\jour Inform. Primen.
\yr 2019
\vol 13
\issue 3
\pages 90--96
\mathnet{http://mi.mathnet.ru/ia614}
\crossref{https://doi.org/10.14357/19922264190313}
Linking options:
  • https://www.mathnet.ru/eng/ia614
  • https://www.mathnet.ru/eng/ia/v13/i3/p90
  • This publication is cited in the following 2 articles:
    Citing articles in Google Scholar: Russian citations, English citations
    Related articles in Google Scholar: Russian articles, English articles
    Информатика и её применения
    Statistics & downloads:
    Abstract page:823
    Full-text PDF :528
    References:45
     
      Contact us:
     Terms of Use  Registration to the website  Logotypes © Steklov Mathematical Institute RAS, 2024