Informatika i Ee Primeneniya [Informatics and its Applications]
RUS  ENG    JOURNALS   PEOPLE   ORGANISATIONS   CONFERENCES   SEMINARS   VIDEO LIBRARY   PACKAGE AMSBIB  
General information
Latest issue
Archive
Impact factor

Search papers
Search references

RSS
Latest issue
Current issues
Archive issues
What is RSS



Inform. Primen.:
Year:
Volume:
Issue:
Page:
Find






Personal entry:
Login:
Password:
Save password
Enter
Forgotten password?
Register


Informatika i Ee Primeneniya [Informatics and its Applications], 2018, Volume 12, Issue 4, Pages 63–69
DOI: https://doi.org/10.14357/19922264180409
(Mi ia564)
 

This article is cited in 1 scientific paper (total in 1 paper)

Optimal recurrent neural network model in paraphrase detection

A. N. Smerdova, O. Yu. Bakhteeva, V. V. Strijovab

a Moscow Institute of Physics and Technology, 9 Institutskiy Per., Dolgoprudny, Moscow Region 141700, Russian Federation
b A. A. Dorodnicyn Computing Center, Federal Research Center “Computer Science and Control” of the Russian Academy of Sciences, 40 Vavilov Str., Moscow 119333, Russian Federation
Full-text PDF (380 kB) Citations (1)
References:
Abstract: This paper addresses the problem of optimal recurrent neural network selection. It asserts the neural network evidence lower bound as the optimal criterion for selection. It investigates variational inference methods to approximate the posterior distribution of the network parameters. As a particular case, the normal distribution of the parameters with different types of the covariance matrix is investigated. The authors propose a method of pruning parameters with the highest probability density in zero to increase the model marginal likelihood. As an illustrative example, a computational experiment of multiclass classification on the SemEval 2015 dataset was carried out.
Keywords: deep learning, recurrent neural network, neural network pruning, variational approach.
Funding agency Grant number
Russian Foundation for Basic Research 16-07-01160_а
Ministry of Education and Science of the Russian Federation 05.Y09.21.0018
This research was supported by the Russian Foundation for Basic Research, project 16-07-01160, and by Government of the Russian Federation, agreement 05.Y09.21.0018.
Received: 05.05.2018
Bibliographic databases:
Document Type: Article
Language: Russian
Citation: A. N. Smerdov, O. Yu. Bakhteev, V. V. Strijov, “Optimal recurrent neural network model in paraphrase detection”, Inform. Primen., 12:4 (2018), 63–69
Citation in format AMSBIB
\Bibitem{SmeBakStr18}
\by A.~N.~Smerdov, O.~Yu.~Bakhteev, V.~V.~Strijov
\paper Optimal recurrent neural network model in~paraphrase detection
\jour Inform. Primen.
\yr 2018
\vol 12
\issue 4
\pages 63--69
\mathnet{http://mi.mathnet.ru/ia564}
\crossref{https://doi.org/10.14357/19922264180409}
\elib{https://elibrary.ru/item.asp?id=36574077}
Linking options:
  • https://www.mathnet.ru/eng/ia564
  • https://www.mathnet.ru/eng/ia/v12/i4/p63
  • This publication is cited in the following 1 articles:
    Citing articles in Google Scholar: Russian citations, English citations
    Related articles in Google Scholar: Russian articles, English articles
    Информатика и её применения
    Statistics & downloads:
    Abstract page:237
    Full-text PDF :84
    References:20
     
      Contact us:
     Terms of Use  Registration to the website  Logotypes © Steklov Mathematical Institute RAS, 2024