Computer Optics
RUS  ENG    JOURNALS   PEOPLE   ORGANISATIONS   CONFERENCES   SEMINARS   VIDEO LIBRARY   PACKAGE AMSBIB  
General information
Latest issue
Archive

Search papers
Search references

RSS
Latest issue
Current issues
Archive issues
What is RSS



Computer Optics:
Year:
Volume:
Issue:
Page:
Find






Personal entry:
Login:
Password:
Save password
Enter
Forgotten password?
Register


Computer Optics, 2021, Volume 45, Issue 4, paper published in the English version journal
DOI: https://doi.org/10.18287/2412-6179-CO-856
(Mi co946)
 

This article is cited in 13 scientific papers (total in 13 papers)

IMAGE PROCESSING, PATTERN RECOGNITION

One-shot learning with triplet loss for vegetation classification tasks

A. V. Uzhinskiyab, G. A. Ososkova, P. V. Goncharova, A. V. Nechaevskiyab, A. A. Smetaninc

a Russian State Agrarian University - Moscow Timiryazev Agricultural Academy, Russia, Moscow, Timiryazevskaya st., 49
b Russian State Agrarian University - Moscow Agricultural Academy after K.A. Timiryazev
c National Research University ITMO, 197101, Russia, Saint–Petersburg, Kronverkskiy pr., 49
References:
Abstract: Triplet loss function is one of the options that can significantly improve the accuracy of the One-shot Learning tasks. Starting from 2015, many projects use Siamese networks and this kind of loss for face recognition and object classification. In our research, we focused on two tasks related to vegetation. The first one is plant disease detection on 25 classes of five crops (grape, cotton, wheat, cucumbers, and corn). This task is motivated because harvest losses due to diseases is a serious problem for both large farming structures and rural families. The second task is the identification of moss species (5 classes). Mosses are natural bioaccumulators of pollutants; therefore, they are used in environmental monitoring programs. The identification of moss species is an important step in the sample preprocessing. In both tasks, we used self-collected image databases. We tried several deep learning architectures and approaches. Our Siamese network architecture with a triplet loss function and MobileNetV2 as a base network showed the most impressive results in both above-mentioned tasks. The average accuracy for plant disease detection amounted to over 97.8% and 97.6% for moss species classification.
Keywords: deep neural networks; siamese networks; triplet loss; plant diseases detection; moss species classification.
Funding agency Grant number
Ministry of Science and Higher Education of the Russian Federation 075-15-2020-905
Russian Foundation for Basic Research 18-07-00829 à
A.V.U. and A.V.N. gratefully acknowledge financial support from the Ministry of Science and Higher Education of the Russian Federation in accordance with agreement No 075-15-2020-905 dated November 16, 2020 on providing a grant in the form of subsidies from the Federal budget of Russian Federation. The grant was provided for state support for the creation and development of a World-class Scientific Center "Agrotechnologies for the Future". The database creation part of the reported study was funded by RFBR according to the research project No 18-07-00829.
Received: 28.12.2020
Accepted: 24.02.2021
Document Type: Article
Language: English
Citation: A. V. Uzhinskiy, G. A. Ososkov, P. V. Goncharov, A. V. Nechaevskiy, A. A. Smetanin
Citation in format AMSBIB
\Bibitem{UzhOsoGon21}
\by A.~V.~Uzhinskiy, G.~A.~Ososkov, P.~V.~Goncharov, A.~V.~Nechaevskiy, A.~A.~Smetanin
\mathnet{http://mi.mathnet.ru/co946}
\crossref{https://doi.org/10.18287/2412-6179-CO-856}
Linking options:
  • https://www.mathnet.ru/eng/co946
  • This publication is cited in the following 13 articles:
    Citing articles in Google Scholar: Russian citations, English citations
    Related articles in Google Scholar: Russian articles, English articles
    Computer Optics
    Statistics & downloads:
    Abstract page:106
    Full-text PDF :37
    References:12
     
      Contact us:
     Terms of Use  Registration to the website  Logotypes © Steklov Mathematical Institute RAS, 2024