|
Uchenyye zapiski UlGU. Seriya "Matematika i informatsionnyye tekhnologii", 2020, Issue 2, Pages 30–34
(Mi ulsu10)
|
|
|
|
Mobile application for real-time sign language translation
A. V. Kopylov, M. A. Volkov Ulyanovsk State University, Ulyanovsk, Russia
Abstract:
The paper describes a software product that allows translating the dactyl speech of the American sign language into the letters of the English language. A multithreaded mobile application based on Android OS for gesture recognition uses a trained convolutional neural network with the ResNet18 architecture, with selected hyperparameters of the network model and selected augmentation parameters. One of the important advantages of this product is the ability to work in real-time on mobile devices, without using the Internet. The developed prototype of the mobile application is universal, that is, when changing the language, it is enough to replace only the file with the neural network model (provided that their architectures are similar) and the file with the letters classes.
Keywords:
mobile app, sign language, fingerprint, convolutional neural networks, Android, Python, Kotlin.
Received: 27.05.2020 Revised: 12.07.2020 Accepted: 15.07.2020
Citation:
A. V. Kopylov, M. A. Volkov, “Mobile application for real-time sign language translation”, Uchenyye zapiski UlGU. Seriya “Matematika i informatsionnyye tekhnologii”, 2020, no. 2, 30–34
Linking options:
https://www.mathnet.ru/eng/ulsu10 https://www.mathnet.ru/eng/ulsu/y2020/i2/p30
|
Statistics & downloads: |
Abstract page: | 60 | Full-text PDF : | 211 | References: | 13 |
|