|
This article is cited in 1 scientific paper (total in 1 paper)
Speech enhancement method based on modified encoder-decoder pyramid transformer
A. A. Lependin, R. S. Nasretdinov, I. D. Ilyashenko Altai State University
Abstract:
The development of new technologies for voice communication has led to the need of improvement of speech enhancement methods. Modern users of information systems place high demands on both the intelligibility of the voice signal and its perceptual quality. In this work we propose a new approach to solving the problem of speech enhancement. For this, a modified pyramidal transformer neural network with an encoder-decoder structure was developed. The encoder compressed the spectrum of the voice signal into a pyramidal series of internal embeddings. The decoder with self-attention transformations reconstructed the mask of the complex ratio of the cleaned and noisy signals based on the embeddings calculated by the encoder. Two possible loss functions were considered for training the proposed neural network model. It was shown that the use of frequency encoding mixed with the input data has improved the performance of the proposed approach. The neural network was trained and tested on the DNS Challenge 2021 dataset. It showed high performance compared to modern speech enhancement methods. We provide a qualitative analysis of the training process of the implemented neural network. It showed that the network gradually moved from simple noise masking in the early training epochs to restoring the missing formant components of the speaker's voice in later epochs. This led to high performance metrics and subjective quality of enhanced speech.
Keywords:
speech enhancement, noise reduction, noise masking, deep neural network, deep learning, encoder-decoder architecture, pyramid transformer, self-attention
Citation:
A. A. Lependin, R. S. Nasretdinov, I. D. Ilyashenko, “Speech enhancement method based on modified encoder-decoder pyramid transformer”, Proceedings of ISP RAS, 34:4 (2022), 135–152
Linking options:
https://www.mathnet.ru/eng/tisp710 https://www.mathnet.ru/eng/tisp/v34/i4/p135
|
Statistics & downloads: |
Abstract page: | 35 | Full-text PDF : | 25 |
|