|
This article is cited in 2 scientific papers (total in 2 papers)
Language models application in sentiment attitude extraction task
N. L. Rusnachenko Bauman Moscow State Technical University
Abstract:
Large text can convey various forms of sentiment information including the author's position, positive or negative effects of some events, attitudes of mentioned entities towards to each other. In this paper, we experiment with BERT based language models for extracting sentiment attitudes between named entities. Given a mass media article and list of mentioned named entities, the task is to ex tract positive or negative attitudes between them. Efficiency of language model methods depends on the amount of training data. To enrich training data, we adopt distant supervision method, which provide automatic annotation of unlabeled texts using an additional lexical resource. The proposed approach is subdivided into two stages FRAME-BASED: (1) sentiment pairs list completion (PAIR-BASED), (2) document annotations using PAIR-BASED and FRAME-BASED factors. Being applied towards a large news collection, the method generates RuAttitudes2017 automatically annotated collection. We evaluate the approach on RuSentRel-1.0, consisted of mass media articles written in Russian. Adopting RuAttitudes2017 in the training process results in 10-13% quality improvement by F1-measure over supervised learning and by 25% over the top neural network based model results.
Keywords:
sentiment analysis, relation extraction, distant supervision, neural networks, language models.
Citation:
N. L. Rusnachenko, “Language models application in sentiment attitude extraction task”, Proceedings of ISP RAS, 33:3 (2021), 199–222
Linking options:
https://www.mathnet.ru/eng/tisp608 https://www.mathnet.ru/eng/tisp/v33/i3/p199
|
Statistics & downloads: |
Abstract page: | 117 | Full-text PDF : | 41 | References: | 32 |
|