|
This article is cited in 14 scientific papers (total in 14 papers)
MATHEMATICAL MODELING AND NUMERICAL SIMULATION
Regularization, robustness and sparsity of probabilistic topic models
K. V. Vorontsova, A. A. Potapenkob a RUKONT-PhysTech Laboratory, CMAM department, MIPT, 9 Institutskii per., Dolgoprudny, Moscow Region, 141700, Russia
b CMC department, Moscow State University, Leninskie gory, Moscow, 119991, Russia
Abstract:
We propose a generalized probabilistic topic model of text corpora which can incorporate heuristics of Bayesian regularization, sampling, frequent parameters update, and robustness in any combinations. Well- known models PLSA, LDA, CVB0, SWB, and many others can be considered as special cases of the proposed broad family of models. We propose the robust PLSA model and show that it is more sparse and performs better that regularized models like LDA.
Keywords:
text analysis, topic modeling, probabilistic latent semantic analysis, EM-algorithm, latent Dirichlet allocation, Gibbs sampling, Bayesian regularization, perplexity, robusteness.
Received: 06.09.2012
Citation:
K. V. Vorontsov, A. A. Potapenko, “Regularization, robustness and sparsity of probabilistic topic models”, Computer Research and Modeling, 4:4 (2012), 693–706
Linking options:
https://www.mathnet.ru/eng/crm522 https://www.mathnet.ru/eng/crm/v4/i4/p693
|
Statistics & downloads: |
Abstract page: | 286 | Full-text PDF : | 123 | References: | 32 |
|