|
This article is cited in 3 scientific papers (total in 3 papers)
Computer science
Tradeoff relation between mutual information and error probability in data classification problem
A. M. Lange, M. M. Lange, S. V. Paramonov Federal Research Center “Computer Science and Control,” Russian Academy of Sciences, 119333, Moscow, Russia
Abstract:
A data classification model in which the average mutual information between source objects and made decisions is a function of the error probability is investigated. Optimization of the model consists in finding a tradeoff “mutual information–error probability” (MIEP) relation between the minimal average mutual information and the error probability, which is analogous to the well-known rate distortion function for source coding with a given fidelity in the case of a noisy observation channel. A lower bound for the MIEP relation is constructed, which provides a lower bound for the classification error probability on a given set of objects for any fixed value of the average mutual information. The MIEP relation and its lower bound are generalized to ensembles of sources. The obtained bounds are useful for estimating the error probability redundancy for decision algorithms with given sets of discriminant functions.
Key words:
data classification, ensemble of sources, error probability, mutual information, mutual information–error probability relation, lower bound, discriminant function, decision algorithm, error probability redundancy.
Received: 26.11.2020 Revised: 26.11.2020 Accepted: 11.03.2021
Citation:
A. M. Lange, M. M. Lange, S. V. Paramonov, “Tradeoff relation between mutual information and error probability in data classification problem”, Zh. Vychisl. Mat. Mat. Fiz., 61:7 (2021), 1192–1205; Comput. Math. Math. Phys., 61:7 (2021), 1181–1193
Linking options:
https://www.mathnet.ru/eng/zvmmf11269 https://www.mathnet.ru/eng/zvmmf/v61/i7/p1192
|
Statistics & downloads: |
Abstract page: | 79 |
|