6 citations to https://www.mathnet.ru/rus/eps1
-
А. В. Булинский, “О выборе значимых признаков, основанном на теории информации”, Теория вероятн. и ее примен., 68:3 (2023), 483–508 ; A. V. Bulinski, “On relevant features selection based on information theory”, Theory Probab. Appl., 68:3 (2023), 392–410
-
Mehmet Siddik Cadirci, Dafydd Evans, Nikolai Leonenko, Vitalii Makogin, “Entropy-based test for generalised Gaussian distributions”, Computational Statistics & Data Analysis, 173 (2022), 107502
-
Alexander Bulinski, Denis Dimitrov, “Statistical estimation of the Kullback–Leibler divergence”, Mathematics, 9:5 (2021), 544–36
-
“Тезисы докладов, представленных на Четвертой международной конференции по стохастическим методам”, Теория вероятн. и ее примен., 65:1 (2020), 151–210 ; “Abstracts of talks given at the 4th International Conference on Stochastic Methods”, Theory Probab. Appl., 65:1 (2020), 121–172
-
Josu Ircio, Aizea Lojo, Usue Mori, Jose A. Lozano, “Mutual information based feature subset selection in multivariate time series classification”, Pattern Recognition, 108 (2020), 107525
-
Sergio Verdú, “Empirical Estimation of Information Measures: A Literature Guide”, Entropy, 21:8 (2019), 720