|
This article is cited in 3 scientific papers (total in 3 papers)
IMAGE PROCESSING, PATTERN RECOGNITION
Formation of an informative index for recognizing specified objects in hyperspectral data
R. A. Paringera, A. V. Mukhina, A. V. Kupriyanovab a Samara National Research University
b Image Processing Systems Institute of the RAS - Branch of the FSRC "Crystallography and Photonics" RAS, Samara, Russia, Samara
Abstract:
The paper is about the development of an approach which able to create rules for distinguishing between specified objects of hyperspectral data using a small number of observations. Such an approach would contribute to the development of methods and algorithms for the operational analysis of hyperspectral data. These methods can be used for hyperspectral data preprocessing and labeling. Implementation of the proposed approach are using a technology that harnesses both discriminative criteria and the general formulas of spectral indexes. In implementing the proposed technology, the index was defined with normalized difference formula. The Informativeness was estimated using separability criteria of discriminative analysis. The results show that the implemented algorithm can recognize areas of hyperspectral data with different vegetation. The index formed by the algorithm is similar to Normalized Difference Vegetation Index (NDVI). The proposed technology is the generalization of the approach of forming recognition rules using a small number of features. It has been shown that technology can form informative indexes in specified tasks of hyperspectral data analysis.
Keywords:
classification, hyperspectral data, NDVI, discriminant analysis
Received: 27.05.2021 Accepted: 08.09.2021
Citation:
R. A. Paringer, A. V. Mukhin, A. V. Kupriyanov, “Formation of an informative index for recognizing specified objects in hyperspectral data”, Computer Optics, 45:6 (2021), 873–878
Linking options:
https://www.mathnet.ru/eng/co978 https://www.mathnet.ru/eng/co/v45/i6/p873
|
Statistics & downloads: |
Abstract page: | 25 | Full-text PDF : | 23 |
|