|
This article is cited in 1 scientific paper (total in 1 paper)
Programming & Computer Software
Maximal coordinate discrepancy as accuracy criterion of image projective normalization for optical recognition of documents
I. A. Konovalenkoab, V. V. Kokhanab, D. P. Nikolaevab a Institute for Information Transmission Problems of the RAS, Moscow
b Smart Engines Service LLC, Moscow, Russian Federation
Abstract:
Application of projective normalization (a special case of orthocorrection and perspective correction) to photographs of documents for their further optical recognition is generally accepted. In this case, inaccuracies of normalization can lead to recognition errors. To date, a number of normalization accuracy criteria are presented in the literature, but their conformity with recognition quality was not investigated. In this paper, for the case of a fixed structured document, we justify a uniform probabilistic model of recognition errors, according to which the probability of correct recognition of a character abruptly falls to zero with an increase in the coordinate discrepancy of this character. For this model, we prove that the image normalization accuracy criterion, which is equal to the maximal coordinate discrepancy in the text fields of a document, monotonously depends on the probability of correct recognition of the entire document. Also, we show that the problem on computing the maximal coordinate discrepancy is not reduced to the nearest known one, i.e. the linear-fractional programming problem. Finally, for the first time, we obtain an analytical solution to the problem on computing the maximal coordinate discrepancy on a union of polygons.
Keywords:
orthocorrection, perspective correction, image projective normalization, optical character recognition, accuracy criteria, coordinate discrepancy, nonlinear programming.
Received: 21.11.2019
Citation:
I. A. Konovalenko, V. V. Kokhan, D. P. Nikolaev, “Maximal coordinate discrepancy as accuracy criterion of image projective normalization for optical recognition of documents”, Vestnik YuUrGU. Ser. Mat. Model. Progr., 13:3 (2020), 43–58
Linking options:
https://www.mathnet.ru/eng/vyuru556 https://www.mathnet.ru/eng/vyuru/v13/i3/p43
|
Statistics & downloads: |
Abstract page: | 169 | Full-text PDF : | 50 | References: | 36 |
|