Information-theoretic bounds for accuracy of letter encoding and pattern recognition via ensembles of datasets
Автор: Lange M.M., Lange A.M.
Журнал: Компьютерная оптика @computer-optics
Рубрика: Численные методы и анализ данных
Статья в выпуске: 3 т.48, 2024 года.
Бесплатный доступ
In this paper, we study stochastic models for discrete letter encoding and object classification via ensembles of different modality datasets. For these models, the minimal values of the average mutual information between a given ensemble of datasets and the corresponding set of possible decisions are constructed as the appropriate monotonic decreasing functions of a given admissible error probability. We present examples of such functions constructed for a scheme of coding independent letters represented by pairs of observation values with possible errors as well as for a scheme of classifying composite objects given by pairs of face and signature images. The inversions of the obtained functions yield the lower bounds for the error probability for any amount of processed information. So, these functions can be considered as the appropriate bifactor fidelity criteria for source coding and object classification decisions. Moreover, the obtained functions are similar to the rate distortion function known in the information theory.
Source coding, ensemble of datasets, entropy, object classification, error probability, mutual information, rate distortion function
Короткий адрес: https://sciup.org/140308614
IDR: 140308614 | DOI: 10.18287/2412-6179-co-1362