Algorithms for multi-frame image super-resolution under applicative noise based on deep neural networks
Автор: Savvin Sergey Viktorovich, Sirota Alexander Anatolievich
Журнал: Компьютерная оптика @computer-optics
Рубрика: Обработка изображений, распознавание образов
Статья в выпуске: 1 т.46, 2022 года.
Бесплатный доступ
The article describes algorithms for multi-frame image super-resolution, which recover high-resolution images from a sequence of low-resolution images of the same scene under applicative noise. Applicative noise generates local regions of outlying observations in each image and reduces the image resolution. So far, little attention has been paid to this problem. At the same time, the use of deep neural networks is considered to be a promising method of image processing, including multi-frame image super-resolution. The article considers the existing solutions to the problem and suggests a new approach based on using several pre-trained convolutional neural networks and directed acyclic graph neural networks trained by the authors. The developed approach and the algorithms based on this approach involve iterative processing of the input sequence of low-resolution images using different neural networks at different processing stages. The stages include registration of low-resolution images, their segmentation performed in order to determine regions damaged by applicative noise, and transformation performed in order to increase the resolution. The approach combines the strengths of the existing solutions while lacking their drawbacks resulting from the use of approximate mathematical data models required for the synthesis of the image processing algorithms within the statistical theory of solutions. The experimental studies demonstrated that the suggested algorithm is fully functional and allows more accurate recovery of high-resolution images than the existing analogues.
Digital image processing, multi-frame superresolution, convolutional neural networks, deep learning, applicative noise
Короткий адрес: https://sciup.org/140290695
IDR: 140290695 | DOI: 10.18287/2412-6179-CO-904