Toward image super-resolution based on local regression and nonlocal means

Super-resolution algorithms have many weaknesses because they have low resolution and quality. These weaknesses are due to limitations and high cost of hardware implementation. Therefore, software methods are used due to its high accuracy and low cost. Super-resolution is used in a variety of applic...

Full description

Saved in:
Bibliographic Details
Published inMultimedia tools and applications Vol. 81; no. 16; pp. 23473 - 23492
Main Authors Bastanfard, Azam, Amirkhani, Dariush, Mohammadi, Mohammad
Format Journal Article
LanguageEnglish
Published New York Springer US 01.07.2022
Springer Nature B.V
Subjects
Online AccessGet full text
ISSN1380-7501
1573-7721
DOI10.1007/s11042-022-12584-x

Cover

More Information
Summary:Super-resolution algorithms have many weaknesses because they have low resolution and quality. These weaknesses are due to limitations and high cost of hardware implementation. Therefore, software methods are used due to its high accuracy and low cost. Super-resolution is used in a variety of applications, including satellite imagery, medicine, CCTV cameras, etc. Super-resolution is divided into two main categories: super-resolution using an input image and super-resolution using multiple input images. Considering the great use of the subject, this paper provides a method for increasing the resolution using an input image. In the proposed method, we improved the image super-resolution based on local regression and nonlocal means. The proposed method consists of two stages: a learning phase and a reconstruction phase. After the initial implementation, we compared the results with a quick and accurate method and found the difference. Then, using the Genetic algorithm, these differences were applied to the initial image. Results showed that our proposed method performed 2.4% and 2.2% better than the RAISR and LANR-NLM methods, Respectively.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1380-7501
1573-7721
DOI:10.1007/s11042-022-12584-x