VHCFormer: Vignetting Removal Based on Hybrid Channel Transformer
Vignetting is a common phenomenon in photography and imaging. It is characterized by a degradation from center to edges. Vignetting can be caused by optical, mechanical, and natural factors. Currently, most methods for vignetting removal are traditional approaches. The inefficiency of these methods,...
        Saved in:
      
    
          | Published in | International Conference on Control, Automation and Robotics : proceedings pp. 452 - 457 | 
|---|---|
| Main Authors | , , , , , , , | 
| Format | Conference Proceeding | 
| Language | English | 
| Published | 
            IEEE
    
        18.04.2025
     | 
| Subjects | |
| Online Access | Get full text | 
| ISSN | 2251-2454 | 
| DOI | 10.1109/ICCAR64901.2025.11072940 | 
Cover
| Summary: | Vignetting is a common phenomenon in photography and imaging. It is characterized by a degradation from center to edges. Vignetting can be caused by optical, mechanical, and natural factors. Currently, most methods for vignetting removal are traditional approaches. The inefficiency of these methods, which involve single-image inputs and manual adjustment of numerous parameters, hinders effective vignetting removal. To address this issue, we propose VHCFormer, a neural network designed to remove vignetting through multi-dimensional analysis of channels, spatial information, and pixel-level features. VHCFormer consists of the Hybrid Spatial-Channel Transformer and the Pixel Channel Fusion Transformer. The Pixel Channel Fusion Transformer leverages a sliding window mechanism to handle edge information in the image for Vignetting detection. Hybrid Spatial-Channel Transformer captures global information at the pixel level and adaptively adjusts the weights, particularly in high channel scenarios. Quantities and qualities experimental results validate that our proposed network outperforms state-of-the-art approaches in terms of vignetting removal. | 
|---|---|
| ISSN: | 2251-2454 | 
| DOI: | 10.1109/ICCAR64901.2025.11072940 |