(49-6) 11 * << * >> * Russian * English * Content * All Issues

Neural-based spatial-spectral sensitivity correction for push-brum hyperspectral cameras
A.R. Makarov 1, A.V. Nikonorov 1

Samara National Research University,
Moskovskoye Shosse 34, Samara, 443086, Russia

 PDF, 4567 kB

DOI: 10.18287/COJ1812

Pages: 947-960.

Full text of article: Russian language.

Abstract:
A neural network-based method for trainable spatial-spectral sensitivity correction of a push-broom hyperspectral sensor is proposed. Unlike traditional calibration approaches that equalize the recorded signal from a uniformly illuminated Lambertian reference target along the slit using precomputed correction coefficients, the proposed calibration is implemented as neural network layers whose parameters are jointly optimized with the classification model during training. Three types of trainable calibration layers have been developed, based on a learnable matrix of correction coefficients, its partial polynomial approximation, and vector factorization. Experimental evaluation was conducted on hyperspectral images acquired under similar but not identical capture conditions. The calibration layers were integrated into a 3D convolutional neural network and a spatial-spectral transformer. The results demonstrate a consistent improvement in classification quality compared to baseline models without calibration: the accuracy increased by 0.59% to 14.27%, and the F1-score increased by 0.17% to 10.98%, which confirms the effectiveness of the proposed layers.

Keywords:
hyperspectrometer, parametric correction layers, deep learning, classification, position-dependent hyperspectrometer sensitivity correction.

Citation:
Makarov AR, Nikonorov AV. Neural-based spatial-spectral sensitivity correction for pushbrum hyperspectral cameras. Computer Optics 2025; 49(6): 947-960. DOI: 10.18287/COJ1812.

Acknowledgements:
This work was supported by the Ministry of science and higher education of the Russian Federation, grant No 075-15-2025-610.

References:

  1. Lu B, Dao PD, Liu J, He Y, Shang J. Recent advances of hyperspectral imaging technology and applications in agriculture. Remote Sens 2020; 12(16): 2659. DOI: 10.3390/rs12162659.
  2. Bhargava A, Ijaz MF, Do DT. Hyperspectral imaging and its applications: A review. Heliyon 2024; 10(11): e33208. DOI: 10.1016/j.heliyon.2024.e33208.
  3. Yu K. A critical review on applications of hyperspectral remote sensing in crop monitoring. Exp Agric 2022; 58(5): 607-628. DOI: 10.1017/S0014479722000278.
  4. Stuart MB, McGonigle AJS, Willmott JR. Hyperspectral imaging in environmental monitoring: A review of recent developments and technological advances in compact field deployable systems. Sensors 2019; 19(14): 3071. DOI: 10.3390/s19143071.
  5. Alanazi H, Almotiri S, Alqahtani H, Alharthi A, Alsubaie N. A hybrid graph–spatial spectral transformer framework for hyperspectral image analysis. J Phys Conf Ser 2024; 2906(1): 012025. DOI: 10.1088/1742-6596/2906/1/012025.
  6. Gewali UB, Monteiro ST, Saber E. Machine learning based hyperspectral image analysis: A survey. arXiv:1802.08701 [cs.CV]; 2018. Source: <https://arxiv.org/abs/1802.08701>.
  7. Jia B, Wang W, Ni X, Lawrence KC, Zhuang H, Yoon S-C, Gao Z. Essential processing methods of hyperspectral images of agricultural and food products. Chemometrics and Intelligent Laboratory Systems 2020; 198: 103936. DOI: 10.1016/j.chemolab.2020.103936
  8. Al–Hourani A, Balendhran S, Walia S, Hourani T. Line Scan Hyperspectral Imaging Framework for Open Source Low-Cost Platforms. Remote Sensing 2023; 15(11): 2787. DOI: 10.3390/rs15112787.
  9. Liu X, Jiang Z, Wang T, Cai F, Wang D. Fast hyperspectral imager driven by a low-cost and compact galvo-mirror. Optics & Laser Technology 2021; 140: 106987. DOI: 10.1016/j.ijleo.2020.165716.
  10. Gao L, Smith R T. Optical hyperspectral imaging in microscopy and spectroscopy – a review of data acquisition. J Biophotonics 2015; 8(6): 441-456. DOI: 10.1002/jbio.201400051.
  11. Høye G, Løke T, Fridman A. Method for quantifying image quality in push-broom hyperspectral cameras. Opt Eng 2015; 54(5): 053102. DOI: 10.1117/1.OE.54.5.053102.
  12. Jablonski J, Durell C, Slonecker ET, Wong KKA, Simon B, Eichelberger A, Osterberg J. Best practices in passive remote sensing VNIR hyperspectral system hardware calibrations. Proc SPIE 2016; 9860: 98600D. DOI: 10.1117/12.2223012.
  13. Leung MCH, Chen S, Jurgenson C. Accurately measuring hyperspectral imaging distortion in grating spectrographs using a clustering algorithm. Proc SPIE 2022; 12188: 121883W. DOI: 10.1117/12.2630442.
  14. Yokoya N, Miyamura N, Iwasaki A. Preprocessing of hyperspectral imagery with consideration of smile and keystone properties. In: Larar AM, Chung H-S, Suzuki M, eds. Multispectral, Hyperspectral, and Ultraspectral Remote Sensing Technology, Techniques, and Applications III. Proc SPIE 2010; 7857: 73-81. DOI: 10.1117/12.870437.
  15. Bakker W, van der Werff HMA, van der Meer FD. Determining smile and keystone of hyperspectral lab cameras. In: Proc. of the 10th Workshop on Hyperspectral Imaging and Signal Processing (WHISPERS) 2019; 1-5. DOI: 10.1109/WHISPERS.2019.8921045
  16. Riihiaho K.A., Eskelinen M.A., Pölönen I. A do-it-yourself hyperspectral imager brought to practice with open-source Python. Sensors 2021; 21(4): 1072. DOI: 10.3390/s21041072.
  17. Morales A, Horstrand P, Guerra R, Leon R, Ortega S, Díaz M, Melián JM, López S, López JF, Callico GM, Martel E, Sarmiento R. Laboratory hyperspectral image acquisition system setup and validation. Sensors 2022; 22(6): 2159. DOI: 10.3390/s22062159.
  18. Specim, Spectral Imaging Ltd. FX17 Reference Manual. 2016. Source: <https://ftp.stemmer–imaging.com/webdavs/docmanager/152613-Specim-FX17-Reference-Manual.pdf>.
  19. Aasen H, Kirchgessner N, Walter A, et al. Specim IQ: Evaluation of a new, miniaturized handheld hyperspectral camera and its application for plant phenotyping and disease detection. Sensors 2018; 18(2): 441. DOI: 10.3390/s18020441.
  20. Zhang X, Liu F, He Y, Li X. Application of hyperspectral imaging and chemometric calibrations for variety discrimination of maize seeds. Sensors 2012; 12(12): 17234-17246. DOI: 10.3390/s121217234.
  21. Hu X, Ma Y, Zhao J, Li W, Sun L, Zhang Y. A novel scene-based non-uniformity correction method for short-wavelength infrared push-broom hyperspectral sensors. ISPRS J Photogramm Remote Sens 2017; 131: 160-169. DOI: 10.1016/j.isprsjprs.2017.08.006.
  22. Shaikh MS, Jaferzadeh K, Thörnberg B, Casselgren J. Calibration of a hyper-spectral imaging system using a low-cost reference. Sensors 2021; 21(11): 3738. https://doi.org/10.3390/s21113738.
  23. Kosec M, Bürmen M, Tomaževič D, Pernuš F, Likar B. Characterization of a spectrograph–based hyperspectral imaging system. Optics Express 2013; 21(10): 12085-12099. DOI: 10.1364/OE.21.012085.
  24. Høye G, Fridman A. Spatial misregistration in hyperspectral cameras: lab characterization and impact on data quality in real-world images. Opt Eng 2020; 59(8): 084103. DOI: 10.1117/1.OE.59.8.084103.
  25. Wang A, Wang Y, Chen Y. Hyperspectral image classification based on convolutional neural network and random forest. Remote Sens Lett 2019; 10(11): 1086-1094. DOI: 10.1080/2150704X.2019.1649736.
  26. Jablonski J, Durell C, Slonecker ET, Wong KKA, Simon B, Eichelberger A, Osterberg J. Best practices in passive remote sensing VNIR hyperspectral system hardware calibrations. Proc SPIE 2016; 9860: 9860 04. DOI: 10.1117/12.2224022.
  27. Aasen H, Honkavaara E, Lucieer A, Zarco–Tejada PJ. Quantitative remote sensing at ultra-high resolution with UAV spectroscopy: A review of sensor technology, measurement procedures, and data correction workflows. Remote Sens 2018; 10(7): 1091. DOI: 10.3390/rs10071091.
  28. Gómez–Chova L, Alonso L, Guanter L, Camps–Valls G, Calpe J, Moreno J. Correction of systematic spatial noise in push-broom hyperspectral sensors: Application to CHRIS/PROBA images. Appl Opt 2008; 47(28): F46-F60. DOI: 10.1364/AO.47.000F46.
  29. Markelin L, Honkavaara E, Takala T, Pellikka P. Calibration and validation of hyperspectral imagery using a permanent test field. In: Proc 5th Workshop on Hyperspectral Image and Signal Processing: Evolution in Remote Sensing (WHISPERS); 2013. p. 1-4. DOI: 10.1109/WHISPERS.2013.8080708.
  30. Noël S, Bramstedt K, Bovensmann H, Gerilowski K, Burrows J P, Standfuss C, Dufour E, Veihelmann B. Quantification and mitigation of the impact of scene inhomogeneity on Sentinel-4 UVN UV-VIS retrievals. Atmospheric Measurement Techniques. 2012;5(6):1319-1331. DOI: 10.5194/amt-5-1319-2012.
  31. Firsov N, Podlipnov V, Ivliev N, Nikolaev P, Mashkov S, Ishkin P, Skidanov R, Nikonorov A. Neural network–aided classification of hyperspectral vegetation images with a training sample generated using an adaptive vegetation index. Computer Optics 2021; 45(6): 887-896. DOI: 10.18287/2412-6179-CO-890.
  32. Fırat H, Asker ME, Hanbay D. Classification of hyperspectral remote sensing images using different dimension reduction methods with 3D/2D CNN. Remote Sens Appl Soc Environ 2022; 25: 100694. DOI: 10.1016/j.rsase.2022.100694.
  33. Zhao J, Hu L, Huang L, Wang C, Liang D. MSRA-G: Combination of multi-scale residual attention network and generative adversarial networks for hyperspectral image classification. Eng Appl Artif Intell 2023; 121: 106017. DOI: 10.1016/j.engappai.2023.106017.
  34. Hong D, Han Z, Yao J, Gao L, Zhang B, Plaza A, Chanussot J. SpectralFormer: Rethinking hyperspectral image classification with transformers. IEEE Trans Geosci Remote Sens 2022; 60: 5518615. DOI: 10.1109/TGRS.2021.3130716.
  35. Sun L, Zhao G, Zheng Y, Wu Z. Spectral-spatial feature tokenization transformer for hyperspectral image classification. IEEE Trans Geosci Remote Sens 2022; 60: 5522214. DOI: 10.1109/TGRS.2022.3144158
  36. Makarov A, Mirpulatov I, Firsov N, Lobanov V, Illarionova S, Podlipnov V, Vybornova Y, Shadrin D, Rastorguev A, Skidanov R, Burnaev E, Nikonorov A. Deep spectral-spatial transformer for robust hyperspectral image segmentation in varying field conditions. IEEE Access 2025; 13: 97453-97467. DOI: 10.1109/ACCESS.2025.3575699.
  37. Shajkofci A, Liebling M. Spatially-variant CNN-based point spread function estimation for blind deconvolution and depth estimation in optical microscopy. IEEE Trans Image Process 2020; 29: 5848-5861. DOI: 10.1109/TIP.2020.2986880
  38. Xu J, Yin Q, Guo P, Zheng X. Two-dimensional multi-fiber spectrum image correction based on machine learning techniques. arXiv preprint arXiv:2002.06600 [astro–ph.IM]; 2020. Source: <https://doi.org/10.48550/arXiv.2002.06600>.
  39. Gashnikov MV, Soifer VA, eds. Promising information technologies for Earth remote sensing [In Russian]. Samara: Samara State Aerospace University named after acad. S.P. Korolev (National Research University); 2015. 255 p. ISBN 978-5-88940-138-4.
  40. Ligan B, Jbilou K, Kalloubi F, Ratnani A. Parameter-efficient fine-tuning of multispectral foundation models for hyperspectral image classification. arXiv preprint arXiv:2505.15334 [cs.CV]; 2025. Source: <https://arxiv.org/abs/2505.15334>. DOI: 10.48550/arXiv.2505.15334.
  41. Zidi FAA, Bouchari JE, Sellam AZ, Wafi A, Distante C, Bekhouche SE, Taleb–Ahmed A. LoLA-SpecViT: Local attention SwiGLU Vision Transformer with LoRA for hyperspectral visualization. arXiv preprint arXiv:2506.17759 [cs.CV]; 2025. Source: <https://arxiv.org/abs/2506.17759>. DOI: 10.48550/arXiv.2506.17759.
  42. Duan Y, Wang N, Zhang Y, Song C. Tensor–based sparse representation for hyperspectral image reconstruction using RGB inputs. Mathematics 2024; 12(5): 708. DOI: 10.3390/math12050708.
  43. Ren Y, Liu L, Yang H, Zhang J, Yang X. Hyperspectral image spectral-spatial feature extraction via tensor principal component analysis. arXiv preprint arXiv:2412.06075 [cs.CV]; 2024. Source: <https://arxiv.org/abs/2412.06075>.
  44. Ozdemir A, Iwen MA, Aviyente S. A multiscale approach for tensor denoising. In book: Proc. IEEE Statistical Signal Processing Workshop (SSP). 2016: 1-5. Source: <https://users.math.msu.edu/users/iwenmark/Papers/SSP16_alp.pdf>.
  45. Rastorguev AA, Kharitonov SI, Kazanskiy NL. Modeling of image formation with a space-borne Offner hyperspectrometer. Computer Optics 2020; 44(1): 12-21. DOI: 10.18287/2412-6179-CO-644.
  46. Wanajaroen W, Lepine T, Chartsiriwattana P, Wannawichian S, Rujopakarn W, Poshyachinda S, Soonthornthum B. TSC-1 Offner spectrometer prototype characterization. Photonics 2024; 11(7): 644. DOI: 10.3390/photonics11070644.
  47. Mouroulis P, Green RO, Chrien TG. Design of pushbroom imaging spectrometers for optimum recovery of spectroscopic and spatial information. Appl Opt 2000; 39(13): 2210-2220. DOI: 10.1364/AO.39.002210.
  48. Stergar J, Ravbar M, Pernuš F, Likar B, Majaron H. Design of a laboratory setup for the calibration of hyperspectral imaging systems. Sensors 2022; 22(18): 6811. DOI: 10.3390/s22186811.
  49. Shaikh S, Lohumi S, Lee H, Mo C, Cho B-K. Performance evaluation of a low-cost diffuse reflectance standard for hyperspectral imaging system calibration. Sensors 2021; 21(2): 697.
  50. Gaidel AV, Podlipnov VV, Ivliev NA, Paringer RA, Ishkin PA, Mashkov SV, Skidanov RV. Agricultural plant hyperspectral imaging dataset. Computer Optics 2023; 47(3): 442-450. DOI: 10.18287/2412-6179-CO-1226.
  51. Benelli A, Cevoli C, Fabbri A. In-field hyperspectral imaging: An overview on the ground-based applications in agriculture. J Agric Eng 2020; 51(3): 129-139. DOI: 10.4081/jae.2020.1030.
  52. Galeazzi C, Carpentiero R, De Cosmo V, Garramone L, Longo F, Lopinto E, Varacalli G. The PRISMA system and PAN/HYP instrument. In: Proc 4th Workshop on Hyperspectral Image and Signal Processing: Evolution in Remote Sensing (WHISPERS) 2009: 10. Source: <https://old.earsel.org/workshops/IS_Tel-Aviv_2009/PDF/earsel-PROCEEDINGS/3020%20Galeazzi.pdf>.
  53. Nansen C, Mishra S, Prabhakar M, Roberts S, Andow D, Paris J. Radiometric calibration and repeatability assessment of hyperspectral imaging from UAV platforms. Front Plant Sci 2023; 14: 1051410. DOI: 10.3389/fpls.2023.1051410.
  54. Suomalainen J, Näsi R, Hakala T, Viljanen N, Peltoniemi JI, Kaasalainen S. Direct reflectance transformation methodology for drone-based hyperspectral imaging. Remote Sens Environ 2021; 266: 112691. DOI: 10.1016/j.rse.2021.112691.
  55. Shen S, Yao Z, Gholami A, Mahoney MW, Keutzer K. PowerNorm: Rethinking batch normalization in transformers. In: Proc Int Conf on Machine Learning (ICML); 2020. PMLR. arXiv preprint arXiv:2003.07845. Source: <https://arxiv.org/abs/2003.07845>. DOI: 10.48550/arXiv.2003.07845
  56. Firsov N A, Podlipnov V V, Ivliev N A, Ryskova D D, Pirogov A V, Muzyka A A, Makarov A R, Lobanov V E, Platonov V I, Babichev A N, Monastyrskiy V A, Olgarenko V I, Nikolaev D P, Skidanov R V, Nikonorov A V, Kazanskiy N L, Soyfer V A. Ensembles of spectral-spatial convolutional neural network models for classifying soil types in hyperspectral images. Computer Optics 2023; 47(5): 795-805. DOI: 10.18287/2412-6179-CO-1260.
  57. Chai T, Draxler RR. Root mean square error (RMSE) or mean absolute error (MAE)? Arguments against avoiding RMSE in the literature. Geosci Model Dev 2014; 7(3): 1247-1250. DOI: 10.5194/gmd-7-1247-2014.

© 2009, IPSI RAS
151, Molodogvardeiskaya str., Samara, 443001, Russia; E-mail: journal@computeroptics.ru ; Tel: +7 (846) 242-41-24 (Executive secretary), +7 (846) 332-56-22 (Issuing editor), Fax: +7 (846) 332-56-20