Vegetation type recognition in hyperspectral  images using a conjugacy indicator
Bibikov S.A., Kazanskiy N.L., Fursov V.A.
  
  IPSI RAS – Branch of the FSRC “Crystallography and Photonics” RAS, Molodogvardeyskaya 151, 443001, Samara, Russia
 Samara National Research University, 34, Moskovskoye shosse, 443086,  Samara, Russia
 PDF
  PDF
Abstract:
This paper considers a  vegetation type recognition algorithm in which the conjugacy indicator with a  subspace spanned by endmember vectors is taken as a proximity measure. We show  that with proper data preprocessing, including vector components weighting and  class partitioning into subclasses, the proposed method offers a higher  recognition quality when compared to a support vector machine (SVM) method  implemented in MatLab software. This implementation provides good results with  the SVM method for a fairly difficult classification test using the Indian  Pines dataset with 16 classes containing similar vegetation types. The  difficulty of the test is caused by high correlation between the classes. Thus,  the results show a possibility for the recognition of a large variety of  vegetation types, including the narcotic plants. 
Keywords:
hyperspecter images,  thematic classification, support vector machine, conjugacy indicator.
Citation:
Bibikov SA, Kazanskiy  NL, Fursov VA. Vegetation type recognition in hyperspectral images using a  conjugacy indicator. Computer Optics 2018; 42(5): 846-854. DOI:  10.18287/2412-6179-2018-42-5-846-854.
References:
  - Schowengerdt RA. Remote sensing:  models and methods for image processing. Burlington, San Diego, London:  Academic Press, 2006. ISBN: 978-0-08-048058-9.
- Chaban LN, Vecheruk GV, Kondranin TV, Kudriavtsev  SV, Nikolenko AA. Modeling and thematic processing of images identical to the  imagery from workable and preparing for the space launch hyperspectral remote  sensors [In Russian]. Sovremennye Problemy Distantsionnogo Zondirovaniya Zemli  iz Kosmosa 2012; 9(2): 111-121.
- Schuler CJ, Hirsch M, Harmeling S, Schölkopf B.  Learning to deblur. IEEE Trans Pattern Anal Mach  Intell 2016; 38(7): 1439-1451.  DOI: 10.1109/TPAMI.2015.2481418. 
- Makantasis K, Karantzalos K, Doulamis A, Doulamis  N. Deep supervised learning for hyperspectral data classification through  convolutional neural networks. Geoscience and Remote Sensing Symposium (IGARSS)  2015: 4959-4962. DOI:  10.1109/IGARSS.2015.7326945. 
- He Z, Shen Y, Zhang M, Wang Q, Wang Y, Yu R. Spectral-spatial  hyperspectral image classification via SVM and superpixel segmentation.  Instrumentation and Measurement Technology Conference (I2MTC) 2014: 422-427. DOI: 10.1109/I2MTC.2014.6860780. 
- Fauvel M, Tarabalka Y, Benediktsson JA, Chanussot  J, Tilton JC. Advances in spectral-spatial classification of hyperspectral  images. Proc IEEE 2013; 101(3): 652-675. DOI: 10.1109/JPROC.2012.2197589. 
- Fursov VA, Bibikov SA, Bayda OA. Thematic  classification of hyperspectral images using conjugacy indicator. Computer  Optics 2014; 38(1): 154-158.
- De Carvalho Jr OA, Meneses PR. Spectral  correlation mapper (SCM): an improvement on the spectral angle mapper (SAM).  Summaries of the 9th JPL Airborne Earth Science Workshop 2000.
- Shafri HZM, Affendi S, Shattri M. The performance  of maximum likelihood, spectral angle mapper, neural network and decision tree  classifiers in hyperspectral image analysis. J Comp Sci 2007; 3(6): 419-423. DOI:  10.3844/jcssp.2007.419.423. 
- Fursov  VA, Kozin NE. Recognition through constructing the eigenface classifiers using  conjugation indices. IEEE Conference on Advanced Video and Signal Based  Surveillance 2007: 465-469. DOI: 10.1109/AVSS.2007.4425355. 
-   Fursov VA, MinaevEYu, Zherdev DA, Kazanskiy NL.  Support subspaces method for recognition of the synthetic aperture radar images  using fractal compression. Int J Adv Robotic Syst 2017; 14(5): 1-8. DOI: 10.1177/1729881417733952.
  
  
  © 2009, IPSI RAS
  151, Molodogvardeiskaya str., Samara, 443001, Russia; E-mail: journal@computeroptics.ru ; Tel: +7 (846) 242-41-24 (Executive secretary), +7 (846) 332-56-22 (Issuing editor), Fax: +7 (846) 332-56-20