An overview on linear hyperspectral unmixing
Author:
Affiliation:

Tsinghua university,Tsinghua University

Clc Number:

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    Hyperspectral remote sensing technology contribute significantly to earth observation. In hyperspectral images (HSIs), the spectral vector of each pixel contains hundreds or even thousands of elements, which provides rich spectral information to efficiently identify and distinguish different types of land cover. However, due to the limited spatial resolution and the complexity of surface features, mixed pixels are common in HSIs. The existence of numerous mixed pixels conflicts with the demands for accurate recognition and interpretation of the material properties of the pixels. Hyperspectral unmixing (HU), which decomposes the mixed pixels into a set of constituent materials called “endmembers”, as well as the corresponding mixture coefficients referred to as “abundances”, was developed to alleviate the mixed pixels problem. Linear unmixing as the basis of HU, due to its physical explanatory and mathematic maneuverability, has been widely concerned. At the beginning, there are many mathematical models to handle Linear Hyperspectral Unmixing(LHU). However, the observation noise, the environmental conditions, the endmember variability and dataset size provide lots of challenges to the inverse problem of ill-posedness. The paper summarizes the literature of the past five years from four aspects: matrix decomposition, archetype analysis, bayesian method and sparse regression to unveil the state-of-art models and problems of linear unmixing.

    Reference
    Related
    Cited by
Get Citation

YUAM Jing, ZHANG Yu-Jin, GAO Fang-Ping. An overview on linear hyperspectral unmixing[J]. Journal of Infrared and Millimeter Waves,2018,37(5):553~571

Copy
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:January 05,2018
  • Revised:March 30,2018
  • Adopted:April 03,2018
  • Online: October 31,2018
  • Published: