Radiometric Normalization

Remote sensing applications often use image time series or mosaics of various images or image parts. Radiometric normalization generally refers to empirically reducing the differences between images in time series or mosaics related to differences in the image acquisition time or date. These differences can affect the accuracy of image interpretation. In satellite-based optical remote sensing, these differences include differing atmospheric conditions, target illumination, sensor calibration, or surface phenology. Radiometric normalization is applied both to images that have undergone atmospheric correction and to those that have not.

Image radiometric normalization is warranted when

  • the spectral signatures from one image will classify other images without other normalization,
  • the particular approach to time-series analysis requires that the radiometric differences between images be minimal, or
  • a single linear model ...
  • Loading...
locked icon

Sign in to access this content

Get a 30 day FREE TRIAL

  • Watch videos from a variety of sources bringing classroom topics to life
  • Read modern, diverse business cases
  • Explore hundreds of books and reference titles