Skip to main content icon/video/no-internet

Image fusion is a technique to combine multi-source images such as a high-spatial-resolution panchromatic or radar image and a lower-spatial-resolution multispectral image to produce a high-spatial-resolution multispectral image. Ideally, the fused image inherits high-resolution spatial information from the panchromatic or radar image and preserves the original spectral characteristics of the multispectral image. Most Earth observation satellite systems provide two types of image data: a panchromatic image with high spatial resolution and a multispectral image with lower spatial resolution but higher spectral resolution. To effectively use such images, image fusion techniques can be employed to combine the highresolution panchromatic and low-resolution multispectral images into one color image. Such techniques can extend the application potential of remote-sensing image data.

A variety of methods for fusing panchromatic images of high spatial resolution with multispectral images of lower spatial resolution were developed. Pohl and Van Genderen grouped the existing image fusion techniques into two classes: (1) color-related techniques such as intensity-hue-saturation (IHS) and hue-saturation-value (HSV) fusion methods and (2) statistical/numerical methods such as principal components analysis (PCA), high-pass filtering (HPF), Brovey transform (BT), regression variable substitution (RVS), and wavelet methods. Ranchin and Wald distinguished three groups of fusion methods: (1) the projection and substitution methods, (2) the relative spectral contribution methods, and (3) the methods relevant to the ARSIS (the French acronym for “spatial resolution enhancement by injection of structures”) concept. There are also some hybrid methods that use combined methods from more than one group. Among the existing fusion methods, the IHS transform and PCA methods are the algorithms most commonly used by the remote-sensing community.

Problems and limitations associated with the available fusion techniques have been reported by many studies. The most significant problem is that the fused image usually has a notable deviation in visual appearance and in spectral values from the original image. These deviations, called color distortions, affect further interpretation, especially when the wavelength range of a panchromatic image does not correspond to that of the employed multispectral image. Another potential problem arises from the fact that panchromatic and multi-spectral images are often taken in different seasons of the year. In image fusion, it is desirable to minimize the color distortion since this ensures that features separable in the original multispectral image are still separable in the fused image.

Other terms with similar meanings to image fusion that can be found in the literature include image pan-sharpening, image merging, image combination, image synergy, and image integration. Data fusion is another term related to image fusion, but it has a broader meaning than image fusion.

YangrongLing

Further Readings

Chavez, P.Sides, S.Anderson, J.(1991).Comparison of three different methods to merge multiresolution and multispectral data: TM & SPOT pan.Photogrammetric Engineering and Remote Sensing57(3)295–303.
Pohl, C.Van Genderen, J.(1998).Multisensor image fusion in remote sensing: Concepts, methods and applications.International Journal of Remote Sensing19(5)823–854.http://dx.doi.org/10.1080/014311698215748
Ranchin, T.Wald, L.(2000).Fusion of high spatial and spectral resolution images: The ARSIS concept and its implementation.Photogrammetric Engineering and Remote Sensing66(1)49–61.
  • Loading...
locked icon

Sign in to access this content

Get a 30 day FREE TRIAL

  • Watch videos from a variety of sources bringing classroom topics to life
  • Read modern, diverse business cases
  • Explore hundreds of books and reference titles

Sage Recommends

We found other relevant content for you on other Sage platforms.

Loading