Data fusion for image analysis in remote sensing

01 January 2015 → 31 December 2020
Research Foundation - Flanders (FWO)
Research disciplines
  • Natural sciences
    • Applied mathematics in specific fields
    • Computer architecture and networks
    • Distributed computing
    • Information sciences
    • Information systems
    • Programming languages
    • Scientific computing
    • Theoretical computer science
    • Visual computing
    • Other information and computing sciences
Data fusion remote sensing
Project description

Progress in sensor technology has advanced the remote sensing domain, enabling access to a multitude of satellite and airborne data products, each with their own spatial and spectral resolutions, coverage and revisiting times. While each of these modalities provides valuable information by itself, their combination is likely to deliver even richer knowledge. In this project, the term “fusion” is used as the general term for combining information from more than one image source. One difficulty is that images of similar land cover display dissimilar radiometric properties due to the changes in illumination, atmospheric or other observational conditions between acquisitions. This causes so-called spectral variability, i.e. variation in spectral appearance of a specific material within and between images. Another problem is that most hyperspectral sensors have wavelength dependent characteristics. Our overall objective is to develop data fusion techniques that account for these spectral aspects and in this way enhance the spectral properties of a fused image or a derived product such as a land cover classification map. Specific objectives are to develop techniques for removing wavelength depending blur and noise and for adapting images and classifiers to changing observational circumstances. Furthermore, we will apply our developed techniques to improve state of the art methods for dimensionality reduction, change detection and reusing ground reference data.