1
Introduction
This chapter introduces the different concepts in remote sensing, with emphasis on the differences and complementarity of multiple sensors. Furthermore, remote sensing image fusion is defined in the context of data fusion. The chapter proceeds to provide a definition of remote sensing image fusion and describes the types of imagery commonly fused. This chapter also explains the purpose and objective of image fusion, and lists some of the main benefits and limitations of remote sensing image fusion. The main factors that need to be taken into account when fusing different types of remote sensing imagery are also introduced. A special role is given to active and passive remote sensing but also other types of sensors, such as thermal, hyperspectral, and light detection and ranging, are explained.
1.1 Outline of the Book
This book deals with remote sensing image and data fusion. It is assumed that the reader is familiar with remote sensing principles, techniques, and applications. However, as many readers of this book may come from other disciplines such as medical image processing, computer graphics, security, and defence, in Section 1.2, we give a brief introduction to remote sensing multi-sensor data. For more details on remote sensing principles, techniques, and application, the textbooks listed in Table 1.1 serve as a good introduction to this field.
In this introductory chapter, we introduce the topic of remote sensing image and data fusion, providing information on the objectives of image fusion, followed by a description of the types of remote sensing image fusion available today, plus discussions on the limitations of such fusion approaches. Also, the benefits and the many applications of remote sensing image fusion methods are briefly described. The chapter ends with some definitions and terminology used in this book and provides a list of literature references.
Chapter 2 discusses the various levels of remote sensing image fusion. After a brief introduction, the chapter explains and describes the various levels at which images and data may be fused. These include pixel level, feature level, advanced decision level fusion, as well as a section on data fusion. Again, as for Chapter 1 and for all subsequent chapters, this chapter ends with a summary and a list of relevant literature.
TABLE 1.1
Textbooks on Remote Sensing
Title | Reference | Publisher | ISBN |
Remote Sensing Handbook | Thenkabail (2015) | CRC Press | 9781482218015 |
Remote Sensing and Image Interpretation | Lillesand et al. (2015) | Wiley | 9781118343289 |
The Core of GIScience: A Systems-Based Approach | Dopheide et al. (2013) | ITC | 9789036537193 |
Introduction to Remote Sensing | Campbell and Wynne (2011) | Guilford Press | 9781609181765 |
Fundamentals of Satellite Remote Sensing | Chuvieco and Huete (2009) | CRC Press | 9780415310840 |
Principles of Remote Sensing | Tempfli et al. (2009) | ITC | 9789061641837 |
Remote Sensing—Models and Methods for Image Processing | Schowengerdt (2007) | Academic Press | 9780123694072 |
In Chapter 3, the authors deal with the many preprocessing steps required prior to fusing different data sets. Key aspects treated in this chapter include the issues involved in selecting the appropriate data sources to be fused, the sensor-specific corrections that need to be made, such as geometric changes, and explanation of the several image enhancement techniques commonly used in remote sensing image fusion.
One of the key chapters of this book is Chapter 4 on the actual remote sensing image fusion techniques. This chapter commences with a novel categorization of image fusion techniques. As there are hundreds of image fusion algorithms that have been developed over the past 25 years, we have grouped them into several logical categories and then describe in detail each fusion algorithm in each of these categories. The categories described include (a) component substitution, (b) numerical methods, (c) statistical image fusion, (d) modulation-based techniques, (e) multiresolution approaches (MRA), (f) hybrid techniques, and (g) others. In all cases, we have tried to provide the original algorithm for the described fusion technique. After this detailed analysis of the many fusion techniques used today, the chapter concludes with a section giving guidelines on the selection approach and discusses the communalities and contradictions of the remote sensing image fusion algorithms.
Before going on to deal with the many applications of remote sensing image and data fusion, Chapter 5 deals with the important topic of quality assessment. It explains the various image quality parameters used to evaluate image fusion products and discusses and presents the main existing, established indices. A section is also devoted to the requirements and procedures for visual, subjective evaluation of fused image, and data products.
Chapter 6 is another major chapter in which we present the many applications of remote sensing image and data fusion. This chapter provides numerous case studies, showing the features, benefits, and results of fusing different types of data sets for an improved interpretation of the area under consideration. As for all chapters, this chapter also ends with some conclusions about the actual and potential applications of image fusion, and provides many references for the reader to consult for more details about any of the applications considered in this chapter.
The final chapter in this book gives an insight into the future. It presents some of the main trends and developments in remote sensing image and data fusion and introduces some new key technologies, which will influence this field over the coming years. This includes topics such as the remaining challenges and trends, data mining, cloud computing, Big Data analytics, and The Internet of Things.
Further reading is possible using the references provided at the end of each chapter. The references refer to the topics discussed in the individual chapter. The background for the information provided in this book is threefold. The foundation is built from the authors’ many years of expertise in practical use of remote sensing image fusion (RSIF). A second information source is an extensive and dedicated database of journal and conference papers built over the years covering published material until 2015. This database enabled the categorization of RSIF as a research field, splitting the subject into various domains of interest, that is, techniques, sensors, applications, areas of achievement, and journal/conference. The latter led to an overview of which journals published the most RSIF papers and helps the reader to further deepen the subject of interest (see Figure 1.1). The purpose of RSIF and its research focus are subjects of Section 1.4.3. Other interesting information retrieved from the database with an overview of most popular techniques and sensors will be displayed in Chapter 4. Most common applications are discussed in Chapter 6.
1.2 Remote Sensing
“Remote sensing is the art, science, and technology of observing an object, scene, or phenomenon … without” actually being in “physical contact with the object of interest” (Tempfli et al. 2009). The sensors are designed to provide information on a received signal. The signal depends on the materials on the ground with their unique molecular composition and shape. Electromagnetic radiation is reflected, absorbed and emitted differently, depending on the surface composition. Different sensors deliver different “views” of Earth’s surface. The difference is given by spatial, spectral, and temporal resolution, view angle, polarization, wavelength, interaction with the objects, and atmospheric influence on the signal. The advantage of using spaceborne remote sensing is the ability to acquire data over large areas, providing a synoptic view of Earth, in a multi-temporal fashion, allowing change detection methods to model complex Earth processes. Remote sensing distinguishes between passive, for example, visible and near infrared (VIR), or thermal infrared (TIR), and active, for example, synthetic aperture radar (SAR) and light detection and ranging (...