An Autonomous Remote Sensing System

(This is only a draft version of this page)

 

John F. Bolton


ABSTRACT

This paper describes an end-to-end remote sensing system that can identify the nature and location of targets without human intervention. The central feature of the autonomous remote sensing system is a temporal knowledgebase of the spectral and spatial characteristics of known targets. This is, fortunately, available in the form of the Landsat Archive.

The system incorporates the concept of Full Spectral Imaging (FSI), introduced by the author several years ago. It also incorporates the concept of Empirical Reflectance Retrieval (ERR).

The goal of the autonomous remote sensing (ARS) system is to provide remotely sensed information to people who do not have the capability to do the preparation of the raw remotely sensed data, and the analysis that is required to utilize the data that is available today.

Autonomous remote sensing as discussed in this paper applies to passive optical systems.

Keywords: Remote sensing, hyperspectral imaging, data processing, data compression, information theory, calibration, characterization, artificial intelligence- neural networks

INTRODUCTION

Remote sensing is, fundamentally, identifying targets at a distance (remotely). Targets may be identified by various features, such as size, shape, and color. In addition to the identification of the target, the position of the target is also important. The final parameter in the identification process is the time at which the target was observed. All remote sensing applications boil down to essentially these fundamental characteristics. These can be boiled down to “What”, “Where”, and “When”.

Current Status

The practice of spaceborne remote sensing has been developed over approximately the past 30 years. These practices were established with the first remote sensing systems, and have not changed much over time. There have been improvements in data processing techniques, and in the capability to handle large amounts of data, but the basic practices have not changed much. To do the “What” part of remote sensing researchers have relied on spatial resolution and on selected color bands. Color bands were selected to represent the spectral reflectance of the target. Ideally, researchers would have liked to be able to get the spectral reflectance across the entire solar illumination spectrum. This is what researchers do on the ground, and what they do in the laboratory, though in the laboratory artificial light sources are normally used.

A serious problem that arose immediately was that though the researchers wanted to measure reflectance at the target, what the actually got was radiance at the top of the atmosphere; in other words, at the sensor. The color of a target appears quite different when viewed through many kilometers of atmosphere, as opposed to a few millimeters on the ground or in the laboratory. To figure out the actual spectral reflectance the fundamental remote sensing practice of ‘reflectance retrieval’ is employed. Reflectance retrieval is a science in itself. Many techniques have been developed to estimate the atmospheric characteristics and then to correct the top of the atmosphere measurement to give the reflectance at the target.

The success of reflectance retrieval depends on the capability to measure the radiance at the top of the atmosphere accurately. This requires careful calibration, and maintenance of the calibration of the sensor. Sensor calibration, like reflectance retrieval, has also become a science in itself.

So, as it stands now we have to rely on two sophisticated sciences, reflectance retrieval and calibration, to do remote sensing. To make matters worse, the quality of remote sensing instruments (sensors) is not always the best. Specifically, the signal-to-noise-ratio (SNR) is not always as good as researchers would like. Researchers always seem to want more signal, or more bits to represent the digital signal. One is always faced with the trade-off between SNR and spectral and spatial resolution.

Currently, researchers have to do a lot of work “fixing” their remotely sensed data before they can use it to do their research. Reflectance retrieval is only one of the ‘fixes’ that they have to do. Other fixes include geo-locating the data (the “Where” part), correcting distortion, and stitching together separate scenes. While there are people and processes to help the researchers fix their data, this process has to be understood to properly use remotely sensed data. The goal of autonomous remote sensing is to eliminate this fixing process and provide information to the researcher that is of high quality and that is ready (and easy) to use.

Vision Analogy

Human vision is a form of remote sensing. We look at things and determine what they are and where they are. After many years of looking at (and interacting with) things, we can readily identify them and even though we may have never seen a specific thing before, we can draw some conclusions about that thing. For example, if we see a chair that could be anything from a Lay-z-boy to a folding chair we immediately recognize it as a chair. We know the basic features of a chair; its shape, its size, etc. that give it the characteristics of a chair. Even if some feature is grossly exaggerated, like tiny model chairs or enormous sculptural chairs, we can still recognize it as a chair. In addition to identifying the thing as a chair, we can draw additional conclusions. For example we can determine the color, the approximate weight, the surface texture, etc. The capability to draw these conclusions comes from many years of observing things, not only chairs. Also, the capability to draw conclusions is aided by the context of the situation. For example, one is more likely to find a Lay-z-boy than a folding chair in front of a television. In other words, it is possible for one to draw conclusions about what one does not know from what one does know.


An Application of Autonomous Remote Sensing is described in Landsat for the 21st Century

(to be continued....)

 



This page was last modified on 13 November 2015

 

HOME