About Us

 

(John Bolton: Earth Science Systems and Remote Sensing Instrument developer.  Recently retired from the Earth Science Program Office at NASA’s Goddard Space Flight Center)

 

See also: A Global, Real-Time Disaster & Environmental Monitoring System

 

And: The Development of a New System for Remote Sensing

Landsat for the 21st Century  A spin-off of the Full Spectral Imaging project

 

Related Links:

John Bolton's Experience Chronology

John Bolton's Initiatives 1985 - Present

jfbolton@fullspectralimaging.net

 

Introduction


This page describes the path that was taken to develop a new end-to-end system for high spectral and spatial resolution, passive optical, remote sensing. The applications for this system are similar to those for the U.S. LandSat and the French SPOT systems.

Background

My educational background is in Physics and Physical Chemistry and my practical experience is in electro-optical or photonic systems. For more than 25 years I worked at the NASA Goddard Space Flight Center, first in Space Sciences and then with the Earth Observing Program (EOS) Program. Among the many responsibilities that I have had is the development of advanced technologies for remote sensing.
The first opportunity that I had for advanced technology development came with the MODIS-T Project. MODIS-T (MODerate resolution Imaging Spectrometer-Tiltable) was to be a Hyperspectral Transfer Radiometer (HTR). MODIS-T would cover a narrow swath, have high spectral and spatial resolution, and be a pointable (tiltable) instrument. The function of MODIS-T was to ‘transfer’ calibration measurements between ground truth sites, and unknown sites that were in the field-of-view of the primary MODIS instrument, which at that time was called MODIS-N (MODIS-Nadir). When the initial work on MODIS-T began, the concept of hyperspectral imaging was quite new. Nevertheless, in collaboration with sub-system experts at Goddard, and with several contractors, we came up with a simple instrument design using the currently available state-of-the art technology that would have met the science team’s requirements. This design and the trade-offs considered were detailed in a NASA technical report.

If the MODIS-T instrument had been built, it would have been the first operational hyperspectral instrument in orbit. It would have provided very useful data for researchers developing algorithms for the derivation of products from hyperspectral data. Unfortunately, the conceptual design was passed on to a Phase B design team who proceeded to ‘enhance’ the design of the instrument, even adding a calibration subsystem (this, for an instrument that was intended to provide calibration information by transferring ground truth). When the instrument design began to run over budget and behind schedule MODIS-T was cancelled.

Hoping to utilize some of the experience gained during the ill-fated MODIS-T exercise, I proposed to take advantage of the Goddard Research and Study Fellowship Program with the Remote Sensing Group at VTT (The Technical Research Center of Finland) in Otaniemi, Finland. The Head of the Remote Sensing Group had told me that they were interested in developing hyperspectral technology. They were one of the few organizations interested in the technology at that time.

During the one year in Finland, we developed the plan, made the proposal, secured funding (involving a commercial partnership), found the right people to do the work, built, tested and actually flew and got good results from the prototype airborne hyperspectral instrument. Again, all of this was done in one year.

From this experience I learned a lot about what you have to do to actually get something done. Fortunately, before coming to Goddard I had experience in both academia and in industry, so I already had some idea of how to get a job done on time and within budget. One critical thing that I learned in Finland was that in order to get something done in the remote sensing business, it may be necessary to involve people who know nothing whatsoever about remote sensing. The engineers who did the “heavy lifting” on the Finnish project were from the Opto-Electronics Division of the VTT and knew nothing about remote sensing before we stated the project. Fortunately, due to my experience, I was able to ‘speak their language’.

After the successful completion of this first phase of the project, all of the VTT engineers involved decided to leave their positions at VTT and start a small company to build the AISA instrument that we had developed. Their company, SPECIM, is now quite successful, having expanded their product line well beyond the original AISA instrument.
When I returned to Goddard after the Fellowship, I and others at Goddard, had hopes that the AISA instrument that had been developed might be brought to Goddard. The developers and their commercial sponsor visited Goddard and presented the AISA instrument. While there was considerable enthusiasm among the Goddard technical people, Earth Science management decided that we would not follow-up with this technology. Thus we missed, in my opinion, an excellent opportunity to get into the business of hyperspectral imaging at the very beginning.

Within two years after returning from my Goddard Research Fellowship I had accumulated a lot of practical experience regarding the remote sensing business. After the AISA instrument was rejected by Goddard management, I decided to see what I could do in collaboration with some of the NASA people who had been looking forward to exploiting the technology that had been developed. Several of these people were in the Oceans Branch at the Wallops Flight Facility (a branch of Goddard). Coincidentally, I had also met the president of a small GIS and remote sensing company located in Easton, Maryland (halfway between Goddard and Wallops) who was interested in expanding his business. We quickly developed collaboration between the new Finnish company, (SPECIM), the Easton company (3-D Imaging), and Wallops. At that time it was easy to arrange for “piggyback” flights for experimental instruments at Wallops. This collaboration involved the USDA in Beltsville, and two major AgriBusiness companies. It evolved into a very successful (for the commercial interests at least) development of hyperspectral technology and applications. It also led to the development of new techniques for the calibration and characterization of hyperspectral systems. These techniques took advantage of the unique properties of hyperspectral data.

During this period I wrote “And Now for Something Completely Different: A Proposal for an Alternative Method for the Development of Earth Observation Science and Technology”, to open the discussion of an alternative way of doing NASA’s Earth Observing Science. It is based on my experience working for the EOS Project, and on learning from the remote sensing scientists associated with the Project, and on some years of instrument development experience. It is also based on the many conversations I have had regarding alternative methods for the development of technology and applications in connection with my proposal to establish a Center for Airborne Remote Sensing and Technology and Applications Development (CARSTAD). This was my first effort to look at new ways of doing remote sensing.

Development Part I

After doing all of this practical work on hyperspectral systems, and seeing what the problems were, I started to think about how the system could be improved. In my experience with remote sensing researchers, I noticed that they spent most of their time “fixing” their data before they could do any meaningful science. In my view, the data had to be fixed because it was of poor quality. My goal became to provide the researchers with high quality data that did not need to be fixed.
Several issues arose immediately in my quest to provide good hyperspectral data. They were:
· Too much data
· Hyperspectral is just multispectral with a lot of bands

Even with the small AISA airborne system, data storage and processing was a major concern. The primary objection to hyperspectral systems always seemed to be that they produced too much data. Just like every other remote sensing system, the hyperspectral systems would capture every byte of data that the instrument produced and send it to the researcher. All of this data contained a lot of redundant information. The actual information in the data could have been represented with many fewer bytes. As the clever reader may immediately recognize, this is called “data compression”. Data compression is simply extracting the information from the data. Unfortunately, when data compression is mentioned to the typical remote sensing researcher, the reaction is negative. The reason for this goes back to the researchers problem with their data. If their data always needs to be fixed, then the researchers want to be sure that they had every last bit of data to work with. If their data was good, and if it did not need to be fixed, then they would not have an objection to data compression. Data compression should be merely a step in the information delivery process, transparent to the researchers.

To fix this problem and to address the issues mentioned above, the concept of Full Spectral Imaging (FSI) was developed. When one measures spectral reflectance in the laboratory, one does not use bands; one measures a continuous reflectance spectrum. The information to be derived is contained in the features of the spectrum. If the technology to measure continuous spectra had been available in the first remote sensing satellites, it is not likely that the use of bands would have ever been developed. Bands are simply the best representation of spectral features that could be done with the limited technology of the time. Unfortunately, when hyperspectral technology became available the same techniques that had been used for nearly 30 years were applied. Hyperspectral technology has the capability to produce continuous spectra. The spectral resolution is determined by the instrument characteristics, and is typically selected as a trade-off between signal-to-noise-ratios, data rate, and the science requirements. When continuous spectra are available, the features of the spectra, the parts that contain the information, may be extracted.

To successfully extract the features of the spectra, a “good” instrument is required. The instrument must have, among other things, a good signal-to-noise-ratio and there must be little crosstalk (stray light). Fortunately, a lot of very clever people have been working on the technology to do just this sort of thing for many years. We now have virtually distortionless optical systems with high throughput. We have area array detectors with high sensitivity and good readout characteristics. We have excellent technologies for real-time data processing and compression. Just about all of these technologies have been developed and are constantly being improved, for applications other than remote sensing.

Development Part II

One of the biggest problems with any remote sensing system is calibration. Calibration is very important in a traditional remote sensing system because absolute radiances must be measured. Absolute radiances must be measured because the whole science of passive optical remote sensing with satellites has developed around the measurement of the absolute radiance at the top of the atmosphere. What you really want to measure is the reflectance at the target. To get reflectance at the target from radiance at the top of the atmosphere, one has to develop a model of the processes that occur between the target and the sensor. This is very difficult. All of these models require data that is radiometrically accurate.
During the course of the work with the AISA system, alternative methods for calibration and reflectance retrieval were developed. On of the best methods for calibration is the use of ground truth sites, or “vicarious calibration”. In the earliest work, we used features that appeared during the normal course of data collection, such as the roofs of airport buildings and runways. There were also a lot of features for which we knew the spectral reflectivity. By utilizing this information, calibration of the sensor could be done with information acquired during the course of normal operation. This method is particularly effective with a hyperspectral or Full Spectral imaging system.

Once this technique had been established, I thought that it might be possible to derive the atmospheric characteristics from the hyperspectral data. If we knew the spectral reflectance of targets, we could figure out the spectral contribution of the atmosphere by looking at the difference between that the instrument saw and what we knew to be the actual spectral reflectance. The validity of this approach was confirmed by colleagues at the National Center for Atmospheric Research (NCAR). The logical extension of this procedure was to simply figure out the reflectance of any unknown target based on the reflectance of known targets. This is how the concept of Empirical Reflectance Retrieval was developed.

Development Part III

The third step in the development process is the concept of Autonomous Remote Sensing. The goal of the autonomous remote sensing system is to provide remotely sensed information to people who do not have the capability to do the analysis that is currently required to utilize the data that is available today. The autonomous system would have the capability to determine what it does not know based on what it does know. The system would also have the capability to merge data from multiple sources.

Several years ago, in collaboration with a commercial provider of remote sensing software, I established the Remote Sensing On-Line (RSOL) web site. RSOL was intended to be a technology demonstration of the capability to take remotely sensed information and make it available in a form that could be used by anybody with an Internet connection and a web browser. The demonstration project used the resources of the Remote Sensing Education and Outreach Laboratory (RSEOL), and the data available through the EOS Direct Broadcast system. The initial demonstration would have made the MODIS Direct Broadcast data available via an interactive web interface, which was remarkably similar to the current Google Earth. In addition to the access to MODIS data, the user would have been able to access standard MODIS data products, and a tutorial (linked to the RSOL functions) providing instruction in remote sensing.

In addition to the RSOL experience, the work that I had done several years before on neural networks as applied to remote sensing systems was utilized. Quite a lot of work has been done in ‘non-traditional’ remote sensing to utilize neural networks for remote sensing, primarily for real-time military target identification. Recent developments in the Symantic Web, or Web 3.0 are also encouraging.

By taking advantage of the ideas I developed in preparing a conceptual design for next-generation LandSat/SPOT instruments, one can readily develop an autonomous system that incorporates neural network features and provides information to the users via an interactive web interface. Once again, the technology is not the problem. The inter-communication infrastructure exists, grid and collaborative computing principles are well established, neural network algorithms have been developed, and the auxiliary information from multiple sources that would be needed is available. Efforts are currently being made to figure out how to bring validated information into such a system.

The key to the autonomous system is the knowledgebase. The knowledgebase consists of all information that is relevant to the remotely acquired information. This includes ground truth, verified remotely sensed information, and auxiliary information. The auxiliary information can include a wide range of items ranging from maps to rainfall totals. This idea was first explored when working with AgriBusiness companies in the development of hyperspectral applications for AISA. In that project auxiliary information pertaining to crops was included in the knowledgebase to provide a better product for the farmers.

The Big Picture

To develop an end-to-end system for remote sensing, all aspects must be considered, from the instrument fore-optics to the data acquisition, processing, storage, and distribution. We have seen that the first thing that is needed is a good instrument that produces data that does not need to be fixed. This instrument should also provide the data to the researcher in a form that is easy to use and that does not contain a lot of redundant information.

The next thing that is needed is to take the requirement for atmospheric modeling out of the process. This eliminates the current stringent calibration requirements, and the need for modeling, which can be very difficult.

Finally, the system can be automated and the information made available to users who are not experts in remote sensing. A system like this will provide people who are not experts in remote sensing with information that they can use. This would make remote sensing very much like GIS. GIS is a very well developed (and profitable) business that is accessible to anybody without the need for advanced training.

In order to do all these things, all that is needed is to make innovative use of technology and infrastructure that is currently available. Some examples of these technologies and infrastructure are:
· Wide field-of-view all reflective optics
· The Offner imaging spectrograph
· CMOS area array technology with on-chip processing
· Real-time 3-D wavelet data compression
· Image based pointing stabilization systems
· Standardized spacecraft busses
· The EOS Direct Broadcast system
· Numerous auxiliary data sources
· Distributed and collaborative computing and data storage
· Neural networks and Artificial Intelligence (AI)
· The Internet & potentially, the Symantic Web (Web 3.0)

Conclusion

Innovative application of currently available technologies and infrastructure would make possible a new end-to-end system for passive optical remote sensing. Some of the concepts developed for the New System could be applied to non-remote sensing systems, and technologies used in non-remote sensing systems may be applied to the New system. This is demonstrated by the extensive use by the New System of technologies developed outside traditional remote sensing. It is expected that the well-designed New System would perform significantly better than current systems, would cost less to build and operate, and would provide remotely sensed information in real-time to users who are not remote sensing specialists.

None of this work would have been possible without the encouragement and advice of many people. One of the first things that I learned when I went to work was that if you want to get a job done right, you should talk to, and get help from people who are specialists and who really know what they are doing. When working on innovative new ideas, it may be necessary to get advice from people who are not directly involved with the project area in which you are working.

 

This page was last modified on 19 January 2015

HOME