navigation image mapnext pagetable of contentsprevious page

The advent of Charge-Coupled Devices and the Pushbroom technique for retrieving the detector charges to create signals are discussed, establishing a basis for building AVIRIS. A diagram relates the resulting signal sequence to a sampled pixel; a second illustration shows in an image cube format the final reconstructed image and variations in signal strengths along sample rows and columns. Several other hyperspectral instruments are mentioned.


AVIRIS and other Imaging Spectrometers

The advantage of a spectrometer operating in space over a multispectral scanner, which samples only bands of extended spectral intervals, as hinted at before, is that it samples essentially the full continuum of wavelengths (actually, with present technology, very narrow individual bands that are wavelength contiguous). This illustration helps to clarify the point just stated:

Multispectral vs Hyperspectral Image Comparison

A stationary reflectance spectrometer, looking through a collimating lens at a ground scene, breaks the light into wavelengths emanating from the fixed view. When such a spectrometer is flown on an aircraft or spacecraft, a problem with recording the light arises because the scene moves past the lens (or the spectrometer swings across the scene) at high speed. Older types of detectors didn't have enough time to record the ever-changing field of view, that is, the electronics couldn't sample the dispersed light fast enough to resolve it into the closely-spaced wavelengths needed to construct a spectral curve. Instead, they recorded the light as broad bands, in the manner of Landsat-type scanners.

The technology for a scanning spectrometer that could sweep across moving terrain, while sampling at narrow wavelength intervals, had to await a breakthrough. This breakthrough came with the advent of Charge-Coupled Detectors (CCD's). A CCD is a microelectronic semi-conducting metal chip that detects light. Radiation produces an electron charge on the chip in proportion to the number of photons received, which is governed by the intensity and exposure time. The charge must be rapidly removable, resetting the CCD for the next influx of photons, such as those coming from the next part of the scene..

A chip is extremely small (tens of micrometers). It is made of some light-sensitive material, such as the silicon used in computer microchips. Normally, we mount hundreds of them on an insulator backing, in a line or a two-dimensional array. Consider a linear array of, say, 2,000 chips per inch. If we allow light to enter through lenses and strike the array simultaneously from one end to the other, then each chip will receive light from a small part of the scene. Adjacent chips will get their light from adjacent ground or air locations. Thus, the instrument samples a line of finite width and length on the ground or in the air.

Each chip, accruing a charge representative of the arriving photon batch, is a pixel that defines a spatial resolution, which depends on the chip size and the height (distance) of the chip array above (from) the scanned scene. After an instant of dwell time over the scene, all of the chips discharge sequentially (producing a signal stream for a recording device) to reset for the next batch. The chips have a high signal-to-noise (S/N) ratio, which allows, in the brief moment, enough buildup of electrons, coupled with gain amplification, to yield usable signals. After a single exposure, and the spacecraft moves on a small distance, the process repeats for the next line of the ground scene. The advance of the linear array, similar to the forward motion of a wide "pushbroom", generates a succession of varying electronic signals that the instrument converts into an image, in which the range of gray levels relate to the signal strengths.

 

Simple sketch indicating the Pushbroom technique for using CCDs  in a multilinear array to sample the scene.

A variant of the linear array is the two-dimensional array, which receives and records light from rectangular ground scenes on multiple lines of chips. Or, another mode uses a rocking or rotating mirror to sweep across the array.

A sensor that uses CCD elements can be multispectral if it uses several arrays, each dedicated to a wavelength band, for which a bandpassing filter determines the bandwidth. SPOT's sensor uses CCD arrays and red, green, and infrared filters to create its multispectral data set.

But, these broad-band sensors do not provide hyperspectral data, that is, they do not sample the spectrum in narrow intervals. To accomplish this, the sensor detector must consist of many parallel rows of chips, each dedicated to a narrow wavelength interval, that the recorder can sample (CCDs discharged) extremely rapidly. Imagine a two-dimensional array that is several hundred chips wide and 200 or so long. Let the light enter the sensor through a telescope or focusing lens, impinge upon a moving-mirrored surface, and then pass through a diffraction grating, which disperses the light over a spectral range in the direction of the array length (which is also the same direction of forward motion of the sensor). At one instantaneous position of the mirror, the light from the ground activates the first pixel chip in the array width-wise, and at the same time, does so for the other wavelength arrays (the spectral dimension). The recorder electronically samples lengthwise the first chip in each line. Next the mirror moves widthwise to the next ground spot and the light from it spreads its spectral dispersion lengthwise. The mirror continues progressively to complete the equivalent of one ground sweep. While this happens, the sensor moves on to look at the next ground position and the whole scanning-dispersing process repeats.

As the instrument proceeds along its flight path or orbit, the final result is a vast collection of data that has both spatial and hyperspectral inputs. From the data set, we can construct images using individual narrow spectral bands associated with small plots on the ground. Or, we can tie spectral data for any pixel position across the width to the wavelengths sampled lengthwise, to plot a spectral curve for that piece of the surface. With special modifications, we can image the atmosphere, if that is the target.

This, in a general way, describes how AVIRIS and other hyperspectral imaging spectrometers operate. JPL's AVIRIS uses diffraction gratings with two sets of CCD arrays, one with silicon chips to sense in the visible range and the other with Indium-Antimony (InSb) chips for wavelengths in the Near-IR to Short-Wave-IR range. A refrigeration unit cools the detectors with liquid nitrogen for optimum performance. There are 224 detectors (channels) in the spectral dimension, extending over a range of 0.38 to 2.50 µm. This arrangement leads to a spectral resolution for each chip of 0.01 µm. By choice, the units that the remote sensing industry adopted for reporting hyperspectral data is the nanometer (nm); 1000 nm = 1 µm. The resolution stated this way is 10 nm, and the range of coverage is 380 to 2500 nm (0.38 - 2.5 µm). AVIRIS gathers its light through a 30° field of view, sending diffracted light to the 614 individual CCD's in the width (across flight) direction. An oscillating mirror scans the scene and sends the incoming radiation to sweep across the array. The spatial resolution derived from this depends on the platform height. A typical mission, mounting AVIRIS on a NASA aircraft (ER-2), produces a spatial resolution of about 20 meters, but that can be improved to five meters by flying at lower altitudes, which, of course, narrows the width of the ground coverage.

This is a picture of AVIRIS:

 

Drawing showing the general appearance and design of AVIRIS.

And this is an example of an AVIRIS spectral curve for a single pixel:

Example of an AVIRIS spectral curve for a single pixel.

The stacking under the pixel at top is meant to denote the multiple 224 10 nm channels, whose values we plotted to attain this curve; abscissa increments = 0.15 µm.

Another way to visualize the relationship between an image developed in the usual color compositing manner, but using just three 10 nm data values at different wavelengths, and the spectral variations over the interval sampled is to depict this as a Hyperspectral Cube:

A Hyperspectral Cube display of a scene ; three narrow bands have been used to create the image; the front and right side show a generalized color representation of variations in reflectance for these bands along a single row (front) and column (right).

Here, the front face of the cube is a color image made from the reflectances associated with three narrow spectral bands in the visible region. On the top and right front sides are the external edges of the thin planes representing each narrow band. The top corresponds to the low end of the spectrum and the bottom the high end. The reflectances for those pixels located along the top and right lines of the cube have been color-coded to indicate various intensity ranges. Black through purple and blue are assigned to low reflectances. Yellow through red and then white denote high reflectances.

Here is another hyperspectral cube, with a twist on the information presented:

Another hyperspectral cube; see text below.

At the top is the spectral curve for a single pixel. On the side are four more spectral curves. These were derived from the cube data for pixels in which the stated class name dominates. The top pixel is a mixed pixel composed of weighted contributions from each of the classes that are present in the ground correspoinding to the pixel.

We will show some selected AVIRIS results on the next page. Since AVIRIS, many other imaging spectrometers have been constructed and put online. You can choose a list of most of these, in tabular form.

One of these is HYDICE, developed by the Navy, for aerial use. It has 210 channels, each with a spectral resolution of about 10 nm, extending from 413 to 2,504 nm. It uses a prism as the spectral dispersion device. The spatial dimension is defined by a row of 320 pixels. When flown at low altitudes, HYDICE yields images with resolutions approaching one meter.

Another instrument, developed in Europe, is DAIS, the Digital Airborne Imaging Spectrometer, which uses a diffraction grating to obtain spectral coverage between 450 and 2,500 nm. Its 72 channels collect radiation in three spectral intervals: 400-1200 nm (bandwidth 15-30 nm); 1,500-1,800 nm (bandwidth 45 nm); 2,000-2,500 nm (bandwidth 20 nm). The gaps coincide with atmospheric absorption bands. As separate sensors using the same optics, it also has a single broad-band channel between 3,000-5,000 nm and six channels that operate within parts of the 8,000-14,000 nm interval. These bands provide important information on thermal emissions from ground objects. Flown low, it provides one-meter resolution imagery for strips a few kilometers wide.

For most systems, the diffraction grating or the prism accomplishes the dispersion, but other techniques of separating the spectrum include interference filters, acoustical-optical filters, liquid crystal tunable filters, Michelson interferometers, Fourier Transform interferometers, and multi-order etalons. Chips made of mercury-cadmium-tellurium (MCT) or platinum silicide are sensitive to certain usable wavelength intervals.

Hyperspectral sensors are beginning to make their way into space on unmanned satellites - part of the trend towards operating these powerful imagers that can yield much more information than those now on Landsat, SPOT, and others. As introduced in the Overview (see page I-2), EO-1, launched in November 2000, had both a standard 9 band sensor (ALI) and a 220 band hyperspectral sensor (Hyperion). We show a Hyperion flight strip across San Jose, CA below on the left and the larger ALI image on the right which includes the strip area (on its left):

Hyperion image of a strip that runs north through San Jose, CA. ALI image that covers part of San Jose, the southern San Francisco Bay, and a segment of the East Bay.

The Canadian Centre for Remote Sensing has developed an airborne hyperspectral instrument called Probe-1. Here is the rock units map produced from several of its bands for an area in northern Canada (Arctic):

Rock unit discrimination using CCRS's Probe-1; courtesy Noranda, Inc.

In 2001, ESA launched its experimental Proba satellite which included CHRIS (Compact High Resolution Imaging Spectrometer) that images 18 by 18 km targets at 18 meter resolution. CHRIS can use its CCD sensors to produce 200 narrow bands. An example of a CHRIS image, with bands chosen to produce a false color composite, is this view of the Ardennes in Belgium.

CHRIS hyperspectral image of the Ardennes area in Belgium.

More information on Hyperion and Proba is found in the Introduction, page 24.

navigation image mapnext pageprevious page


Primary Author: Nicholas M. Short, Sr.