SP-424 The Voyage of Mariner 10

 

Appendix B

Processing the TV Images

 

[177] Data returned from the spacecraft are recorded on the ground in what are termed data records. The original data record (ODR) is a local-computer-processed record on magnetic tape of the data as received at the ground station. This record can be replayed from the station only after the pass has been completed, i.e., when the station has ended its communication with the spacecraft.

A system data record (SDR) is made at the Mission Control and Computing Center from the data transmitted in real-time by the receiving stations over ground data links to the Jet Propulsion Laboratory. This record always contains more errors than the original data record; but whereas the original data record takes time to transport physically to Pasadena, the systems data record is available in real-time. The production of these records is shown in Fig. B-l. Also shown on the figure is an analog record made at each Deep Space Network station, recording the received signals in their raw form before any computer processing takes place.

An experimenter data record (EDR) was later produced from a merging of the original and systems data records to eliminate some erroneous data. From this experimenter data record the science data were supplied to experimenters, including the television experimenters.

The experimenter data record tapes were physically transferred to the Video Analysis Facility at the Laboratory, where a library of tapes is maintained. These tapes can be processed into photographic copies in the Image Processing Laboratory on a variety of machines that convert the digital information on the tapes into small gray squares (pixels) on a photoemulsion. These squares of different shades of gray, arranged side by side in adjacent lines, build up the complete picture, just as photographs reproduced in a newspaper are built up of many small black dots of different sizes. Because there are sufficient numbers of these pixels, the shades of gray of each pixel are fused by the eye into a continuous, smooth-looking picture (Fig. B-2).

As each picture became available from Mercury or Venus it was displayed in a raw version on screens in the Mission Command and Control Center. Because of the high data rate associated with real-time transmission of the pictures, every picture could not be displayed in real-time. However, those that were shown enabled the experimenters to gain a good idea of the quality of the imaging and to make sure that the imaging sequence was proceeding according to the plan laid down before the encounter.

A Digifax system allowed close-to-real-time ( 30-min delay) production of hard-copy pictures of the TV images that were displayed on the real-time screens. An improved photoimage of each TV frame was available some time later through the mission test computer. This was known as the MTC version of each picture. Later still there was an even more refined version available, known as the IPL, since it was produced by the Image Processing Laboratory. Figure B-3 compares these three versions of a single picture of Mercury.

The MTC photoimages were sent in raw and filtered versions of enhanced detail to the National Space Data Center at NASA-Goddard Space Flight Center, where scientists worldwide can obtain access to them. 70-mm versions of all pictures on film were also sent to the Science Data Center.

The most sophisticated pictures result from the operations of the Image Processing Laboratory. Here the analysts can change contrasts, rectify images, correct bit errors and accentuate details into the terminator regions. Figure B-4 shows an A-camera Mariner 10 image of Mercury as produced from the experimenter data record at the Image Processing Laboratory compared with the same image that has been corrected in the Laboratory by use of what is termed a convolutional filter to compensate for modulation characteristics in the electronics of the A-camera of the spacecraft. This considerably sharpened the features shown [178] on the image. Figure B-5 shows a similar pair of pictures processed in this way for a B-camera image. Note particularly the increased fine detail in the floor of the large crater.

Figure B-6 illustrates how the Image Processing Laboratory can clean up errors in real-time pictures. The first picture (a) shows a Mercury I encounter picture as received at Canberra in real-time with a bit error rate of one error in 33 bits of data. This same picture is shown (b) after the computer at the Image Processing Laboratory detected and replaced 57,000 pixels that were in error. It did this by averaging between the correct neighboring pixels that surrounded each pixel in error. However, out-of-tolerance departures were processed differently. One of the five most significant bits was reset to bring the pixel value closest to the neighbor averages. This is a "smart" despiker algorithm in that it does not simply replace an erroneous pixel with neighbor averages.

The next two pictures, (c) and (d), show the same camera frame as received in real-time at Goldstone with an even greater error rate of I bit in 14, and as corrected by the removal and replacement of 128,000 pixels. Because of the higher error rate, the quality is not as good as the corrected picture from the Canberra station. Finally, a virtually errorfree version of the picture that was recorded on the tape recorder of the spacecraft and later played back at a slow data rate is shown for comparison (e). While this picture is obviously much better than the high error rate but corrected image from Goldstone, it is not noticeably different from the corrected lower error rate Canberra picture. Indeed, a great deal of work went into a hard examination of the maximum allowable bit error rate so as to be able to use 117.6 kbits- the previous Mariner video threshold was 5 kbits only.

The next series of pictures (Fig. B-7) shows examples of another process of image improvement used by the Image Processing Laboratory. The Mercury I real-time image of Mercury as received by Canberra (a) has typical pixel errors, shown as dark spots all over the picture. The picture was processed (b) to remove these errors and also to correct for some photometric distortion in the spacecraft camera. Next (c) a two-dimensional, high-pass filter was used to retain 25% of the low-frequency brightness components, and the resultant image had its contrast increased about 2 times. Visually, the picture is more pleasing than the previous pictures and allows a better interpretation of the surface features.

The picture was then further enhanced (d) by correction of the characteristics of the camera in regard to the modulations obtained in its electronic circuits. Finally, because the spacecraft was looking at the planet at an angle, and the individual picture frames have to be assembled into largescale mosaics of the planet's surface, the projection of the picture has to be changed. This, too, is done by the Image Processing Laboratory. The final picture in the series (e) shows the result of orthographic projection correction to the image frame.

On an airless planet such as Mercury, the amount of light reflected from the surface close to the terminator boundary between light and darkness is very much less than from the rest of the visible disc. Thus details of the images in the terminator regions are difficult to see. When a mosaic is made up of frames that are processed with normal contrast through a conventional high-pass filter, there is relatively poor contrast near the terminator, as illustrated in this Mercury I picture of the Caloris Basin (Fig. B-8). By putting these TV images through a spatially dependent filtering operation on the computer, the contrast and visibility of the features near the terminator is considerably improved (Fig. B-9).

The Image Processing Laboratory can also operate on pictures of a cloud-covered planet such as Venus to enhance the details of cloud structure. The series of images in Fig. B- 10 show first the uncorrected raw image of Venus ( a ) followed by an image on which the spatial variation and nonlinearity in response of the spacecraft camera have been removed (b). The contrast was also increased 1.5 times on this photograph. Next (c) the contrast was increased still further to 3.5, making additional features of the clouds stand out. Then a two-dimensional, high-pass filter was applied to the pixels for about half of the planet, thereby removing global shading and making a more even illumination ( d ). The contrast on this image was increased 4 times. The next image (e) shows an even further increase in contrast to 8 times. Finally, a smaller high-pass filter was applied to emphasize the details of small-scale clouds in the atmosphere at two contrasts (f) and (g).

For Mariner 10 pictures to be used in mapping Mercury, the precise locations of craters and other objects are needed. The coordinates of control points for cartography are measured by counting pixels on versions of the photographs made especially for this purpose. Before launch, the coordinates of 111 positions on the vidicon tube of each camera were measured to high precision. Reseau marks appear on each image to relate it to the vidicon tube coordinates. A computer control program allows the pixel measurements of control points on an image frame to be related to image coordinates at the vidicon faceplate at the time the picture was taken. Since the position of the spacecraft relative to the planet is known at this same time, the precise location of the control points on the surface of Mercury can be determined relative to a latitude and longitude system on the planet.

To aid in counting pixels, some photographs are reproduced (Fig. B-11) with an orthogonal-type grid. Black and white dashes show every 25 pixels. Pixel measurements can be made to within one-tenth of a pixel, and by averaging the count of several people of the same control point, a suitably accurate measurement is obtained.

 


[
179]

Fig. B-1. Several types of data records were produced for the Mariner Venus/Mercury mission.

Fig. B-1. Several types of data records were produced for the Mariner Venus/Mercury mission.

 


[180] Fig. B-2. The small area circled on the picture of Mercury (Caloris Basin) is enlarged to show the individual picture elements (pixels) that make up the picture. Each of these elements is transmitted from the spacecraft as a binary number which the ground computer processes through phototerminals into the reconstructed picture to duplicate the picture originally recorded on the vidicon aboard the spacecraft.

Fig. B-2. The small area circled on the picture of Mercury (Caloris Basin) is enlarged to show the individual picture elements (pixels) that make up the picture. Each of these elements is transmitted from the spacecraft as a binary number which the ground computer processes through phototerminals into the reconstructed picture to duplicate the picture originally recorded on the vidicon aboard the spacecraft.


[181] Fig. B-3. Three versions of a single picture of Mercury are compared: (a) The Digifax system, (b) the MTC improved version, and (c) the IPL version.

Fig. B-3. Three versions of a single picture of Mercury are compared: (a) The Digifax system, (b) the MTC improved version, and (c) the IPL version.


[182-183] Fig. B-4. Comparison of images produced (a) from the experimenter data record and (b) from compensated data that correct for modulation characteristics of the electronics of the A-camera from which the image data were received.

Fig. B-4. Comparison of images produced (a) from the experimenter data record and (b) from compensated data that correct for modulation characteristics of the electronics of the A-camera from which the image data were received.


[184-185] Fig. B-5. A similar pair of images from the B-camera are shown improved in the same way by convolutional filtering.

Fig. B-5. A similar pair of images from the B-camera are shown improved in the same way by convolutional filtering.


[186-190] Fig. B-6. This series of pictures shows how the Image Processing Laboratory cleans up errors in real-time pictures: (a) a real-time picture with a bit error rate of 1 in 33; (b) the same picture after cleanup of 57,000 pixels in error; (c) the same frame received with an error rate of 1 in 14, and (d) when it has been cleaned up by 128,000 pixels; (e) a virtually error-free picture received later by tape playback from the spacecraft.

Fig. B-6. This series of pictures shows how the Image Processing Laboratory cleans up errors in real-time pictures: (a) a real-time picture with a bit error rate of 1 in 33; (b) the same picture after cleanup of 57,000 pixels in error; (c) the same frame received with an error rate of 1 in 14, and (d) when it has been cleaned up by 128,000 pixels; (e) a virtually error-free picture received later by tape playback from the spacecraft.


[191-195] Fig. B-7. This series shows another method of removing errors from real-time pictures: (a) a picture received from Canberra with typical pixel errors showing as black dots; (b) the picture processed to remove the errors and to correct for some distortion in the camera; (c) the effect of high-pass filter to increase contrast two times; (d) a further correction to eliminate a camera electronic distortion; (e) the corrected picture reprojected for assembly into a large-scale mosaic of the type shown in Appendix A.

Fig. B-7. This series shows another method of removing errors from real-time pictures: (a) a picture received from Canberra with typical pixel errors showing as black dots; (b) the picture processed to remove the errors and to correct for some distortion in the camera; (c) the effect of high-pass filter to increase contrast two times; (d) a further correction to eliminate a camera electronic distortion; (e) the corrected picture reprojected for assembly into a large-scale mosaic of the type shown in Appendix A.



[196] Fig. B-8. This mosaic has been assembled from frames that were processed with normal contrast. The result is loss of detail in the terminator region.

Fig. B-8. This mosaic has been assembled from frames that were processed with normal contrast. The result is loss of detail in the terminator region.


[197] Fig. B-9. This same region is here reproduced from frames that were processed through spatially dependent filtering to improve the visibility of features in the region of the terminator.

Fig. B-9. This same region is here reproduced from frames that were processed through spatially dependent filtering to improve the visibility of features in the region of the terminator.


[198-204] Fig. B-10. This series of images shows how cloud structure can be enhanced by computer processing: (a) the uncorrected raw image of Venus; (b) the errors introduced by the camera of the spacecraft have been taken out and the contrast of the image has been increased 1.5 times; (c) the contrast is further increased 3.5 times; (d) global shading has been removed to present more even illumination over the whole of the planet, and the contrast has also been further increased; (e) a further step in increasing contrast; (f) and (g) images processed through high-pass filters to emphasize small-size cloud details.

Fig. B-10. This series of images shows how cloud structure can be enhanced by computer processing: (a) the uncorrected raw image of Venus; (b) the errors introduced by the camera of the spacecraft have been taken out and the contrast of the image has been increased 1.5 times; (c) the contrast is further increased 3.5 times; (d) global shading has been removed to present more even illumination over the whole of the planet, and the contrast has also been further increased; (e) a further step in increasing contrast; (f) and (g) images processed through high-pass filters to emphasize small-size cloud details.


[205] Fig. B-11. Special versions of pictures were also made on which black and white dots are introduced every 25 pixels in the form of a grid on the picture as shown here. By counting pixels on pictures such as this, the coordinates of control points on Mercury's surface were established as part of the map-making process.

Fig. B-11. Special versions of pictures were also made on which black and white dots are introduced every 25 pixels in the form of a grid on the picture as shown here. By counting pixels on pictures such as this, the coordinates of control points on Mercury's surface were established as part of the map-making process.


previousindexnext