Digital Image Processing of Remote-Sensed Data
Digital Image Enhancement
Enhancement is the modification of an image to alter its impact on the viewer. Image enhancement is attempted after the image is corrected for geometric and radiometric distortions. Image enhancement methods are applied separately to each band of a multispectral image. Image enhancement is the process of improving the visual appearance and quality of the image for better interpretation and analysis. Image enhancement can be used to highlight or emphasize certain features, contrast, or details in the image. Some of the common methods for image enhancement include: (1) contrast stretching, (2) intensity, hue, and saturation transformations, (3) density slicing, and (4) spatial filtering.
Contrast Stretching
Contrast stretching is the process of adjusting the brightness and darkness levels of the image to enhance the differences between the pixel values. The density values in a scene are literally pulled farther apart, that is, expanded over a greater range.
Linear Contrast Stretch
This is the simplest contrast stretch algorithm. The grey values in the original image and the modified image follow a linear relation in this algorithm. A density number in the low range of the original histogram is assigned to extremely black, and a value at the high end is assigned to extremely white.
Non-Linear Contrast Stretch
In these methods, the input and output data values follow a non-linear transformation. The general form of the non-linear contrast enhancement is defined by y = f (x), where x is the input data value and y is the output data value.
Intensity, Hue, and Saturation Transformations
The additive system of primary colors (RGB) is well established. An alternate approach to color is the intensity, hue, and saturation system (IHS), which is useful because it presents colors more nearly as the human observer perceives them. The IHS system is based on the color sphere in which the vertical axis represents intensity, the radius is saturation, and the circumference is hue. The intensity (I) axis represents brightness variations and ranges from black (0) to white (255); no color is associated with this axis.
Density Slicing
Digital images have high radiometric resolution. Images in some wavelength bands contain 128 distinct grey levels. But, a human interpreter can reliably detect and consistently differentiate between 15 and 25 shades of gray only. However, the human eye is more sensitive to color than to the different shades between black and white.
Spatial Filtering
Spatial filtering encompasses another set of digital processing functions that are used to enhance the appearance of an image. Spatial filters are designed to highlight or suppress specific features in an image based on their spatial frequency. Spatial filtering is the process of dividing the image into its constituent spatial frequencies and selectively altering certain spatial frequencies to emphasize some image features. This technique increases the analyst's ability to discriminate details. The three types of spatial filters used in remote sensor data processing are low-pass filters, high-pass filters, and directional or edge detection filters.
Click on the following topics for more information on digital image processing of remote-sensed data.

