Imagery and GIS. Kass Green

Чтение книги онлайн.

Читать онлайн книгу Imagery and GIS - Kass Green страница 10

Автор:
Жанр:
Серия:
Издательство:
Imagery and GIS - Kass Green

Скачать книгу

across-track scanners, along-track scanners often also use a dispersing element to split apart the incoming beam of electromagnetic energy into distinct portions of the electromagnetic spectrum to enable the collection of multispectral imagery. Developed 30 years ago, along-track scanners are a more recent development than across-track scanners. Many multispectral satellite systems (e.g., WorldView-3, Landsat 8) rely on along-track sensors, as do the Leica Airborne Digital Sensors.

      Active sensors send out their own pulses of electromagnetic energy, and the sensor measures the echoes or returns of the energy as they are reflected by objects in the path of the pulse. For example, consumer cameras with flash attachments are active systems. Active remote sensors include lidar (light detection and ranging) systems, which generate laser pulses and sense electromagnetic energy in the ultraviolet to near-infrared regions of the spectrum, and radar (radio detection and ranging) systems, which generate and sense energy in the microwave range. An advantage of active systems is that they do not rely on the sun, so acquisitions can be made at times when the sun angle is low or at night. An additional advantage of radar systems is that the long wavelengths of microwaves can penetrate clouds, haze, and even light rain.

       Wavelengths Sensed

      Passive Sensors

      Most images are collected by panchromatic or multispectral passive sensors that are able to sense electromagnetic energy in the visible through infrared portions of the electromagnetic spectrum. To separate different optical and midinfrared wavelengths from one another, passive remote sensors place filters or dispersing elements between the opening and the imaging surface to split different wavelengths or “bands” of the electromagnetic spectrum from one another. Filters are usually used with framing cameras and include the following:

       Employing a Bayer filter over the digital array, which restricts each pixel to one portion of the electromagnetic spectrum, but alternates pixels in the array to collect at different wavelengths. The computer then interpolates the values of the non-sensed wavelengths from the surrounding pixels to simulate their values for each frequency at each pixel. This is how consumer cameras and many of the high-resolution small satellite constellations (e.g., Planet Doves) collect multispectral imagery.

       Placing separate filters on multiple cameras, each filtered to accept energy from a distinct portion of the electromagnetic spectrum, allows each focal plane to be optimized for that portion of the spectrum. Many four-band (red, green, blue, and infrared) airborne image sensors (e.g., Microsoft Ultracam and Leica DMC sensors) use this approach, which requires that the images simultaneously captured with the separate cameras be coregistered to one another after capture.

       Placing a spinning filter wheel in front of one camera so that each exposure of the image surface is in one portion of the electromagnetic spectrum. This approach is very useful for fixed platforms, however it requires very complex postcollection registration for systems with moving platforms and is rarely used in remote sensing systems.

      Alternatively, a dispersing/splitting element can be placed between the lens and a series of CCD arrays to split the incoming energy into its discrete portions of the electromagnetic spectrum. Many multispectral and most hyperspectral sensors employ dispersing/splitting elements (e.g., Leica Airborne Digital Sensors, NASA AVIRIS).

      Figures 3.6 to 3.8 illustrate how Bayer filters, framing cameras, and dispersing elements are typically used to create multispectral images. In general, because pixel values are interpolated for two values out of every three, Bayer filters will always have lower spectral resolution than multiheaded frame cameras or systems using dispersing elements.

Images

      Figure 3.6. How a Bayer filter framing camera system works. While the figure shows a true color image, Bayer filters can also be used to collect in the near-infrared portions of the electromagnetic spectrum, resulting in infrared imagery.

Images

      Figure 3.7. How a multilens multispectral framing camera system works

Images

      Figure 3.8. How a push broom multispectral scanner works with a dispersing element

      Active Sensors

      The most common active remote sensors are lidar and radar systems. As mentioned earlier, all active instruments work similarly by transmitting electromagnetic energy that is bounced back to the sensor from the surface of the earth. Because active sensors generate their own energy, they can capture imagery at any time of the day or night.

      Radar imagery is often used to create digital surface and digital elevation models over large regions, and to map sea or land cover in perpetually cloudy areas where optical imagery can’t be effectively collected. Figure 3.9 shows an example of a radar image of Los Angeles, California. Radar imagery is collected over a variety of microwave bands, which are denoted by letters and measured in centimeters as follows: Ka, 0.75 to 1.1 cm; K, 1.1 to 1.67 cm; Ku, 1.67 to 2.4 cm; X, 2.4 to 3.75 cm; C, 3.75 to 7.5 cm; S, 7.5 to 15 cm; L, 15 to 30 cm; and P, 30 to 100 cm. Usually, radar imagery is collected in just one band, resulting in a single band image. Bands X, C, and L are the most common ranges used in remote sensing. Some radar systems are able to collect imagery in several bands, resulting in multispectral radar imagery.

      Varying antenna lengths are required to create the radar signal at these different wavelengths. Because it is often not viable to have a long antenna on a platform moving through the air or space, the length of the antenna is extended electronically through a process called synthetic aperture radar.

      Radar signals can also be transmitted and received in either horizontal or vertical polarizations or a combination of both. HH imagery is both transmitted and received in a horizontal polarization, and VV imagery is both transmitted and received in a vertical polarization (i.e., like-polarized). HV imagery is transmitted horizontally and received vertically, and VH imagery is transmitted vertically and received horizontally (i.e., cross-polarized). The different polarizations can be combined to create a multipolarized image, which is similar to a multispectral image as each polarization collects different data about the ground.

Images

      Figure 3.9. An example radar image captured over Los Angeles, California (esriurl.com/IG39). Source: NASA

      Over the last 20 years in much of the world, airborne lidar has surpassed photogrammetric methods for measuring the 3-dimensional world. Lidar imagery is used to develop digital elevation models (DEMs), digital terrain models (DTMs), digital surface models (DSMs), digital height models (DHMs), elevation contours, and other derived datasets (chapter 8 provides more detail on the creation of DEMs). Additionally, NASA uses low-spatial-resolution satellite lidar to monitor ice sheet mass balance and aerosol heights and has recently initiated the Global Ecosystem Dynamics Investigation (GEDI) mission, which will result in the first global, moderate-spatial-resolution, spaceborne topographic lidar (http://science.nasa.gov/missions/gedi/).

      Lidar

Скачать книгу