Imagery and GIS. Kass Green
Чтение книги онлайн.
Читать онлайн книгу Imagery and GIS - Kass Green страница 8
Chapter 3
Imagery Fundamentals
Introduction
Imagery is collected by remote sensing systems managed by either public or private organizations. It is characterized by a complex set of variables, including
collection characteristics: image spectral, radiometric, and spatial resolutions, viewing angle, temporal resolution, and extent; and
organizational characteristics: image price and licensing and accessibility.
The choice of which imagery to use in a project will be determined by matching the project’s requirements, budget, and schedule to the characteristics of available imagery. Making this choice requires understanding what factors influence image characteristics. This chapter provides the fundamentals of imagery by first introducing the components and features of remote sensing systems, and then showing how they combine to influence imagery collection characteristics. The chapter ends with a review of the organizational factors that also characterize imagery. The focus of this chapter is to provide an understanding of imagery that will allow the reader to 1) rigorously evaluate different types of imagery within the context of any geospatial application, and 2) derive the most value from the imagery chosen.
Collection Characteristics
Image collection characteristics are affected by the remote sensing system used to collect the imagery. Remote sensing systems comprise sensors that capture data about objects from a distance, and platforms that support and transport sensors. For example, humans are remote sensing systems because our bodies, which are platforms, support and transport our sensors—our eyes, ears, and noses—which detect visual, audio, and olfactory data about objects from a distance. Our brains then identify/classify this remotely sensed data into information about the objects. This section explores sensors first, and then platforms. It concludes by discussing how sensors and platforms combine to determine imagery collection characteristics.
A platform is defined by the Glossary of the Mapping Sciences (ASCE, 1994) as “A vehicle holding a sensor.” Platforms include satellites, piloted helicopters and fixed-wing aircraft, unmanned aerial systems (UASs), kites and balloons, and earth-based platforms such as traffic-light poles and boats. Sensors are defined as devices or organisms that respond to stimuli. Remote sensors reside on platforms and respond “to a stimulus without being in contact with the source of the stimulus” (ASCE, 1994). Examples of remote sensing systems include our eyes, ears, and noses; the camera in your phone; a video camera recording traffic or ATM activity; sensors on satellites; and cameras on UASs, helicopters, or airplanes.
Imagery is acquired from terrestrial, aircraft, marine, and satellite platforms equipped with either analog (film) or digital sensors that measure and record electromagnetic energy.1 Because humans rely overwhelmingly on our eyes to perceive and understand our surroundings, most remote sensing systems capture imagery that extends our ability to see by measuring the electromagnetic energy reflected or emitted from an object. Electromagnetic energy is of interest because different types of objects reflect and emit different intensities and wavelengths of electromagnetic energy, as shown in figure 3.1. Therefore, measurements of electromagnetic energy can be used to identify features on the imagery and to differentiate diverse classes of objects from one another to make a map.
Figure 3.1. Comparison of example percent reflectance of different types of objects across the electromagnetic spectrum (esriurl.com/IG31)
The type of sensor used to capture energy determines which portions of the electromagnetic spectrum the sensor can measure (the imagery’s spectral resolution) and how finely it can discriminate between different levels of energy (its radiometric resolution). The type of platform employed influences where the sensor can travel, which will affect the temporal resolution of the imagery. The remote sensing system—the combination of the sensor and the platform—impacts the detail perceivable by the system, the imagery’s spatial resolution, the viewing angle of the imagery, and the extent of landscape viewable in each image.
Sensors
This section provides an understanding of remote sensors by examining their components and explaining how different sensors work. As mentioned in chapter 1, a wide variety of remote sensors have been developed over the last century. Starting with glass-plate cameras and evolving into complex active and passive digital systems, remote sensors have allowed us to “see” the world from a superior viewpoint.
All remote sensors are composed of the following components, as shown in figure 3.2:
Devices that capture either electromagnetic energy or sound, either chemically, electronically, or biologically. The devices may be imaging surfaces (used mostly in electro-optical imaging) or antennas (used in the creation of radar and sonar images).
Lenses that focus the electromagnetic energy onto the imaging surface.
Openings that manage the amount of electromagnetic energy reaching the imaging surface.
Bodies that hold the other components relative to one another.
Figure 3.2. The similar components of the human eye and a remote sensor
Our eyes, cameras, and the most advanced passive and active digital sensors fundamentally all work the same way. Electromagnetic energy passes through the opening of the sensor body where it reaches a lens that focuses the energy onto the imaging surface. Our brains turn the data captured by our retinas into information. Similarly, we convert remotely sensed image data into information through either manual interpretation or semi-automated image classification.
Imaging Surfaces
Imaging surfaces measure the electromagnetic energy that is captured by digital sensors such as a charged coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) array. The wavelengths of energy measured are determined by either filters or dispersing elements placed between the sensor opening and the imaging surface. The energy is generated either passively by a source (such as the sun) other than the sensor, or actively by the sensor.
The Electromagnetic Spectrum
Most remote sensing imaging surfaces work by responding to photons of electromagnetic energy. Electromagnetic energy is caused by the phenomenon of photons freeing electrons from atoms. Termed the photoelectric effect, it was first conceptualized by Albert Einstein, earning him the Nobel Prize in physics in 1921.
Electromagnetic