Position, Navigation, and Timing Technologies in the 21st Century. Группа авторов

Чтение книги онлайн.

Читать онлайн книгу Position, Navigation, and Timing Technologies in the 21st Century - Группа авторов страница 10

Position, Navigation, and Timing Technologies in the 21st Century - Группа авторов

Скачать книгу

feature‐based laser navigation using line f...Figure 49.10 Two‐dimensional (2D) feature‐based laser navigation using line ...Figure 49.11 Two‐dimensional (2D) feature‐based laser navigation using line ...Figure 49.12 Extraction of the two‐dimensional (2D) algorithm to three dimen...Figure 49.13 Feature‐based laser/inertial integration.Figure 49.14 Complementary Kalman filter (CKF) for inertial error estimation...Figure 49.15 Basic principle of feature‐based SLAM.Figure 49.16 Feature‐based EKF_SLAM algorithm.Figure 49.17 Feature‐based SLAM data association (Bailey [18]).Figure 49.18 EKF‐SLAM (yellow) versus GPS (blue).Figure 49.19 FastSLAM mechanization.Figure 49.20 Front‐end and back‐end processing for graph‐based SLAM methods....Figure 49.21 Example of a factor graph used for offline processing of data u...Figure 49.22 Example of using iterative closest point (ICP) on actual point ...Figure 49.23 Complementary Kalman filter (CKF) for inertial error estimation...Figure 49.24 Map lookup function (Vadlamani and Uijt de Haag [44]).Figure 49.25 Gradient‐based search method to find the lateral error offset (...Figure 49.26 Airborne laser‐scanner system (ALS)‐based terrain navigator usi...Figure 49.27 Dual ALS (DALS)‐based terrain navigator without a known terrain...Figure 49.28 (a) Feedforward and (b) feedback coupled dual airborne laser‐sc...Figure 49.29 Simulation results for the dual airborne laser‐scanner system (...Figure 49.30 Dual airborne laser‐scanner system/inertial navigation system (...Figure 49.31 Basic principle of forming an occupancy grid (gray: misses, bla...Figure 49.32 Example of an occupancy grid with a superimposed aerial robot t...Figure 49.33 Pose estimation based on matching the laser scan against availa...Figure 49.34 FastSLAM method using occupancy grids instead of features.Figure 49.35 (a) GridSLAM map and trajectory results, (b) trajectories of al...Figure 49.36 Small unmanned aircraft system (sUAS) mapping results for Ohio ...Figure 49.37 sUAS mapping results for Ohio University Stocker Center third f...

      18 Chapter 50Figure 50.1 Simple imaging system model. The imaging system transforms the s...Figure 50.2 Camera frame definition.Figure 50.3 Commonly used camera pinhole model.Figure 50.4 Mapping from 3D camera coordinates to 2D normalized coordinates,...Figure 50.5 Image plane for a nx × ny image, showing the relationship betwee...Figure 50.6 Relationship between the camera frame (and virtual image plane),...Figure 50.7 Sample feature extraction. In this image, notional features are ...Figure 50.8 Harris corner extraction example image.Figure 50.9 Harris corner edge response function.Figure 50.10 Harris corner metric sample results.Figure 50.11 Sample line extraction. In this image, lines are detected using...Figure 50.12 Frequency response of the Gaussian blur filter for varying blur...Figure 50.13 Impulse response of the difference of the Gaussian filter.Figure 50.14 Frequency response of the difference of the Gaussian filter. Th...Figure 50.15 Sample image of airfield.Figure 50.16 Sample image scale decomposition. As the filter center frequenc...Figure 50.17 Sample 12‐Segment FAST Feature Detection Nucleus. The center pi...Figure 50.18 Sample feature matching exercise. A feature descriptor from Fra...Figure 50.19 Sample unconstrained correspondence. In this case, a correspond...Figure 50.20 Stochastic feature prediction. Optical features of interest are...Figure 50.21 Epipolar geometry.Figure 50.22 Epipolar geometry for a landmark of interest.Figure 50.23 Two‐view geometry navigation processing example.Figure 50.24 Comparison of PnP error distribution between 6DOF (position and...Figure 50.25 Example of monocular imaging scale ambiguity. In this figure, t...Figure 50.26 Example of stereoscopic ranging. The depth of landmarks “A” and...Figure 50.27 Example of forced perspective imaging. In this photograph, the ...Figure 50.28 Example of automated attitude stabilization by tracking paralle...Figure 50.29 Overview of image‐aided inertial algorithm. Inertial measuremen...Figure 50.30 Comparison of image‐aided inertial navigation solutions for ind...

      19 Chapter 51Figure 51.1 Evolution of photogrammetric equipment: (a) early large‐format a...Figure 51.2 Georeferencing/navigation concepts.Figure 51.3 High‐resolution CCD sensor with main parametersFigure 51.4 Linear‐sensor‐based high‐resolution multispectral camera by Leic...Figure 51.5 Examples of image degradation.Figure 51.6 Geometric model of the pinhole camera.Figure 51.7 Coordinate systems in photogrammetry.Figure 51.8 Pixel and photo‐coordinate system.Figure 51.9 Interior parameters.Figure 51.10 Barrel (a) and pincushion (b) distortions.Figure 51.11 Image (a) taken with wide‐angle optics and (b) after distortion...Figure 51.12 Relationship between the image and object space.Figure 51.13 Classical airborne case of stereo photogrammetry.Figure 51.14 Epipolar constraints.Figure 51.15 Overview of the typical photogrammetric processing workflow.Figure 51.16 Tie and ground control points (GCPs) in aerial photogrammetry....Figure 51.17 Calibration targets.Figure 51.18 Generated tie points from a UAS image.Figure 51.19 Simple bundle adjustment example (Triggs et al. [32]).Figure 51.20 Result of the bundle adjustment: georeferenced image planes and...Figure 51.21 Epipolar resampling.Figure 51.22 Left image, right image and disparity.Figure 51.23 Orthorectification.Figure 51.24 Examples of close‐range and indoor photogrammetric applications...Figure 51.25 Parameters of the aerial flight planning.Figure 51.26 Rotary‐ and fixed‐wing UAVs: (a) DJI Phantom, (b) Bergen custom...Figure 51.27 Spectral bands of three satellite systems (UB – Ultra Blue, B –...

      20 Chapter 52Figure 52.1 Plaque attached to the face of the Pioneer 10 (1972) and 11 (197...Figure 52.2 The Unconventional Stellar Aspect (USA) instrument located on th...Figure 52.3 Example X‐ray photon intensity profile of Crab Nebula pulsar, PS...Figure 52.4 Example X‐ray photon intensity profile of pulsar PSR B1509‐58 wi...Figure 52.5 Neutron star with separate rotation and magnetic axes (Sheikh [2...Figure 52.6 Crab Nebula and Pulsar (PSR B0531+21) in the X‐ray band as obser...Figure 52.7 Several types of X‐ray celestial sources plotted along the Galac...Figure 52.8 Comparisons of two GRB measurements for GRB080727B using two sep...Figure 52.9 Pulse arrivals from distant individual pulsars as they arrive at...Figure 52.10 Range vectors from a single pulsar to Earth and spacecraft loca...Figure 52.11 Doppler frequency tracking of Crab Pulsar in RXTE orbit (Golsha...Figure 52.12 Position of spacecraft as pulses enter the solar system from a ...Figure 52.13 Relative navigation between two spacecraft observing the same v...Figure 52.14 Observation of GRB by cooperating base station and remote space...Figure 52.15 ROSAT Bright catalog source plots in right ascension and declin...

      21 Chapter 53Figure 53.1 Illustration of two basic spatial reference frames (Proulx et al...Figure 53.2 Illustration of path integration, or “dead reckoning” (Chiswick ...Figure 53.3 Overview of single unit recording from place cells in the rat hi...Figure 53.4 Firing patterns for four different spatial cells. For the head d...Figure 53.5 Basic Morris Water Maze experimental setup (Samueljohn.de (own w...Figure 53.6 Mid‐line sagittal view of the human cerebral cortex illustrating...Figure 53.7 Overview of three different proposed network models of allocentr...Figure 53.8 Place field in the 3D environment of a flying bat

      22 Chapter 54Figure 54.1 Examples illustrating the fascinating diversity and impressive s...Figure 54.2 A Schematic of the path of the sun through the sky. In the Northe...Figure 54.3 A Birds use the rotational center of the stars to infer a polewar...Figure 54.4 Schematic of Earth’s geomagnetic field, which in essence resembl...Figure 54.5 Birds use a magnetic inclination compass. Depicted is the scenar...

      23 Chapter 55Figure 55.1 Caterpillar excavator equipped by Trimble’s GNSS‐based guidance ...Figure 55.2 Mansoura Bridge, Egypt: (a) view, (b) GNSS monitoring system’s b...Figure 55.3 Measured and smoothed relative time series of the movements of M...Figure 55.4 Tide gauge station at Waikelo (Sumba, Indonesia) with a GNSS ant...Figure 55.5 Relationships between the sensors, MMS, and mapping coordinate f...Figure 55.6 Commercial UAV mapping systems: (a) Leica Aibot X6 and (b) Trimb...Figure 55.7 Components of a Pegasus backpack MMS.Figure 55.8 V10 imaging rover: (a) rover components, (b) application in an i...

      24 Chapter 56Figure 56.1 Straight rows of soybeans planted with an RTK‐GPS auto‐steered t...Figure 56.2 Manual soil sampling is labor intensive; however, the capital co...Figure 56.3 Machine‐aided soil sampling system. This is part of the Soil Inf...Figure 56.4 A soil sampling for a field will result in many maps, one map fo...Figure 56.5 Prescription nitrogen map. This map is converted to a rate contr...Figure 56.6 VR phosphorous map. Each nutrient as determined by the informati...Figure 56.7 Seeds are planted at their optimum plant density to maximize pro...Figure 56.8 VR planter with electric motor drives. This type of planter can ...Figure

Скачать книгу