All Topics

CP-06 - Graphics Processing Units (GPUs)

Graphics Processing Units (GPUs) represent a state-of-the-art acceleration technology for general-purpose computation. GPUs are based on many-core architecture that can deliver computing performance much higher than desktop computers based on Central Processing Units (CPUs). A typical GPU device may have hundreds or thousands of processing cores that work together for massively parallel computing. Basic hardware architecture and software standards that support the use of GPUs for general-purpose computation are illustrated by focusing on Nvidia GPUs and its software framework: CUDA. Many-core GPUs can be leveraged for the acceleration of spatial problem-solving.  

DC-19 - Ground Verification and Accuracy Assessment

Spatial products such as maps of land cover, soil type, wildfire, glaciers, and surface water have become increasingly available and used in science and policy decisions.  These maps are not without error, and it is critical that a description of quality accompany each product.  In the case of a thematic map, one aspect of quality is obtained by conducting a spatially explicit accuracy assessment in which the map class and reference class are compared on a per spatial unit basis (e.g., per 30m x 30m pixel).  The outcome of an accuracy assessment is a description of quality of the end-product map, in contrast to conducting an evaluation of map quality as part of the map production process.  The accuracy results can be used to decide if the map is of adequate quality for an intended application, as input to uncertainty analyses, and as information to improve future map products.

DC-36 - Historical Maps in GIS

The use of historical maps in coordination with GIS aids scholars who are approaching a geographical study in which an historical approach is required or is interested in the geographical relationships between different historical representations of the landscape in cartographic document.  Historical maps allow the comparison of spatial relationships of past phenomena and their evolution over time and permit both qualitative and quantitative diachronic analysis. In this chapter, an explanation of the use of historical maps in GIS for the study of landscape and environment is offered. After a short theoretical introduction on the meaning of the term “historical map,” the reader will find the key steps in using historic maps in a GIS, a brief overview on the challenges in interpretation of historical maps, and some example applications.

PD-32 - JavaScript for GIS

JavaScript (which has no connection to the Java computer language) is a popular high-level programming languages used to develop user interfaces in web pages. The principle goal of using JavaScript for programming web and mobile GIS applications is to build front-end applications that make use of spatial data and GIS principles, and in many cases, have embedded, interactive maps. It is considered much easier to program than Java or C languages for adding automation, animation, and interactivity into web pages and applications. JavaScript uses the leading browsers as runtime environments (RTE) and thus benefits from rapid and continuously evolving browser support for all web and mobile applications.

AM-08 - Kernels and Density Estimation

Kernel density estimation is an important nonparametric technique to estimate density from point-based or line-based data. It has been widely used for various purposes, such as point or line data smoothing, risk mapping, and hot spot detection. It applies a kernel function on each observation (point or line) and spreads the observation over the kernel window. The kernel density estimate at a location will be the sum of the fractions of all observations at that location. In a GIS environment, kernel density estimation usually results in a density surface where each cell is rendered based on the kernel density estimated at the cell center. The result of kernel density estimation could vary substantially depending on the choice of kernel function or kernel bandwidth, with the latter having a greater impact. When applying a fixed kernel bandwidth over all of the observations, undersmoothing of density may occur in areas with only sparse observation while oversmoothing may be found in other areas. To solve this issue, adaptive or variable bandwidth approaches have been suggested.

AM-29 - Kriging Interpolation

Kriging is an interpolation method that makes predictions at unsampled locations using a linear combination of observations at nearby sampled locations. The influence of each observation on the kriging prediction is based on several factors: 1) its geographical proximity to the unsampled location, 2) the spatial arrangement of all observations (i.e., data configuration, such as clustering of observations in oversampled areas), and 3) the pattern of spatial correlation of the data. The development of kriging models is meaningful only when data are spatially correlated.. Kriging has several advantages over traditional interpolation techniques, such as inverse distance weighting or nearest neighbor: 1) it provides a measure of uncertainty attached to the results (i.e., kriging variance); 2) it accounts for direction-dependent relationships (i.e., spatial anisotropy); 3) weights are assigned to observations based on the spatial correlation of data instead of assumptions made by the analyst for IDW; 4) kriging predictions are not constrained to the range of observations used for interpolation, and 5) data measured over different spatial supports can be combined and change of support, such as downscaling or upscaling, can be conducted.

AM-54 - Landscape Metrics

Landscape metrics are algorithms that quantify the spatial structure of patterns – primarily composition and configuration - within a geographic area. The term "landscape metrics" has historically referred to indices for categorical land cover maps, but with emerging datasets, tools, and software programs, the field is growing to include other types of landscape pattern analyses such as graph-based metrics, surface metrics, and three-dimensional metrics. The choice of which metrics to use requires careful consideration by the analyst, taking into account the data and application. Selecting the best metric for the problem at hand is not a trivial task given the large numbers of metrics that have been developed and software programs to implement them.

CV-28 - Lesson Design in Cartography Education

This entry describes six general variables of lesson design in cartography education and offers some practical advice for the development of materials for teaching cartography. First, a lesson’s scope concerns the set of ideas included in a lesson and helps identify different types of lessons based on the kinds of knowledge that they contain. Second, learning objectives concern the things that students should be able to do following a lesson and relate to different cognitive processes of learning. Third, a lesson’s scheme deals with the organizational framework for delivering content. Fourth, a lesson’s guidance concerns the amount and quality of supportive information provided. Fifth, a lesson’s sequence may involve one or more strategies for ordering content. Sixth, a lesson’s activity concerns what students do during a lesson and is often associated with different learning outcomes. These six variables help differentiate traditions for teaching cartography, elucidate some of the recurring challenges in cartography education, and offer strategies for designing lessons to foster meaningful learning outcomes.

DC-27 - Light Detection and Ranging (LiDAR)

LiDAR (Light Detection and Ranging) is a remote sensing technology that collects information reflected or refracted from the Earth’s surface. The instrumentation that collects LiDAR data can be housed on drones, airplanes, helicopters, or satellites, and consists of a laser scanner that transmits pulses of light. These transmitted pulses reflect or refract from objects on the Earth’s surface or from the surface itself, and the time delay is recorded. Knowing the travel time and the speed of light, an elevation of each pulse above the surface can be determined. From the pulse data collected, the user can determine the topography and landscape features of the Earth or whatever surface has received the pulses. The evolution of software that displays and analyzes LiDAR data and the development of new and more compact file formats have allowed the use of LiDAR to grow dramatically in recent years.

PD-01 - Linear Programming and GIS

Linear programming is a set of methods for finding optimal solutions to mathematical models composed of a set of linear functions. Many spatial location problems can be structured as linear programs. However, even modest-sized problem instances can be very difficult to solve due to the combinatorial complexity of the problems and the associated computational expense that they incur. Geographic Information Systems software does not typically incorporate formal linear programming functionality, and instead commonly uses heuristic solution procedures to generate near-optimal solutions quickly. There is growing interest in integrating the spatial analytic tools incorporated in Geographic Information Systems with the solution power of linear programming software to generate guaranteed optimal solutions to spatial location problems.