Search Page

Showing 21 - 22 of 22
CV-01 - Cartography and Science

"Science" is used both to describe a general, systematic approach to understanding the world and to refer to that approach as it is applied to a specific phenomenon of interest, for example, "geographic information science." The scientific method is used to develop theories that explain phenomena and processes. It consists of an iterative cycle of several steps: proposing a hypothesis, devising a way to make empirical observations that test that hypothesis, and finally, refining the hypothesis based on the empirical observations. "Scientific cartography" became a dominant mode of cartographic research and inquiry after World War II, when there was increased focus on the efficacy of particular design decisions and how particular maps were understood by end users. This entry begins with a brief history of the development of scientific cartographic approaches, including how they are deployed in map design research today. Next it discusses how maps have been used by scientists to support scientific thinking. Finally, it concludes with a discussion of how maps are used to communicate the results of scientific thinking.

DM-70 - Problems of Large Spatial Databases

Large spatial databases often labeled as geospatial big data exceed the capacity of commonly used computing systems as a result of data volume, variety, velocity, and veracity. Additional problems also labeled with V’s are cited, but the four primary ones are the most problematic and focus of this chapter (Li et al., 2016, Panimalar et al., 2017).  Sources include satellites, aircraft and drone platforms, vehicles, geosocial networking services, mobile devices, and cameras. The problems in processing these data to extract useful information include query, analysis, and visualization. Data mining techniques and machine learning algorithms, such as deep convolutional neural networks, often are used with geospatial big data. The obvious problem is handling the large data volumes, particularly for input and output operations, requiring parallel read and write of the data, as well as high speed computers, disk services, and network transfer speeds. Additional problems of large spatial databases include the variety and heterogeneity of data requiring advanced algorithms to handle different data types and characteristics, and integration with other data. The velocity at which the data are acquired is a challenge, especially using today’s advanced sensors and the Internet of Things that includes millions of devices creating data on short temporal scales of micro seconds to minutes. Finally, the veracity, or truthfulness of large spatial databases is difficult to establish and validate, particularly for all data elements in the database.

Pages