Search Page

Showing 91 - 100 of 273
DA-23 - GIS&T and Marine Science

Image courtesy of the National Academy of Sciences Ocean Studies Board

 

GIS&T has traditionally provided effective technological solutions to the integration, visualization, and analysis of heterogeneous, georeferenced data on land. In recent years, our ability to measure change in the ocean is increasing, not only because of improved measuring devices and scientific techniques, but also because new GIS&T is aiding us in better understanding this dynamic environment. The domain has progressed from applications that merely collect and display data to complex simulation, modeling, and the development of new research methods and concepts.

AM-81 - GIS-Based Computational Modeling

GIS-based computational models are explored. While models vary immensely across disciplines and specialties, the focus is on models that simulate and forecast geographical systems and processes in time and space. The degree and means of integration of the many different models with GIS are covered, and the critical phases of modeling: design, implementation, calibration, sensitivity analysis, validation and error analysis are introduced. The use of models in simulations, an important purpose for implementing models within or outside of GIS, is discussed and the context of scenario-based planning explained. To conclude, a survey of model types is presented, with their application methods and some examples, and the goals of modeling are discussed.

DA-11 - GIS&T and the Digital Humanities

This entry reviews the use of GIS&T in the digital humanities and in the spatial humanities, highlighting opportunities for interdisciplinary collaborations between GIScientists and humanities scholars, including in history, archeology, and literary studies. Challenges are highlighted as well, including epistemological and ontological differences between the spatial, abstract, and quantitative view of the world of GIS&T and GIScience and the humanities emphasis on place and qualitative methods. The potential of mixed methods to bring together different epistemological perspectives is discussed in this context. Scale is identified as a promising geographical framework for humanities research, both in its metaphorical aspects and as intended in cartography. Examples of the use of GIS&T and GIScience in the humanities are provided, including historical GIS, geohistorical gazetteers, archeology and GIS, and GIS in literary studies. The entry is framed historically, with reference to the work of Bakhtin, Braudel, and Hägerstrand, who are early influencers of the spatial turn in the humanities. Among the research directions briefly explored are the GIS of place, deep maps, and qualitative GIS, which exemplify how the collaboration between GIScience and the humanities can be strengthened.

AM-08 - Kernels and Density Estimation

Kernel density estimation is an important nonparametric technique to estimate density from point-based or line-based data. It has been widely used for various purposes, such as point or line data smoothing, risk mapping, and hot spot detection. It applies a kernel function on each observation (point or line) and spreads the observation over the kernel window. The kernel density estimate at a location will be the sum of the fractions of all observations at that location. In a GIS environment, kernel density estimation usually results in a density surface where each cell is rendered based on the kernel density estimated at the cell center. The result of kernel density estimation could vary substantially depending on the choice of kernel function or kernel bandwidth, with the latter having a greater impact. When applying a fixed kernel bandwidth over all of the observations, undersmoothing of density may occur in areas with only sparse observation while oversmoothing may be found in other areas. To solve this issue, adaptive or variable bandwidth approaches have been suggested.

DC-10 - Aerial Photography: History and Georeferencing

In 1903, Julius Neubranner, a photography enthusiast, designed and patented a breast-mounted aerial camera for carrier pigeons. Weighing only 70 grams, the camera took automatic exposures at 30-second intervals along the flight line flown by the bird. Although faster than balloons, they were not always reliable in following their flight paths. Today the pigeon corps has been replaced by unmanned aerial vehicles, but aerial photography continues to be an important source of data for use in a wide range of geospatial applications. Processing of the imagery to remove various types of distortion is a necessary step before the images can be georeferenced and used for mapping purposes. 

AM-46 - Location-allocation modeling

Location-allocation models involve two principal elements: 1) multiple facility location; and 2) the allocation of the services or products provided by those facilities to places of demand. Such models are used in the design of logistic systems like supply chains, especially warehouse and factory location, as well as in the location of public services. Public service location models involve objectives that often maximize access and levels of service, while private sector applications usually attempt to minimize cost. Such models are often hard to solve and involve the use of integer-linear programming software or sophisticated heuristics. Some models can be solved with functionality provided in GIS packages and other models are applied, loosely coupled, with GIS. We provide a short description of formulating two different models as well as discuss how they are solved.

FC-04 - Perception and Cognitive Processing of Geographic Phenomena: a Choropleth Map Case Study

The near ubiquity of maps has created a population the is well adept at reading and understanding maps.  But, while maps are familiar, understanding how the human brain processes that information is less known.  Discussing the processing of geographic phenomena could take different avenues: specific geospatial thinking skills, general perception and cognition processes, or even different parts of the human brain that are invoked when thinking geographically.  This entry focuses on tracing the processing of geographic phenomena using a choropleth map case study, beginning from perception — the moment the phenomena enter the human brain via our senses, to cognition — how meaning and understanding are generated. 

FC-24 - Conceptual Models of Error and Uncertainty

Uncertainty and error are integral parts of science and technology, including GIS&T, as they are of most human endeavors. They are important characteristics of knowledge, which is very seldom perfect. Error and uncertainty both affect our understanding of the present and the past, and our expectations from the future. ‘Uncertainty’ is sometimes used as the umbrella term for a number of related concepts, of which ‘error’ is the most important in GIS and in most other data-intensive fields. Very often, uncertainty is the result of error (or suspected error).  As concepts, both uncertainty and error are complex, each having several different versions, interpretations, and kinds of impacts on the quality of GIS products, and on the uses and decisions that users may make on their basis. This section provides an overview of the kinds of uncertainty and common sources of error in GIS&T, the role of a number of additional related concepts in refining our understanding of different forms of imperfect knowledge, the problems of uncertainty and error in the context of decision-making, especially regarding actions with important future consequences, and some standard as well as more exploratory approaches to handling uncertainties about the future. While uncertainty and error are in general undesirable, they may also point to unsuspected aspects of an issue and thus help generate new insights.

CP-16 - On the Origins of Computing and GIS&T: Part I, A Computer Systems Perspective

This paper describes the evolutionary path of hardware systems and the hardware-software interfaces that were used for GIS&T development during its “childhood”, the era from approximately the late 1960s to the mid-1980s.  The article is structured using a conceptualization that developments occurred during this period in three overlapping epochs that have distinctive modes of interactivity and user control: mainframes, minicomputers and workstations.  The earliest GIS&T applications were developed using expensive mainframe computer systems, usually manufactured by IBM. These mainframes typically had memory measured in kilobytes and operated in batch mode with jobs submitted using punched cards as input.  Many such systems used an obscure job control language with a rigid syntax. FORTRAN was the predominant language used for GIS&T software development. Technological developments, and associated cost reductions, led to the diffusion of minicomputers and a shift away from IBM. Further developments led to the widespread adoption of single user workstations that initially used commodity processors and later switched to reduced instruction set chips. Many minicomputers and workstations ran some variant of the UNIX operating system, which substantially improved user interactivity.

DM-01 - Spatial Database Management Systems

A spatial database management system (SDBMS) is an extension, some might say specialization, of a conventional database management system (DBMS).  Every DBMS (hence SDBMS) uses a data model specification as a formalism for software design, and establishing rigor in data management.  Three components compose a data model, 1) constructs developed using data types which form data structures that describe data, 2) operations that process data structures that manipulate data, and 3) rules that establish the veracity of the structures and/or operations for validating data.  Basic data types such as integers and/or real numbers are extended into spatial data types such as points, polylines and polygons in spatial data structures.  Operations constitute capabilities that manipulate the data structures, and as such when sequenced into operational workflows in specific ways generate information from data; one might say that new relationships constitute the information from data.  Different data model designs result in different combinations of structures, operations, and rules, which combine into various SDBMS products.  The products differ based upon the underlying data model, and these data models enable and constrain the ability to store and manipulate data. Different SDBMS implementations support configurations for different user environments, including single-user and multi-user environments.  

Pages