All Topics

A B C D E F G H I J K L M N O P R S T U V W
CP-16 - On the Origins of Computing and GIS&T: Part I, A Computer Systems Perspective

This paper describes the evolutionary path of hardware systems and the hardware-software interfaces that were used for GIS&T development during its “childhood”, the era from approximately the late 1960s to the mid-1980s.  The article is structured using a conceptualization that developments occurred during this period in three overlapping epochs that have distinctive modes of interactivity and user control: mainframes, minicomputers and workstations.  The earliest GIS&T applications were developed using expensive mainframe computer systems, usually manufactured by IBM. These mainframes typically had memory measured in kilobytes and operated in batch mode with jobs submitted using punched cards as input.  Many such systems used an obscure job control language with a rigid syntax. FORTRAN was the predominant language used for GIS&T software development. Technological developments, and associated cost reductions, led to the diffusion of minicomputers and a shift away from IBM. Further developments led to the widespread adoption of single user workstations that initially used commodity processors and later switched to reduced instruction set chips. Many minicomputers and workstations ran some variant of the UNIX operating system, which substantially improved user interactivity.

CP-32 - On the Origins of Computing and GIST: Part 2, A Perspective on the Role of Peripheral Devices

GIS implementations in the late-1960s to mid-1980s required the use of exotic peripheral devices to encode and display geospatial information. Data encoding was normally performed in one of two modes: automated raster scanning and manual (vector) coordinate recording. Raster scanning systems in this era were extremely expensive, operated in batch mode, and were located at a limited number of centralized facilities, such as federal mapping agencies. Coordinate digitizers were more widely distributed and were often configured with dedicated minicomputers to handle editing and formatting tasks. Data display devices produced hardcopy and softcopy output. Two commonly encountered hardcopy devices were line printers and pen plotters. Softcopy display consisted of cathode ray tube devices that operated using frame buffer and storage tube technologies. Each device was driven by specialized software provided by device manufacturers, leading to widespread hardware-software incompatibly. This problem led to the emergence of device independence to promote increased levels of interoperability among disparate input and output devices.

DM-80 - Ontology for Geospatial Semantic Interoperability

It is difficult to share and reuse geospatial data and retrieve geospatial information because of geospatial data heterogeneity problems. Lack of semantic interoperability is one of the major problems facing GIS (Geographic Information Science/System) systems and applications today. To solve geospatial data heterogeneity problems and support geospatial information retrieval and semantic interoperability over the Web, the use of an ontology is proposed because it is a formal explicit description of concepts or meanings of words in a well-defined and unambiguous manner. Geospatial ontologies represent geospatial concepts and properties for use over the Web. OWL (Ontology Web Language) is an emerging language for defining and instantiating ontologies. OWL builds on RDF (Resource Description Framework) but adds more vocabulary for describing properties and classes. The downside of representing structured geospatial data in OWL and RDF languages is that it can result in inefficient data access. SPARQL (Simple Protocol and RDF Query Language) is recommended for general RDF query while the GeoSPARQL (Geographic Simple Protocol and RDF Query Language) protocol is proposed as an extension of SPARQL for querying geospatial data. However, the runtime cost of GeoSPARQL queries can be high due to the fine-grained nature of RDF data models. There are several challenges to using ontologies for geospatial semantic interoperability but these can be overcome through collaboration.

PD-37 - Open Source Software Development

Open source geospatial software is now ubiquitous – it is used and supported across industries, in government agencies, as well as research institutions and academia. This entry describes general principles of open source software development and provides an overview of the development platforms and tools. Specific focus is on the Open Source Geospatial Foundation’s software stack, its development principles, practices, and initiatives. Several additional major open source software systems with geospatial support are also briefly discussed with examples of open source applications developed by integrating multiple libraries and packages.

FC-35 - Openness

The philosophy of Openness and its use in diverse areas is attracting increasing attention from users, developers, businesses, governments, educators, and researchers around the world. The technological, socio-cultural, economic, legal, institutional, and philosophical issues related to its principles, applications, benefits, and barriers for its use are growing areas of research. The word “Open” is commonly used to denote adherence to the principles of Openness. Several fields are incorporating the use of Openness in their activities, some of them are of particular relevance to GIS&T (Geographic Information Science and Technology) such as: Open Data, Free and Open Source Software; and Open Standards for geospatial data, information, and technologies. This entry presents a definition of Openness, its importance in the area of GISc&T is introduced through a list of its benefits in the fields of Open Data, Open Source Software, and Open Standards. Then some of the barriers, myths, or inhibitors to Openness are presented using the case of Free and Open Source Software (FOSS) and FOSS for Geospatial Applications (FOSS4G).

KE-33 - Organizational Models for GIS Management

Organizational structures and management practices for GIS programs are numerous and complex. This topic begins with an explanation of organizational and management concepts and context that are particularly relevant to GIS program and project management, including strategic planning and stakeholders. Specific types of organizations that typically use GIS technology are described and organizational structure types are explained. For GIS Program management, organizational placement, organizational components, and management control and policies are covered in depth. Multi-organizational GIS Programs are also discussed. Additional topics include management roles and technology trends that affect organizational structure. It concludes with a general description of GIS Project management. 

AM-04 - Overlay

Overlay operation is a critical and powerful tool in GIS that superimposes spatial and attribute information from various thematic map layers to produce new information. Overlay operations facilitate spatial analysis and modeling processes when being used with other spatial operations (e.g. buffer, dissolve, merge) to solve real-world problems. For both vector and raster data models, the input layers need to be spatially aligned precisely with each other to ensure a correct overlay operation. In general, vector overlay is geometrically and computationally complex. Some most used vector overlay operations include intersection, union, erase, and clip. Raster overlay combines multiple raster layers cell by cell through Boolean, arithmetic, or comparison operators. This article provides an overview of the fundamentals of overlay operations, how they are implemented in vector and raster data, and how suitability analysis is conducted.

FC-04 - Perception and Cognitive Processing of Geographic Phenomena: a Choropleth Map Case Study

The near ubiquity of maps has created a population the is well adept at reading and understanding maps.  But, while maps are familiar, understanding how the human brain processes that information is less known.  Discussing the processing of geographic phenomena could take different avenues: specific geospatial thinking skills, general perception and cognition processes, or even different parts of the human brain that are invoked when thinking geographically.  This entry focuses on tracing the processing of geographic phenomena using a choropleth map case study, beginning from perception — the moment the phenomena enter the human brain via our senses, to cognition — how meaning and understanding are generated. 

FC-03 - Philosophical Perspectives

This entry follows in the footsteps of Anselin’s famous 1989 NCGIA working paper entitled “What is special about spatial?” (a report that is very timely again in an age when non-spatial data scientists are ignorant of the special characteristics of spatial data), where he outlines three unrelated but fundamental characteristics of spatial data. In a similar vein, I am going to discuss some philosophical perspectives that are internally unrelated to each other and could warrant individual entries in this Body of Knowledge. The first one is the notions of space and time and how they have evolved in philosophical discourse over the past three millennia. Related to these are aspects of absolute versus relative conceptions of these two fundamental constructs. The second is a brief introduction to key philosophical approaches and how they impact geospatial science and technology use today. The third is a discussion of which of the promises of the Quantitative Revolution in Geography and neighboring disciplines have been fulfilled by GIScience (and what is still missing). The fourth and final one is an introduction to the role that GIScience may play in what has recently been formalized as theory-guided data science.

DM-36 - Physical Data Models

Constructs within a particular implementation of database management software guide the development of a physical data model, which is a product of a physical database design process. A physical data model documents how data are to be stored and accessed on storage media of computer hardware.  A physical data model is dependent on specific data types and indexing mechanisms used within database management system software.  Data types such as integers, reals, character strings, plus many others can lead to different storage structures. Indexing mechanisms such as region-trees and hash functions and others lead to differences in access performance.  Physical data modeling choices about data types and indexing mechanisms related to storage structures refine details of a physical database design. Data types associated with field, record and file storage structures together with the access mechanisms to those structures foster (constrain) performance of a database design. Since all software runs using an operating system, field, record, and file storage structures must be translated into operating system constructs to be implemented.  As such, all storage structures are contingent on the operating system and particular hardware that host data management software. 

Pages