GS-11 - Professional and Practical Ethics of GIS&T

You are currently viewing an archived version of Topic Professional and Practical Ethics of GIS&T. If updates or revisions have been published you can find them at Professional and Practical Ethics of GIS&T.

Geospatial technologies are often and rightly described as “powerful.” With power comes the ability to cause harm – intentionally or unintentionally - as well as to do good. In the context of GIS&T, Practical Ethics is the set of knowledge, skills and abilities needed to make reasoned decisions in light of the risks posed by geospatial technologies and methods in a wide variety of use cases. Ethics have been considered from different viewpoints in the GIS&T field. A practitioner's perspective may be based on a combination of "ordinary morality," institutional ethics policies, and professional ethics codes. By contrast, an academic scholar's perspective may be grounded in social or critical theory. What these perspectives have in common is reliance on reason to respond with integrity to ethical challenges. This entry focuses on the special obligations of GIS professionals, and on a method that educators can use to help students develop moral reasoning skills that GIS professionals need. The important related issues of Critical GIS and Spatial Law and Policy are to be considered elsewhere.  

Author and Citation Info: 
The latest version of the entry "Professional and Practical Ethics of GIS&T" may be cited as:
 
DiBiase, D. (2017). Professional and Practical Ethics of GIS&T. The Geographic Information Science & Technology Body of Knowledge (2nd Quarter 2017 Edition), John P. Wilson (ed.). doi: 10.22224/gistbok/2017.2.2
 
This entry was published on May 5, 2017. 
 
Older versions of "Professional and Practical Ethics of GIS" are available in the following editions: Quarter 2, 2016 (first archived) [author: UCGIS].
Topic Description: 

1. Definitions

2. Professional and practical ethics

3. Emergence of a practical ethics of GIS

4. Critical perspectives on GIS ethics

5. Moral reasoning

6. The case method

7. Implications for teaching and learning

 

1. Definitions

Case method: A pedagogical technique for strengthening the moral reasoning skills of students by analyzing ethical scenarios. 

Dialectic: A method of examining and discussing opposing ideas to find the truth.

Moral reasoning: The process of determining the difference between right and wrong in a rational way.

Practical ethics: The discipline that bridges the gap between moral philosophy and the ethical challenges that confront people and institutions in day-to-day life.

Professional ethics: An aspect of practical ethics that concerns the special moral obligations that bear upon persons in certain occupations.

 

2. Professional and practical ethics

Dennis Thompson, founding director of the Ethics Center at Harvard University, observes that “philosophical principles cannot be applied in any straightforward way to particular problems and policies” (2007). Practical ethics, he points out, is the discipline that bridges the gap between moral philosophy and the ethical challenges that confront people and institutions in day-to-day life. Professional ethics is an aspect of practical ethics that concerns the special moral obligations that bear upon persons in certain occupations. In fact, a practitioner's commitment to honor these occupation-specific obligations, above and beyond ordinary morality, is a defining characteristic of professionalism (Davis 2014). As the practice of GIS has coalesced as a profession, and as geospatial technologies have matured, so too have the special moral obligations of GIS professionals crystalized. 

 

3. Emergence of a pratical ethics of GIS

Among the earliest considerations of professional ethics in cartography and GIS was an “ethics roundtable” published in 1990 (McHaffie, Andrews, Dobson, & others 1990). Contributors identified implications of inaccurate maps and data, intellectual property issues, and conflicts of interest as important ethical issues. Soon thereafter, Monmonier (1991, 1996) pointed out ways in which maps can be used to mislead decision-makers and the public, and proposed design guidelines to foster ethical practice by cartographers.

By 1993, Will Craig had laid the groundwork for a GIS Code of Ethics (Craig, 1993). After a study of existing codes in comparable organizations, and in consultation with a community of scholars and practitioners, Craig adopted a deontological approach that emphasizes treating people with respect, not as means to an end. In 2004, the newly-founded GIS Certification Institute included affirmation of the GIS Code of Ethics (written primarily by Craig) and Rules of Conduct (developed later by GISCI) among the requirements for certification as a GIS Professional (GISP). As of this writing, over 8,000 GISPs have earned certification.

Both the Code and Rules specify "Obligations to Society," "Obligations to Employers and Funders," "Obligations to Colleagues and the Profession," and "Obligations to Individuals in Society” as categories of duties to which GIS professionals are bound. Virtues enshrined in the Code and Rules include honesty ("admit when a mistake has been made"), forthrightness ("provide full, clear, and accurate information"), integrity ("avoid [or disclose] all conflicts of interest") , good citizenship ("donate services to the community"), objectivity ("not distort or alter the facts"), fairness ("treat all individuals equally"), respect and lawfulness ("honor intellectual property rights"), and responsibility ("hold paramount the safety, health, and welfare of the public" and "keep current in the field").

None of the provisions of the Code or Rules are solely applicable to GIS work. A few pertain specifically to information technology, such as "all data shall have appropriate metadata," and "allow individuals to withhold consent from being added to a database." Ethical problems uniquely posed by geospatial technologies, such as location tracking and geographic profiling, are implied but not explicitly mentioned. Dictums such as "Be aware of consequences, good and bad," "Strive to do what is right, not just what is legal," and "Define alternative strategies to reach employer/funder goals..." reflect an appreciation of the fact that ethical behavior requires ethical awareness and moral reasoning skills, not just following rigid rules about right and wrong actions. The very first Rule provides an uncomfortable example of the kind of moral ambiguity that many professionals experience at some points in their careers: "Some applications of GIS ... may harm individuals ... while advancing government policies that some citizens regards as morally objectionable. GIS professionals’ participation in such applications is a matter of individual conscience." (GISCI, 2004)

No assessment of the Code's and Rules' conformance with a "moral consensus of GIS practice," as recommended by Onsrud (1995), has been attempted. Absent that, the Code itself stands as a defacto consensus of the special obligations of GIS professionals. Though a true consensus may never be achieved, critical engagement with the Code and Rules is warranted (O'Sullivan 2008).

 

4. Critical perspectives on GIS ethics

In 1995, Jeremy Crampton critiqued the practical ethical concerns raised in the earlier “ethics roundtable” and related discussions. Long before the GIS Code and Rules were developed, he predicted that “’nailing down’ an ethical code ... is not the solution” (1995, p. 88). Following Michael Curry, Crampton expected that a code of ethics “may create the illusion of order, but is more likely merely to promote rule-following behaviour...” (Curry 1991, 144).  Crampton argued for “an ethical analysis that goes beyond ‘internalist’ judgments of good behavior … to a contextualized ‘externalist’ one.” His critique, and others like it, later coalesced as the “intellectual stance” known as Critical GIS (O'Sullivan 2008, 7). 

Critical or “externalist” perspectives on GIS appeared in the late 1980s and early 90s. As geospatial technologies matured and their applications became widespread, scholars as well as practitioners began to express concerns about the ethical implications of their use. Brian Harley (1988) was in the vanguard of scholars who questioned the assumption that maps are impartial and value-neutral depictions. By 1991, he challenged map makers to consider whether there could be “an ethically informed cartography, and if so, what should be its agenda?” (Harley 1991, p. 13).

At about the same time, Pickles (1991) highlighted the use of GIS as a surveillance technology, while Smith (1992) alleged that the makers and users of geospatial technologies were complicit in the killings associated with what he considered to be a morally questionable Gulf War. By 1995, a substantial literature focused on ethical and epistemological critiques of GIS and related technologies had appeared (e.g., Pickles 1995), and a widening gulf of misunderstanding and mistrust had separated critical scholars from proponents and practitioners of GIS and related technologies (Schuurman 2000). Critique verged on alarmism when Dobson and Fisher (2003) warned of “a new form of slavery characterized by location control” (p. 47), arguing that “...the countless benefits of [location-based services, including human tracking] are countered by social hazards unparalleled in human history” (p. 47).

Skeptical about the sufficiency of an ethics code to help GIS practitioners to mitigate such hazards, Crampton argued instead for “approaching ethical issues from an internalist and externalist dialectic” that “modifies both internal and external perspectives.” “Dialectic” is a method of examining and discussing opposing ideas in order to find the truth. In practical ethics, one such method is known as moral reasoning. 

 

5. Moral reasoning

Moral reasoning is the process of determining the difference between right and wrong in a rational way. Moral reasoning is needed to resolve ethical problems that do not present alternatives that are obviously right or wrong.

For example, imagine yourself an independent GIS services contractor who's been offered a lucrative contract by a municipal government to map Muslim neighborhoods in a major city. City Police intend the project to identify enclaves of Muslims who are susceptible to radicalization, and to target outreach activities designed to mitigate the risk of domestic terrorist attacks. On the other hand, community leaders and others oppose the plan as geographic profiling. Should you accept the contract, decline it, or respond in some other way? Unless you're willing to “go with your gut,” the right decision may not be obvious to you. Several provisions of the GIS Code of Ethics and GISCI Rules of Conduct pertain, but no one provides decisive guidance. In such a non-trivial ethical case, moral reasoning is more likely than intuition to lead to a good decision.  

Using geospatial technologies, as in the “mapping Muslim neighbors” scenario described above (National Public Radio 2007), sometimes gives rise to ethical concerns. However, the technologies themselves also cause concern as they evolve and proliferate. For example, camera-equipped drones and associated image processing software provide efficient means to monitor and map relatively small areas. But they also fuel worries about privacy and safety, as well as intended and unintended consequences of police or military surveillance. Elsewhere, however, they enable deliveries of life-saving medicines and supplies to remote settlements, and monitoring of species threatened by poachers. Autonomous drones pose potentials for good and ill that outstrip governments’ ability to regulate their use (Pomfret 2017).  

Self-driving cars and trucks are chockablock with geospatial technologies, including mobile lidar, radar, video, GPS, inertial measurement, high-resolution "HD" digital maps, and network analysis capabilities like routing. While potential benefits of fewer accidents and injuries, and savings of time and money, assure the continuing development and proliferation of driverless and autonomous trucks and cars. Some observers worry about the ethical issues that autonomous vehicles raise. Consider the scenario posed by philosopher Eric Scheitzgebel:

"You and your daughter are riding in a driverless car along the Pacific Coast Highway. The autonomous vehicle rounds a corner and detects a crosswalk full of children. It brakes, but your lane is unexpectedly full of sand from a recent rock slide. It can't get traction. Your car does some calculations: If it continues braking, there's a 90% chance that it will kill at least three children, Should it save them by steering you and your daughter off the cliff?" (Schwitzgebel 2015) 

The Internet of Things – that burgeoning billions of location-aware sensors that occupy our products and devices, and that monitor our environments like unsleeping sentries – pose perhaps the greatest ethical challenges. The machine learning techniques that are needed to make sense of the really big data that the IoT generates each moment may lead to the decoupling of intelligence from consciousness (Harari 2016). Who then will be prepared to code ethical algorithms for an autonomous spatial decision (support) system?

Helping students develop stronger moral reasoning skills is an overarching goal of ethics education (Dark and Winstead 2005). Developing moral reasoning skills requires “a framework for evaluating ethical dilemmas and making decisions. In accepting the premise that technology is value-laden, we stress the need to teach a methodology of explicit ethical analysis in all decision-making related to technology” (Martin & Holz 2010). One such framework is Davis’ (1999) “seven-step guide to ethical decision making” (outlined below) or similar models suggested by Keefer and Ashley (2001) and others. The guide, applied to non-trivial ethics case studies, helps educators coach students to suspend judgment until they've identified, considered, and rigorously tested a range of possible options. 

Step 1. State problem. For example, “there's something about this decision that makes me uncomfortable” or “do I have a conflict of interest?”

Step 2. Check facts. Many problems disappear upon closer examination of situation, while others change radically.

Step 3: Identify relevant factors. For example, persons involved, laws, professional code, other practical constraints.

Step 4: Develop list of options. Be imaginative, try to avoid “dilemma”; not “yes” or “no” but whom to go to, what to say.

Step 5: Test options. Use such tests as the following: Harm test: does this option do less harm than alternatives? Publicity test: would I want my choice of this option published in the newspaper? Defensibility test: could I defend choice of option before Congressional committee or committee of peers? Reversibility test: would I still think choice of this option good if I were adversely affected by it? Colleague test: what do my colleagues say when I describe my problem and suggest this option as my solution? Professional test: what might my profession's governing body or ethics committee say about this option? Organization test: what does the company's ethics officer or legal counsel say about this?

Step 6: Make a choice based on steps 1-5.

Step 7: Review steps 1-6. What could you do to make it less likely that you would have to make such a decision again? Are there any precautions can you take as individual (announce your policy on question, change job, etc.)? Is there any way to have more support next time? Is there any way to change the organization (for example, suggest policy change at next departmental meeting)?

Michael Davis’ (1999) Seven-step guide to ethical decision-making. This is one of several frameworks recommended by ethicists to help students strengthen moral reasoning skills by methodically analyzing ethics case studies. 

 

6. The case method

Onsrud (1995) recommended a study of the moral reasoning of GIS professionals in response to a set of “ethical conflict scenarios” to determine whether a moral consensus about GIS practice exists. The case method is a common pedagogical technique for strengthening the moral reasoning skills of students in business, medicine, law, engineering, and computer and information science (Davis 1999, Keefer and Ashley 2001, Quinn 2006c). In the context of professional ethics, case studies are realistic workplace scenarios that challenge students to analyze ethical problems rationally and to identify and evaluate options.

In 2007, the National Science Foundation funded a project to create a set of ethics case studies for use in GIS education. Fifteen cases are available at ‪http://gisprofessionalethics.org, along with instructor resources on request. (The scenario described in the preceding section is a synopsis of one of the cases.) Other case studies include:

  • Mobile phone tracking: Researchers track mobile phone users' movements to derive predictive models of human mobility.
  • Public access to government map data: A governmental agency's need to recoup user fees conflicts with a public records law.
  • E-911 Contract case: A municipal GIS manager troubled by what appears to be a conflict of interest considers filing a formal ethics complaint.
  • Tidal wetland mapping case: A scope of work statement and established mapping procedures prevent a GIS analyst from adding wetlands to a conservation database.
  • Collateral damage case: A geospatial intelligence analyst predicts the civilian casualties likely to be caused by a pre-emptive missile attack.

At Penn State University, the cases, along with Davis’ seven-step guide, have been used by nearly 300 graduate students since 2009 in a required “Responsible Scholarship and Professional Practice” workshop. Pre- and post-workshop questionnaires suggest that the case method has increased students’ awareness of ethical problems associated with GIS, enabled them to demonstrate moral reasoning abilities, and strengthened their belief that ethics education should be a required part of the preparation of geospatial professionals. 

 

7. Implications for teaching and learning

To the extent that GIS&T education is meant to prepare students to become, or advance as, GIS professionals, it ought to help students develop ethical awareness and moral reasoning abilities. Indeed, preparing students for successful and meaningful futures is arguably one of the special moral obligations that bear upon GIS education professionals. But even GIS courses, certificate and degree programs that do not emphasize pre-service or in-service professional development should prepare students to challenge the status quo and question assumptions about right and wrong actions, whatever their walk of life. The case method is a tried and true method for accomplishing that.

However, a recent survey of 312 GIS course syllabi at U.S. colleges and universities provides no evidence that professional ethics, practical ethics, or critical perspectives are included (Wikle and Fagin 2014). Although "exclusion of a topic may not imply that a topic was [not] covered," (Ibid., 583), it is certain that no disciplinary mandate for ethics training yet exists for GIS&T, as it does for Information Systems, Computer Science, and Engineering disciplines. Davis (2006, 717) observes that engineering faculty members resisted adding ethics to their crowded curricula because "there wasn't room." He suggests "micro-insertions" – concise considerations of ethical problems that can be slipped into existing lessons – as an effective alternative to entire ethics lessons or classes. The ethics case studies produced by the GIS Professional Ethics project are designed to facilitate the micro-insertion of professional ethics into GIS&T education.  

References: 

Portions of this entry were adapted from DiBiase, David, Francis Harvey, Christopher Goranson and Dawn Wright (2012). The GIS Professional Ethics Project: Practical Ethics for GIS ProfessionalsIn Unwin, David, Ken Foote, Nick Tate and David DiBiase, Eds. Teaching Geographic Information Science and Technology in Higher Education. London: Wiley and Sons. 

 

Craig, William J. (1993). A GIS Code of Ethics: What can we learn from other organizations? Journal of the Urban and Regional Information Systems Association, 5:2, 13-16.

Crampton, Jeremy (1995). The Ethics of GIS. Carotography and Geographic Information Science 22:1, 84-89. 

Curry, Michael R. (1991). On the possibility of ethics in geography: Writing, citing, and the construction of intellectual property. Progress in Human Geography 18:25-147. 

Dark, Melissa J., and Winstead, Jeanne. (2005). Using educational theory and moral psychology to inform the teaching of ethics in computing. Information Security Curriculum Development Conference ’05, September 23-24, Kennesaw, GA. Association of Computing Machinery.

Davis, Michael (1999). Ethics and the University. London: Routledge.

Davis, Michael (2006). Integrating ethics in technical courses: Micro-insertion. Science and Engineering Ethics 12:717-730. DOI: 10.1007/s11948-006-0066-z.

Davis, Michael (2014). What to consider when preparing a model core curriculum for GIS ethics: objectives, methods, and a sketch of content. Journal of Geography I Higher Education 38:4, 471–480. DOI: 10.1080/03098265.2014.956298.

Dobson, Jerome E. & Fisher, Peter. F. (2003). Geoslavery. IEEE Technology and Society Magazine, Spring, 47-52.

Harari, Yuval Noah (2016). Homo Deus: A Brief History of Tomorrow. HarperCollins.

GIS Certification Institute (2014) Rules of Conduct https://www.gisci.org/Ethics/RulesofConduct.aspx

Harley, J. B. (1988). Maps, Knowledge, and Power. In D. Cosgrove and S. Daniels, Eds., The Iconography of Landscape, pp. 277-312. Cambridge: Cambridge University Press.

Harley, J. B. (1991). Can there be a cartographic ethics? Cartographic Perspectives, 10, 9-16. DOI: 10.14714/CP10.1053.

Keefer, Michael and Ashley, K. D. (2001). Case-based approaches to professional ethics: A systematic comparison of students’ and ethicists’ moral reasoning. Journal of Moral Education, 30(4), 377-398. DOI: 10.1080/03057240120094869.
 
Martin, C. Diane and Holz, H. J. (2010). Non-apologetic computer ethics education: A strategy for integrating social impact and ethics into the computer science curriculum. Teaching Computer Ethics, http://rccs.southernct.edu/teaching-computer-ethics/#non-apologetic
 
McHaffie, P., Andrews, S., Dobson, M., and “Two anonymous employees of a federal mapping agency” (1990) Ethical problems in cartography: A roundtable commentary. Cartographic Perspectives, 7, 3-13. DOI: 10.14714/CP07.1095.
 
Monmonier, Mark S. (1991). Ethics and map design: Six strategies for confronting the traditional one-map solution. Cartographic Perspectives, 10, 3-8. DOI: 10.14714/CP10.1052.
 
Monmonier, Mark S. (1996). How to Lie with Maps. Chicago: University of Chicago Press. 
 
National Public Radio (2007). Plan to Map L.A.’s Muslims Sparks Outrage. http://www.npr.org/templates/story/story.php?storyId=16162012
 
Onsrud, Harlan J. (1995). Identifying unethical conduct in the use of GIS. Cartography and Geographic Information Systems, 22(1), 90-97. 
 
O’Sullivan, David (2008). What’s critical about critical GIS? In Wilson, Matthew W. and Barbara S. Poore, Theory, practice, and history in critical GIS: Reports on an AAG panel session. Cartographica 44:1, 5-16. DOI:10.3138/carto.44.1.5
 
Pickles, John. (1991). Geography, GIS, and the surveillant society. Papers and Proceedings of Applied Geography Conferences, 14: 80-91. 
 
Pickles, John, Ed. (1995). Ground Truth: The Social Implications of Geographic Information Systems. New York: Guilford.
 
Pomfret, Kevin (2017). Centre for Spatial Law and Policy. http://spatiallaw.com
 
Schuurman, Nadine (2000). Trouble in the heartland: GIS and it critics in the 1990s. Progress in Human Geography, 24(4), 569-590. 
 
Schwitzgebel, Eric (2015). Will your driverless car kill you so others may live? Los Angeles Times. http://www.latimes.com/opinion/op-ed/la-oe-1206-schwitzgebel-driverless-car-safety-algorithm-20151206-story.html
 
Smith, Neil (1992). History and philosophy of geography: real wars, theory wars. Progress in Human Geography, 16, 257-271. 
 
Thompson, Dennis F. (2007). What is Practical Ethics? In Ethics at Harvard 1987-2007. Edmond J. Safra Center for Ethics, Harvard University. http://ethics.harvard.edu/what-practical-ethics
 
Wikle, Thomas A. and Fagin, Todd D. (2014). GIS course planning: A comparison of syllabi and U.S. colleage and universities. Transactions in GIS 18:4, 574-585. DOI: 10.1111/tgis.12048

 

 

Learning Objectives: 
  • Demonstrate the ability to reason about an ethical challenge in the professional practice of GIS by methodically analyzing an ethics case study.
  • Compare and contrast professional and practical (“internalist”) perspectives and critical (“externalist”) perspectives on the ethics of GIS&T.
  • Demonstrate ethical creativity by posing multiple possible solutions to an ethical challenge. Resist the temptation to reduce such challenges to simplistic dilemmas.
  • Identify provisions of the GIS Code of Ethics that are relevant to particular ethical challenges, especially provisions that appear to be contrary.
Instructional Assessment Questions: 

1. What should GIS professionals know about ethics?

2. What should GIS educators teach about ethics?

3. What's the best way to teach GIS ethics?

4. Outline a multi-step process for reasoning about ethical challenges.

5. Describe a scenario that poses a non-trivial ethical change related to geospatial technogies.