skip to main content
article
Free access

Bias in computer systems

Published: 01 July 1996 Publication History

Abstract

From an analysis of actual cases, three categories of bias in computer systems have been developed: preexisting, technical, and emergent. Preexisting bias has its roots in social institutions, practices, and attitudes. Technical bias arises from technical constraints of considerations. Emergent bias arises in a context of use. Although others have pointed to bias inparticular computer systems and have noted the general problem, we know of no comparable work that examines this phenomenon comprehensively and which offers a framework for understanding and remedying it. We conclude by suggesting that freedom from bias should by counted amoung the select set of criteria—including reliability, accuracy, and efficiency—according to which the quality of systems in use in society should be judged.

References

[1]
BERLINS,M.AND HODGES, L. 1981. Nationality Bill sets out three new citizenship catego-ries. The London Times (Jan. 15), 1, 15.
[2]
CORBAT~ O, F. J., MERWIN-DAGGETT, M., AND DALEY, R. C. 1962. An experimental time-sharing system. In Proceedings of the Spring Joint Computer Conference. Spartan Books, 335-344.
[3]
FISHLOCK, T. 1981. Delhi press detect racism in Nationality Bill. The London Times (Jan. 20).
[4]
FOTOS, C. P. 1988. British Airways assails U.S. decision to void CRS agreement with American. Aviat. Week Space Tech. (Oct. 24), 78.
[5]
GAO. 1992. Patriot Missile defense: Software problem led to system failure at Dhahran, Saudi Arabia. GAO/IMTEC-92-26, U.S. General Accounting Office, Washington, D.C.
[6]
GRAETTINGER,J.S.AND PERANSON, E. 1981a. The matching program. New Engl. J. Med. 304, 1163-1165.
[7]
GRAETTINGER,J.S.AND PERANSON, E. 1981b. National resident matching program. New Engl. J. Med. 305, 526.
[8]
HUFF,C.AND COOPER, J. 1987. Sex bias in educational software: The effect of designers' stereotypes on the software they design. J. Appl. Soc. Psychol. 17, 519-532.
[9]
JOHNSON,D.G.AND MULVEY, J. M. 1993. Computer decisions: Ethical issues of responsibil-ity and bias. Statistics and Operations Res. Series SOR-93-11, Dept. of Civil Engineering and Operations Research, Princeton Univ., Princeton, N.J.
[10]
LEITH, P. 1986. Fundamental errors in legal logic programming. Comput. J. 29, 225-232.
[11]
MOOR, J. 1985. What is computer ethics? Metaphilosophy 16, 266-275.
[12]
OTT, J. 1988. American Airlines settles CRS dispute with British Airways. Aviat. Week Space Tech. (July 18).
[13]
ROTH, A. E. 1984. The evolution of the labor market for medical interns and residents: A case study in game theory. J. Pol. Econ. 92, 6, 991-1016.
[14]
ROTH, A. E. 1990. New physicians: A natural experiment in market organization. Science 250, (Dec. 14), 1524-1528.
[15]
SERGOT, M. J., SADRI, F., KOWALSKI, R. A., KRIWACZEK, F., HAMMOND, P., AND CORY,H.T. 1986. The British Nationality Act as a logic program. Commun. ACM 29, 370-386.
[16]
SHIFRIN, C. A. 1985. Justice will weigh suit challenging airlines' computer reservations. Aviat. Week Space Tech. (Mar. 25), 105-111.
[17]
SUDARSHAN,A.AND ZISOOK, S. 1981. National resident matching program. New Engl. J. Med. 305, 525-526.
[18]
TAIB, I. M. 1990. Loophole allows bias in displays on computer reservations systems. Aviat. Week Space Tech. (Feb.), 137.
[19]
WILLIAMS, K. J., WERTH, V. P., AND WOLFF, J. A. 1981. An analysis of the resident match. New Engl. J. Med. 304, 19, 1165-1166.

Cited By

View all
  • (2025)Fairness for machine learning software in education: A systematic mapping studyJournal of Systems and Software10.1016/j.jss.2024.112244219(112244)Online publication date: Jan-2025
  • (2024)The Need for Emotional Intelligence in Human-Computer InteractionsHarnessing Artificial Emotional Intelligence for Improved Human-Computer Interactions10.4018/979-8-3693-2794-4.ch006(82-106)Online publication date: 28-Jun-2024
  • (2024)Responsible i nnovation and d igital p latforms: The c ase of o nline f ood d eliveryJournal of Innovation Economics & Management10.3917/jie.pr1.0155N° 43:1(215-246)Online publication date: 19-Jan-2024
  • Show More Cited By

Recommendations

Reviews

Darin Chardin Savage

Friedman and Nissenbaum present a fascinating overview of bias within computer systems. The variety of systems surveyed—banking, commerce, computer science, education, medicine, and law—allows for both a broad-ranging and poignant discussion of bias, which, if undetected, may have serious and unfair consequences. The rapid dissemination and ready acceptance of computer algorithms means that biases may easily affect a large number of unsuspecting people. The difficulty, the authors observe, is in identifying and describing the nature of the bias, for this issue has not been comprehensively addressed in the computer literature. Most experts of social bias and discrimination approach the issue from a legal or philosophical background and may not be equipped to interpret and translate issues from a technological standpoint. This places the onus of responsibility on the technological professions themselves, and this paper provides a very thoughtful and thorough framework for meeting this responsibility. From the variety of cases, the authors are able to identify three forms of bias—preexisting, technical, and emergent. The descriptions of how these biases are integral to the cases seem right on target. My only wish as a reader was for a greater quantification of the actual impact of the bias, whether in economic, institutional, or social terms. The authors do give one example that alludes to a quantification of impact by noting that 90 percent of airline reservations are pulled from the first screen of the database. Therefore, those airlines listed only on subsequent screens would be at a great disadvantage. I wanted to know, however, to what extent the businesses are economically disadvantaged by this arrangement and how many people were affected by the BNAP immigration program or the National Resident Match Program. Perhaps the data are not available or are difficult to determine, but the issues raised in the paper evoke an expectation for this kind of data, which, if presented, would highlight the significance of bias. The authors do show unequivocally that bias exists in the case examples, and where and how it exists within the systems. Their rigorous definition of what constitutes bias and their call for standards of unbiased programming that meet clearly defined criteria offer a useful platform from which to address the complex issues of ethics and equity in the information age.

Access critical reviews of Computing literature here

Become a reviewer for Computing Reviews.

Comments

Information & Contributors

Information

Published In

cover image ACM Transactions on Information Systems
ACM Transactions on Information Systems  Volume 14, Issue 3
July 1996
111 pages
ISSN:1046-8188
EISSN:1558-2868
DOI:10.1145/230538
Issue’s Table of Contents

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 01 July 1996
Published in TOIS Volume 14, Issue 3

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. bias
  2. computer ethics
  3. computers and society
  4. design methods
  5. ethics
  6. human values
  7. social computing
  8. social impact
  9. standards
  10. system design
  11. universal design
  12. values

Qualifiers

  • Article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)4,020
  • Downloads (Last 6 weeks)531
Reflects downloads up to 10 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2025)Fairness for machine learning software in education: A systematic mapping studyJournal of Systems and Software10.1016/j.jss.2024.112244219(112244)Online publication date: Jan-2025
  • (2024)The Need for Emotional Intelligence in Human-Computer InteractionsHarnessing Artificial Emotional Intelligence for Improved Human-Computer Interactions10.4018/979-8-3693-2794-4.ch006(82-106)Online publication date: 28-Jun-2024
  • (2024)Responsible i nnovation and d igital p latforms: The c ase of o nline f ood d eliveryJournal of Innovation Economics & Management10.3917/jie.pr1.0155N° 43:1(215-246)Online publication date: 19-Jan-2024
  • (2024)Bias and Fairness in AI Models: How can Bias in AI Models be Identified, Mitigated, and Prevented in Data Science Practices?International Journal of Innovative Science and Research Technology (IJISRT)10.38124/ijisrt/IJISRT24SEP789(868-872)Online publication date: 24-Sep-2024
  • (2024)Measuring and Mitigating Racial Bias in Large Language Model Mortgage UnderwritingSSRN Electronic Journal10.2139/ssrn.4812158Online publication date: 2024
  • (2024)When Computers Say No: Towards a Legal Response to Algorithmic Discrimination in EuropeSSRN Electronic Journal10.2139/ssrn.4735345Online publication date: 2024
  • (2024)Chameleon: Foundation Models for Fairness-Aware Multi-Modal Data Augmentation to Enhance Coverage of MinoritiesProceedings of the VLDB Endowment10.14778/3681954.368201417:11(3470-3483)Online publication date: 30-Aug-2024
  • (2024)Preparedness and Response in the Century of DisastersInformation Systems Research10.1287/isre.2024.intro.v35.n235:2(460-468)Online publication date: 1-Jun-2024
  • (2024)Strong or thin digital democracy? The democratic implications of Taiwan's open government data policy in the 2010sBig Data & Society10.1177/2053951724129603811:4Online publication date: 10-Nov-2024
  • (2024)No recognised ethical standards, no broad consent: navigating the quandary in computational social science researchResearch Ethics10.1177/1747016124124768620:3(433-452)Online publication date: 19-Apr-2024
  • Show More Cited By

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Get Access

Login options

Full Access

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media