The Case for Anticipating Undesirable Consequences of Computing Innovations Early, Often, and Across Computer Science

Authors: Rock Yuren Pang, Dan Grossman, Tadayoshi Kohno, Katharina Reinecke

Published: 2023-09-08 17:32:22+00:00

AI Summary

This paper argues for proactively addressing the unintended negative consequences of computing innovations. It proposes a framework, the PEACE Project, to encourage researchers to consider ethical implications early and often throughout the research lifecycle, rather than as an afterthought.

Abstract

From smart sensors that infringe on our privacy to neural nets that portray realistic imposter deepfakes, our society increasingly bears the burden of negative, if unintended, consequences of computing innovations. As the experts in the technology we create, Computer Science (CS) researchers must do better at anticipating and addressing these undesirable consequences proactively. Our prior work showed that many of us recognize the value of thinking preemptively about the perils our research can pose, yet we tend to address them only in hindsight. How can we change the culture in which considering undesirable consequences of digital technology is deemed as important, but is not commonly done?


Key findings
The paper highlights that current methods for addressing ethical concerns in computing research are insufficient. The PEACE Project is proposed as a solution, but its effectiveness in changing researcher behavior and institutional culture remains to be evaluated.
Approach
The PEACE Project aims to foster a culture change by providing resources and support for researchers to anticipate and mitigate undesirable consequences. This includes case studies, brainstorming tools, access to an ethics board, and a PEACE Report to document ethical considerations throughout the project.
Datasets
UNKNOWN
Model(s)
UNKNOWN
Author countries
USA