🔬 Research Summary by Rock Yuren Pang, whose focus is on using HCI methods, crowdsourcing, and large language models to support researchers in anticipating the social impact of their work.
[Original paper by Rock Yuren Pang, Dan Grossman, Tadayoshi Kohno, and Katharina Reinecke]
Overview: Computer science research has led to many breakthrough innovations but has also grappled with unintended negative societal repercussions. Prior work showed that CS researchers recognize the value of thinking preemptively about the perils of CS research, but we tend to address them only in hindsight. This paper builds on prior work to propose our vision of facilitating a shift in institutional culture for CS researchers to anticipate the social impact of CS research.
Introduction
From smart sensors that infringe on our privacy to neural nets that portray realistic imposter deepfakes, our society increasingly bears the burden of negative, if unintended, consequences of computing innovations. The recent flurry of negative media about the adverse effects of technologies is spurring researchers in various computer science disciplines to more commonly examine the ethical implications of their work. However, a critical challenge persists: how can we support CS researchers across many subfields to anticipate these social impacts before they harm society? This paper proposes a vision to reshape academic institutional cultures to navigate and anticipate these social impacts of computing research. The visions have implications for broader academic and industry researchers alike.
Key Insights:
“That’s important, but…”
Our vision builds on prior work that investigated the current attitudes, practices, and barriers when it comes to considering social impacts in advance. For example, factors include the usual (and often true) lack of time argument as our move-fast mentality and deadlines take precedence over all else; the deflection of responsibility to others (e.g., the IRB, other academic fields, those commercializing the research ideas, and other team members); the lack of formal processes and guidelines to think through potential negative effects; and the difficulties accessing diverse perspectives that are crucial to identifying potential impacts. CS researchers usually think about potential undesirable outcomes in hindsight, e.g., after a publication venue requires an ethics statement or after a research innovation raises concerns. Addressing these consequences at such a time is, of course, too late to pivot. Sometimes, the damage cannot be undone.
Some CS researchers have attempted to alleviate the challenges over the past decades. These approaches require researchers to submit an ethics statement with paper submissions or undergo an Ethics and Society Review before grant proposals. Such efforts raise our awareness about this issue. Still, our prior findings show that these brutal-force actions remain insufficient to make meaningful headway on an intractable problem that computing researchers rarely anticipate and address undesirable consequences systematically in advance.
Our Proposals
Support research subfields across CS: Recent ethics efforts in technology development have often focused on artificial intelligence (AI) instead of encouraging all computing researchers to consider potential societal effects. This focus persists even though many CS subcommunities have seen their share of sometimes severe unintended consequences (See more examples in our paper). Pointing our fingers at AI can risk researchers in other fields feeling that ethical considerations are “someone else’s problem.” It may also lead us to overlook opportunities for holistic improvement across the broader CS field. Instead, we believe that all computing researchers, no matter their subdiscipline, should be supported in learning about what ethics in computing means as well as how to consider unintended consequences in our work proactively.
Encourage early consideration: Considering undesirable consequences could start when formulating the research problem to increase the likelihood that it is still possible to pivot. There needs to be more time to substantially address the problem and make changes to a research project before the submission or publication stage to reflect on potential social impacts. This is both because people become invested in an idea once they have put in some work and because modifying existing innovations is considerably more time-consuming than doing so early on.
Encourage regular considerations: Just as the research is a constantly evolving endeavor, considering undesirable consequences can also introduce unforeseen challenges requiring ongoing feedback and re-evaluation. Rather than making it a one-time exercise, researchers should routinely think about the potential societal implications of their work. Achieving this will require changing the institutional culture and support system such that researchers are incentivized to think about societal implications regularly and learn how to do this efficiently.
Support CS researchers at all levels: Many prior approaches place most of the responsibility on a single person, namely, the person submitting a paper or the PI submitting a proposal. This can lead other research team members, such as undergraduates, graduates, postdoctoral researchers, or other collaborators, to overly rely on this one person. In fact, our prior work suggests that some faculty rely on the experience of their “more ethics-educated” students. In contrast, students may rely on the experience of their senior researchers and faculty, pointing to their experience. To break this cycle of deferred responsibility, every one of us on a research team should play a role and be supported in addressing this issue.
Between the lines
While these action items may appear ambitious, we are actively transforming these ideas into practice as part of the ethics effort at the Allen School of Computer Science & Engineering. Over the coming years, we aim to design, refine, and assess our strategies through iteratively sharing our learnings with the wider CS community. We warmly invite ideas, feedback, and collaboration. The insights from our endeavors will advance institutional practices across many academic environments and within the industry at large.