You know the stereotype. Soulless Dr. Strangeloves inventing next-gen weapons technologies, from bionic warriors to robot drones, and creating frightening new technologies of coercion and control, such as uploading memories or know-how from chips to brains. All of it is supposedly driven solely by the sweetness of the problem and by the need to stay two steps ahead of our adversaries. It’s science and engineering running out of control, heedless of the human cost and the potential for misuse. Ethics? Ethics is for philosophers, priests, and the weak of heart. We don’t do ethics. Ethics would only get in the way, stifling the scientific imagination.
Not true. On the contrary, DARPA now wants to own the ethics; DARPA wants to bring the ethics right into the heart of the R&D process. And DARPA has a plan. You can read it in the just-released National Research Council/National Academy of Engineering committee report, “Emerging and Readily Available Technologies and National Security – A Framework for Addressing Ethical, Legal, and Societal Issues.” Funded by DARPA, the report presents the work of a distinguished committee of experts from both the civilian and military sectors, representing a wide of array of technical fields, and including lawyers, ethicists, and policy experts.
A bit wordy and ponderous, like any such committee report, the report is, still, a pretty remarkable document, both for what it says about DARPA’s determination to be ahead of the curve on the ethics, as well as the technology, and for its specific recommendations. While tailored mainly for a funding agency like DARPA, many of the ideas put forward will apply across the entire spectrum of government, weapons R&D agencies and labs, as well as corresponding private sector entities. And the model can be easily adapted for use in other technical areas.
The report’s first recommendation calls bluntly and forcefully for an open, public declaration by leaders of “interested agencies” that they are, themselves, committed to ongoing engagement with ethical, legal, and societal issues (ELSI), and stipulates that this declaration must include a clear “designation of functional accountability” within the agency for such issues. It goes on to specify a five-step process that includes initial project ELSI screening, public engagement, and ongoing review to spot new issues that might arise in the course of R&D. The report further recommends that managers receive regular ELSI training and that agencies should build external ELSI expertise.
Late last month, the University of Notre Dame’s Reilly Center for Science, Technology, and Values hosted a two-day workshop on the report, focused on the question of operationalizing the recommended institutional engagement with ELSI issues in weapons R&D. Participants included DARPA Deputy Director, Dr. Steven Walker, and the committee chair, Dr. Herbert S. Lin, from the National Research Council, along with committee members and tech ethics experts. The participants agreed about the urgency of foregrounding ethics in the weapons R&D process, not just to prevent future mistakes, such as those that may have been made in the run-up to the launch of the Total Information Awareness program in 2003, but also to enhance the quality and impact of our nation’s weapons development efforts. The general view seemed to be that doing it right is doing it well.
There was also a clear understanding that getting the ethics right can be just as hard as getting the science and engineering right. Ethics is a notoriously complicated and contested arena. Moreover, forecasting future ethics challenges even from existing technologies verges on the impossible.
These are important problems. They are also hard problems. But this is DARPA, and DARPA likes hard problems. DARPA hard.