A few weeks before the Sixth EMBO/EMBL Joint Conference on ‘Science and Security’ in Heidelberg, Germany, the scientific journals Nature and Science published articles describing, respectively, the genome sequence of the 1918 Spanish flu virus (Ghedin et al, 2005) and the reconstruction of the virus from this sequence (Tumpey et al, 2005). The reconstruction research, conducted at the US Centers for Disease Control and Prevention (Atlanta, GA, USA), provided important insights linking the epidemiology of the pandemic, which killed between 20 and 50 million people worldwide in 1918–1919, and the genetic mutability and virulence of the virus. This information is highly relevant today in light of the avian‐flu threat, according to the senior author of the Nature paper, Steven Salzberg. But is this enough to justify depositing the sequence data in a publicly accessible database and reawakening the virus? Two speakers at the conference had radically different views on that point, and thus set the tone for the presentations and discussions on the security implications of biological research.
The nub of the question is how to safeguard dual‐use research from nefarious applications. Some might even ask whether such research should be done at all. But upstream of that debate is the very definition of ‘dual use’ itself. Cannot all scientific discoveries, from the Ancient Greeks through to modern‐day researchers, be classified as more or less dual use? In analysing the problem at the laboratory level, society risks depreciating curiosity‐driven research that produces unforeseen benefits, as well as risks. To place bans on certain types of research seems to be an anti‐scientific approach to addressing the problem. But some would argue that forsaking unforeseen benefits would be an acceptable trade‐off in return for avoiding the risk of a catastrophe, which takes us into the thorny domain of value judgements and risk assessment. Moreover, the dual‐use nature of some research emerges only once the research has already been done. Publication of the research, and—if applicable—of related information, such as genome sequences, are points at which some control can be exercised to minimize the risk of dangerous research. However, as demonstrated by at least one recent case in the USA (Alberts, 2005), the mechanisms for such control are far from clear—both on the side of the authorities and in the scientific community.
Whatever the potential risks today, it cannot be ignored that scientists actively participated in biological‐weapons research programmes and even in the use of such weapons in wartime. Whether as political pawns or as patriots, scientists have been crucial to the development of military technologies. Some national research programmes still focus on bioweapons, even though their use is forbidden by the 1925 Geneva Protocol. It is often difficult to draw a line between research on pathogens for defence purposes and research on developing pathogens for biological warfare itself; much of the information produced in the two endeavours is similar. Furthermore, scientists have frequently trodden a narrow line between advocating the development of certain research findings, and denouncing the wartime, political or societal use of the resulting technology.
On the second day of the EMBO/EMBL Joint Conference, the reflections turned to the science and technology of identifying individuals using biometrics and DNA profiles, and their advantages and dangers for individuals and society. The USA, for example, has passed laws requiring that any passport issued after October 2007 must contain biometric information, such as a fingerprint or an iris scan, in addition to the passport holder's photo. In fact, biometrics, such as iris‐ or face‐scanning technology or fingerprinting, is rapidly becoming commonplace for controlling access to public and private locations. Meanwhile, the UK is expanding its National DNA Database to contain not only the sequences of convicted criminals, but also those of dismissed suspects, which causes much concern among advocates of civil liberties.
Whereas the probability of misidentifying someone has diminished, the pressure to assure people that they are safe has never been greater. Pressure leads to mistakes, and even scientifically based methods for increasing security have limitations that can unwittingly, or even wittingly, lead to wrong convictions. Of more general concern, however, is the concept that our biological identity and everything we do can be recorded and stored digitally for some, as yet undefined, use. How should society proceed? Clearly, in democracies, the views of all sides must be heard, and scientists should certainly be represented and present their view of the future and how their research contributes to it. But should scientists remain dispassionate, communicating only scientific facts and probabilities?
Today the moral convictions of scientists are increasingly in the public eye. The pressures of funding and publication are prone to create dilemmas, the solutions to which are often very personal decisions. Scientists are, after all, people like anyone else. Although talk of a Hippocratic‐style oath for scientists continues, it must be said that the scientific community in general has been very aware of ethical and safety questions, addressing them through initiatives from the Pugwash Conferences (founded in 1955) and the 1975 Asilomar Conference, to the establishment of Scientists for Global Responsibility in 1992. Perhaps the most successful approach is simply discussing such concerns. The involvement of scientists in debates on the use of technologies is an important symbol in itself; as important as the involvement of the general public.
Whether it is by examining research practice in the laboratory, or assessing the risk to public safety from publication or application of research findings, it is clear that scientists must engage themselves in these discussions. If they stand back from the process, the results will inevitably be rules, regulations and even legislation with which they might not agree, which do not reflect scientific realities, and which hinder research to the detriment of mankind. We hope that the articles in this special issue of EMBO reports, arising from the EMBO/EMBL Joint Conference on ‘Science and Security’, stimulate creative thought and reflection. Perhaps through this publication, a single event will become an expanding ripple.
- Copyright © 2006 European Molecular Biology Organization