Sunday, August 1, 2010

A higher degree of ethics in science?


"Why must scientists become more ethically sensitive than they used to be?"

by

John Ziman

[John Ziman was brought up in New Zealand, studied at Oxford, and lectured at Cambridge, before becoming professor of theoretical physics at the University of Bristol in 1964. He was chairman of the Council for Science and Society from 1976 to 1990, and has written extensively on various aspects of the social relations of science and technology.]

Fifty years ago when I came into science, we rarely talked about ethical issues. I don't mean that there were no such issues, or that scientists were not, individually or in unofficial groups, speaking and acting about them. But ethics as such did not figure regularly in public discourse about science, in or beyond the scientific world.

And yet nowadays, the ethics of science not only occupies media slots and Sunday supplements. It also energizes scholarly books, journals, conferences and curricula. Having spent most of my life urging my colleagues to be more "socially responsible," I am not unhappy about this. But how did this abrupt change of attitude come about? Why are scientists now expected to be so much more ethically sensitive than they used to be?

Some would see this as no more than a natural consequence of the increasing influence of science on society, magnified, perhaps by media frenzy. Others see it as the latest battle front in the perennial "science wars." But I go further and interpret it as symptomatic of the transformation of science into a new type of social institution. As their products become more tightly woven into the social fabric, scientists are having to perform new roles in which ethical considerations can no longer be swept aside.

Fifty years ago the world of science was divided into two types of institutions.* In universities and in many publicly funded research organizations people practiced "academic science"; in industrial and governmental research and development laboratories they practiced "industrial science." These were two distinct cultures, closely linked in many ways, but dealing with ethical issues quite differently.

Academic science was intensely individualistic. People held personal appointments earned by published contributions to knowledge. Universities and research institutes had little direct influence on their research. Academic employees decided for themselves what they would investigate and how they would go about it. The only constraint--an immensely powerful one in practice--was that the results of their research would be closely scrutinized by other members of one of the innumerable specialized research communities that partition the scientific world.

Academic scientists belonged to a worldwide institutional web. The production of reliable public knowledge was so loosely organized that it almost seemed like the anarchist's dream: an active, orderly republic of free-born citizens with no central government. It functioned through a number of well-established practices such as peer review, respect for priority of discovery, comprehensive citation of the literature, meritocratic preferment on the basis of research performance, and so on. Although these practices were never formally codified or systematically enforced, they geared smoothly together. In 1942 Robert Merton argued that this was because they satisfied a set of "norms" that together constitute an "ethos" for science. Merton's analysis was highly idealized, and is rejected by most present-day sociologists. Nevertheless, I believe that it still provides the best theoretical framework for an understanding of how these practices interact to produce the sort of knowledge that we recognize as peculiarly "scientific."

Paradoxically, however, this "ethos" has practically no conventional "ethical" dimension. At most, it defines a basic structure for a perfectly democratic, universal "speech community." While this is an essential prerequisite for ethical debate, such debate is banished from academic science itself by Merton's norm of "disinterestedness." In pursuit of complete "objectivity"--admittedly a major virtue--the norm rules that all research results should be conducted, presented, and discussed quite impersonally, as if produced by androids or angels.

But ethical issues always involve human "interests." Ethics is not just an abstract intellectual discipline. It is about the conflicts that arise in trying to meet real human needs and values. The official ethos of academic science systematically shuts out all such considerations.

Actually, this norm is not activated against one major human interest--the quest for knowledge. Scientists are certainly not supposed to be "disinterested" about the promotion of their own discoveries or the advancement of knowledge in general. In fact, this interest is often given priority over other, less exalted, concerns, such as the welfare of experimental animals, and even over wider human interests such as the long-term consequences of publishing research that might be used for evil.

The important point is that this "no ethics" principle is not just an obsolete module that can be uninstalled with a keystroke. It is an integral part of a complex cultural form. Merton's norms combine in various ways to motivate and license a wide range of practices and processes. There is no space between them for any other values or virtues than supposedly objective, disinterested truth. Academic scientists have always, of course, brought ethical considerations into their scientific work. But they have had to smuggle them in from private life, from politics, from religion, or from sheer humanitarian sympathy. And even now, many fine scientists instinctively resent the intrusion of this troublesome element into their orderly, committed way of life.

Now take industrial science. This has essentially the same knowledge base as academic science, but is sociologically quite distinct. Its structural principles are not uncodified norms since they are explicitly enforced by the corporate bodies, private and public, that pay scientists to work for them. I am not saying that these principles are completely antithetical to the academic ethos, but that there are certainly many contrasts. One is that industrial scientists do not, in general, "own" their research in the sense of undertaking projects of their own choosing and being free to publish their results entirely on their own initiative.

Industrial science is not just a subsidiary to academic science. It is a parallel culture in which talented persons use good science to produce valuable knowledge. But notice, once again, that there is no ethical term in its social algorithm. It is true that a specialized group of industrial scientists may come together to formulate a professional code covering various aspects of their work, and such a code may have strong indirect ethical implications such as explicit concern for public safety and human welfare. Yet it is not intrinsic to the research culture, and remains subject to their contractual obligations as hired hands and brains.

Yet industrial science--from agriculture through mental medicine, and missile manufacture to zookeeping--is intimately involved in the business of daily life. The personal values and needs of customers, patients, and other users have to be taken into account. Supposedly technical problems almost always have ethical aspects. Industrial scientists are much more likely to encounter ethical dilemmas than their academic contemporaries, and are not screened from them by any doctrine of "objectivity."

The trouble is that industrial scientists do not actually have a direct say in how these dilemmas are solved. This responsibility legally rests with their corporate employers, who are seldom scientists themselves. Indeed, for most industrial scientists, an active concern about ethical issues is just asking for trouble. Better to treat the welfare of their firm or country as the supreme good. Like academic scientists, they too feel emotionally more secure if they can keep "ethics" out of their scientific work.

Of course industrial scientists should not take jobs with firms or government agencies whose policies and practices are ethically unacceptable. Of course they should resign, or even blow a whistle of warning, if required to do unethical work. Of course, like other subordinates, they cannot escape personal blame for crimes committed on the orders of higher authorities. But these are moral dilemmas that are not specific to science or scientists, as such.

This division of science into two distinct cultural traditions, located in different types of institution, is highly schematic. Nevertheless, it shows that science has, as a whole, been insulated from ethics for two quite distinct reasons. On the one hand, academic scientists are supposed to be indifferent to the potential consequences of their work. On the other hand, industrial scientists do work whose consequences are considered too serious to be left in their hands.

In recent years, however, these two cultures have begun to merge. This is a complex, pervasive, irreversible process, driven by forces that are not yet well understood. The hybrid research culture that is now emerging has been called by some scholars "Mode 2," to differentiate it from the more traditional style of "Mode 1." I prefer to call it "post-academic," to show that it outwardly preserves many academic practices and is still partially located in "academia."

My point is that post-academic science has features that make nonsense of the traditional barriers between science and ethics. As we have seen, the two separate reasons for keeping ethical considerations out of the two separate scientific traditions are essentially inconsistent. Applied simultaneously to this new hybrid culture, they do not reinforce each other but tend to cancel each other out.

For example, post-academic research is usually undertaken as a succession of "projects," each justified in advance to a funding body whose members are usually not scientists. As the competition for funds intensifies, project proposals are forced to become more and more specific about the expected outcomes of the research, including its wider economic and social impact. This is no longer a matter for individual researchers to determine for themselves. Universities and research institutes are no longer deemed to be devoted entirely to the pursuit of knowledge "for its own sake." They are encouraged to seek industrial funding for commissioned research, and to exploit to the full any patentable discoveries made by their academic staffs--especially when there is a smell of commercial profit in the wind.

Indeed, it is argued that all Mode 2 research stems from problems "arising in the context of application." This does not mean that basic science will disappear. The path to the solution of many urgent and practical problems, such as finding a cure for AIDS, surely lies through many remote and apparently irrelevant domains of fundamental research. But the mere fact that such paths can be traced back into past human needs, and forward into a future where these needs might be met, gives them an explicit ethical dimension. Even the "purest," "most basic" research is thus endowed with potential human consequences, so that researchers are bound to ask themselves whether all the goals of the activity in which they are engaged are consistent with their other personal values.

For most industrial scientists the situation has probably not much changed. But the typical post-academic role of the independent scientific entrepreneur compounds moral risks with financial risks, and does not permit ethical problems to be pushed upstairs to non-scientific corporate managers. Should such scientists remain bound by the academic ethos that they tacitly acknowledged when they earned their Ph.D.'s?

Another feature of post-academic science is that it is largely the work of teams of scientists, often networked over a number of different institutions. Where, then, do the ethical responsibilities lie? Should the nominal leader be blamed for dishonest work by a junior member? What ethical code should apply to a team that includes scientists from both academia and industry? And to further complicate the problem, teams are often temporary. How will ethical considerations operate in such heterogeneous and evanescent settings?

These are only some examples of the way that the transition to post-academic science is forcing scientists to become more sensitive to ethical issues. One of the virtues of the new mode of knowledge production is that it cannot brush its ethical problems under the carpet. Science can no longer be "in denial" of matters that many of us have long tried to bring to the fore.

No comments: