ON BEING A SCIENTIST
RESPONSIBLE CONDUCT IN RESEARCH

SECOND EDITION

COMMITTEE ON SCIENCE, ENGINEERING, AND PUBLIC POLICY
NATIONAL ACADEMY OF SCIENCES
NATIONAL ACADEMY OF ENGINEERING
INSTITUTE OF MEDICINE

NATIONAL ACADEMY PRESS
Washington, D.C. 1995

scource document in ASCII


NOTICE: This volume was produced as part of a project approved by the Governing Board of the National Research Council, whose members are drawn from the councils of the National Academy of Sciences, the National Academy of Engineering, and the Institute of Medicine. It is a result of work done by the Committee on Science, Engineering, and Public Policy (COSEPUP) which has authorized its release to the public. This report has been reviewed by a group other than the authors according to procedures approved by COSEPUP and the Report Review Committee.

FINANCIAL SUPPORT: The development of this document was supported by grants from the Howard Hughes Medical Institute and the Alfred P. Sloan Foundation. Support for dissemination of this document was provided by the following corporations and disciplinary societies: Bristol Myers Squibb Company, Glaxo Research Institute, SmithKline Beecham Corp., Sigma Xi, the Federation of American Societies for Experimental Biology, the American Society for Microbiology, the American Chemical Society, the American Institute for Biological Sciences, the American Sociological Association, the American Statistical Association , the Association of American Medical Colleges, the American Institute of Physics, and the American Physical Society. Additional support was provided by the Basic Science Fund of the National Academy of Sciences, whose contributors include the AT&T Foundation, Atlantic Richfield Foundation, BP America, Dow Chemical Company, E.I. du Pont de Nemours & Co., IBM Corporation, Merck and Company, Inc., Monsanto Company, and Shell Oil Companies Foundation.


Copyright © 1995 by the National Academy of Sciences. All rights reserved. This document may be reproduced solely for educational purposes without the written permission of the National Academy of Sciences.


INTERNET ACCESS: This report is available on the National Academy of Sciences' Internet host. It may be accessed via World Wide Web at http://www.nas.edu, via Gopher at gopher.nas.edu, or via FTP at ftp.nas.edu.


Printed Copies of "On Being a Scientist" are available as follows:

Quantity

1

2-9

10 or more

Price

$5.00 each

$4.00 each

$2.50 each

Order from: National Academy Press, 2101 Constitution Ave., N.W. Washington, D.C. 20418. All orders must be prepaid with delivery to a single address. No additional discounts apply. Prices are subject to change without notice. To order by credit card, call 1-800-624-6242.


ON THE COVER: The cover depicts the names of some of the scientists who have been awarded the Nobel Prize. The design of the cover and the report was done by Isely &/or Clark Design.

PHOTOGRAPH CREDITS: Calar Alto Observatory (GIF Image 8); Ira Wexler/College of Engineering/University of Maryland (GIF Image 6); National Library of Medicine/National Institutes of Health (GIF Image 10); U.S Department of Agriculture (GIF Images 1,2, 3, 4, 5, 6, 7, 9).

International Standard Book Number 0-309-05196-7

Printed in the United States of America


COMMITTEE ON SCIENCE, ENGINEERING, AND PUBLIC POLICY

Phillip A. Griffiths
(Chair), Director, Institute for Advanced Study Robert McCormick Adams Secretary Emeritus, Smithsonian Institution

Bruce M. Alberts
President, National Academy of Sciences

Elkan R. Blout
Harkness Professor, Department of Biological Chemistry and Molecular Pharmacology, Harvard Medical School

Felix E. Browder
University Professor, Department of Mathematics, Rutgers University

David R. Challoner, M.D.
Vice President of Health Affairs, University of Florida

Albert F. Cotton
Distinguished Professor of Chemistry (term ending 6/94)

Ellis B. Cowling
Director, Southern Oxidants Study, School of Forest Resources, North Carolina State University Bernard N. Fields, M.D.

Adele Lehman
Professor; Chairman, Department of Microbiology and Molecular Genetics, Harvard Medical School

Alexander H. Flax
Senior Fellow, National Academy of Engineering

Ralph E. Gomory
President, Alfred P. Sloan Foundation

Thomas D. Larson
Consultant

Mary J. Osborn
Head, Department of Microbiology, University of Connecticut Health Center

C. Kumar N. Patel
Vice Chancellor, Research Programs, University of California, Los Angeles (term ending 6/94)

Phillip A. Sharp
Head, Department of Biology, Center for Cancer Research, Massachusetts Institute of Technology Kenneth I. Shine President, Institute of Medicine

Robert M. Solow
Institute Professor, Department of Economics, Massachusetts Institute of Technology (term ending 6/94)

H. Guyford Stever
Member, Carnegie Commission on Science and Technology (term ending 6/94)

Morris Tanenbaum
Vice President, National Academy of Engineering

Robert M. White
President, National Academy of Engineering


Lawrence E. McCray
Executive Director


PRINCIPAL PROJECT STAFF

Steve Olson, Consultant/Writer

Deborah D. Stine, Project Director


The Committee on Science, Engineering and Public Policy (COSEUP) is a joint committee of the National Academy of Sciences, the National Academy of Engineering, and the Institute of Medicine. It includes members of the councils of all three bodies.

The National Academy of Sciences (NAS) is a private, nonprofit, self-perpetuating society of distinguished scholars engaged in scientific and engineering research, dedicated to the furtherance of science and technology and to their use for the general welfare. Upon the authority of the charter granted to it by the Congress in 1863, the Academy has a mandate that requires it to advise the federal government on scientific and technical matters. Dr. Bruce M. Alberts is the president of the NAS.

The National Academy of Engineering (NAE) was established in 1964, under the charter of the National Academy of Sciences, as a parallel organization of outstanding engineers. It is autonomous in its administration and in the selection of its members, sharing with the National Academy of Sciences the responsibility for advising the federal government. The National Academy of Engineering also sponsors engineering programs aimed at meeting national needs, encourages education and research, and recognizes the superior achievements of engineers. Dr. Robert M. White is president of the NAE.

The Institute of Medicine (IOM) was established in 1970 by the National Academy of Sciences to secure the services of eminent members of appointed professions for the examination of policy matters pertaining to the health of the public. The Institute acts under the responsibility given to the National Academy of Sciences in 1863 by its charter to be an advisor to the federal government and, upon its own initiative, to study problems of medical care, research, and education. Dr. Kenneth I. Shine is president of the IOM.


PREFACE

The scientific research enterprise, like other human activities, is built on a foundation of trust. Scientists trust that the results reported by others are valid. Society trusts that the results of research reflect an honest attempt by scientists to describe the world accurately and without bias. The level of trust that has characterized science and its relationship with society has contributed to a period of unparalleled scientific productivity. But this trust will endure only if the scientific community devotes itself to exemplifying and transmitting the values associated with ethical scientific conduct.

In the past, young scientists learned the ethics of research largely through informal means-by working with senior scientists and watching how they dealt with ethical questions. That tradition is still vitally important. But science has become so complex and so closely intertwined with society's needs that a more formal introduction to research ethics and the responsibilities that these commitments imply is also needed-an introduction that can supplement the informal lessons provided by research supervisors and mentors.

The original "On Being a Scientist," published by the National Academy of Sciences in 1989, was designed to meet that need. Written for beginning researchers, it sought to describe the ethical foundations of scientific practices and some of the personal and professional issues that researchers encounter in their work. It was meant to apply to all forms of research-whether in academic, industrial, or governmental settings-and to all scientific disciplines. Over 200,000 copies of the booklet were distributed to graduate and undergraduate science students. It continues to be used today in courses, seminars, and informal discussions.

Much has happened in the six years since "On Being a Scientist" first appeared. Research institutions and federal agencies have developed important new policies for dealing with behaviors that violate the ethical standards of science. A distinguished panel convened by the National Academies of Sciences and Engineering and the Institute of Medicine issued a major report on research conduct entitled Responsible Science: Ensuring the Integrity of the Research Process. Continued questions have reemphasized the importance of the ethical decisions that researchers must make.

To reflect the developments of the last six years, the National Academy complex is issuing this new version of "On Being a Scientist." This version incorporates new material from Responsible Science and other recent reports. It reflects suggestions from readers of the original booklet, from instructors who used the original booklet in their classes and seminars, and from graduate students and professors who critiqued drafts of the revision. This version of "On Being a Scientist" also includes a number of hypothetical scenarios, which have proved in recent years to provide an effective means of presenting research ethics. An appendix at the end of the booklet offers guidance in thinking about and discussing these scenarios, but the scenarios remain essentially open-ended. As is the case for the entire document, input from readers is welcomed.

Though "On Being a Scientist" is aimed primarily at graduate students and beginning researchers, its lessons apply to all scientists at all stages of their scientific careers. In particular, senior scientists have a special responsibility in upholding the highest standards for conduct, serving as role models for students and young scientists, designing educational programs, and responding to alleged violations of ethical norms. Senior scientists can themselves gain a new appreciation for the importance of ethical issues by discussing with their students what had previously been largely tacit knowledge. In the process, they help provide the leadership that is essential for high standards of conduct to be maintained.

The original "On Being a Scientist" was produced under the auspices of the National Academy of Sciences by the Committee on the Conduct of Science, which consisted of Robert McCormick Adams, Francisco Ayala (chairman), Mary-Dell Chilton, Gerald Holton, David Hull, Kumar Patel, Frank Press, Michael Ruse, and Phillip Sharp. Several members of that committee were involved directly in the revision of the booklet, and the others were consulted during the revision and reviewed the resulting document.

This new version of the booklet was prepared under the auspices of the Committee on Science, Engineering, and Public Policy, which is a joint committee of the National Academies of Sciences and Engineering and the Institute of Medicine. The revision was overseen by a guidance group consisting of Robert McCormick Adams, David Challoner, Bernard Fields, Kumar Patel, Frank Press, and Phillip Sharp (group chairman).

The future of science depends on attracting outstanding young people to research-not only people of enormous energy and talent but people of strong character who will be tomorrow's leaders. It is incumbent on all scientists and all administrators of science to help provide a research environment that, through its adherence to high ethical standards and creative productivity, will attract and retain individuals of outstanding intellect and character to one of society's most important professions.

Bruce Alberts
President, National Academy of Sciences

Kenneth Shine
President, Institute of Medicine

Robert White
President, National Academy of Engineering


ACKNOWLEDGMENTS

The committee thanks the graduate students of Boston University, the Massachusetts Institute of Technology, and the University of California, Irvine, who participated in focus group sessions which provided invaluable feedback on earlier drafts of the document, as well as Charles Cantor, Frank Solomon, and F. Sherwood Rowland, who sponsored those sessions at the respective institutions.

In addition, the committee thanks a number of individuals who teach research ethics and provided guidance on earlier drafts as to the "teachability" of the document, especially: Joan Steitz, Caroline Whitbeck, Penny Gilmer, Michael Zigmond, Frank Solomon, and Indira Nair.

Finally, the committee thanks its able staff: Steve Olson, science writer, whose help in drafting this revision was invaluable; Deborah Stine, who managed the project and ran the focus groups on the document; and Jeffrey Peck and Patrick Sevcik, who provided administrative support at various stages.


A NOTE ON USING THIS BOOKLET

This booklet makes the point that scientific knowledge is defined collectively through discussion and debate. Collective deliberation is also the best procedure to apply in using this booklet. Group discussion-whether in seminars, orientations, research settings, or informal settings - can demonstrate how different individuals would react in specific situations, often leading to conclusions that no one would have arrived at individually.

These observations apply with particular force to the hypothetical scenarios in this booklet. Each scenario concludes with a series of questions, but these questions have many answers - some better, some worse - rather than a single right answer. An appendix at the end of this booklet examines specific issues involved in several of the scenarios as a way of suggesting possible topics for consideration and discussion.

This booklet has been prepared for use in many different settings, including:

- Classes on research ethics

- Classes on research methods or statistics - Classes on the history, sociology, or philosophy of science

- Seminars to discuss research practices or results

- Meetings sponsored by scientific societies on a local, regional, or national level

- Meetings held to develop ethics policies or guidelines for a specific laboratory or institution

- Orientation sessions

- Journal clubs

A useful format in any of these situations is to have a panel discussion involving three or four researchers who are at different stages of their careers-for example, a graduate student, a postdoctoral fellow, a junior faculty member, and a senior faculty member. Such panels can identify the ambiguities in a problem situation, devise ways to get the information needed to resolve the ambiguities, and demonstrate the full range of perspectives that are involved in ethical deliberations. They can also show how institutional policies and resources can influence an individual's response to a given situation, which will emphasize the importance for all researchers to know what those institutional policies and resources are.

Finally, discussion of these issues with a broad range of researchers can demonstrate that research ethics is not a complete and finalized body of knowledge. These issues are still being discussed, explored, and debated, and all researchers have a responsibility to move the discussion forward.


INTRODUCTION

The geneticist Barbara McClintock once said of her research, "I was just so interested in what I was doing I could hardly wait to get up in the morning and get at it. One of my friends, a geneticist, said I was a child, because only children can't wait to get up in the morning to get at what they want to do."

Anyone who has experienced the childlike wonder evoked by observing or understanding something that no one has ever observed or understood before will recognize McClintock's enthusiasm. The pursuit of that experience is one of the forces that keep researchers rooted to their laboratory benches, climbing through the undergrowth of a sweltering jungle, or following the threads of a difficult theoretical problem. To succeed in research is a personal triumph that earns and deserves individual recognition. But it is also a communal achievement, for in learning something new the discoverer both draws on and contributes to the body of knowledge held in common by all scientists.

Scientific research offers many other satisfactions in addition to the exhilaration of discovery. Researchers have the opportunity to associate with colleagues who have made important contributions to human knowledge, with peers who think deeply and care passionately about subjects of common interest, and with students who can be counted on to challenge assumptions. With many important developments occurring in areas where disciplines overlap, scientists have many opportunities to work with different people, explore new fields, and broaden their expertise. Researchers often have considerable freedom both in choosing what to investigate and in deciding how to organize their professional and personal lives. They are part of a community based on ideals of trust and freedom, where hard work and achievement are recognized as deserving the highest rewards. And their work can have a direct and immediate impact on society, which ensures that the public will have an interest in the findings and implications of research.

Research can entail frustrations and disappointments as well as satisfactions. An experiment may fail because of poor design, technical complications, or the sheer intractability of nature. A favored hypothesis may turn out to be incorrect after consuming months of effort. Colleagues may disagree over the validity of experimental data, the interpretation of results, or credit for work done. Difficulties such as these are virtually impossible to avoid in science. They can strain the composure of the beginning and senior scientist alike. Yet struggling with them can also be a spur to important progress.

Scientific progress and changes in the relationship between science and society are creating new challenges for the scientific community. The numbers of trained researchers and exciting research opportunities have grown faster than have available financial resources, which has increased the pressure on the research system and on individual scientists. Research endeavors are becoming larger, more complex, and more expensive, creating new kinds of situations and relationships among researchers. The conduct of research is more closely monitored and regulated than it was in the past. The part played by science in society has become more prominent and more complex, with consequences that are both invigorating and stressful.

To nonscientists, the rich interplay of competition, elation, frustration, and cooperation at the frontiers of scientific research seems paradoxical. Science results in knowledge that is often presented as being fixed and universal. Yet scientific knowledge obviously emerges from a process that is intensely human, a process indelibly shaped by human virtues, values, and limitations and by societal contexts. How is the limited, sometimes fallible, work of individual scientists converted into the enduring edifice of scientific knowledge?

The answer lies partly in the relationship between human knowledge and the physical world. Science has progressed through a uniquely productive marriage of human creativity and hard-nosed skepticism, of openness to new scientific contributions and persistent questioning of those contributions and the existing scientific consensus. Based on their observations and their ideas about the world, researchers make new observations and develop new ideas that seem to describe the physical, biological, or social world more accurately or completely. Scientists engaged in applied research may have more utilitarian aims, such as improving the reliability of a semiconductor chip. But the ultimate effect of their work is the same: they are able to make claims about the world that are subject to empirical tests.

The empirical objectivity of scientific claims is not the whole story, however. As will be described in a moment, the reliability of scientific knowledge also derives partly from the interactions among scientists themselves. In engaging in these social interactions, researchers must call on much more than just their scientific understanding of the world. They must also be able to convince a community of peers of the correctness of their concepts, which requires a fine understanding of the methods, techniques, and social conventions of science.

By considering many of the hard decisions that researchers make in the course of their work, this booklet examines both the epistemological and social dimensions of scientific research. It looks at such questions as: How should anomalous data be treated? How do values influence research? How should credit for scientific accomplishments be allocated? What are the borderlines between honest error, negligent error, and misconduct in science?

These questions are of interest to more than just the scientific community. As the influence of scientific knowledge has grown throughout society, nonscientists have acquired a greater interest in assessing the validity of the claims of science. With science becoming an increasingly important social institution, scientists have become more accountable to the broader society that expects to benefit from their work.


THE SOCIAL FOUNDATIONS OF SCIENCE

Throughout the history of science, philosophers and scientists have sought to describe a single systematic procedure that can be used to generate scientific knowledge, but they have never been completely successful. The practice of science is too multifaceted and its practitioners are too diverse to be captured in a single overarching description. Researchers collect and analyze data, develop hypotheses, replicate and extend earlier work, communicate their results with others, review and critique the results of their peers, train and supervise associates and students, and otherwise engage in the life of the scientific community.

Science is also far from a self-contained or self- sufficient enterprise. Technological developments critically influence science, as when a new device, such as a telescope, microscope, rocket, or computer, opens up whole new areas of inquiry. Societal forces also affect the directions of research, greatly complicating descriptions of scientific progress.

Another factor that confounds analyses of the scientific process is the tangled relationship between individual knowledge and social knowledge in science. At the heart of the scientific experience is individual insight into the workings of nature. Many of the outstanding achievements in the history of science grew out of the struggles and successes of individual scientists who were seeking to make sense of the world.

At the same time, science is inherently a social enterprise-in sharp contrast to a popular stereotype of science as a lonely, isolated search for the truth. With few exceptions, scientific research cannot be done without drawing on the work of others or collaborating with others. It inevitably takes place within a broad social and historical context, which gives substance, direction, and ultimately meaning to the work of individual scientists.

The object of research is to extend human knowledge of the physical, biological, or social world beyond what is already known. But an individual's knowledge properly enters the domain of science only after it is presented to others in such a fashion that they can independently judge its validity. This process occurs in many different ways. Researchers talk to their colleagues and supervisors in laboratories, in hallways, and over the telephone. They trade data and speculations over computer networks. They give presentations at seminars and conferences. They write up their results and send them to scientific journals, which in turn send the papers to be scrutinized by reviewers. After a paper is published or a finding is presented, it is judged by other scientists in the context of what they already know from other sources. Throughout this continuum of discussion and deliberation the ideas of individuals are collectively judged, sorted, and selectively incorporated into the consensual but ever evolving scientific worldview. In the process, individual knowledge is gradually converted into generally accepted knowledge.

This ongoing process of review and revision is critically important. It minimizes the influence of individual subjectivity by requiring that research results be accepted by other scientists. It also is a powerful inducement for researchers to be critical of their own conclusions because they know that their objective must be to try to convince their ablest colleagues.

The social mechanisms of science do more than validate what comes to be known as scientific knowledge. They also help generate and sustain the body of experimental techniques, social conventions, and other "methods" that scientists use in doing and reporting research. Some of these methods are permanent features of science; others evolve over time or vary from discipline to discipline. Because they reflect socially accepted standards in science, their application is a key element of responsible scientific practice.


"Scientists are people of very dissimilar temperaments doing different things in very different ways. Among scientists are collectors, classifiers and compulsive tidiers-up; many are detectives by temperament and many are explorers; some are artists and others artisans. There are poet-scientists and philosopher-scientists and even a few mystics."

- Peter Medawar. Pluto's Republic, Oxford University Press, New York, 1982, p. 116.


EXPERIMENTAL TECHNIQUES AND THE TREATMENT OF DATA

One goal of methods is to facilitate the independent verification of scientific observations. Thus, many experimental techniques-such as statistical tests of significance, double-blind trials, or proper phrasing of questions on surveys-have been designed to minimize the influence of individual bias in research. By adhering to these techniques, researchers produce results that others can more easily reproduce, which promotes the acceptance of those results into the scientific consensus.

If research in a given area does not use generally accepted methods, other scientists will be less likely to accept the results. This was one of several reasons why many scientists reacted negatively to the initial reports of cold fusion in the late 1980s. The claims were so physically implausible that they required extraordinary proof. But the experiments were not initially presented in such a way that other investigators could corroborate or disprove them. When the experimental techniques became widely known and were replicated, belief in cold fusion quickly faded.

In some cases the methods used to arrive at scientific knowledge are not very well defined. Consider the problem of distinguishing the "facts" at the forefront of a given area of science. In such circumstances experimental techniques are often pushed to the limit, the signal is difficult to separate from the noise, unknown sources of error abound, and even the question to be answered is not well defined. In such an uncertain and fluid situation, picking out reliable data from a mass of confusing and sometimes contradictory observations can be extremely difficult.

In this stage of an investigation, researchers have to be extremely clear, both to themselves and to others, about the methods being used to gather and analyze data. Other scientists will be judging not only the validity of the data but also the validity and accuracy of the methods used to derive those data. The development of new methods can be a controversial process, as scientists seek to determine whether a given method can serve as a reliable source of new information. If someone is not forthcoming about the procedures used to derive a new result, the validation of that result by others will be hampered.

Methods are important in science, but like scientific knowledge itself, they are not infallible. As they evolve over time, better methods supersede less powerful or less acceptable ones. Methods and scientific knowledge thus progress in parallel, with each area of knowledge contributing to the other.

A good example of the fallibility of methods occurred in astronomy in the early part of the twentieth century. One of the most ardent debates in astronomy at that time concerned the nature of what were then known as spiral nebulae-diffuse pinwheels of light that powerful telescopes revealed to be quite common in the night sky. Some astronomers thought that these nebulae were spiral galaxies like the Milky Way at such great distances from the earth that individual stars could not be distinguished. Others believed that they were clouds of gas within our own galaxy.

One astronomer who thought that spiral nebulae were within the Milky Way, Adriaan van Maanen of the Mount Wilson Observatory, sought to resolve the issue by comparing photographs of the nebulae taken several years apart. After making a series of painstaking measurements, van Maanen announced that he had found roughly consistent unwinding motions in the nebulae. The detection of such motions indicated that the spirals had to be within the Milky Way, since motions would be impossible to detect in distant objects.

Van Maanen's reputation caused many astronomers to accept a galactic location for the nebulae. A few years later, however, van Maanen's colleague Edwin Hubble, using the new 100-inch telescope at Mount Wilson, conclusively demonstrated that the nebulae were in fact distant galaxies; van Maanen's observations had to be wrong. Studies of van Maanen's procedures have not revealed any intentional misrepresentation or sources of systematic error. Rather, he was working at the limits of observational accuracy, and his expectations influenced his measurements.

Though van Maanen turned out to be wrong, he was not ethically at fault. He was using methods that were accepted by the astronomical community as the best available at the time, and his results were accepted by most astronomers. But in hindsight he relied on a technique so susceptible to observer effects that even a careful investigator could be misled.

The fallibility of methods is a valuable reminder of the importance of skepticism in science. Scientific knowledge and scientific methods, whether old or new, must be continually scrutinized for possible errors. Such skepticism can conflict with other important features of science, such as the need for creativity and for conviction in arguing a given position. But organized and searching skepticism as well as an openness to new ideas are essential to guard against the intrusion of dogma or collective bias into scientific results.


THE SELECTION OF DATA

Deborah, a third-year graduate student, and Kathleen, a postdoc, have made a series of measurements on a new experimental semiconductor material using an expensive neutron source at a national laboratory. When they get back to their own laboratory and examine the data, they get the following data points(see GIF Figure). A newly proposed theory predicts results indicated by the curve.

During the measurements at the national laboratory, Deborah and Kathleen observed that there were power fluctuations they could not control or predict. Furthermore, they discussed their work with another group doing similar experiments, and they knew that the other group had gotten results confirming the theoretical prediction and was writing a manuscript describing their results.

In writing up their own results for publication, Kathleen suggests dropping the two anomalous data points near the abscissa (the solid squares) from the published graph and from a statistical analysis. She proposes that the existence of the data points be mentioned in the paper as possibly due to power fluctuations and being outside the expected standard deviation calculated from the remaining data points. "These two runs," she argues to Deborah, "were obviously wrong."

1. How should the data from the two suspected runs be handled?

2. Should the data be included in tests of statistical significance and why?

3. What other sources of information, in addition to their faculty advisor, can Deborah and Kathleen use to help decide?


VALUES IN SCIENCE

Scientists bring more than just a toolbox of techniques to their work. Scientist must also make complex decisions about the interpretation of data, about which problems to pursue, and about when to conclude an experiment. They have to decide the best ways to work with others and exchange information. Taken together, these matters of judgment contribute greatly to the craft of science, and the character of a person's individual decisions helps determine that person's scientific style (as well as, on occasion, the impact of that person's work).

Much of the knowledge and skill needed to make good decisions in science is learned through personal experience and interactions with other scientists. But some of this ability is hard to teach or even describe. Many of the intangible influences on scientific discovery-curiosity, intuition, creativity-largely defy rational analysis, yet they are among the tools that scientists bring to their work.

When judgment is recognized as a scientific tool, it is easier to see how science can be influenced by values. Consider, for example, the way people judge between competing hypotheses. In a given area of science, several different explanations may account for the available facts equally well, with each suggesting an alternate route for further research. How do researchers pick among them?

Scientists and philosophers have proposed several criteria by which promising scientific hypotheses can be distinguished from less fruitful ones. Hypotheses should be internally consistent so that they do not generate contradictory conclusions. Their ability to provide accurate experimental predictions, sometimes in areas far removed from the original domain of the hypothesis, is viewed with great favor. With disciplines in which experimentation is less straightforward, such as geology, astronomy, or many of the social sciences, good hypotheses should be able to unify disparate observations. Also highly prized are simplicity and its more refined cousin, elegance.

Other kinds of values also come into play in science. Historians, sociologists, and other students of science have shown that social and personal beliefs-including philosophical, thematic, religious, cultural, political, and economic beliefs-can shape scientific judgment in fundamental ways. For example, Einstein's rejection of quantum mechanics as an irreducible description of nature-summarized in his insistence that "God does not play dice"-seems to have been based largely on an aesthetic conviction that the physical universe could not contain such an inherent component of randomness. The nineteenth-century geologist Charles Lyell, who championed the idea that geological change occurs incrementally rather than catastrophically, may have been influenced as much by his religious views as by his geological observations. He favored the notion of a God who is an unmoved mover and does not intervene in His creation. Such a God, thought Lyell, would produce a world in which the same causes and effects keep cycling eternally, producing a uniform geological history.

Does holding such values harm a person's science? In some cases the answer has to be "yes." The history of science offers a number of episodes in which social or personal beliefs distorted the work of researchers. The field of eugenics used the techniques of science to try to demonstrate the inferiority of certain races. The ideological rejection of Mendelian genetics in the Soviet Union beginning in the 1930s crippled Soviet biology for decades.

Despite such cautionary episodes, it is clear that values cannot-and should not-be separated from science. The desire to do good work is a human value. So is the conviction that standards of honesty and objectivity need to be maintained. The belief that the universe is simple and coherent has led to great advances in science. If researchers did not believe that the world can be described in terms of a relatively small number of fundamental principles, science would amount to no more than organized observation. Religious convictions about the nature of the universe have also led to important scientific insights, as in the case of Lyell discussed above.

The empirical link between scientific knowledge and the physical, biological, and social world constrains the influence of values in science. Researchers are continually testing their theories about the world against observations. If hypotheses do not accord with observations, they will eventually fall from favor (though scientists may hold on to a hypothesis even in the face of some conflicting evidence since sometimes it is the evidence rather than the hypothesis that is mistaken).

The social mechanisms of science also help eliminate distorting effects that personal values might have. They subject scientific claims to the process of collective validation, applying different perspectives to the same body of observations and hypotheses.

The challenge for individual scientists is to acknowledge and try to understand the suppositions and beliefs that lie behind their own work so that they can use that self-knowledge to advance their work. Such self-examination can be informed by study in many areas outside of science, including history, philosophy, sociology, literature, art, religion, and ethics. If narrow specialization and a single-minded focus on a single activity keep a researcher from developing the perspective and fine sense of discrimination needed to apply values in science, that person's work can suffer.


POLYWATER AND THE ROLE OF SKEPTICISM

The case of polywater demonstrates how the desire to believe in a new phenomenon can sometimes overpower the demand for solid, well-controlled evidence. In 1966 the Soviet scientist Boris Valdimirovich Derjaguin lectured in England on a new form of water that he claimed had been discovered by another Soviet scientist, N. N. Fedyakin. Formed by heating water and letting it condense in quartz capillaries, this "anomalous water," as it was originally called, had a density higher than normal water, a viscosity 15 times that of normal water, a boiling point higher than 100 degrees Centigrade, and a freezing point lower than zero degrees.

Over the next several years, hundreds of papers appeared in the scientific literature describing the properties of what soon came to be known as polywater. Theorists developed models, supported by some experimental measurements, in which strong hydrogen bonds were causing water to polymerize. Some even warned that if polywater escaped from the laboratory, it could autocatalytically polymerize all of the world's water.

Then the case for polywater began to crumble. Because polywater could only be formed in minuscule capillaries, very little was available for analysis. When small samples were analyzed, polywater proved to be contaminated with a variety of other substances, from silicon to phospholipids. Electron microscopy revealed that polywater actually consisted of finely divided particulate matter suspended in ordinary water.

Gradually, the scientists who had described the properties of polywater admitted that it did not exist. They had been misled by poorly controlled experiments and problems with experimental procedures. As the problems were resolved and experiments gained better controls, evidence for the existence of polywater disappeared.


CONFLICTS OF INTEREST

Sometimes values conflict. For example, a particular circumstance might compromise-or appear to compromise- professional judgments. Maybe a researcher has a financial interest in a particular company, which might create a bias in scientific decisions affecting the future of that company (as might be the case if a researcher with stock in a company were paid to determine the usefulness of a new device produced by the company). Or a scientist might receive a manuscript or proposal to review that discusses work similar to but a step ahead of that being done by the reviewer. These are difficult situations that require trade-offs and hard choices, and the scientific community is still debating what is and is not proper when many of these situations arise.

Virtually all institutions that conduct research now have policies and procedures for managing conflicts of interest. In addition, many editors of scientific journals have established explicit policies regarding conflicts of interest. These policies and procedures are designed to protect the integrity of the scientific process, the missions of the institutions, the investment of stakeholders in institutions (including the investments of parents and students in universities), and public confidence in the integrity of research.

Disclosure of conflicts of interest subjects these concerns to the same social mechanisms that are so effective elsewhere in society. In some cases it may only be necessary for a researcher to inform a journal editor of a potential conflict of interest, leaving it for the editor to decide what action is necessary. In other cases careful monitoring of research activities can allow important research with a potential conflict of interest to go forward while protecting the integrity of the institution and of science. In any of these cases the intent is to involve outside monitors or otherwise create checks to reduce the possibility that bias will enter into science.


A CONFLICT OF INTEREST

John, a third-year graduate student, is participating in a department-wide seminar where students, postdocs, and faculty members discuss work in progress. An assistant professor prefaces her comments by saying that the work she is about to discuss is sponsored by both a federal grant and a biotechnology firm for which she consults. In the course of the talk John realizes that he has been working on a technique that could make a major contribution to the work being discussed. But his faculty advisor consults for a different, and competing, biotechnology firm.

1. How should John participate in this seminar?

2. What, if anything, should he say to his advisor-and when?

3. What implications does this case raise for the traditional openness and sharing of data, materials, and findings that have characterized modern science?


INDUSTRIAL SPONSORSHIP OF ACADEMIC RESEARCH

Sandra was excited about being accepted as a graduate student in the laboratory of Dr. Frederick, a leading scholar in the field, and she embarked on her assigned research project eagerly. But after a few months she began to have misgivings. Though part of Dr. Frederick's work was supported by federal grants, the project on which she was working was totally supported by a grant from a single company. She had known this before coming to the lab and had not thought it would be a problem. But she had not known that Dr. Frederick also had a major consulting agreement with the company. She also heard from other graduate students that when it came time to publish her work, any paper would be subject to review by the company to determine if any of her work was patentable.

1. What are the advantages and disadvantages of Sandra doing research sponsored entirely by a single company?

2. How can she address the specific misgivings she has about her research?

3. If Sandra wishes to discuss her qualms with someone at her university, to whom should she turn?


PUBLICATION AND OPENNESS

Science is not an individual experience. It is shared knowledge based on a common understanding of some aspect of the physical or social world. For that reason, the social conventions of science play an important role in establishing the reliability of scientific knowledge. If these conventions are disrupted, the quality of science can suffer.

Many of the social conventions that have proven so effective in science arose during the birth of modern science in the latter half of the seventeenth century. At that time, many scientists sought to keep their work secret so that others could not claim it as their own. Prominent figures of the time, including Isaac Newton, were loathe to convey news of their discoveries for fear that someone else would claim priority-a fear that was frequently realized.

The solution to the problem of making new discoveries public while assuring their author's credit was worked out by Henry Oldenburg, the secretary of the Royal Society of London. He won over scientists by guaranteeing rapid publication in the society's Philosophical Transactions as well as the official support of the society if the author's priority was brought into question. Oldenburg also pioneered the practice of sending submitted manuscripts to experts who could judge their quality. Out of these innovations rose both the modern scientific journal and the practice of peer review.

The continued importance of publication in learned journals accounts for the convention that the first to publish a view or finding, not the first to discover it, tends to get most of the credit for the discovery. Once results are published, they can be freely used by other researchers to extend knowledge. But until the results become common knowledge, people who use them are obliged to recognize the discoverer through citations. In this way scientists are rewarded through peer recognition for making results public.

Before publication, different considerations apply. If someone else exploits unpublished material that is seen in a privileged grant application or manuscript, that person is essentially stealing intellectual property. In industry the commercial rights to scientific work belong more to the employer than the employee, but similar provisions apply: research results are privileged until they are published or otherwise publicly disseminated.

Many scientists are generous in discussing their preliminary theories or results with colleagues, and some even provide copies of raw data to others prior to public disclosure to facilitate related work. But scientists are not expected to make their data and thinking available to others at all times. During the initial stages of research, a scientist deserves a period of privacy in which data are not subject to disclosure. This privacy allows individuals to advance their work to the point at which they have confidence both in its accuracy and its meaning.

After publication, scientists expect that data and other research materials will be shared with qualified colleagues upon request. Indeed, a number of federal agencies, journals, and professional societies have established policies requiring the sharing of research materials. Sometimes these materials are too voluminous, unwieldy, or costly to share freely and quickly. But in those fields in which sharing is possible, a scientist who is unwilling to share research materials with qualified colleagues runs the risk of not being trusted or respected. In a profession where so much depends on interpersonal interactions, the professional isolation that can follow a loss of trust can damage a scientist's work.

Publication in a peer-reviewed journal remains the standard means of disseminating scientific results, but other methods of communication are subtly altering how scientists divulge and receive information. Posters, abstracts, lectures at professional gatherings, and proceedings volumes are being used more often to present preliminary results before full review. Preprints and computer networks are increasing the ease and speed of scientific communications. These new methods of communication are in many cases just elaborations of the informal exchanges that pervade science. To the extent that they speed and improve communication and revision, they will strengthen science. But if publication practices, either new or traditional, bypass quality control mechanisms, they risk weakening conventions that have served science well.

An example is the scientist who releases important and controversial results directly to the public before submitting them to the scrutiny of peers. If the researcher has made a mistake or the findings are misinterpreted by the media or the public, the scientific community and the public may react adversely. When such news is to be released to the press, it should be done when peer review is complete-normally at the time of publication in a scientific journal.

Sometimes researchers and the institutions sponsoring research have different interests in making results public. For example, a scientist doing research sponsored by industry may want to publish results quickly, while the industrial sponsor may want to keep results private-at least temporarily-to establish intellectual property rights prior to disclosure. Research institutions and government agencies have started to adopt explicit policies to reduce conflicts over such issues of ownership and access.

In research that has the potential of being financially profitable, openness can be maintained by the granting of patents. Patents enable an individual or institution to profit from a scientific discovery in return for making the results public. Scientists who may be doing patentable work have special obligations to the sponsors of that work. For example, they may need to have their laboratory notebooks validated and dated by others. They may also have to disclose potentially valuable discoveries promptly to the patent official of the organization sponsoring the research.

In some situations, such as proprietary research sponsored by industry or militarily sensitive research, openness in disseminating research results may not be possible. Scientists working under such conditions may need to find other ways of exposing their work to professional scrutiny. Unclassified summaries of classified work can compensate for the lack of open scrutiny that allows the validation of results elsewhere in science. Properly structured visiting committees can examine proprietary or classified research while maintaining confidentiality.


THE SHARING OF RESEARCH MATERIALS

Ed, a fourth-year graduate student, was still several months away from finishing an ongoing research project when a new postdoc arrived from a laboratory doing similar work. After the two were introduced, Ed automatically asked about the work going on in the other lab and was surprised to hear that researchers there had successfully developed a reagent that he was still struggling to perfect. Knowing that both labs had policies requiring the sharing of research materials, Ed wrote a letter to the head of the other lab asking if the laboratory could share some of the reagent with him. He didn't expect there to be a problem, because his project was not in competition with the work of the other lab, but a couple of weeks later he got a letter from the lab director saying that the reagent could not be shared because it was still "poorly developed and characterized."

The new postdoc, upon hearing the story, said, "That's ridiculous. They just don't want to give you a break."

1. Where can Ed go for help in obtaining the materials?

2. Are there risks in involving other people in this situation?

3. What kinds of information is it appropriate for researchers to share with their colleagues when they change laboratories?


"We thus begin to see that the institutionalized practice of citations and references in the sphere of learning is not a trivial matter. While many a general reader - that is, the lay reader located outside the domain of science and scholarship - may regard the lowly footnote or the remote endnote or the bibliographic parenthesis as a dispensable nuisance, it can be argued that these are in truth central to the incentive system and an underlying sense of distributive justice that do much to energize the advancement of knowledge."

- Robert K. Merton, "The Matthew Effect in Science, II: Cumulative Advantage and the Symbolism of Intellectual Property," Isis, 79: 621, 1988.


THE ALLOCATION OF CREDIT

The principle of fairness and the role of personal recognition within the reward system of science account for the emphasis given to the proper allocation of credit. In the standard scientific paper, credit is explicitly acknowledged in three places: in the list of authors, in the acknowledgments of contributions from others, and in the list of references or citations. Conflicts over proper attribution can arise in any of these places.

Citations serve many purposes in a scientific paper. They acknowledge the work of other scientists, direct the reader toward additional sources of information, acknowledge conflicts with other results, and provide support for the views expressed in the paper. More broadly, citations place a paper within its scientific context, relating it to the present state of scientific knowledge.

Failure to cite the work of others can give rise to more than just hard feelings. Citations are part of the reward system of science. They are connected to funding decisions and to the future careers of researchers. More generally, the misallocation of credit undermines the incentive system for publication.

In addition, scientists who routinely fail to cite the work of others may find themselves excluded from the fellowship of their peers. This consideration is particularly important in one of the more intangible aspects of a scientific career-that of building a reputation. Published papers document a person's approach to science, which is why it is important that they be clear, verifiable, and honest. In addition, a researcher who is open, helpful, and full of ideas becomes known to colleagues and will benefit much more than someone who is secretive or uncooperative.

Some people succeed in science despite their reputations. Many more succeed at least in part because of their reputations.


CREDIT WHERE CREDIT IS DUE

Ben, a third-year graduate student, had been working on a research project that involved an important new experimental technique. For a national meeting in his discipline, Ben wrote an abstract and gave a brief presentation that mentioned the new technique. After his presentation, he was surprised and pleased when Dr. Freeman, a leading researcher from another university, engaged him in an extended conversation. Dr. Freeman asked Ben extensively about the new technique, and Ben described it fully. Ben's own faculty advisor often encouraged his students not to keep secrets from other researchers, and Ben was flattered that Dr. Freeman would be so interested in his work.

Six months later Ben was leafing through a journal when he noticed an article by Dr. Freeman. The article described an experiment that clearly depended on the technique that Ben had developed. He didn't mind; in fact, he was again somewhat flattered that his technique had so strongly influenced Dr. Freeman's work. But when he turned to the citations, expecting to see a reference to his abstract or presentation, his name was nowhere to be found.

1. Does Ben have any way of receiving credit for his work?

2. Should he contact Dr. Freeman in an effort to have his work recognized?

3. Is Ben's faculty advisor mistaken in encouraging his students to be so open about their work?


AUTHORSHIP PRACTICES

The allocation of credit can also become an issue in the listing of authors' names. Science has become a much more collaborative enterprise than it was in the past. The average number of authors for articles in the New England Journal of Medicine, for example, has risen from slightly more than one in 1925 to more than six today. In some areas, such as high-energy physics or genome sequencing, the number of authors can rise into the hundreds. This increased collaboration has produced many new opportunities for researchers to work with colleagues at different stages in their careers, in different disciplines, or even in widely separated locations. It has also increased the possibility for differences to arise over questions of authorship.

In many fields, the earlier a name appears in the list of authors, the greater the implied contribution, but conventions differ greatly among disciplines and among research groups. Sometimes the scientist with the greatest name recognition is listed first, whereas in other fields the research leader's name is always last. In some disciplines supervisors' names rarely appear on papers, while in others the professor's name appears on almost every paper that comes out of the lab. Some research groups and journals avoid these decisions by simply listing authors alphabetically.

Frank and open discussion of the division of credit within research groups-as early in the research process as possible and preferably at the very beginning, especially for research leading to a published paper- can prevent later difficulties. The best practice is for authorship criteria to be explicit among all collaborators. In addition, collaborators should be familiar with the conventions in a particular field to understand their rights and obligations. Group meetings provide an occasion to discuss ethical and policy issues in research.

The allocation of credit can be particularly sensitive when it involves researchers at different stages of their careers-for example, postdocs and graduate students, or senior faculty and student researchers. In such situations, differences in roles and status compound the difficulties of according credit.

Several considerations must be weighed in determining the proper division of credit between a student or research assistant and a senior scientist, and a range of practices are acceptable. If a senior researcher has defined and put a project into motion and a junior researcher is invited to join in, major credit may go to the senior researcher, even if at the moment of discovery the senior researcher is not present. By the same token, when a student or research assistant is making an intellectual contribution to a research project, that contribution deserves to be recognized. Senior scientists are well aware of the importance of credit in science and are expected to give junior researchers credit where warranted. In such cases, junior researchers may be listed as coauthors or even senior authors, depending on the work, traditions within the field, and arrangements within the team.

Occasionally a name is included in a list of authors even though that person had little or nothing to do with the content of a paper. Such "honorary authors" dilute the credit due the people who actually did the work, inflate the credentials of those so "honored," and make the proper attribution of credit more difficult. Several scientific journals now state that a person should be listed as the author of a paper only if that person made a direct and substantial contribution to the paper. Some journals require all named authors to sign the letter that accompanies submission of the original article and all subsequent revisions to ensure that no author is named without consent and that all authors agree with the final version.

As with citations, author listings establish accountability as well as credit. When a paper is found to contain errors, whether caused by mistakes or deceit, authors might wish to disavow responsibility, saying that they were not involved in the part of the paper containing the errors or that they had very little to do with the paper in general. However, an author who is willing to take credit for a paper must also bear responsibility for its contents. Thus, unless a footnote or the text of the paper explicitly assigns responsibility for different parts of the paper to different authors, the authors whose names appear on a paper must share responsibility for all of it.


WHO SHOULD GET CREDIT FOR THE DISCOVERY OF PULSARS ?

A much-discussed example of the difficulties associated with allocating credit between junior and senior researchers was the 1967 discovery by Jocelyn Bell, then a 24-year-old graduate student, of pulsars. Over the previous two years, Bell and several other students, under the supervision of Bell's thesis advisor, Anthony Hewish, had built a 4.5-acre radiotelescope to investigate scintillating radio sources in the sky. After the telescope began functioning, Bell was in charge of operating it and analyzing its data under Hewish's direction. One day Bell noticed "a bit of scruff" on the data chart. She remembered seeing the same signal earlier and, by measuring the period of its recurrence, determined that it had to be coming from an extraterrestrial source. Together Bell and Hewish analyzed the signal and found several similar examples elsewhere in the sky. After discarding the idea that the signals were coming from an extraterrestrial intelligence, Hewish, Bell, and three other people involved in the project published a paper announcing the discovery, which was given the name "pulsar" by a British science reporter.

Many argued that Bell should have shared the Nobel Prize awarded to Hewish for the discovery, saying that her recognition of the signal was the crucial act of discovery. Others, including Bell herself, said that she received adequate recognition in other ways and should not have been so lavishly rewarded for doing what a graduate student is expected to do in a project conceived and set up by others.


ERROR AND NEGLIGENCE IN SCIENCE

Scientific results are inherently provisional. Scientists can never prove conclusively that they have described some aspect of the natural or physical world with complete accuracy. In that sense all scientific results must be treated as susceptible to error.

Errors arising from human fallibility also occur in science. Scientists do not have limitless working time or access to unlimited resources. Even the most responsible scientist can make an honest mistake. When such errors are discovered, they should be acknowledged, preferably in the same journal in which the mistaken information was published. Scientists who make such acknowledgments promptly and openly are rarely condemned by colleagues.

Mistakes made through negligent work are treated more harshly. Haste, carelessness, inattention-any of a number of faults can lead to work that does not meet the standards demanded in science. If scientists cut corners for whatever reason, they are placing their reputation, the work of their colleagues, and the public's confidence in science at risk.

Some researchers may feel that the pressures on them are an inducement to haste at the expense of care. For example, they may believe that they have to do substandard work to compile a long list of publications and that this practice is acceptable. Or they may be tempted to publish virtually the same research results in two different places or publish their results in "least publishable units"-papers that are just detailed enough to be published but do not give the full story of the research project described.

Sacrificing quality to such pressures can easily backfire. A lengthy list of publications cannot outweigh a reputation for shoddy research. Scientists with a reputation for publishing a work of dubious quality will generally find that all of their publications are viewed with skepticism by their colleagues. Reflecting the importance of quality, some institutions and federal agencies have recently adopted policies that limit the number of papers that will be considered when an individual is evaluated for appointment, promotion, or funding.

By introducing preventable errors into science, sloppy or negligent research can do great damage-even if the error is eventually uncovered and corrected. Though science is built on the idea of peer validation and acceptance, actual replication is selective. It is not practical (or necessary) to reconstruct all the observations and theoretical constructs that go into an investigation. Researchers have to trust that previous investigators performed the work as reported.

If that trust is misplaced and the previous results are inaccurate, the truth will likely emerge as problems arise in the ongoing investigation. But researchers can waste months or years of effort because of erroneous results, and public confidence in the integrity of science can be seriously undermined.


PUBLICATION PRACTICES

Paula, a young assistant professor, and two graduate students have been working on a series of related experiments for the past several years. During that time, the experiments have been written up in various posters, abstracts, and meeting presentations. Now it is time to write up the experiments for publication, but the students and Paula must first make an important decision. They could write a single paper with one first author that would describe the experiments in a comprehensive manner, or they could write a series of shorter, less complete papers so that each student could be a first author.

Paula favors the first option, arguing that a single publication in a more visible journal would better suit all of their purposes. Paula's students, on the other hand, strongly suggest that a series of papers be prepared. They argue that one paper encompassing all the results would be too long and complex and might damage their career opportunities because they would not be able to point to a paper on which they were first authors.

1. If the experiments are part of a series, are Paula and her students justified in not publishing them together?

2. If they decided to publish a single paper, how should the listing of authors be handled?

3. If a single paper is published, how can they emphasize to the review committees and funding agencies their various roles and the importance of the paper?


"Of all the traits which quality a scientist for citizenship in the republic of science, I would put a sense of responsibility as a scientist at the very top. A scientist can be brilliant, imaginative, clever with his hands, profound, broad, narrow - but he is not much as a scientist unless he is responsible."

- Alvin Weinberg, "The Obligations of Citizenship in the Republic of Science," Minerva, 16:1-3, 1978.


MISCONDUCT IN SCIENCE

Beyond honest errors and errors caused through negligence are a third category of errors: those that involve deception. Making up data or results (fabrication), changing or misreporting data or results (falsification), and using the ideas or words of another person without giving appropriate credit (plagiarism)- all strike at the heart of the values on which science is based. These acts of scientific misconduct not only undermine progress but the entire set of values on which the scientific enterprise rests. Anyone who engages in any of these practices is putting his or her scientific career at risk. Even infractions that may seem minor at the time can end up being severely punished.

The ethical transgressions discussed in earlier sections-such as misallocation of credit or errors arising from negligence-are matters that generally remain internal to the scientific community. Usually they are dealt with locally through the mechanisms of peer review, administrative action, and the system of appointments and evaluations in the research environment. But misconduct in science is unlikely to remain internal to the scientific community. Its consequences are too extreme: it can harm individuals outside of science (as when falsified results become the basis of a medical treatment), it squanders public funds, and it attracts the attention of those who would seek to criticize science. As a result, federal agencies, Congress, the media, and the courts can all get involved.

Within the scientific community, the effects of misconduct-in terms of lost time, forfeited recognition to others, and feelings of personal betrayal-can be devastating. Individuals, institutions, and even entire research fields can suffer grievous setbacks from instances of fabrication, falsification, or plagiarism even if they are only tangentially associated with the case.

When individuals have been accused of scientific misconduct in the past, the institutions responsible for responding to those accusations have taken a number of different approaches. In general, the most successful responses are those that clearly separate a preliminary investigation to gather information from a subsequent adjudication to judge guilt or innocence and issue sanctions if necessary. During the adjudication stage, the individual accused of misconduct has the right to various due process protections, such as reviewing the evidence gathered during the investigation and cross- examining witnesses.

In addition to falsification, fabrication, and plagiarism, other ethical transgressions directly associated with research can cause serious harm to individuals and institutions. Examples include cover-ups of misconduct in science, reprisals against whistleblowers, malicious allegations of misconduct in science, and violations of due process in handling complaints of misconduct in science. Policymakers and scientists have not decided whether such actions should be considered misconduct in science-and therefore subject to the same procedures and sanctions as falsification, fabrication, and plagiarism-or whether they should be investigated and adjudicated through different channels. Regulations adopted by the National Science Foundation and the Public Health Service define misconduct to include "other serious deviations from accepted research practices," in addition to falsification, fabrication, and plagiarism, leaving open the possibility that other actions could be considered misconduct in science. The problem with such language is that it could allow a scientist to be accused of misconduct for using novel or unorthodox research methods, even though such methods are sometimes needed to proceed in science. Federal officials respond by saying that this language is needed to prosecute ethical breaches that do not strictly fall into the categories of falsification, fabrication, or plagiarism and that no scientist has been accused of misconduct on the basis of using unorthodox research methods. This area of science policy is still evolving.

Another category of behaviors-including sexual or other forms of harassment, misuse of funds, gross negligence in a person's professional activities, tampering with the experiments of others or with instrumentation, and violations of government research regulations-are not necessarily associated with scientific conduct. Institutions need to discourage and respond to such behaviors. But these behaviors are subject to generally applicable legal and social penalties and should be dealt with using the same procedures that would be applied to anyone.


FABRICATION IN A GRANT APPLICATION

Don is a first-year graduate student applying to the National Science Foundation for a predoctoral fellowship. His work in a lab where he did a rotation project was later carried on successfully by others, and it appears that a manuscript will be prepared for publication by the end of the summer. However, the fellowship application deadline is June 1, and Don decides it would be advantageous to list a publication as "submitted." Without consulting the faculty member or other colleagues involved, Don makes up a title and author list for a "submitted" paper and cites it in his application.

After the application has been mailed, a lab member sees it and goes to the faculty member to ask about the "submitted" manuscript. Don admits to fabricating the submission of the paper but explains his actions by saying that he thought the practice was not uncommon in science.

The faculty members in Don's department demand that he withdraw his grant application and dismiss him from the graduate program. After leaving the university, Don applies for a master's degree, since he has fulfilled the course requirements. Although the department votes not to grant him a degree, the university administration does so because it is not stated in the university graduate bulletin that a student in Don's department must be in "good standing" to receive a degree. They fear that Don will bring suit against the university if the degree is denied. Likewise, nothing will appear in Don's university transcript regarding his dismissal.

1. Do you agree with Don that scientists often exaggerate the publication status of their work in written materials?

2. Do you think the department acted too harshly in dismissing Don from the graduate program?

3. Do you believe that being in "good standing" should be a prerequisite for obtaining an advanced degree in science? If Don later applied to a graduate program at another institution, does that institution have the right to know what happened?


A CASE OF PLAGIARISM

May is a second-year graduate student preparing the written portion of her qualifying exam. She incorporates whole sentences and paragraphs verbatim from several published papers. She does not use quotation marks, but the sources are suggested by statements like "(see . . . for more details)." The faculty on the qualifying exam committee note inconsistencies in the writing styles of different paragraphs of the text and check the sources, uncovering May's plagiarism.

After discussion with the faculty, May's plagiarism is brought to the attention of the dean of the graduate school, whose responsibility it is to review such incidents. The graduate school regulations state that "plagiarism, that is, the failure in a dissertation, essay, or other written exercise to acknowledge ideas, research or language taken from others" is specifically prohibited. The dean expels May from the program with the stipulation that she can reapply for the next academic year.

1. Is plagiarism like this a common practice?

2. Are there circumstances that should have led to May's being forgiven for plagiarizing?

3. Should May be allowed to reapply to the program?


RESPONDING TO VIOLATIONS OF ETHICAL STANDARDS

One of the most difficult situations that a researcher can encounter is to see or suspect that a colleague has violated the ethical standards of the research community. It is easy to find excuses to do nothing, but someone who has witnessed misconduct has an unmistakable obligation to act. At the most immediate level, misconduct can seriously obstruct or damage one's own research or the research of colleagues. More broadly, even a single case of misconduct can malign scientists and their institutions, result in the imposition of counterproductive regulations, and shake public confidence in the integrity of science.

To be sure, raising a concern about unethical conduct is rarely an easy thing to do. In some cases, anonymity is possible-but not always. Reprisals by the accused person and by skeptical colleagues have occurred in the past and have had serious consequences. Any allegation of misconduct is a very important charge that needs to be taken seriously. If mishandled, an allegation can gravely damage the person charged, the one who makes the charge, the institutions involved, and science in general.

Someone who is confronting a problem involving research ethics usually has more options than are immediately apparent. In most cases the best thing to do is to discuss the situation with a trusted friend or advisor. In universities, faculty advisors, department chairs, and other senior faculty can be invaluable sources of advice in deciding whether to go forward with a complaint.

An important consideration is deciding when to put a complaint in writing. Once in writing, universities are obligated to deal with a complaint in a more formal manner than if it is made verbally. Putting a complaint in writing can have serious consequences for the career of a scientist and should be undertaken only after thorough consideration.

The National Science Foundation and Public Health Service require all research institutions that receive public funds to have procedures in place to deal with allegations of unethical practice. These procedures take into account fairness for the accused, protection for the accuser, coordination with funding agencies, and requirements for confidentiality and disclosure.

In addition, many universities and other research institutions have designated an ombudsman, ethics officer, or other official who is available to discuss situations involving research ethics. Such discussions are carried out in strictest confidence whenever possible. Some institutions provide for multiple entry points, so that complainants can go to a person with whom they feel comfortable.

Government agencies, including the National Science Foundation and Public Health Service, enforce laws and regulations that deal with misconduct in science. At the Public Health Service in Washington, D.C., complaints can be referred to the appropriate office through the Office of Research Integrity. At the National Science Foundation in Arlington, Virginia, complaints can be directed to the Office of the Inspector General. Within universities, research grant officials can provide guidance on whether federal rules may be involved in filing a complaint.

Many institutions have prepared written materials that offer guidance in situations involving professional ethics. Volume II of Responsible Science: Ensuring the Integrity of the Research Process (National Academy Press, Washington, D.C., 1993) reprints a number of these documents. Sigma Xi, a national society of research scientists headquartered in Research Triangle Park, North Carolina, the American Association for the Advancement of Science in Washington, D.C., and other scientific and engineering professional organizations also are prepared to advise scientists who encounter cases of possible misconduct.

The research system exerts many pressures on beginning and experienced researchers alike. Principal investigators need to raise funds and attract students. Faculty members must balance the time spent on research with the time spent teaching undergraduates. Industrial sponsorship of research introduces the possibility of conflicts of interest.

All parts of the research system have a responsibility to recognize and respond to these pressures. Institutions must review their own policies, foster awareness of research ethics, and ensure that researchers are aware of the policies that are in place. And researchers should constantly be aware of the extent to which ethically based decisions will influence their success as scientists.


A CAREER IN THE BALANCE

Francine was just months away from finishing her Ph.D. dissertation when she realized that something was seriously amiss with the work of a fellow graduate student, Sylvia. Francine was convinced that Sylvia was not actually making the measurements she claimed to be making. They shared the same lab, but Sylvia rarely seemed to be there. Sometimes Francine saw research materials thrown away unopened. The results Sylvia was turning in to their common thesis advisor seemed too clean to be real.

Francine knew that she would soon need to ask her thesis advisor for a letter of recommendation for faculty and postdoc positions. If she raised the issue with her advisor now, she was sure that it would affect the letter of recommendation. Sylvia was a favorite of her advisor, who had often helped Sylvia before when her project ran into problems. Yet Francine also knew that if she waited to raise the issue the question would inevitably arise as to when she first suspected problems. Both Francine and her thesis advisor were using Sylvia's results in their own research. If Sylvia's results were inaccurate, they both needed to know as soon as possible.

1. Should Francine first try to talk with Sylvia, with her thesis advisor, or with someone else entirely?

2. Does she know enough to be able to raise concerns?

3. Where else can Francine go for information that could help her decide what to do?


THE SCIENTIST IN SOCIETY

This booklet has concentrated on the responsibilities of scientists for the advancement of science, but scientists have additional responsibilities to society. Even scientists conducting the most fundamental research need to be aware that their work can ultimately have a great impact on society. Construction of the atomic bomb and the development of recombinant DNA-events that grew out of basic research on the nucleus of the atom and investigations of certain bacterial enzymes, respectively-are two examples of how seemingly arcane areas of science can have tremendous societal consequences.

The occurrence and consequences of discoveries in basic research are virtually impossible to foresee. Nevertheless, the scientific community must recognize the potential for such discoveries and be prepared to address the questions that they raise. If scientists do find that their discoveries have implications for some important aspect of public affairs, they have a responsibility to call attention to the public issues involved. They might set up a suitable public forum involving experts with different perspectives on the issue at hand. They could then seek to develop a consensus of informed judgment that can be disseminated to the public. A good example is the response of biologists to the development of recombinant DNA technologies-first calling for a temporary moratorium on the research and then helping to set up a regulatory mechanism to ensure its safety.

This document cannot describe the many responsibilities incumbent upon researchers because of science's function in modern society. The bibliography lists several volumes that examine the social roles of scientists in detail. The important point is that science and technology have become such integral parts of society that scientists can no longer isolate themselves from societal concerns. Nearly half of the bills that come before Congress have a significant scientific or technological component. Scientists are increasingly called upon to contribute to public policy and to the public understanding of science. They play an important role in educating nonscientists about the content and processes of science.

In fulfilling these responsibilities scientists must take the time to relate scientific knowledge to society in such a way that members of the public can make an informed decision about the relevance of research. Sometimes researchers reserve this right to themselves, considering nonexperts unqualified to make such judgments. But science offers only one window on human experience. While upholding the honor of their profession, scientists must seek to avoid putting scientific knowledge on a pedestal above knowledge obtained through other means.

Many scientists enjoy working with the public. Others see this obligation as a distraction from the work they would like to be doing. But concern and involvement with the broader uses of scientific knowledge are essential if scientists are to retain the public's trust.

The research enterprise has itself been changing as science has become increasingly integrated into everyday life. But the core values on which the enterprise is based-honesty, skepticism, fairness, collegiality, openness-remain unchanged. These values have helped produce a research enterprise of unparalleled productivity and creativity. So long as they remain strong, science-and the society it serves-will prosper.


"Any research organization requires generous measures of the following:

- social space for personal initiative and creativity;

- time for ideas to grow to maturity;

- openness to debate and criticism;

- hospitality toward novelty; and - respect for specialized expertise.

[These] may sound too soft and old-fashioned to stand up against the cruel modern realities of administrative accountability and economic stringency. On the contrary, I believe that they are fundamental requirements for the continued advancement of scientific knowledge-and, of course, for its eventual social benefits."

- John Ziman, Prometheus Bound: Science in a Dynamic Steady State, Cambridge University Press, New York, 1994, p. 276.


THE NATIONAL RESEARCH COUNCIL AND SERVICE TO SOCIETY

One way in which scientists serve the needs of the broader society is by participating in the activities of the National Research Council, which is administered by the National Academy of Sciences, the National Academy of Engineering, and the Institute of Medicine. The National Research Council brings together leaders from academe, industry, government, and other sectors to address critical national issues and provide advice to the U.S. government and its citizens. Over the course of a typical year, about 650 committees involving approximately 6,400 individuals study societally important issues that involve science and technology. All of these experts volunteer their time to serve on study committees, plan and participate in seminars, review documents, and otherwise assist in the work of the institution. Study committees work independently of government, sponsors, and special-interest groups. Continuous oversight and formal anonymous review of the results of the studies enhance objectivity and quality.


BIBLIOGRAPHY

Volume I of Responsible Science: Ensuring the Integrity of the Research Process (National Academy Press, Washington, D.C., 1992) presents a thorough analysis of scientific misconduct made by the Panel on Scientific Responsibility and the Conduct of Research under the Committee on Science, Engineering, and Public Policy of the National Academy of Sciences, National Academy of Engineering, and Institute of Medicine. Volume II of Responsible Science (National Academy Press, Washington, D.C., 1993) contains a number of background papers, a selection of guidelines for the conduct of research, and examples of specific research policies and procedures for handling allegations of misconduct in science.

In The Responsible Conduct of Research in the Health Sciences (National Academy Press, Washington, D.C., 1989), the Institute of Medicine's Committee on the Responsible Conduct of Research examines institutional policies and procedures designed to strengthen the professional standards of academic research. Sharing Research Data, edited by Stephen E. Fienberg, Margaret E. Martin, and Miron L. Straf (National Academy Press, Washington, D.C., 1985), lays out general principles to govern the sharing of research results and the materials used in research.

An early but still excellent book on experimental and statistical methods for data reduction is E. Bright Wilson's An Introduction to Scientific Research (McGraw- Hill, New York, 1952). A more general book from the same period that remains useful today is The Art of Scientific Investigation by W. I. B. Beveridge (Third Edition, Vintage Books, New York, 1957).

A broad overview of the philosophy, sociology, politics, and psychology of science can be found in John Ziman's An Introduction to Science Studies: The Philosophical and Social Aspects of Science and Technology (Cambridge University Press, New York, 1984). Ziman analyzes many of the changes going on in contemporary science in Prometheus Bound: Science in a Dynamic Steady State (Cambridge University Press, New York, 1994).

Many pioneering essays by Robert K. Merton have been collected in The Sociology of Science (University of Chicago Press, Chicago, 1973). Stephen Cole analyzes and critiques some of the more modern work in the sociology of science in Making Science: Between Nature and Society (Harvard University Press, Cambridge, Mass. 1992).

Gerald Holton discusses the thematic presuppositions of scientists and the integrity of science in chapters 1 and 12 of his book Thematic Origins of Scientific Thought: Kepler to Einstein (Revised Edition, Harvard University Press, Cambridge, Mass., 1988). Holton elaborates on the historical context of research ethics in "On Doing One's Damnedest: The Evolution of Trust in Scientific Findings," which is chapter 7 in Einstein, History, and Other Passions (American Institute of Physics, New York, 1994). The roles of recognition and credit in science are discussed in chapters 8-10 of David Hull's Science as Process: An Evolutionary Account of the Social and Conceptual Development of Science (University of Chicago Press, Chicago, 1988).

Peter B. Medawar addresses the concerns of beginning researchers in his book Advice to a Young Scientist (Harper & Row, New York, 1979). "Honor in Science" by C. Ian Jackson, is a booklet offering "practical advice to those entering careers in scientific research" (Sigma Xi, The Scientific Research Society, Research Triangle Park, N. C., 1992). Ethics, Values, and the Promise of Science (Sigma Xi, The Scientific Research Society, Research Triangle Park, N. C., 1993), the proceedings of a 1992 forum held by Sigma Xi, contains a number of interesting papers on ethical scientific conduct.

Several insightful books offer advice for researchers about succeeding in a scientific career, including A Ph.D. Is Not Enough: A Guide to Survival in Science by Peter J. Feibelman (Addison-Wesley, Reading, Mass., 1993), The Incomplete Guide to the Art of Discovery by Jack E. Oliver (Columbia University Press, New York, 1991), and The Joy of Science by Carl J. Sindermann (Plenum Publishers, New York, 1985).

Alexander Kohn presents a number of case studies of misconduct and self-deception from the history of science and medicine in False Prophets: Fraud and Error in Science and Medicine (Basil Blackwell, New York, 1988). A lively book that discusses several historic cases of self-deception in science is Diamond Dealers and Feather Merchants: Tales from the Sciences by Irving M. Klotz (Birkhauser, Boston, 1986). The story of cold fusion is well told in Cold Fusion: The Scientific Fiasco of the Century by John R. Huizenga (Oxford University Press, New York, 1993) and in Gary Taubes' Bad Science: The Short Life & Hard Times of Cold Fusion (Random House, New York, 1993).

Harriet Zuckerman gives a thorough, scholarly analysis of scientific misconduct in "Deviant Behavior and Social Control in Science" (pp. 87-138 in Deviance and Social Change, Sage Publications, Beverly Hills, Calif., 1977). Frederick Grinnell has a chapter on scientific misconduct in the second edition of The Scientific Attitude (Guilford Press, New York, 1992).

The American Association of Medical Colleges has gathered a large number of case studies in Teaching the Responsible Conduct of Research Through a Case Study Approach (American Association of Medical Colleges, Washington, D.C., 1994). Research Ethics: Cases and Materials, edited by Robin Levin Penslar (Indiana University Press, Bloomington, 1994), contains a number of extended case studies as well as essays on various aspects of research ethics. In Understanding Ethical Problems in Engineering Practice and Research (Cambridge University Press, New York, 1995), Caroline Whitbeck examines issues of professional ethics (such as the engineer's or chemist's responsibility for safety) and research ethics. The American Association for the Advancement of Science and the American Bar Association have jointly issued several publications on issues of scientific ethics, including Good Science and Responsible Scientists: Meeting the Challenge of Fraud and Misconduct in Science, by Albert H. Teich and Mark S. Frankel (American Association for the Advancement of Science, Washington, D.C., 1991).

The report Scientific Freedom and Responsibility, prepared by John T. Edsall (American Association for the Advancement of Science, Washington, D.C., 1975), remains an important statement on the social obligations of scientists in the modern world. Rosemary Chalk has compiled a series of papers from Science magazine on ethics, scientific freedom, social responsibility, and a number of other topics in Science, Technology, and Society: Emerging Relationships (American Association for the Advancement of Science, Washington, D.C., 1988).

The Barbara McClintock quotation on the first page of the document came from A Feeling for the Organism: The Life and Work of Barbara McClintock by Evelyn Fox Keller (W.H. Freeman, San Francisco, 1983).

Among audiovisual materials, the NOVA program "Do Scientists Cheat?" stands out as a balanced treatment of ethical issues in the conduct of research.


APPENDIX: DISCUSSION OF CASE STUDIES

The hypothetical scenarios included in this booklet raise many different issues that can be discussed and debated. The observations and questions given below suggest just some of the areas that can be explored.

THE SELECTION OF DATA

Deborah and Kathleen's principal obligation, in writing up their results for publication, is to describe what they have done and give the basis for their actions. They must therefore examine how they can meet this obligation within the context of the experiment they have done. Questions that need to be answered include: If the authors state in the paper that data have been rejected because of problems with the power supply, should the data points still be included in the published chart? Should statistical analyses be done that both include and exclude the questionable data? If conventions within their discipline allow for the use of statistical devices to eliminate outlying data points, how explicit do Deborah and Kathleen need to be in the published paper about the procedures they have followed?

A CONFLICT OF INTEREST

Science thrives in an atmosphere of open communication. When communication is limited, progress is limited for everyone. John therefore needs to weigh the advantages of keeping quiet - if in fact there are any - against the damage that accrues to science if he keeps his suggestion to himself. He might also ask himself how keeping quiet might affect his own life in science. Does he want to appear to his advisor and his peers as someone who is less than forthcoming with his ideas? Will he enjoy science as much if he purposefully limits communication with others?

INDUSTRIAL SPONSORSHIP OF ACADEMIC RESEARCH

Sandra has enrolled in the university to receive an education, not to work for industry. But working on industrially sponsored research is not necessarily incompatible with getting a good education. In fact, it can be a valuable way to gain insight into industrially oriented problems and to prepare for future work that has direct applications to societal needs. The question that must be asked is whether the nature of the research is subverting Sandra's education. Sandra's faculty advisor has entered into a relationship that could result in conflicts of interest. That relationship is therefore most likely to be subject to review by third parties. Can Sandra turn to those responsible for overseeing the research for help in resolving her own uncertainties? What would be the possible effects on her career if she did so?

THE SHARING OF RESEARCH MATERIALS

After a research material like a reagent has been described in a publication, sharing that material speeds and in some cases enables the replication of results and therefore contributes to the progress of science. But the reagent in this situation has not yet been described in a published paper, so the provisions for sharing it are different. Ed needs to consider the other laboratory's legitimate interest in developing that material and establishing how it works before publication. He also needs to consider the relationship between the two laboratories. If he turns to his faculty advisor for help in acquiring the reagent, how is his advisor likely to respond? Is there any way he can work with the other laboratory and thereby come a step closer to forming an agreement with them about the use of the reagent?

CREDIT WHERE CREDIT IS DUE

Ben is to be commended for being open and for seeking to involve others in his work. He will benefit from that openness, even if he seems not to have benefited in this situation. At the same time, Ben has to ask himself honestly if his comments were a critical factor in Dr. Freeman's work. If Dr. Freeman had already had the same ideas, he should have told Ben this during their conversation. But could the same ideas have come from elsewhere? If Ben is still convinced that he has not been treated fairly, he will need to work with his research advisor to see if his contributions can be acknowledged. One option would be to see if his advisor would cosign a letter with Ben or write a letter on Ben's behalf addressing this issue. Ben will need to think about the possible implications of this course of action for his own career. What if Dr. Freeman writes back and says that the lack of credit was an oversight and that he will credit Ben in the future? What if he says that Ben's objections are not warranted and gives the reasons why?

PUBLICATION PRACTICES

Contributions to a scientific field are not counted in terms of the number of papers. They are counted in terms of significant differences in how science is understood. With that in mind, Paula and her students need to consider how they are most likely to make a significant contribution to their field. One determinant of impact is the coherence and completeness of a paper. Paula and her students may need to begin writing before they can tell whether one or more papers is needed. In retrospect, Paula and her students might also ask themselves about the process that led to their decision. Should they have discussed publications much earlier in the process? Were the students led to believe that they would be first authors on published papers? If so, should that influence future work in the lab?

FABRICATION IN A GRANT APPLICATION

Even though Don did not introduce spurious results into science, he fabricated the submission of the research paper and therefore engaged in misconduct. Though his treatment by the department might seem harsh, fabrication strikes so directly at the foundations of science that it is not excusable. This scenario also demonstrates that researchers and administrators in an institution may differ on the appropriate course of action to take when research ethics are violated. Sometimes institutions may be unwilling or unable to respond to an ethical transgression in the way the scientific community would desire. Researchers might then have to decide the extent to which they are willing to impose and enforce sanctions themselves.

A CASE OF PLAGIARISM

A broad spectrum of misconduct falls into the category of plagiarism, ranging from obvious theft to uncredited paraphrasing that some might not consider dishonest at all. In a lifetime of reading, theorizing, and experimenting, a person's work will inevitably incorporate and overlap with that of others. However, occasional overlap is one thing; systematic use of the techniques, data, words, or ideas of others without appropriate acknowledgment is another. A person's background can play a role in considering episodes of plagiarism. For example, what if May had never been taught the conventions and institutional policies governing the attribution of other's work? Should she then have been treated more leniently?

A CAREER IN THE BALANCE

Francine's most obvious option is to discuss the situation with her research advisor, but she has to ask herself if this is the best alternative. Her advisor is professionally and emotionally involved in the situation and may not be able to take an impartial stance. In addition, because the advisor is involved in the situation, she may feel the need to turn the inquiry into a formal investigation or to report the inquiry to her supervisors. Francine should also consider whether she can discuss the situation directly with Sylvia. Many suspicions evaporate when others have a chance to explain actions that may have been misinterpreted. If Francine feels that she cannot talk with Sylvia, she needs some way to discuss her concerns confidentially. Maybe she could turn to a trusted friend, another member of the faculty, someone on the university's administrative staff, or an ombudsman designated by the university. That person can help Francine explore such questions as: What is known and what is not known about the situation? What are the options available to her? Should she put her concerns in writing, an action likely to lead to a formal investigation?