Thursday, November 24, 2011

Week 15: Science and Values

So by popular demand, we will turn in our last few meetings of the course to a series of interconnected questions about the role of values in science. It seems to me that we can organize our investigation on the heading of three broad questions:
  1. What special moral issues/problems are raised by science?
  2. Do individual scientists have special responsibilities or duties that go above and beyond the dictates of general morality?
  3. How should societies structure their collective scientific efforts? 
For Tuesday, I've assigned a chapter assigned from Resnik’s 1998 book The Ethics of Science (regarded by many as a modern classic in the field) surveys several the issues that arise under heading (1). Some of these issues may strike us as relatively straightforward or unproblematic. For instance, the existence in science of certain distinctive professional relationships or power structures raises the potential of immoral behavior with the structure of these relationships. Scientists can abuse their responsibilities as mentors or teachers, they can falsify or misrepresent data (i.e., lie), fail to behave fairly, and so on. No surprise here: scientists are people, after all — and people are known to act in immoral ways. Science may be no different from other human activities — e.g., sport — in introducing novel ways in which to be immoral, but it may not require any novel moral concepts in order for us to evaluate these behaviors. We require scientists to obey the dictates of morality merely because we require everyone to obey those dictates.

Of course, the fact that many issues that arise in the practice of science can be treated with general moral concepts we already have doesn’t necessarily mean that those issues will be straightforward. Perhaps the questions about human and animal experimentation are like these. History has witnessed some truly disturbing instances of the violation of human right in the pursuit of science. But even if we are agreed about the wrongness of — e.g., subjecting unconsenting humans to extreme cold (as the Nazis did), see French pp. 126–7 — we might disagree about the morality of using this data to save lives. Are there other issues that cannot be straightforwardly handled by a commonsense, general morality? Granted that scientists should not act immorally, do they incur any further responsibilities, in virtue of being scientists? For example, do they have a responsibility to think about the potential outcomes of their research? Is there any research that should be off-limits? At this point, we may want to distinguish, as Resnik does in an earlier chapter, between morality and ethics. Resnik writes:
Morality consists of a society’s most general standards. These standards apply to all people in society regardless of their professional or institutional roles (Pojman 1995). Moral standards distinguish between right and wrong, good and bad, virtue and vice, justice and injustice. Many writers maintain that moral duties and obligations override other ones: if I have a moral duty not to lie, then I should not lie even if my employment requires me to lie. Moral standards include those rules that most people learn in childhood, e.g. “don’t lie, cheat, steal, harm other people, etc.” Ethics are not general standards of conduct but the standards of a particular profession, occupation, institution, or group with-in society. The word “ethics,” when used in this way, usually serves as a modifier for another word, e.g. business ethics, medical ethics, sports ethics, military ethics, Muslim ethics, etc. Professional ethics are standards of conduct that apply to people who occupy a professional occupation or role (Bayles 1988). A person who enters a profession acquires ethical obligations because society trusts them to provide valuable goods and services that cannot be provided unless their conduct conforms to certain standards. Professionals who fail to live up to their ethical obligations betray this trust. For instance, physicians have a special duty to maintain confidentiality that goes way beyond their moral duties to respect privacy. A physician who breaks confidentiality compromises her ability to provide a valuable service and she betrays society’s (and the patient’s) trust. Professional standards studied by ethicists include medical ethics, legal ethics, mass media ethics, and engineering ethics, to name but a few. . . . (Resnik 1998, 13–14).
So heading (2) could be rephrased as “Is there a distinctive ethics of science?” This question is more controversial. Many people want to see science as a “value-free” enterprise. The only moral questions is what we do with the results of science. Some will argue that placing limits on what individual scientists study will have negative consequences for science (both at the level of individual motivation and at the epistemic level).

The reading for Thursday, from Philip Kitcher’s important book, Science, Truth, and Democracy, addresses these issues. Of course, for research that is publicly-funded, the idea that scientists should have free reign to pursue whatever questions interests them is clearly spurious. We clearly do not have an obligation to fund their individual whims! This raises the questions under my heading (3) above: how should we go about ordering our scientific priorities? There is a practical question here about how our democracy should function in this respect. But there is also (arguably) a moral issue that parallels those faced by individual scientists. Do we have any duties to direct our collective resources toward some projects over others?

Reading advice: While both Resnik and Kitcher are much more straightforward writers than Feyerabend, my advice from when we read Against Method applies. Since we are reading just a chapter or two of a whole book, you’ll come across some references that will be obscure. Don’t let that derail you: focus on the message that is specific to the chapter; concentrate on its argument and assumptions.

Tuesday (11/29): Ethics in the Lab
• Resnik, “Ethical Issues in the Laboratory” (Chapter 7 of The Ethics of Science) [PDF]
• Pence, “The Tuskegee Study” [PDF]* 

Short Writing Assignment:
Do a little internet research: find and briefly describe an instance of a moral/ethical violation in science that was not mentioned in the text. Is it better thought of as a moral or ethical issue? 

Thursday (12/1): The Myth of Purity and Value of Free Inquiry 
• For background, I recommend reading French, Ch. 9.*
• Kitcher, chapters 7–8 of Science, Truth, and Democracy [PDF]

Short Writing Assignment: (write on two)
  1. What is the relationship between the “Myth of Purity” and the broad questions I’ve identified above? 
  2. Briefly describe one way in which we might deny that there is a clean distinction between “pure” and “applied” research. 
  3. Think of an example of scientific research that, while appearing“pure” at first glance, can be seen to be “impure” on further reflection. 
  4. Try to summarize Kitcher’s argument concerning free inquiry. 
  5. Whether or not you agree with him, what do you take to be the significance/breadth of Kitcher’s conclusion?

No comments:

Post a Comment