Guest post by Kindred Spirits
Part One, Part Two, Part Three, Part Four, Part Five, Part Six, Part Seven, Part Eight, Part Nine, Part Ten
So you think science is the antidote to sloppy emotional thinking as shown in the last few posts? Alas, scientists and scientific funding are subject to our non-rational brains too. Richard Feynman was a Nobel Prize-winning physicist who worked on the Manhattan Project creating the atomic bomb in World War II. In the essay below, Feynman discusses some of the many challenges that scientists face which are examples of the “…first principle is that you must not fool yourself — and you are the easiest person to fool.”
Cargo Cult Science, by Richard Feynman:
We have learned a lot from experience about how to handle some of the ways we fool ourselves. One example: Millikan measured the charge on an electron by an experiment with falling oil drops and got an answer which we now know not to be quite right. It’s a little bit off, because he had the incorrect value for the viscosity of air. It’s interesting to look at the history of measurements of the charge of the electron, after Millikan. If you plot them as a function of time, you find that one is a little bigger than Millikan’s, and the next one’s a little bit bigger than that, and the next one’s a little bit bigger than that, until finally they settle down to a number which is higher.
Why didn’t they discover that the new number was higher right away? It’s a thing that scientists are ashamed of—this history—because it’s apparent that people did things like this: When they got a number that was too high above Millikan’s, they thought something must be wrong—and they would look for and find a reason why something might be wrong. When they got a number closer to Millikan’s value they didn’t look so hard. And so they eliminated the numbers that were too far off, and did other things like that. We’ve learned those tricks nowadays, and now we don’t have that kind of a disease.
But this long history of learning how to not fool ourselves—of having utter scientific integrity—is, I’m sorry to say, something that we haven’t specifically included in any particular course that I know of. We just hope you’ve caught on by osmosis.
The first principle is that you must not fool yourself—and you are the easiest person to fool. So you have to be very careful about that.
In another essay, Feynman argues that religion has a role in ethics, despite the metaphysics of religions being doubtful. He also investigated various mystical and alternative mental states (e.g., from sensory deprivation chambers), and seemingly decided that while the phenomena existed, it didn’t prove that any of the religious metaphysics was true. You can read more of his thoughts in: The Relation of Science and Religion, by Richard Feynman.
Lastly, Feynman worked on creating the atomic bomb. Originally, he joined knowing that the Germans were also working an atomic bomb. However, after the Germans surrendered, the target was switched from the Germans to the Japanese who were not developing an atomic bomb, and he didn’t even question that change of the target at the time. (In an interview I saw, he seemed to think it was an ethical failure on his part. Alas, I could not find the video clip.)
So the question is, did science help Feynman make this ethical judgment? Did he make the correct ethical judgment?