The Scientists' Dilemma

I recently had an interesting discussion with someone who is interested in science, but without training or experience as a scientist. The question was, basically: Why don’t we (scientists) all just lie to each other? i.e. what compels scientists to truthfully share their research results? It’s a fair question — we’re human and competitive to some degree, and at first blush there would seem to be a lot to gain from keeping competitors off-balance by feeding them false clues.

I will draw a distinction here between non-cooperation, i.e. secrecy, and deception. Certainly there are endeavors where information sharing is limited — corporate and military research have their secrets, and I suppose that some endeavors might actually try and mislead the competition. This secrecy is (in my experience) rare in the more open environment of academia. Why is this? There a couple of factors.

Science is big — really big — and there is a built-in symbiosis that has arisen. Nobody can possibly study every area of science, so we have to be able to trust that information we get elsewhere is valid. The so-called scientific method has developed ways to do this — we like to confirm experimental results either by replicating the work, doing a similar experiment or doing a more advanced experiment that uses the results, which gives a trust-but-verify attitude, and knowing that results are going to be checked is a motivation to be truthful. Reputations are at stake, and even if there were no peer review you wouldn’t want to be know as an untrustworthy researcher — if you are consistently wrong, whether by sloppiness or deceit, nobody will pay attention to you. So part of the answer is peer pressure.

Science is big in terms of career length as well — decades, vs the time scale of experiments, or more importantly, publications and talks, which is measure in months to a few years. If a scientist is going to engage in deception in order to complete an experiment first, the benefit has to last beyond that one experiment. And it doesn’t. Some level of cooperation serves any scientist’s interests much more than subterfuge.

In science, information is a valuable commodity. You publish and present results at conferences with the expectation that everyone else will do the same, and beyond this, you have discussions in the hallways or at meals, in which you discuss details that never make it into the papers or talks. Just reading a paper does not give you all of the information necessary to complete it — one might describe a certain layout for an experiment, but what isn’t discussed is how to get the experiment to work. The nice graph of data that has been published, for example, is the end result. There is no information about what knobs you have to tweak to get from an ugly, noisy signal to the nice one, and there is usually no discussion of the false paths one went down in pursuing the result. That’s what you talk about when you are trading information, and the value is that it’s a huge time-saver. It’s one justification for the graduate school system, where acquiring experience is given some value (overvalued, perhaps, by those paying salaries). Knowing what not to do is a benefit in terms of both time and money.

Bucking the system, then, threatens to cut you off from that flow of information. Someone more familiar math could no doubt go into details, but I’m sure that it’s some application of game theory. While you might be smarter than everyone else, you are not smarter than everyone else put together — even if you started with a lead isolation will eventually leave you far behind the pack as you have to try all of the dead ends, while the cooperative ones will share that information and save time. Taking the middle ground of simply not sharing is no better. People will stop talking to you once they realize that the relationship is asymmetric.

Since I work for the military, I run into the conflict between sharing and keeping secrets, which is pretty much divided along the lines of scientists vs military staff. You have to make your case that not sharing has a price — this information flow asymmetry would quickly shut one off from juicy details, and cost your program time and money to get the desired results. That usually works.

There’s another consequence of the value of information sharing, and I think it contributes to the scientific community’s attitude toward fraud. There are some self-policing communities that seem to have an “I have your back if you have mine” attitude and so misdeeds are sometimes not punished severely. But in science, fraud is a death knell. Part of this is because one of the parties to research can’t be convinced to go along: nature itself. If I falsify data, there will come a time when this impacts someone else, and I don’t have any control over the results of someone else’s experiment. And because, as I said, human nature might tempt us, the punishment has to be severe, such as revoking your degree.

8 thoughts on “The Scientists' Dilemma

  1. The one dilemma I often face is whether or not to share work in progress. However, usually people have lots of their own ideas to work on and are not interested in stealing mine. I find it very useful to get the opinions of and some guidance from experts.

  2. That is a good blog post about an important detail of science outreach. What is described so well in the post above is apparently counter-intuitive to some points of view. Whether it is the fault of the explainer or the listener, a relatively small number of individuals without science backgrounds can’t give themselves permission to understand science and continually see science sharing as something risky. Perhaps it is the understanding of norms from a literary point of view in those without science backgrounds, where the mark of quality is uniqueness. Scientific quality is contrastingly connected to replication and convergence of independently confirmed objective measures.

    An example would be a discussion or questions about a scientific model by the person without science experience, where they are defining the background information. If a scientist attempts to join that discussion to respond by discussing the foreground information, the person without science training will sometimes react in a hostile way, because they only understand science as defined as being able to explain “what” and the scientist as a normal response describes all the ways “how” that has to include the ideas and work of others in the form of comparisons between more than one model.

    Discussing limitations of the original model being defined by the person with no science background is sometimes unfortunately seen as public embarrassment or attack because of lack of familiarity with scientific information appraisal methods. Thus, communication from the replying scientist appears to the person without a science background as criticizing their idea. Overcoming this communication challenge can require quite a bit of energy.

  3. I once heard a story that when the first high Tc superconductors were discovered, the team put an intentional typo in the material structure into the paper submitted for refereeing to keep the material secret prior to publication. (I couldn’t find a reference for the story.)

  4. @Ron
    This story is related in (I believe) the book “Breakthrough: the race for the superconductor”. The error was real: YBa2Cu3O7, but was written YbBa…, implying Ytterbium instead of Yttrium, but the authors claim it was a geniune typo. The material is usually referred to as YBCO – Yttrium (or Ytterbium…) Barium Copper Oxide, so a single slip was possible, but not very plausible (an error in the single most important detail in the paper?) to the other scientists who rushed off to try and make YbBa2Cu3O7.

  5. I think the OP’s feeling for cooperation and sharing in science may also be biased because of his background (I assume) in atomic physics. My impression is that physicists are generally more open/sharing/cooperative with other physicists than say, biologists are with other biologists. Part of this may be culture, but part may be the nature of the experiments done. If a physicist gives you the recipe for how she constructed her vacuum apparatus/laser lock/whatever, it’ll still be something on the order of a year before you can duplicate her apparatus. If a chemist or biologist gives you the recipe for how he synthesizes some chemical/enzyme/whatever, you can now synthesize that exact chemical just as fast as the original chemist can.

    So in physics you can be super “friendly” and still retain a competitive advantage, while in other fields of science that is less true.

    Also, for what it’s worth, I hear that AMO physics is even more “friendly” than other fields of physics. I’ve heard stories that this is partly due to some of the “elder statesmen” in the field taking people aside when they were being jerks and explaining that’s not how things are done in our field. Whether this is true or not, I dunno.

  6. AC,

    It may certainly be true that some disciplines are more open to sharing, but I think that’s in line with my discussion — the willingness to share will be in some relation to the value of the information. But these people still publish their results and this is still different from deception.

    However, I concede that as the willingness to share decreases, the cost of deception drops as well. Requiring publication ensures that results will continue to be shared.

  7. Remember Velikovsky? Not a scientist, a psychiatrist, but he wrote several books, like “Worlds in Collision”, that speculated about astronomy based mainly on the Bible. (I know, the mind boggles.) Carl Sagan wrote about the fact that many scientists that he talked to knew that Velikovsky was wrong about their area of science, yet gave him credit for being right about areas that they did not know about.

    BTW, one prediction of Velikovsky’s that went against scientific thinking of the time, but panned out, was that Venus would be a hot planet. 😉

  8. Carl Sagan wrote about the fact that many scientists that he talked to knew that Velikovsky was wrong about their area of science, yet gave him credit for being right about areas that they did not know about.

    This is also a common feature of scientists who believe that God’s existence can be supported by scientific evidence; it is always in another field. Examples: Francis Collins and Ken Miller, who are biologists, derive their scientific evidence for God in quantum mechanics and cosmology.

Comments are closed.