I recently had an interesting discussion with someone who is interested in science, but without training or experience as a scientist. The question was, basically: Why don’t we (scientists) all just lie to each other? i.e. what compels scientists to truthfully share their research results? It’s a fair question — we’re human and competitive to some degree, and at first blush there would seem to be a lot to gain from keeping competitors off-balance by feeding them false clues.
I will draw a distinction here between non-cooperation, i.e. secrecy, and deception. Certainly there are endeavors where information sharing is limited — corporate and military research have their secrets, and I suppose that some endeavors might actually try and mislead the competition. This secrecy is (in my experience) rare in the more open environment of academia. Why is this? There a couple of factors.
Science is big — really big — and there is a built-in symbiosis that has arisen. Nobody can possibly study every area of science, so we have to be able to trust that information we get elsewhere is valid. The so-called scientific method has developed ways to do this — we like to confirm experimental results either by replicating the work, doing a similar experiment or doing a more advanced experiment that uses the results, which gives a trust-but-verify attitude, and knowing that results are going to be checked is a motivation to be truthful. Reputations are at stake, and even if there were no peer review you wouldn’t want to be know as an untrustworthy researcher — if you are consistently wrong, whether by sloppiness or deceit, nobody will pay attention to you. So part of the answer is peer pressure.
Science is big in terms of career length as well — decades, vs the time scale of experiments, or more importantly, publications and talks, which is measure in months to a few years. If a scientist is going to engage in deception in order to complete an experiment first, the benefit has to last beyond that one experiment. And it doesn’t. Some level of cooperation serves any scientist’s interests much more than subterfuge.
In science, information is a valuable commodity. You publish and present results at conferences with the expectation that everyone else will do the same, and beyond this, you have discussions in the hallways or at meals, in which you discuss details that never make it into the papers or talks. Just reading a paper does not give you all of the information necessary to complete it — one might describe a certain layout for an experiment, but what isn’t discussed is how to get the experiment to work. The nice graph of data that has been published, for example, is the end result. There is no information about what knobs you have to tweak to get from an ugly, noisy signal to the nice one, and there is usually no discussion of the false paths one went down in pursuing the result. That’s what you talk about when you are trading information, and the value is that it’s a huge time-saver. It’s one justification for the graduate school system, where acquiring experience is given some value (overvalued, perhaps, by those paying salaries). Knowing what not to do is a benefit in terms of both time and money.
Bucking the system, then, threatens to cut you off from that flow of information. Someone more familiar math could no doubt go into details, but I’m sure that it’s some application of game theory. While you might be smarter than everyone else, you are not smarter than everyone else put together — even if you started with a lead isolation will eventually leave you far behind the pack as you have to try all of the dead ends, while the cooperative ones will share that information and save time. Taking the middle ground of simply not sharing is no better. People will stop talking to you once they realize that the relationship is asymmetric.
Since I work for the military, I run into the conflict between sharing and keeping secrets, which is pretty much divided along the lines of scientists vs military staff. You have to make your case that not sharing has a price — this information flow asymmetry would quickly shut one off from juicy details, and cost your program time and money to get the desired results. That usually works.
There’s another consequence of the value of information sharing, and I think it contributes to the scientific community’s attitude toward fraud. There are some self-policing communities that seem to have an “I have your back if you have mine” attitude and so misdeeds are sometimes not punished severely. But in science, fraud is a death knell. Part of this is because one of the parties to research can’t be convinced to go along: nature itself. If I falsify data, there will come a time when this impacts someone else, and I don’t have any control over the results of someone else’s experiment. And because, as I said, human nature might tempt us, the punishment has to be severe, such as revoking your degree.