Not Letting the Facts Get in the Way

A case of never letting the source spoil a good story

Of course, this is a problem that generalises well beyond science. Over and again, you read comment pieces that purport to be responding to an earlier piece, but distort the earlier arguments, or miss out the most important ones: they count on it being inconvenient for you to check. There’s also an interesting difference between different media: most bloggers have no institutional credibility, so they must build it by linking transparently and allowing you to double-check their work easily.

But more than anything, because linking to sources is such an easy thing to do and the motivations for avoiding links are so dubious, I’ve detected myself using a new rule of thumb: if you don’t link to primary sources, I just don’t trust you.

The Kobayashi Maru Quarterly

Spaghettification and the problem of scientific jargon

Can’t use jargon, because people don’t understand it. It sounds like advertising mumbo-jumbo, so they don’t trust you. But if you explain the jargon, it sounds like you’re trying to con them.

Short version: damned if you do, damned if you don’t.

Short, short version: we’re effed.

I think it means we have to attack the problem from another direction — raise the level of scientific literacy. (Or if you’re James T Kirk, rig the system)

Blogging: You're Doing it Wrong! (Episode IV: A New Hope)

A new hope that it’s over, and yes, this is the last installment. (Thoughts from Science Online 2011) (Part I) (Part II) (Part III)

I think the session on using the history of science had a very interesting point (besides looking at history is interesting and that quote-mining to misrepresent your opponent has been going on for a looong time) — that one has to view science in the context of the time, because we always judge information through the optics of what we think is right, and it’s too easy to fall into the trap of thinking that what we know today is absolutely right, when it isn’t — we’ll keep learning more, and finding out some of what we think we know isn’t accurate. It’s easy to scoff at what people thought was true N years ago, but we have the benefit of hindsight. I am reminded of something from James Burke (from The Day the Universe Changed), when someone had made a condescending remark about people thinking the earth was the center of the solar system:

Somebody apparently once went up the the great philosopher Wittgenstein and said “What a lot of morons people back in the Middle Ages must have been to have looked every morning at what’s going on behind me now, the dawn, and to have thought that what they were seeing was the sun going around the Earth, when as every schoolkid knows the Earth goes around the sun and it doesn’t take too many brains to understand that.”

To which Wittgenstein replied, “Yeah, but I wonder what it would have looked like if the Sun had been going around the Earth.”

Point being, of course, is that it would have looked exactly the same.

You see what your knowledge tells you you’re seeing.

What you think the universe is, and how you react to that in everything you do, depends on what you know. And when that knowledge changes, for you, the universe changes. And that is as true for the whole of society as it is for the individual. We all are what we all know today. What we knew yesterday was different, and so were we.

(It’s from the first two minutes of this episode. The whole series is wonderful.)

Put another way, as an example, nobody was likely to come up with relativity as we know it as long as the notion of the ether was ensconced in everyone’s brains. The presence of a medium was the way things were interpreted, up until you couldn’t think about it that way anymore. And there could be something better than relativity out there.

Which leads me to think that perhaps it’s better to view what we know today as being less wrong than what we knew yesterday.

 

Another bit of insight came from the “It’s all Geek to me” session, where geekdom and snark were discussed. One of the moderators made the observation that geeks (in the world of geeks and non-geeks, i.e. without going into the difference between geeks, nerds, dweebs and dorks, which is shown here) is that geeks place a high value on information. And my observation, as an extension to that, was that geeks aren’t generally offended by being corrected, as opposed to the non-geek world, where pointing out errors is often considered rude. A geek doesn’t take it personally — being in error isn’t a character flaw (an honest mistake is not the same as lying), and in any discussion it’s better to proceed from the truth than from a mistake.

As far as the snark goes, I’ll just say this — I use snark (OMG, stop the internet!). But snark can’t be a substitute for an answer. I say this more in the context of forum discussions than a blog. Scoffing at someone’s ignorance isn’t productive; for any fact or concept, there was a time when each of us didn’t know it, and we all have huge areas of ignorance. So snark — as a first response — kills discussion. It sends inquisitive people away. But for someone who is proudly and willfully ignorant and shows they aren’t interested in honest discussion, I say, “Fire for effect.”

 

I also heard the science cheerleader talk, and learned about an interesting method of outreach and destroying some misconceptions; having a professional (pro-sports level) cheerleader point out that she has a technical degree shatters some old but tenacious stereotypes. I also heard about Science for Citizens, which allows average Joes and Janes to contribute to science projects (measuring precipitation where you live, taking samples from streams and ponds, and many other activities, which might be an extension of what you do for work or hobby anyway.

Blogging: You're Doing it Wrong! (Part III)

(Thoughts from Science Online 2011) (Part I) (Part II)

One of the sessions I was really looking forward to was entitled “The Entertainment Factor – Communicating Science with Humor,” moderated by Brian Malow and Joanne Manaster. Because I am really interested in saving on my long distance incorporating humor into my posts. There was a lot of good advice in the session, up to the point of telling people how to be funny — which isn’t a shortcoming, because I don’t think you can do that. There are opportunities to do this, because we tend to use analogies to explain science concepts, and humor is often about exaggerating any kind of absurdities when making comparisons, so there’s a natural fit to do that, and even point out where the analogy fails, as it always will, in making simplifications. And telling funny stories about life in the lab is common enough, because it happens to everyone.

I think there are several “don’ts” that go along with this as well. Don’t force the joke, and this is a subset of some of the other writing tips I heard in other sessions. You may have this great line, be it a joke or some science tidbit, that you really want to include in your post, but it has to naturally flow from the writing. If you “write around the joke,” it’s usually going to end up as bad writing. It’s like laying down tile on the floor — you might have a great pattern in the center of the room, but none of the edges match up with the walls and obstacles, and ends up being a crappy job because you can’t trim all the tiles to fit. Bite the bullet and let go of the notion that the great pattern has to be exactly centered.

Another limitation is this balance between writing to your audience without excluding too many potential readers. Brian had asked be during the break preceding the session to say a bit about science cartooning, which I did when discussing this. A cartoon — especially one panel cartoons — don’t afford the opportunity for background explanation. So if you do a cartoon that relies on the readers knowing about the physics “spherical cow” joke, such as this, you are going to exclude potential readers who aren’t familiar with the joke. And the topic of your post may not lend itself to explaining the joke. I imagine the same applies to science standup comedy as well. You can’t do a couple-minute lecture just to set up a joke and be effective at it. You either have to have an audience that already has the background, or already be in a position where you are explaining the concept. Otherwise it won’t work.

Here’s another perspective on the session, at Observations of a nerd: So this biologist walks into a bar…

——

Completely unrelated to this was a session called “How Can We Maintain High Journalism Standards on the Web,” and it was attended mostly by the professionals. Most of the session focused on ethics standards and disclosure and avoiding the appearance of bias, which means Pepsigate came up (surprise!) and other related subjects as well. I get that most responsible journalists don’t want their work tainted by the appearance that they are endorsing a product or service, which can be questioned by links or undisclosed sponsorships or targeted ads. A lot of their credibility is tied up in their objectivity. But I think there’s more to it. One thing that was mentioned only briefly was knowing what you are writing about, and from the perspective of a scientist who happens to write a blog, that’s where my credibility is. And it wasn’t clear that the writers understood this, or to what extend they understood this, because the conversation never went in that direction. Yes, it’s bad if you have taken money or some kind of favor from the target of your writing, but that really doesn’t come up much when you do a post translating a research result into a post for a wider audience. What’s important there is getting the science right. Because if the part you know and understand is wrong, you lose confidence that the author got any of the rest of the article tight.

Coupled to this is the idea that being objective means staying out of the fray and reporting both sides of a story. Not getting involved in the dispute. This comes up in stories that have a political or ideological slant to them, but what bothers me is the effort that goes into making sure both sides of the story are heard, and the lack of effort that goes into pointing out that the two sides are not equal. When the weight of evidence is heavily on one side, and the opposing arguments are weak and full of quote mining and cherry-picking, being fair means pointing this out. It’s not presenting the arguments as being equally valid — that’s giving extra weight to one side in order to balance the see-saw when the situation is inherently unbalanced. The reporter needs to stand on the fulcrum.

But the discussion didn’t afford me the chance to discuss that. Good thing I have a blog. Both are important. As one participant put it, your allegiance is to the readers. This means you have to be free from bias, or at least disclose potential sources of bias, but you also have to be correct in what you are writing about and present a story that has the least amount of distortion to it. It does no good to write a story on global warming, taking great care to avoid any appearance you’ve been influenced by big oil (or big research, if there is such a thing), only to get the science wrong and give the impression, say, that there has been no warming since 1998. It would be like implying that George Will has any idea what he’s talking about. There’s no scientific integrity there.

I think the lesson here is this: there is more than one thing that props up one’s credibility.

Part IV

So This Virgin Was Explaining Sex to Me …

Horrible Article On Becoming A Physicist

This person by the name of “Timothy Sexton” (BA in English) somehow thought that he could write an article titled “How to Become a Physicist and What to Expect when You Become One“. Now, before we examine his article, tell me something. What are the chances that someone who has never obtained a degree in physics, and has never worked in physics, would know well enough what one needs to do to become a physics, and then know what to expect when one becomes a physicist?

Making It Sound Worse Than it Really Is

Chevy Volt, Nissan Leaf post small December sales

This was the year General Motors Co. and Nissan made good on their promise to bring mass-produced electric cars to the market. But don’t count on seeing one in traffic soon. Sales so far have been microscopic and they’re likely to stay that way for some time because of limited supplies.

GM sold between 250 and 350 Chevy Volts this month and Nissan’s sales totaled less than 10 Leaf sedans in the past two weeks. Production for both is slowly ramping up.

It will be well into 2012 before both the Volt and Leaf are available nationwide. And if you’re interested in buying one, you’ll need to get behind the 50,000 people already on waiting lists.

One might argue that extremely limited availability doesn’t count as “bring(ing) mass-produced electric cars to the market.” It’s bringing a small amount of cars to market. The headline makes it seem like there isn’t much demand, rather than the companies selling every electric car they’ve made.

Read This, Even Though My Data are Faked

Did you know that 73.4% of statistics are made up?

I’ve run across several posts all referencing a recent journal article, Retractions in the scientific literature: do authors deliberately commit research fraud?, which parrot (and possibly distort) the information in the abstract — that US researchers are the worst purveyors of fraud, such as this article: US Scientists Significantly More Likely to Publish Fake Research, Study Finds

I wanted to check on that, because it’s not an unknown phenomenon for a article to incorrectly summarize research and so I looked at the article, linked above, to see the abstract

All 788 English language research papers retracted from the PubMed database between 2000 and 2010 were evaluated.

Well, that’s a bit of bias, since people in the US are more likely to publish in English-language journals, but that’s not necessarily true for countries where English is not the native language. It also assumes all fraud is caught and results in a retraction. But beyond that I wanted numbers to look at, since I know there are a lot of articles published in the US, and if they are simply saying that there are more fraudulent articles published in the US, it is pretty meaningless. While I don’t have access to the journal, it turns out that an analysis has already been done. US scientists “more prone” to fake research? No., with some followup in Rates of Scientific Fraud Retractions

The likelihood of a given paper being retracted as fraudulent is higher for China and India than for the US, and significantly so. The finding that the fraud rate is higher is higher-impact journals may be due to having more scrutiny and that we’re simply missing fraud in journals that are not widely read.

I think it’s also important to note (as the paper’s author does) that the overall rate of fraud is low. Using these criteria, it is less than 200 cases out of more than six million papers, or 0.0032%. In other words, for every 31,000 journal articles you read (from all sources), on average one of them will be fraudulent. If you limit yourself to US authors, the number drops to one in only 21,600.

Demonic Journalists Turn Truth Into Fiction

(Apologies if this is overly rant-y. I’ve been suffering though a discussion from a creationist that boils down to “information” can’t increase, therefore evolution is impossible. I’m therefore a little sensitive to the topic of information and thermodynamics)

Demonic device converts information to energy

The laws of physics say that you can’t get energy for nothing — worse still, you will always get out of a system less energy than you put in. But a nanoscale experiment inspired by a nineteenth-century paradox that seemed to break those laws now shows that you can generate energy from information.

Of course, the key word is seemed. This is my main peeve here. The all-to-common insinuation that some law of nature has been violated.

Of course, later on (not paragraph 19, though), they admit

The experiment does not actually violate the second law of thermodynamics, because in the system as a whole, energy must be consumed by the equipment — and the experimenters — to monitor the bead and switch the voltage as needed. But it does show that information can be used as a medium to transfer energy, says Sano. The bead is driven as a mini-rotor, with a information-to-energy conversion efficiency of 28%.

I’m not sure how they get to “information is a medium to transfer energy,” and from that to “information is converted to energy.”

There’s a somewhat better article, via Dr. SkySkull

[T]here is energy in information. To store a bit of information a system like a computer memory needs to be put into a defined state, either a ’1′ or a ’0′.

That seems more reasonable — storing information (and all information needs to be stored) requires energy, so there is energy in information storage. But what is being presented as “information” is just the state of a system; one could just as easily say there is energy in e.g. an electron’s orientation in a magnetic field (or the location of a polystyrene bead, as in this case), and skip the discussion about information.

It sounds like a version of the Brownian ratchet, where a paddle would spin in only one direction from random collisions, because the ratchet would impede motion in the other direction. It fails because the ratchet, too, would be subject to collisions, and fail to work if everything were at the same temperature. Here, the mechanical ratchet has been replaced by an electric field, which is not in thermal equilibrium. You expend energy determining when to change the field.

I’m glad I didn’t run into any stories that called it “pure energy.” Oh, crud.

Edit (11/22) Sean does a summary, which makes a lot more sense.

The connection is not that “information carries energy”; if I tell you some information about gas particles in a box, that doesn’t change their total energy. But it does help you extract that energy.