Intellectualism and Scientific Literacy

Mastering complexity

[W]e live in a world where it’s de rigueur to know your Shakespeare, Molière or Goethe, but quite all right to be proudly ignorant of Faraday, Pasteur or Einstein. It hasn’t always been that way, and it doesn’t have to be that way. But right now, there’s a trend in society towards scientific apathy, and even antagonism. This is dangerous for us all and it’s incumbent on the scientific community to address the issue.

I think it’s de rigueur to know your Shakespeare, Molière or Goethe if you want to claim to be an intellectual (which, as I have said, I do not). But I think one must note that literacy is a term we associate with a minimum level of capability. One who is literate can read, but that does not mean that said person will be able to appreciate the works of Shakespeare (or Molière or Goethe). That next level is where we find interactional expertise, and we need to be clear whether we expect this, or simply literacy. But anyone claiming to be an intellectual cannot legitimately exclude math and science from their arsenal.

Somewhat related to this topic, I have to say that Howard Johnson Jennifer is right! in Meet Me Halfway

It’s frustrating. That frustration is often expressed in a renewed cracking of the whip, insisting that scientists just need to do better in communicating via public outreach. While I agree that the scientific community should (and is) working to improve in that area — heck, I do this for a living and still am constantly striving to improve! — what Hasson’s research clearly shows is that genuine communication is a two-way street. Scientists — a.k.a., the speakers — are only half of the equation, and thus they are only half of the problem.

The other half of the equation are the listeners; any type of communication will fail if it doesn’t have a receptive audience. And I’d go one step further. We tend to think of listening as a passive act, but it actually requires some effort in order to achieve that elusive connection. Particularly when it comes to bridging a gap, as with scientists and the general public, the listeners need to be more actively engaged, more invested in having a true conversation.

This is a view I’ve held for a long time. There are concepts that do require years of college to get a handle on, and reading a pop-sci book is not a substitute. You have to go out and expend some effort to have your interactional expertise if you want to be part of the conversation.

All of which ties in to a session I attended on scientific literacy (Is encouraging scientific literacy more than telling people what they need to know) at ScienceOnline 2012. We agreed that it’s important, because science appears in many places and people need to be able to make informed decisions, but in light of Jennifer’s post, I think one must add that people need to be motivated to want to make informed decisions, and take steps toward that end.

There was an interesting exercise in which the (Canadian) moderators gave a short dialogue about a curling result, and used the collective sports (il)literacy as an analogy for science (though it’s not the first time one might have thought of this). Since I lived in Canada for 2.5 years and am familiar with curling, though, I didn’t get the full benefit of the exercise.

The FTL neutrino experiment came up as well, and I wish I had better notes because I don’t recall exactly what the objection was — something about conflicting information being presented, but this is because most physicists are not neutrino experts and there’s a difference between literacy and expertise. I pointed out that in some ways, the issue actually raised scientific literacy, because it was a demonstration of the scientific process.

There was a very interesting example given by one of the moderators (Catherine Anderson, with whom I talked at length about this after the session) about some science-camp exercises that were modeled to be like a CSI investigation. Clues were given and the students had to gather evidence and make their case, but one of the driving lessons of the exercise was that there was no right answer, just as in any part of “real” science — you do your experiment and then have to interpret the results. There’s no “right answer” to compare it to, which is one of the tougher concepts I’ve had to try to convey in introductory physics labs back when I was doing that sort of thing. The students get the idea that experimental error was the difference between what they got and what the textbook said they should get. I occasionally try and think of ways one could do a lab where the “right” answer isn’t available, so the students would have to the chance to do something that compared to real science investigation, and understanding the process of science and how uncertainty/error fits in is a big part of scientific literacy