How'd You Get to Be So Good?

I was trying to track down some details of some work-related history and ran across this, which just happened to have my search terms in it (though not in close proximity in the text). It’s a Congressional hearing from 2006 on how the recent NIST Nobel laureates view science policy.

This is not today’s congress, i.e. this was not chaired by Lamar Smith, and all that that engenders, so even though the GOP hasn’t been particularly cozy with science in some time, this dates to a time when things weren’t quite as bad as today. Plus, this hearing wasn’t discussing social science or global warming.

It’s a transcript, so it’s not polished and there’s a lot of fluff, but there are parts that are quite good. I know from experience that Bill Phillips (Nobel in ’97) and Eric Cornell (’01) are good science communicators; I can’t recall ever hearing Jan Hall (’05) give a talk but his testimony is pretty clear as well.

The hearing will address these overarching questions:

1. Why has NIST been so successful at cultivating Nobel Prize winners?

2. What are the implications of the Nobel Prize-winning research at NIST and how can that work get used outside of NIST?

3. What steps are most necessary to improve U.S. performance in math, science and engineering, and U.S. competitiveness?

What directed my attention to the transcript was related to Bill Phillips’ work on laser cooling and trapping

One application of low-temperature physics is technology to improve the accuracy of atomic clocks. By cooling atoms of cesium, scientists have made atomic clocks that are a billion times more accurate than an ordinary wristwatch.

From Bill’s testimony:

Today, laser-cooled atoms define time. At the naval observatory, they keep time for our military. They synchronize GPS, which guides everything from military jeeps to commercial aircraft. NIST’s standard clock is accurate to less than one second in 60 million years. We like to call this “close enough for government work.”

The naval observatory mention was one part that garnered the hit on Google; apparently he talks us up on pretty much every occasion. We invited him out to visit us last summer when we declared our fountain clock ensemble to be fully operational (and were not subsequently destroyed by the rebel alliance), and got to hang out for a while. One thing we talked about is what he discusses below on government investment in science.

Later on in his prepared statement he describes how he pursued laser cooling — first as a bit of a hobby, with scrounged equipment, but later on as a primary research investigation, with proper funding. And, I might add, with minimal interference from a bureaucracy which might demand immediate commercial application from research (just the normal government bureaucracy to inhibit work). He speaks of realizing the application to clocks, but those clocks and frequency standards didn’t come to fruition for several decades, and even then that was pretty fast for basic research to get going, to make a discovery, and for that discovery to have a significant impact. Such is the scale of science, and that’s the reason why scaling back on government investment will not be noticed at the commercial level for quite some time. Inertia is the problem here. We’re coasting on older investment, and we won’t be able to quickly (if ever) regain any lead we have should we lose it. You can’t recreate a decade’s worth of research overnight, even if you threw a lot of money at the problem. As the saying goes, it takes a woman nine months to make a baby, but you can’t get nine women together to make a baby in a month. There’s no substitute for continued, deliberate investment in basic research.

Bill Phillips, in his prepared statement

The invention of the transistor at Bell Telephone Labs set the stage for a booming electronics industry that has sustained much of the U.S. economy. It came from a strong and sustained program in basic research at Bell Labs, one that was mirrored in other industrial labs like RCA, Raytheon, Ford, Xerox, IBM, and so forth. Today, many business analysts seriously contend that AT&T never got a significant return on its research investment and denigrate the value of any long-range, basic research in any industry, focusing instead on very short-term return on investments. Today, Bell Labs is a shadow of its former self in regard to basic research and that sort of far-sighted support of research has virtually disappeared from American industry. I don’t know if we can ever expect to return to the golden age of industrial research, but I strongly believe that we must, as a nation, regain and maintain that level of basic research if we are to remain competitive in a world economy. If industry cannot or will not take its traditional share of this responsibility, I believe that government must compensate.

I think that this is not happening, and things have gotten worse in the last several years as science funding has been cut. We’ll wake up in a decade or two and wonder why so much of the innovation is happening elsewhere and it’s going to be because the government stopped funding science at a level necessary to move forward, mainly because of a powerful few who hated science and blocked its progress. Our “return on investment” can’t be the criterion we use to decide on basic research, because you simply don’t know what you’re going to find.

From Eric Cornell’s statement

The big question is what is going to be the big new industry of 2020? If I knew the answer, I would not be here in front of you testifying–I’d be off setting up my own high-tech venture capital company instead. No one knows the answer for sure, that is why scientific research and discovery is so important. Without knowing for sure what the next big thing will be, we can remain cautiously optimistic that that big thing will be an American thing.

Remember, this was from 2006. I wonder if his take would be different today, given trajectory of science funding? But again, note the underlying thought here: it’s research, and you don’t know what you’re going to find until you go out and find it. Any and every interruption can stop research, but it requires time to get it going again. All too often you have to go back to square one and start over from scratch.

Dan Brown Physics

Einstein’s greatest legacy- How demons and angels advanced science

Thought experiments are common in theoretical physics today. Physicists use them to examine the consequences of a theory beyond that what is measureable with existing technology, but still within the realm of that what is in principle measureable. A thought experiments pushes a theory to its limit and thereby can reveal inconsistencies or novel effects. The rules of the game are that a) relevant is only that what is measureable and b) do not fool yourself. This isn’t as easy as it sounds.

Something I run across often is someone with a “great new theory” (at best one of those descriptions is true) or a scenario that supposedly tears down a pillar of physics (usually relativity), who doesn’t realize that a thought experiment doesn’t (dis)prove anything, because absent any comparison with experiment, all a contradiction shows is that your thought process has some problem — it’s pretty easy to assume contradictory things, which wreak havoc on thought experiments. (For the relativity folks, it’s usually a subtle assumption of absolute simultaneity)

Physics in the Cloud

You need to a flashplayer enabled browser to view this YouTube video

One thing glossed over: Joe mentions that the muons are undergoing time dilation by some factor, which allows a fraction of them to live long enough to reach earth, but in their own frame their clocks run normally. How can that work? The missing piece is that the muons see their travel as length contracted by the same factor. They don’t decay because they didn’t travel very far, (or for very long) as seen in their frame.

The link to make a cloud chamber

Here's That Cat Again

Admin note: Been on the road. It’ll be a while before I’m caught up.

Schrödinger’s cat: A thought experiment in quantum mechanics – Chad Orzel

The narrative is good, the art is good, but I think the depiction of electrons with classical trajectories, both in the double slit and orbiting as in Bohr atoms detracts from this; it arguably sends the wrong message about what’s going on and may reinforce misconceptions. I don’t know if this is simply a problem of illustration, since trajectories are relatively simple, and depicting QM is trickier. It’s not like I have a simple fix for doing the depiction better, though.

Original Recipe

Relativity and Baseball

Going back to the original idea of relativity — simply looking at the relative motion between objects, and the idea that you are allowed to look at the physics in the reference frame that makes the analysis the easiest.

The other thing [besides analyzing momentum] you can do is to invoke relativity a la Galileo. The problem where both bat and ball are moving is still kind of a pain, mathematically, but if one of the two is stationary, then your life gets a whole lot easier. And we know what happens when a light moving object like a baseball hits a heavier object like a bat that isn’t moving: the ball bounces back at a fair fraction of the speed it came in at, and the bat only moves a little bit.

Be Vewwy, Vewwy Quiet

I’m hunting astwonomical objects

Life in the Quiet Zone: West Virginia Town Avoids Electronics for Science

Scientists who use some of the world’s most advanced instruments can’t use a microwave oven to heat their lunch. And then there was the time astronomers were baffled by a mysterious distortion of their data. They had a laugh when they discovered that the errant energy waves were coming from battery-operated fans sold in the facility’s gift shop.

Gren Bank was also the target for a facility to house an alternate master clock for the Navy/DoD, because Robert Byrd wanted it, but that was shot down in the early 90’s.

I Got Those Light Emitting Diode Blues

The Nobel prize would cheer someone up. Chad has an excellent summary of this over at Uncertain Principles, which includes many links to more good stuff.

Nobel Prize for Blue LEDs

That [difference between red and blue LEDs] seems like a pretty small change, dude. How is that worth a Nobel?
Well, because it’s really frickin’ hard to do.

It’s not a paradigm shift in terms of the basic physics, but it’s a ton of hard work and new technological development, and richly deserves the Nobel.

Another good piece at Starts With a Bang

It's Not a Proper Cat

Last week there was a splash in the news about a new particle that had been discovered, called a Majorana particle. Sadly, the coverage was disappointing, but since I had spent most of the week standing on my porch and shaking my fist at things, (and it’s not really my area of physics) I didn’t blog about it.

In short, this new discovery was a quasi-particle, i.e. a composite system, which was information that was buried in every article I read. To me this is reminiscent of the magnetic monopole coverage from a while back, which was another quasi-particle. Interesting, to be sure, but not really what was advertised in the headline.

Turns out, I wasn’t the only one a little miffed at how it was reported. Jon Butterworth was, too.
Majorana particles – Fundamentally confusing

… I was excited to read about the new particle, and somewhat diappointed when I did so to find out that it is not a fundamental Majorana fermion, still less a neutrino. A bit of a let-down for me and my particle-physics colleagues. Nevertheless, the result is interesting for a number reasons.

What has been seen is a quantum state in one-atom-thick wire which in a certain energy range behaves like a Majorana fermion. It is not a fundamental particle, it is a composite state, and the behaviour emerges from the interactions of atoms, electrons and photons, described by quantum electrodynamics, in which all fermions are Dirac. The fact that Majorana behaviour has been predicted, and then observed, to emerge as collective behaviour from a “more fundamental” (i.e. higher energy, shorter distance scale) theory is fascinating.

Neat stuff, no need to sex it up with the misleading inference that it’s an actual particle. And a good explanation of what’s what to boot.

Obscure title reference

Have a Little Trouble With Bears, Did Ya?

(You can only go to the “the hard is what makes it great” well only so many times.)

Research is Hard

Yes, you have to learn lots of math, physics, programming, and many other related things in order to tackle new and interesting research questions in astronomy. The same is true for many fields. But at the end of the day? We are all banging our heads against walls over a minus sign, or a factor of 2, or mixing up log-10 with natural log, or losing track of which star is which. This kind of “stupid mistake” hurdle is what really makes research hard.

The sheer volume of little details makes it inevitable that at least one will be wrong the first time through any problem, whether theoretical or experimental.

Also note the whole “we first tested this on data where we know the answer” bit. Good protocol.