Even a completely isolated atom senses the temperature of its environment. Just as heat swells the air in a hot-air balloon, so-called “blackbody radiation” (BBR) enlarges the size of the electron clouds within the atom, though to a much lesser degree — by one part in a hundred trillion, a size that poses a severe challenge to precision measurement.
Um, not quite. This analogy is drawing a Bohr-atom-esque analogy between orbit size and energy and implying that this due to some ideal gas behavior (hey, things expand when they’re hot!). The effect here is called the AC Stark shift, aka the light-shift. When you interact with a system, the interaction shifts the location of the energy levels in the system. This is a big problem in any precision experiment where the effect in question depends on the energy difference between the states, and the second is defined in those terms — 9192631770 Hz is the defined difference between the two hyperfine ground states, in complete theoretical isolation, and this holds true for any transition one might use, including the “you can call me Al+” device in the article. Any interaction with the atoms shifts those energy levels, so you have to know what the interaction is in order to allow you to measure that shift. That includes static magnetic and electric fields (the Zeeman and DC Stark shifts); oscillating fields in the form of EM radiation are also a problem. This is why atomic clocks which use lasers have to turn those lasers off when the atoms are “ticking” — the perturbation is huge. Simply accounting for it is not an option, because it depends on the intensity, so the shift would depend on how well you could servo the intensity of the laser light, and the answer is not “nearly well enough to do a part in 10^18 measurement” by many orders of magnitude.
As the articles mentions, blackbody radiation from, well, everything, is present, too. The walls emit radiation, you emit radiation; room-temperature-ish thing radiate most strongly near about 10 microns but the peak of the distribution depends on the temperature, which is exploited in thermal imaging. There were a few talks on the BBR effects at the Frequency/Timing conference I recently attended in San Francisco, including this one, though this result is quoted from its presentation at a different conference. The Blackbody radiation shift is one of the larger errors in any frequency standard; while one can measure the temperature of the vacuum system pretty well, what radiation profile the atoms actually see is not something that is known quite as well. Nothing is a true blackbody, and even though you’ve shut lasers off, windows in your system can let in thermal radiation from the outside. And then there’s the theory, which probably needs to include several orders of effects involving multiple energy states in order to be useful at this level. This was the nail sticking up in the error budgets of the frequency standards, so it’s not surprising that it is the one getting hammered down in recent theoretical and experimental work.
The problem I have with the imagery is twofold. First, the generic “atom gets bigger” picture runs counter to the deBroglie wavelength argument. That atom really isn’t hotter, since it’s not in thermal equilibrium with the radiation (a single atom can’t have a temperature, anyway), but an atom in a cold ensemble is bigger, because it has a smaller momentum, and hotter atoms get smaller in that regard. Second, the AC Stark shift is a tad more complicated than is described here. In an interaction with a two-state system (1 and 2, with 2 having a higher energy) it will indeed lower |1> and raise |2>, if you are shining radiation that is near that resonance. But |2> is a nominally unoccupied state. Even in the Bohr picture, that state isn’t what you think of when you look at the size of an orbit. The ground state, which is being pushed down to a lower energy, is what we naively use. In real atoms, with multiple states, the picture is much more complicated (and why the theory is as well). The direction of the shift on a state depends on the frequency of the light relative to the transition. If you consider a three-level system, the shift in the |2> state can be in the opposite direction of the shift in |3>, which happens if you tune the laser to a higher frequency than the 1—>2 transition. (There is a class of frequency standards using optical lattices where you choose the light frequency to exactly match the size of the shifts, so that frequency difference of the 1—>2 transition is unaffected.) Saying that BBR makes electron clouds bigger is just wrong.
I have another nit, related to the usual “this is a frequency standard, not a clock” disclaimer:
This quantum-logic clock, based on atomic energy levels in the aluminum ion, Al+, has an uncertainty of 1 second per 3.7 billion years, translating to 1 part in 8.6 x 10-18, due to a number of small effects that shift the actual tick rate of the clock.
This is backwards. The thing you can measure is the short-term stability, i.e. the frequency stability at short times (e.g. at one second, or at some later time when the measurement stops integrating down due to the systematic errors) is the value that can be determined by your experiment, and the time stability is extrapolated, Disco-Stu style (if these trends continue…). The reality that this experiment probably ran for a few hours at best. When it was shut off, the stability of the timing system reverted to whatever the stability of the other clocks was.