Scientists are such killjoys when it comes to philosophy edition, courtesy of SMBC
(Check the popup by putting your cursor on the red dot)
Scientists are such killjoys when it comes to philosophy edition, courtesy of SMBC
(Check the popup by putting your cursor on the red dot)
The story where I saw this calls the phenomenon “ground resonance”. While it looks and acts a lot like an unbalanced washing machine in the spin cycle, which doesn’t seem like a resonance phenomenon, from what I understand this can be caused by a shock to the system from landing, and the helicopter is susceptible at certain rotor speeds. If the compensation is 180º out of phase with the wobble you get positive feedback, so it makes sense that this could happen; you’d either want to speed up or slow down the rotor, were it safe and easy to do so, to change the feedback. But this happened pretty quickly.
In an advanced physics class.
Ice Sliding Off a Bowl: When Does It Leave the Surface?
A small block of ice is placed on the top of an inverted spherical bowl. The ice is then given a slight nudge so that is slides down the side of the bowl. At some point, the ice will speed up enough to leave the bowl. At what angle does this happen?
How Big a Battery Would It Take to Power All of the U.S.?
Generating capacity is, however, only one side of the story. Storage systems are rated not only by their power, or how fast they can crank out energy (measured in gigawatts), but also by the total amount of energy they store (measured in gigawatt-hours). A facility with an energy capacity of one-gigawatt that can only supply electricity for 10 minutes would not be very helpful; in an ideal world it could do so for, say, 100 hours, thus storing 100 gigawatt-hours. Building up new pumped hydro-facilities similar to existing ones would probably help in all but the most disastrously long of wind lulls. For those worst-case scenarios, we might still have to brace for rolling blackouts.
Of course, this simple calculation also assumes current consumption levels. How would we power all those electric cars that we’re supposed to be driving in the future?
Quantum entanglement is a topic that often gets mangled in the popular press (much my to my torment), so it’s nice when a physicist writes about it.
Tangled Up in Quantum Mechanics
[H]ere’s the problem: the first measurement does not cause anything to happen with the second system: they cannot be in communication in any way, because the distance between them is arbitrary. In other words, they could be separated by several parsecs without changing the outcome, so if they were actually passing information, that would be in violation of relativity. You can’t send signals faster than light using entanglement as a result: the only way you could kinda-sorta communicate is if you had two groups of researchers who agreed in advance on what the settings of their instruments would be before they parted company; no new information would be available, since the real communication takes place at light-speed or slower, before the measurements are even performed.
Fair warning: at the end of the post, under the heading of “What Entanglement Is Not”, the discussion loops back into the “everything is connected” kerfuffle.
Building a Better Clock
I see no progress in this industry. These clocks are no faster than the ones they made a hundred years ago.
– Henry Ford
I really hope he was kidding, but assuming he was it’s pretty funny to a timing geek like me. The way you make clocks better is by making them more precise and accurate, and the levers for this are hidden in the equation for “counting the ticks”. If our ability to count precisely is somehow limited, e.g. if we had an oscillator — like a wheel — and we could measure its angle to a precision of 3.6º, then letting it go for one oscillation represents a 1% measurement, but that same absolute error for 100 oscillations is 0.01%, and we can get there either by integrating longer or by having an oscillator with a higher frequency. So “more ticks” is better … if we don’t have a noisy oscillator. Certain noise processes don’t integrate down, so another lever is to improve the noise, or possibly the noise characterization, of our clock.
Better clocks came in the form of Harrison’s chronometer which could be put to sea, and which included advances like using multiple kinds of metals to reduce temperature effects, and a spring which maintained constant tension. On land, improvements came in the form of better pendulum clocks, culminating in Riefler and Shortt clocks in the early 1900’s, with temperature compensated pendula (to inhibit the length from changing), kept under moderate vacuum to reduce drag and possible humidity effects, and were capable of performing at a precision of around a millisecond per day, and are examples of going to a higher frequency (a period of a second rather than a day) and minimizing the noise effects. Going into the 1930’s-1940’s, quartz oscillators, using much higher frequencies (many kHz rather than 1 Hz) became the best clocks.
Up to this point, the length of the second was defined in terms of a fraction of the tropical year in 1900, which was close but not identical to 86,400 seconds per day (being off by a few milliseconds), but atomic standards were investigated and in 1967 the definition of 9,192,631,770 oscillations of Cs-133 hyperfine transition was adopted, and atomic timekeeping defined Coordinated Universal Time (UTC) starting in 1972. This also marked the start of inserting whole leap seconds to match atomic time with earth rotation time; prior to that it was done by adjusting clock frequencies or inserting fraction-of-a-second steps in time.
Continue reading
Almost all of my colleagues had put very, very low odds on the OPERA experiment’s result being correct — not because the people who did the measurement were considered incompetent or stupid, but because (a) doing experimental physics is very challenging; (b) this particular result was especially implausible; and (as everyone in the field knows) (c) most experiments with a shocking result turn out to be wrong, though it can take months or years to find the mistake.
I haven’t seen the vitriol or scorn the author mentions, but I don’t read everything. Complex experiments are hard, as he notes. In the kinds of experiments I’ve worked on we’ve spent lots of time chasing down subtle vacuum problems and electronic problems like ground loops — if your circuitry has a “loop” configuration it acts as an antenna, converting changes in magnetic fields into voltages (from Faraday’s law) and picking up the noise radiated by all of the electronics in your lab.
“Loose cable” was not high on the list of problems by the betting bystanders, since “Einstein was wrong” and “GPS is fouled up” sounded much sexier, and one might not expect such a mundane-sounding problem to either survive or cause these problems, but there is subtlety there. (I can recall a recent problem with an sma connector with a bad thread — it felt properly tight, but wasn’t. Caused all sorts of weird signals, and took an extra set of eyes to help spot the problem.)
They still need to confirm that this was indeed the source of the anomaly. That’s just good science.
The Most Common Cooking Mistakes
If you want real, true, sweet, creamy caramelized onions to top your burger or pizza, cook them over medium-low to low heat for a long time, maybe up to an hour. If you crank the heat and try to speed up the process, you’ll get a different product―onions that may be crisp-tender and nicely browned but lacking that characteristic translucence and meltingly tender quality you want.
A little more detail on why I think that the idea of every electron affecting every other electron around the universe doesn’t wash.
One thing about scientists that nonscientists sometimes don’t get is that predictions have wide-ranging implications. You may think that something is true because it holds for a specific example, but if that idea is to be generally true, it has to hold up all across physics. (As an aside, this is a common stumbling block for crackpot theories). Claiming that “everything is connected” can be tested and indeed has been tested, even though the experiments were not made for the targeted purpose of falsifying this specific claim.
I’ve already given the example of atomic clocks, though any precision spectroscopy experiment would probably suffice. Brian gives the example of bands in semiconductors, and the Pauli Exclusion Principle is the source of this structure — the electron energy levels cannot be the same, so they “pile up” into bands. But his video takes that one step further, to affecting distant semiconductors. So, I ask, where are these bands in individual atoms? Why don’t we see them? Gather together a reasonable fraction of Avogadro’s number of atoms in a lattice and you get a fairly wide band of energies for a transition, even after you reduce thermal motion. Do the same to a gas, and do Doppler-free spectroscopy, and the transitions can be quite narrow.
Another example, as I mentioned in the comments, has to do with the behavior of composite Fermions, such as atoms. The Pauli Exclusion Principle is based on the behavior of identical particles. If all these electrons are in slightly different states, which differentiates the electrons, then the atoms themselves are not identical anymore. Which means that if you were to collect a bunch of them into a cold gas in some confining potential, you should be able to get them all to drop to the ground state (which would be a band). But we don’t see this behavior: You can form what is known as a Fermionic condensate, which is the analogue of a Bose-Einstein Condensate. But since the Fermionic atoms are identical, they are subject to the Pauli Exclusion Principle and can’t occupy the same energy states in the system; this adds a level of difficulty in forming them (you don’t have the same avenues of exchanging energy in collisions during evaporative cooling, since you are limited to one atom per energy level). But the bottom line is that this kind of system exists, which tells us that the atoms are identical to each other, and falsifies a prediction based on Brian’s conjecture.
More, and in some depth, on the Pauli Exclusion kerfuffle. (I was on the road all day yesterday and got in late, so I am not really caught up with … life, so only time for this quick note.) Please go read it.
Much more interesting to me is getting the physics right.
Amen to that. If my snarky headline/commentary got in the way, I apologize for that.