At the Tone, it Will be ‘Now’ O’Clock

The Problem of Now

I don’t spend much effort thinking about this sort of issue, since I’m much more interested in the experimental aspects of measuring time than the philosophical aspects of it, but I’ve run across some folks who think this problem of “Now” is so perplexing they can’t get past it. (again, because my interests lie elsewhere, this seems more of a dorm-room discussion, or possibly one involving a professor who looks like Donald Sutherland discussing whether atoms can be universes). My view of the utility of this is that while “It’s always now” may or may not be deep thinking, it doesn’t help GPS tell you where you are. (unless “You are here” is an acceptable answer)

[R]egardless of whether you use an external definition of time (some coordinate system) or an internal definition (such as the length of the curve), every single instant on that curve is just some point in space-time. Which one, then, is “now”?

Later on there’s also an interesting point about memory not needing consciousness.

Udderly Ingenious

These Backpacks For Cows Collect Their Fart Gas And Store It For Energy

Researchers put plastic backpacks on cows, then inserted tubes into their rumens (their biggest digestive tract). They extracted the methane–about 300 liters a day. That’s enough to run a car, or a fridge for 24 hours.

There’s More to Physics Than The LHC

Particle Fever is aptly named

[T]his equating of “physics” with “particle physics” not only plays along with the media myth that the only thing worth noting in physics is what is going on at CERN, but also explains outbursts like this one I received from a (non-particle) physicist recently: “Perhaps the poster child for overselling science should be high-energy physics. They oversold the most expensive toys that physicists have ever produced: high-energy particle accelerators… their arrogance when they talk about ‘the god particle’ and ‘the most important problems’ is disappointing.”

Plenty of Science Yet to Do

Science Is Running Out Of Things To Discover?

[H]aven’t we learned anything from the history of science? The last time someone thought that we knew all there was to know about an area of physics, and all that we could do was simply to make incremental understanding of the area, it was pre-1985 before Mother Nature smacked us right in the face with the discovery of high-Tc superconductors.

I have some serious doubts about the original article as well. When I say the claim the the time was getting longer for nobel awards I thought it was a typo, because in my atomic physics corner of the world, that trend does not seem to be in place at all. AMO physics has reflected a short gap between discovery and Nobel. And looking at that trend makes me doubt the physics graph presented in that paper.

The 1989 Nobel went, in part, to Norman Ramsey for his separated oscillatory fields method used in atomic clocks, developed in 1949. I don’t see a 40-year data point anywhere on the graph.

The 1997 Nobel was awarded for contributions to laser cooling and trapping, with the experimental start in the early/mid-1980′s. I don’t see any ~15-year data point for 1997.

The first Bose-Einstein condensate was observed in 1995. The Nobel was awarded for that in 2001 – a scant six years. No such data point exists.

The optical frequency comb was demonstrated in 1999, and the Nobel was awarded (again) six years later. That data point is missing as well.

I don’t know what’s going on, but this doesn’t smell right.

If Only it Mattered

British Columbia Enacted The Most Significant Carbon Tax in the Western Hemisphere—What Happened Next Is It Worked

Nice to see that this can work. If only facts mattered in US politics…

Induction in the Hole!

Another video of the Navy’s railgun (I did an analysis of an earlier test a few years back), which is slated to be tested at sea pretty soon.

You need to a flashplayer enabled browser to view this YouTube video

It’s Been Such a Long Time

Ask Ethan #30: Long-term timekeeping

In this week’s Ask Ethan, we take on perhaps the longest question of them all, and look at how to keep time for arbitrarily long times.

It’s a good post as usual, though there are a few things Ethan glosses over, which is where I step in.

[millisecond pulsars] are also the most accurate clocks we’ve ever discovered. They are so regular that we could watch one, look away for a year, and know — when we look back — whether ten billion pulses have gone by… or whether it’s ten billion-and-one. In fact, we can get down to around microsecond accuracy to their timing over periods of many decades, meaning we can get timing accuracy to around one part in 10^15!

This is bettered only by the most advanced atomic clocks on Earth

In terms of fractional stability that’s true (and I think he means precision rather than accuracy here), but Tom- man-made atomic clocks reach this stability in a matter of hours or days, not years. It’s only by having these good clocks that we can measure how well the pulsars are doing.

I have a recollection of a discussion about timing with pulsars from years ago (this isn’t a new idea). Pulsars don’t actually have an inherently stable frequency — they are slowing down, just like other macroscopic spinning objects, so the timing will show a drift. But pulsars do this at a very predictable rate, so you can characterize them and account for the drift. Some pulsars haven’t “settled in” and can undergo a star quake, which changes their rotation abruptly, but I think the ones under discussion are past that age.

From his followup post

Atomic clocks require a lot of power to continuously stimulate atomic transitions, and a lot of cryogenic fuel to keep the atoms at ultra-low temperatures. Not such a big deal when you’re talking about doing this in a continuously powered laboratory on Earth, but that’s a lot of resources to devote to keeping a simple clock running. The mechanism I gave in the original article — counting atoms or looking at a pulsar — has the advantage that all you have to do is look once at the beginning and once at the end, and requires no devoted power in the intermediate time.

The cryogenics part is a common misconception, since the best clocks are cold-atom clocks. So the assumption is understandable, but these clocks are all offshoots of laser cooling and trapping. Some crystal oscillators in frequency standards are cryogenic, but not in any continuously-runnung clock. That’s a logistical nightmare.

I think our clocks are not getting quite as much respect as they deserve — the pulsars have to be characterized to be useful, and no two pulsars are going to have the same frequency. You could, in principle make a stable reference from measuring several of them, but to actually tell time you have to tie it back to a standard, which means a man-made device.

The second part, about radioactive decays, stresses the longevity of the clock but ignores the precision of the measurement. Unless you have a huge chunk of the material and can determine the number of atoms precisely, your counting statistics will limit the precision of your measurement.

The upshot of all this is that timing is a little more subtle than having something that will tick for a long time. Accuracy, precision and durability aren’t interchangeable attributes. There are often tradeoffs between each of them, so it depends on which is more important.

How Do You Make an Aluminum Float?

Here’s a couple of videos I ran across involving magnetic levitation. You have a changing magnetic field, which induces a current in a conductor, making a field that gives you a net repulsion.

You need to a flashplayer enabled browser to view this YouTube video

So what happens if you let the levitating material continue to heat up? This second video starts a little slow, but be patient. It’s fun at the end.

You need to a flashplayer enabled browser to view this YouTube video

Obscure Trivia Question

How do I impress a physicist on a date?

The main answer is good, and related to What (not) to Say When You Meet a Physicist (especially the persistent idea that all we want to do is teach people physics), but there’s also this

Don’t – tell that great science joke you heard two years ago. We’ve heard it all. even the one about the neutron that walks into a bar.

Absolutely

You need to a flashplayer enabled browser to view this YouTube video

Great job distilling the concepts here — that not everything is relative. It’s just that there are things like length and time (and related concepts like simultaneity) that we once thought we absolute, and it’s hard to wrap one’s head around the fact that they depend on your frame of reference.

WWLD

(Thought I scheduled this to post last week, but apparently I didn’t)

To Save Drowning People, Ask Yourself “What Would Light Do?”

I’ve seen a discussion of the lifeguard dilemma before, but there’s some added value here. Especially the disavowal of the “dog doing calculus” bit

Tim was impressed enough by Elvis’s trick to write a paper called “Do Dogs Know Calculus?” In it he reassures the reader that “Elvis does not know calculus…In fact,” Tim adds, “he has trouble differentiating even simple polynomials. More seriously, although he does not do the calculations, Elvis’s behavior is an example of the uncanny way in which nature often finds optimal solutions.”

I’m glad to see this, as opposed to the “animal of some sort does calculus” headlines I’ve run across a few times.

Gimme That Old Style Energy Storage

Energy Storage Hits the Rails Out West

It’s an interesting concept. They’ve already built a test system, so I’m going to assume this was thought out and will actually work. Let’s look at the numbers.

Each car carries 230 tons and the hill is roughly 3000 feet high, so if we convert to metric and grab an envelope, mgh is about 2 billion Joules of potential energy per car. The output is designed to be 50 MW, so each car gives 40 seconds of runtime at that power, and that’s assuming 100% efficiency of the regenerative brakes. Obviously we need more than one car.

We can also look at this from a different perspective. In this case the power output is P =Fv (that’s actually a dot product, if you’re scoring at home). The incline is given as 6 to 9 degrees, so one car would have to travel at least 160 m/s for that output (at 9º), but that scales with the number of cars. 160 cars can travel at 1 m/s and give us 50 MW (or 80 cars at 2 m/s, etc.), again, modified by the efficiency of the system. Slow speeds would allow the system to “ramp up” by getting to the target speed quickly and operate safely — I doubt anyone wants these trains running down a hill at tens of meters a second.

One last bit of data is that the track is 6.5 km long. I’m assuming that’s the length in addition to the length of the train itself, i.e. it’s how far the train can go.

Let’s assume the efficiency (e) is 75% and we have n cars.

t= (30s)*n and also t = L/v where L is our effective track length. vn = L/30s = 215 m/s

We also know that en(mg)(sin9º)v = 50 MW, or vn = 200 m/s.

Not bad — the answers are within 10%. (Of course it could mean I made the same underlying error in both estimates, or two that happen to cancel)

The final factor is how long the system runs. More cars going slower extends the time, but runs into a space limit. But 50 cars at 4 m/s goes for ~25 minutes, which is not bad for a gap-stopper.

Next Page »

ScienceForums.Net Blog Network | More Blogs | Search Blogs | RSS Logo SFN RSS