A couple days back, Chris Anderson at Wired posted some junk about large volumes of data making the scientific method obsolete, misapplying George Box’s quote, “All models are wrong, but some are useful.” I was a little too distracted to respond, but it didn’t exactly escape the notice of the science and skeptic blog-o-icosahedron.
Archive for July 6th, 2008
But if you can, since it’s 60 LY across, don’t ask me if your butt looks big.
[A] ribbon of gas, compressed and glowing due to a shock wave that slammed into it. The shock came from Supernova 1006, a star that detonated 7000 light years away from us. This was not a massive star that exploded, but a low-mass white dwarf, the dense core left over when a star like the Sun runs out of fuel. Still, the forces are roughly the same, with a titanic explosion ripping the star apart and creating eerie, unearthly beauty even in death.
How many cameras are you wearing? (Chandler, to Monica)
candid camera, over at Cocktail Party Physics.
Richmond’s main hypothesis, however, was that the effect stems from the fact that the camera only has one “eye” (i.e., the lens), whereas human beings have two eyes, roughly 7 to 8 centimeters apart. The camera, it seems, lacks depth perception. The result is a kind of “flattening” effect that can make objects seem wider in photographs.
Particle and wave descriptions of light, duking it out in the early 19th century. What a drag: Arago’s Experiment (1810) over at Skulls in the Stars.
Before 1800, most scientists were proponents of the so-called corpuscular theory of light propagation. In this view, which was championed and solidified by Isaac Newton in his 1704 book Opticks, held that light consisted of a stream of particles. Newton explicitly argued against the wave theory of light and (seemingly) refuted arguments by early wave theory proponents such as Christiaan Huygens. Newton’s arguments, and his personal gravitas, left his particle theory mostly unchallenged until the early 1800s.