The Future Has Arrived

News from Uncertain Principles. Futurebaby is now in the past tense, and is now Steelykid.

Belated congratulations to Chad and Kate, the proud parents. I’m expecting big things for my Silver Warriors (I’m class of 1980), from whatever sports teams she’s on in 15 years.

(Of course this loses meaning if it was a c-section, but what the heck)

If anybody needs a dose of cute, there are baby pics

Meet SteelyKid, Babies Are Bosons, FutureBaby Betting Pool Winner

(I was visiting family this past week, and all of my small cousins were pretty much terrified of me. I was crushed. No turning kids upside-down or tickling, or if things are going well, both at once. And certainly no splunks — our term for blowing a raspberry on the belly.)

Chuckles From Above

Catching up with my blog reading. Via Physics and Physicists, an ArXiv paper by L. B. Okun, The Einstein Formula: E0=mc^2 “Isn’t the Lord Laughing?” detailing some history of “relativistic mass” and the confusion surrounding the term.

The article traces the way Einstein formulated the relation between energy and mass in his work from 1905 to 1955. Einstein emphasized quite often that the mass m of a body is equivalent to its rest energy E0. At the same time he frequently resorted to the less clear-cut statement of equivalence of energy and mass. As a result, Einstein’s formula E0 = mc2 still remains much less known than its popular form, E = mc2, in which E is the total energy equal to the sum of the rest energy and the kinetic energy of a freely moving body. One of the consequences of this is the widespread fallacy that the mass of a body increases when its velocity increases and even that this is an experimental fact.

[…]

Why is it that the weed of velocity-dependent mass is so resistant? First and foremost, because it does not lead to immediate mistakes as far as arithmetic or algebra are concerned. One can introduce additional ‘quasi-physical variables’ into any selfconsistent theory by multiplying true physical quantities by arbitrary powers of the speed of light. The most striking example of such a ‘quasi-quantity’ is the so-called ‘relativistic mass.’ If calculations are done carefully enough, their results should be the same as in the original theory. In a higher sense, however, after the introduction of such ‘quasi-quantities,’ the theory is mutilated because its symmetry properties are violated. (For example, the relativistic mass is only one component of a 4-vector, while the other three components are not even mentioned.)

That's Gonna Leave a Mark

I was once asked, by someone outside of academia, about academic (dis)honesty, and concurred that accusing a researcher of this kind of misconduct is about as serious as you can get. Using data or results without attribution (plagiarism) or worse, outright fabrication of data, are things the scientific community should not (and generally does not) tolerate. Part of the feedback loop keeping things on the straight-and-narrow should be vested self-interest. I can’t imagine researchers wanting to collaborate with one who has plagiarized, and it’s more difficult to do research alone. One who fabricates data is almost certain to be found out, unless it’s in an area of research so obscure that there is no follow-up. (But then that means the research has little value — it’s like counterfeiting a dollar-bill. Why bother?)

I’ve never observed any of this, though I’ve been around long enough to see the type of worker who likes to take credit for others’ work in endeavors outside of research. Fortunately, these cases are peripheral to my own career — I’ve mostly worked with people who were quite insistent on making sure that the credit for work was properly attributed. That’s something that boosts your own credibility, of course, because your audience will believe you when you give an account of your own contribution to the work.

There’s now a study that followed up on some cases on scientific misconduct, and an article summarizing it. Does fraud mean career death?

“People who were found guilty of plagiarism [as opposed to expressly fabricating or falsifying data] get less severe of a punishment, so they were more likely to continue to publish,” Redman noted. Ten of the 28 scientists whose employment information they were able to trace continued to hold academic appointments after the ORI ruling. Originally, 23 out of those 28 had worked in academia.

However, Merz and Redman’s data, as well as interviews they conducted with the seven researchers who agreed to speak with them, indicate that recovering from the misconduct ruling was extremely difficult. Unsurprisingly, the group’s average publication rate was significantly lower after the ruling, dropping from 2.1 to 1.0 publications per year. Twelve of the scientists ceased to publish completely. In interviews with Merz and Redman, researchers described extensive personal and financial hardships due to the ruling.