In Puckett’s case, where there were only five and a half markers available, the San Francisco crime lab put the figure at one in 1.1 million—still remote enough to erase any reasonable doubt of his guilt. The problem is that, according to most scientists, this statistic is only relevant when DNA material is used to link a crime directly to a suspect identified through eyewitness testimony or other evidence. In cases where a suspect is found by searching through large databases, the chances of accidentally hitting on the wrong person are orders of magnitude higher.
The reasons for this aren’t difficult to grasp: consider what happens when you take a DNA profile that has a rarity of one in a million and run it through a database that contains a million people; chances are you’ll get a coincidental match. Given this fact, the two leading scientific bodies that have studied the issue—the National Research Council and the FBI’s DNA advisory board—have recommended that law enforcement and prosecutors calculate the probability of a coincidental match differently in cold-hit cases. In particular, they recommend multiplying the FBI’s rarity statistic by the number of profiles in the database, to arrive at a figure known as the Database Match Probability. When this formula is applied to Puckett’s case (where a profile with a rarity of one in 1.1 million was run through a database of 338,000 offenders) the chances of a coincidental match climb to one in three.
It’s scary that a judge didn’t find these statistics relevant, and scarier still that the FBI is denying database access to scientists who might confirm that their statistics are right or wrong.
I wonder, though, if the defense asked if the police had tracked down the other ~300 suspects who would have fit the DNA profile in the US, and if that would have raised reasonable doubt?