Wednesday, September 14, 2005

Forensic Scientists: Conspirators? & Charlotte's Web: a Lesson in Bullshittery.

(Jump to comments.)

Forensic Scientists: Conspirators?

One of the things that makes me proud of science is that it is self-correcting. (Similarly, I am ashamed of religion in that it criticizes science for "always changing" and being "unstable") But one thing that always annoys me is the accusation that scientists are only "claiming" a rule and they are all trying to hide something from us, either for money or because they are afraid of exposure.
Today at I found [this link] [or this printable link] concerning faulty fingerprint matching. It begins with one of these major annoyances, "While forensic scientists have long claimed fingerprint evidence is infallible.." as if scientists just made it up. Once again a journalist is going to use the old "It's just a theory" mantra to ridicule the work of scientists. On top of this, there were no real sources cited in the story. There's a comment page, whatever help that is supposed to mean. There's a tag at the end: "UC Irvine." Great.
There are several reasons to be skeptical about the importance of this article, and the study. There are a few very inappropriate and misleading grammar add-ons. In the first paragraph,
'...but as many as a thousand incorrect fingerprint "matches" could be made each year in the U.S.'
I think any competent reader understands that if a match proves false, it is directly implied that it was not a true match. To put the word in quotes sleazily implies that we should doubt all matches as tenuous, as if any match is a guess. Later,
'...fingerprint examiners have long held that fingerprint identification is “infallible,”...'
Again, this mocking tone is extremely misleading, obviously biased against a made-up feeling of conspiracy. The very next sentence is a rather unscrupulous accusation from the mouth of Simon Cole.
'“Rather than blindly insisting there is zero error in fingerprint matching, we should acknowledge the obvious, study the errors openly and find constructive ways to prevent faulty evidence from being used to convict innocent people,” said Cole, an assistant professor of criminology, law and society.'
In the end of the article, he claims more directly that forensic scientists are conspirators behind their own zero-error claim.
'While we don’t know how many fingerprint errors are caught in the lab and then swept under the rug..." '
Let me correct Cole in that scientists to not "blindly insist" on anything. Another inappropriate addition of quotation appears when discussing the safeguards of fingerprint matching procedures.
'Wrongful convictions on the basis of faulty evidence are supposed to be prevented by four safeguards: having print identifications “verified” by additional examiners;...'
Surely, the author(s) and editor(s) of this article could have looked up the word verify and asked exactly what it means in a forensic context. does alot of misleading in this article. They openly admit that
"Cole’s data set represents a small portion of actual fingerprint errors because it includes only those publicly exposed cases of mistaken matches. The majority of the cases discussed in this study were discovered only through extremely fortuitous circumstances, such as a post-conviction DNA test, the intervention of foreign police and even a deadly lab accident that led to the re-evaluation of evidence."
However. there is no indication to the significance of this filtering. How small a portion is this sample? Is it enough to be representative of any population? If so, is the error factor even alarming? It would have been prudent for this article to delve into other questions, as well. What really makes those publicly exposed errors special?
In the end, this article has implications that have no real founding. For one, are we to assume after reading this article that we will run into people who have the exact same fingerprints as ourselves? We should not, but that seems to be the aire of this 'report.' There is no distinquishment here between the procedure of matching fingerprints, the uniqueness of fingerprints, and how this fingerprints were used in a court of law. Were Brandon Mayfield's fingrprints not unique? This surely is not the case. Rather, physical and algorithmic errors were the cause. Factors such the method of lifting the fingerprints, the software used to map them, the specific algorithm used, a faulty computer database, etc. So, am I supposed to doubt the uniqueness of fingerprints or the methods of forensic scientists? Why the vagueness?
Where are all this mitigating factors in Cole's study? Why did he not care enough about this article about his work to make sure it is not so misleading?
It is dangerous, what Cole is allowing to happen. One sign of bad science is when a person rushes his 'work' to the media prematurely. Competent and honorable scientists value what we call a 'peer review.' We want our studies to be rushed to the hands of as many scientists and journals as possible long before the media gets its dirty hands on it. Simon Cole only went to one journal, and then the media, it seems.
In any case, one sign of bad science journalism is to predict or declare a profound or semi-profound change in the way science is done or trusted based on very few studies, especially if "few" means "one." Cole seems OK with declaring that fingerprint verifications are faulty, but his rush to the media before more scientists gives me the feeling that he is afraid of peer review verication of his own study. Is this a sign of guilt?
One last statistical quirk. Since Cole's sample is so small and so biased because it is all publicly exposed errors, how can he project how many errors may be occuring with any accuracy? How did he get his number of 1,900 errors per year? All signs point to BULLSHIT.
People, you need to be critical of any so-called scientific study when you read it online or see it in the newspaper or watch it on the news. Journalists are not scientists; scientists in love with the media are probably not very good or doing a bad job.

Charlotte's Web: a Lesson in Bullshittery

In other news, published [this story] about a spider that allegedly wrote the word "lunch" in its web. Looking at the [picture], I just do not see it. These poor people are subject to a phenomenon known as pareidolia. This spider is just [Another Face in the Crowd]. Our brains are pattern-finding machines. Sometimes we see images that are not really there. For example, in the link above, Phil Plait thought he saw Vladamir Lenin in his shower. Many have seen the holy Mother Mary, or Jesus, or some other saint figure either on concret, marble, or in one recently infamous case, a grilled cheese sandwhich (websites like often publish links to pranksters on who auction off false images to poke fun at such saint-seers).
As I said, our brains are pattern-finding machines. This is so much the case, in fact, that we have trouble being random when asked to. Here is an experiment you can do with a group of at least 10 people, that I saw on a program called [The Mechanical Universe... and Beyond] from the Annenburg corporation. It was showing on cable channel 2 a few days in June.
Every person must write ten to thirty digits in a row, using only the numbers 1, 2, and 3. Then you have them construct a table by counting the first number, and the next number after that. Then the second number paired with the number after that. The top row is the left number and the bottom number is the right number. Plot that pair on the table.
1 2 3 ------------ 1| 2| <-- This is the table empty. | 3|
The number 12321232123212321 comes up with this distribution .
1 2 3 -------------- for 12321232123212321 1| 0 4 0 2| 4 0 4 <-- This table shows an obvious pattern 3| 0 4 0
If you do this exercise several times using only your own mind to generate a random string of numbers, you may be surprised to find that you will probably be tabulating very small numbers in a down-right diagonal, that is pairs of (1,1), (2,2) and (3,3). What this shows that we think any repeating number is a sign of a pattern. Non-repeating pairs, therefore, will tend to be high.
What does that show? We have a hard time being random when asked to. Repeat this exercise with a die (one in a set of dice), replacing 4 with 1, 5 with 2, and 6 with 6. Also, do this with a very, very large number of digits, What you will find is that each number in the table will be nearly identical, with small standard deviations. Perhaps one, maybe even two pairs will be a little big high. Such is still expected in randomness; clustering does happen. In the future, try to be aware of this kind of bias. Just because things happen close together chronologically does not mean they are connected. Sometimes it even just has to be that way.
I can think of another test. Often we see patterns because of something called [confirmation bias]. The test is called the [Wayson Card Experiment]. Read the mini-lesson for yourself.
I can think of a third test. Get yourself a sheet of paper. Using a dark pen, randomly slap it on the paper in random spots to make dots. When you've had enough, find yourself a transparancy maker, the kind that is used in schools for the overhead projector. Make two copies of your dots. Now, put them both on a table or the overhead projector, exactly aligned. Next, use a needle or pencil tip to poke a hole right in the very center. Now, keeping that tip/needle in the center, rotate the top sheet.
What do you notice? All the dots somehow manage to form concentric rings around the center. Why? The problem is, we tend to think that random events will be evenly distributed spatially. Any true signs of randomness will be clusters that still form on various regions on your copies, and if this illusion does not work with a different point of center.
In the future, do not be fooled by what your eyes tell you. Contrary to the mantra of Creationists, the eye is not a perfect thing that could not have evolved. It is imperfect (many of us need glasses, duh), and it is probably still evolving. The same goes with our brains.

Labels: ,

| Links to this postEmail This!
To Top of Post