It Has To Be Believed To Be Seen

I’m reading this and that about biometrics, and I have a quick question – fingerprints, as well as every other biometric method of identification I’ve read about including face-recognition, retina-scanning, and so forth is declared to be a unique identifier; you, and only you, have fingerprints, retinas or a face that looks like that.

Well, says who?

That’s it, really. I’m hardly versed in the science, but I can’t find the bit where it says “yeah, we took several thousand people’s fingerprints, and several thousand other people’s fingerprints, and we checked partials and stuff, and yeah – it turns out they’re actually unique identifiers.” Because, apparently, that information doesn’t exist.

I realise that I’m not qualified to make broad, sweeping generalizations about an entire field, here, but this appears to be true of the entirety of the forensic biometrics field at the moment.

It seems to me that if we’re going to put people in jail for some length of time, an in some places in the world, to death, that we might want to apply some kind of empiricism to the situation.

9 Comments

  1. Posted April 17, 2004 at 6:55 pm | Permalink

    Historically, fingerprints were not empirically validated until recently, and when they were tested they were found wanting. The latest and greatest DNA matching has been checked out more thoroughly, partly because there was an incumbent. I’m not sure about the other body parts. Another issue is that if someone compromises your biometric ID it is hard to revoke it. It’s a fringe issue, but it should be considered.

  2. Mike Hoye
    Posted April 18, 2004 at 1:01 am | Permalink

    See, that’s what I’d like to know – who did the research, what’s the paper called, what journal was it published in.

    You’d think that this would be easier to find.

  3. Posted April 18, 2004 at 1:08 pm | Permalink

    I think it depends on what you’re using them for. Typically the fingerprint is not used as a unique identifier, but as a validation mechanism for a known identity. In a legal setting, it’s 2+ factor identification, and I don’t think anyone’s ever made the claim it’s bulletproof.

    Typically the courts need to establish opportunity and motive as well as the identity of someone. Having a fingerprint or a piece of DNA assists in identifying the individual, and is often a key piece of evidence, but you also have to prove or at least outline a plausible reason behind why it was that person who did it.

    I think a much better analogy for a fingerprint is a hash. It’s a signature that when used against a very specific object is an effective means of validating that object. There is the possibility that you will get the same hash with a completely different object, but the odds are against it. It’s just a piece of the evidence chain, but an important one because with a little extra data the probability of them not being unique to an individual object is pretty slim.

    Using fingerprints with single factor verfification is silly. My digitalPersona scanner does this, and while it’s convenient, I wouldn’t want to use it as “the” authority in determining that it is really my finger (they can be fooled with images/latex). Use it as a piece of a multi-factor authentication (i.e. fingerprint plus user password or pin), and it’s much more effective.

    It would still be possible, but very, very improbable, that you’d get two individuals with the same print in the same place at the same time involved with a crime. It’s a lot more probable that you’d find people looking to defeat an authentication system using prints, which is why you have to make them multi-factor. That’s just me, though.

  4. Mike Hoye
    Posted April 18, 2004 at 11:30 pm | Permalink

    “It would still be possible, but very, very improbable, that you’d get two individuals with the same print in the same place at the same time involved with a crime.”

    That’s my question. How do we know that? How do we know how improbable it is, how do we know anything at all about the likelihood of two fingerprints matching, or of the more common partial or smudged fingerprints matching a given sample?

    Point me to that paper, to those numbers, and then we can talk about the rest of it. Show me where those probablities and error rates have been evaluated, and then we can move on from there. I don’t care how they’re being used, or how they can be deliberately falsified. What I care about is whether or not the entire “science” is actually a science at all, or just an institutionally-sanctioned superstition.

  5. Melanie
    Posted April 19, 2004 at 3:02 pm | Permalink

    My personal knowledge about fingerprint ID is limited to what I know from watching Law & Order, CSI, and other such educational television. From them, I gather that fingerprint ID is more of an art than a science. I know of no numbers associated with it.

    As for DNA identification, I’m pretty sure there’s some serious research about its accuracy and reliability at separating out individuals, but I couldn’t tell you where to find it. I assume you’ve tried the basic forensics journals?

  6. Mike Hoye
    Posted April 19, 2004 at 5:49 pm | Permalink

    From the article:

    [....] fingerprint examiners occupy a privileged position not enjoyed by most experts. They routinely testify that a print left at the scene of a crime is a definite match to a suspect, with no possibility of error. Indeed the Scientific Working Group on Friction Ridge Analysis, Study and Technology (SWGFAST), which approves standards for fingerprint analysis techniques in North America, has stated: “[Fingerprint] identifications are absolute conclusions. Probable, possible or likely identification are outside the acceptable limits of the science.”

    Which means that fingerprint analysis is, using the correct technical term, absolute bullshit.

    Indeed, the SWGFAST website is creepily devoid of actual academic reference, citing, among other things “a century of experience”. Better yet, their Standards For Conclusions says that there’s “…no scientific basis for requiring that a predetermined number of corresponding friction ridge details be present” to uniquely identify a print, but asserts nevertheless that the examiner must be “competent”, which is elsewhere defined as having a high school diploma, four years of experience and a background check.

    The relevant quote from the Register article sums it up nicely: “It would surely be just a little bit embarrassing if a few years down the line governments’ deployment of fingerprints in the war on terror resulted in the near overthrow of the criminal justice system, wouldn’t it?”

  7. Novak
    Posted April 19, 2004 at 10:46 pm | Permalink

    Historically, the first research was done in the late 19th and early 20th centuries, so the research is crufty.

    You can’t really do an empirical comparison of hundreds of thousands of fingerprints against each other, though, until the process has been automated, which hasn’t been possible until very recently. So, it’s ongoing, and thus, fingerprints haven’t been considered ironclad.

    That identical twins (as far as I know) have unique fingerprints is a pretty good layman’s argument, though.

    I know NIST does this sort of work, using the FBI collection as a database.

  8. Mike Hoye
    Posted April 19, 2004 at 11:10 pm | Permalink

    Your assertion that fingerprints haven’t been considered ironclad is contradicted by the aformentioned SWGFAST claim, that fingerprints are absolute conclusions.

    Further, from the NIST Biometrics Factsheet:

    In conjunction with the FBI, NIST has developed several databases, including one consisting of 258 latent fingerprints and their matching “rolled” file prints.

    Awesome.

  9. Mike Hoye
    Posted April 20, 2004 at 12:02 am | Permalink

    Wow, there’s lots of stuff here.