A sweep of their tiny fingers

My research into the state of child fingerprinting has led me to this extreme video – you will want to download it.  Then let's look further at the technical issues behind fingerprinting.

Here is a diagram showing how “templates” are created from biometric information in conventional fingerprint systems.  It shows the level of informed discourse that is emerging on activist sites such as LeaveThemKidsAlone.com – dedicated to explaining and opposing child fingerprinting in Britain.

Except in the most invasive systems, the fingerprint is not stored – rather, a “function” of the fingerprint is used.  The function is normally “one-way”, meaning you can create the template from the fingerprint by using the correct algorithm, but cannot reconstitute the fingerprint from the template.

The template is associated with some real-world individual (Criminal?  Student?) During matching, the fingerprint reader again applies the one-way function to the fingerprint image, and produces a blob of data that matches the template – within some tolerance.  Because of the tolerance issue, in most systems the template doesn't behave like a “key” that can simply be looked up in a table.   Instead, the matching software is run against a series of templates and calculations are performed in search of a match.

If the raw image of the fingerprint were stored rather than a template, and someone were to gain access to the database, the raw image could be harnessed to create a “gummy bear” finger that could potentially leave fake prints at the scene of a crime – or be applied to fingerprint sensors.

Further, authorities with access to the data could also apply new algorithms to the image, and thus locate matches against emerging template systems not in use at the time the database was created.  For both these reasons, it is considered safer to store a template than the actual biometric data.

But by applying the algorithm, matching of a print to a person remains possible as long as the data is present and the algorithm is known.  With the negligible cost of storage, this could clearly extend throughout the whole lifetime of a child.  LeaveThemKidsAlone quotes Brian Drury, an IT security consultant who makes a nice point about the potential tyranny of the algorithm:

If a child has never touched a fingerprint scanner, there is zero probability of being incorrectly investigated for a crime. Once a child has touched a scanner they will be at the mercy of the matching algorithm for the rest of their lives.” (12th March 2007 – read more from Brian Drury)

So it is disturbing to read statements like the following by Mitch Johns, President and Founder of Food Service Solutions – whose company sells the system featured in the full Fox news video referenced above:

When school lunch biometric systems like FSS’s are numerically-based and discard the actual fingerprint image, they cannot be used for any purpose other than recognizing a student within a registered group of students. Since there’s no stored fingerprint image, the data is useless to law enforcement, which requires actual fingerprint images.

Mitch, this just isn't true.  I hope your statement is the product of not having thought through the potential uses that could be made of templates.  I can understand the mistake – as technologists, evil usages often don't occur to us.   But I hope you'll start explaining what the risks really are.  Or, better still, consider replacing this product with other based on more mature technology and exposing children and schools to less long term danger and liability.

Published by

Kim Cameron

Work on identity.

5 thoughts on “A sweep of their tiny fingers”

  1. Hi,

    Just wanted to thank you for lending some credible weight behind the absurdity of what is going on here in the UK. Brian Dury is of course entirely correct and does point out exactly what people don't realise.

    So often I hear the argument ‘If you have nothing to hide, you've nothing to fear’. Brian's point shows that this just isn't true. If you've got nothing to hide, the only thing you need to fear is that which is virtually guaranteed – Mistakes.

    More so than the criminals, every innocent person should fear the possibility that the mistake affects them. There is little comfort in the ‘greater good’ of the system when you're in prison for a crime you did not commit.

    R

  2. We, the people, we have no clue. Today’s headline in The Netherlands: the help desk of what would be the IRS in the USA, has been telling people to upload their tax forms with the DigiD of their neighbor. What?!

    In 2003 the Dutch government created a digital identity provider, called Digid. Essentially Digid is a register that links your social security number to a private password. Digid is to be used for identification and authentication with all digital public services. The taxman is one of them. Starting this year, electronic tax forms must be uploaded using Digid to authenticate of the source of the transfer. Apparently, DigiD is not used to sign the tax form itself. As is to be expected, some people had problems using their Digid. They lost their password or were late in enrolling – you get papers send in the physical mail as part of the enrolment process. The tax deadline was looming. People calling in to the IRS help desk were told to use, indeed, the Digid of a neighbor or friend.

    This is not about a stupid bureaucracy; it is just another indication that learning to properly deal with digital identities may well take a generation.

Comments are closed.