Lazy headmasters versus the Laws of Identity

Ray Corrigan routinely combines legal and technological insight at B2fxxx – Random thoughts on law, the Internet and society, and his book on Digital Decision Making is essential.  His work often leaves me feeling uncharacteristically optimistic – living proof that a new kind of legal thinker is emerging with the technological depth needed to be a modern day Solomon.

I hadn't noticed the UK's new Protection of Freedoms Bill until I heard cabinet minister Damian Green talk about it as he pulverized the UK's centralized identity database recently.  Naturally I turned to Ray Corrigan for comment, only to discover that the political housecleaning had also swept away the assumptions behind widespread fingerprinting in Britain's schools, reinstating user control and consent. 

According to TES Connect:

The new Protection of Freedoms Bill gives pupils in schools and colleges the right to refuse to give their biometric data and compels schools to make alternative provision for them.  The several thousand schools that already use the technology will also have to ask permission from parents retrospectively, even if their systems have been established for years…

It turns out that Britain's headmasters, apparently now a lazy bunch, have little stomach for trivialities like civil liberties.  And writing about this, Ray's tone seems that of a judge who has had an impetuous and over-the-top barrister try to bend the rules one too many times.  It is satisfying to see Ray send them home to study the Laws of Identity as scientific laws governing identity systems.   I hope they catch up on their homework…

The Association of School and College Leaders (ASCL) is reportedly opposing the controls on school fingerprinting proposed in the UK coalition government's Protection of Freedoms Bill.

I always understood the reason that unions existed was to protect the rights of individuals. That ASCL should give what they perceive to be their own members’ managerial convenience priority over the civil rights of kids should make them thoroughly ashamed of themselves.  Oh dear – now head teachers are going to have to fill in a few forms before they abuse children's fundamental right to privacy – how terrible.

Although headteachers and governors at schools deploying these systems may be typically ‘happy that this does not contravene the Data Protection Act’, a number of leading barristers have stated that the use of such systems in schools may be illegal on several grounds. As far back as 2006 Stephen Groesz, a partner at Bindmans in London, was advising:

“Absent a specific power allowing schools to fingerprint, I'd say they have no power to do it. The notion you can do it because it's a neat way of keeping track of books doesn't cut it as a justification.”

The recent decisions in the European Court of Human rights in cases like S. and Marper v UK (2008 – retention of dna and fingerprints) and Gillan and Quinton v UK (2010 – s44 police stop and search) mean schools have to be increasingly careful about the use of such systems anyway. Not that most schools would know that.

Again the question of whether kids should be fingerprinted to get access to books and school meals is not even a hard one! They completely decimate Kim Cameron's first four laws of identity.

1. User control and consent – many schools don't ask for consent, child or parental, and don't provide simple opt out options

2. Minimum disclosure for constrained use – the information collected, children's unique biometrics, is disproportionate for the stated use

3. Justifiable parties – the information is in control of or at least accessible by parties who have absolutely no right to it

4. Directed identity – a unique, irrevocable, omnidirectional identifier is being used when a simple unidirectional identifier (eg lunch ticket or library card) would more than adequately do the job.

It's irrelevant how much schools have invested in such systems or how convenient school administrators find them, or that the Information Commissioner's Office soft peddled their advice on the matter (in 2008) in relation to the Data Protection Act.  They should all be scrapped and if the need for schools to wade through a few more forms before they use these systems causes them to be scrapped then that's a good outcome from my perspective.

In addition just because school fingerprint vendors have conned them into parting with ridiculous sums of money (in school budget terms) to install these systems, with promises that they are not really storing fingerprints and they can't be recreated, there is no doubt it is possible to recreate the image of a fingerprint from data stored on such systems. Ross, A et al ‘From Template to Image: Reconstructing Fingerprints from Minutiae Points’ IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 29, No. 4, April 2007 is just one example of how university researchers have reverse engineered these systems. The warning caveat emptor applies emphatically to digital technology systems that buyers don't understand especially when it comes to undermining the civil liberties of our younger generation.

Touch2Id Testimonials

Last summer I wrote about the British outfit called touch2id.  They had developed a system that sounded pretty horrible when I first heard about it – a scheme to control underage drinking by using peoples’ fingerprints rather than getting them to present identity cards.  I assumed it would be another of the hair-brained biometric schemes I had come across in the past – like this one, or this, or these.

But no.  The approach was completely different.  Not only was the system popular with its early adopters, but its developers had really thought through the privacy issues.   There was no database of fingerprints, no record linking a fingerprint to a natural person.  The system was truly one of “minimal disclosure” and privacy by design:

  • To register, people presented their ID documents and, once verified, a template of their fingerprint was stored on a Touch2Id card that was immediately given to them.  The fingerprint was NOT stored in a database
  • When people with the cards wanted to have a drink, they would wave their card over a machine similar to a credit card reader, and press their finger on the machine.  If their finger matched the template on their card, the light came on indicating they were of drinking age and they could be served.

A single claim:  “Able to drink“.  Here we had well designed technology offering an experience that the people using it liked way better than the current “carding” process – and which was much more protective of their privacy.  “Privacy by design” was delivering tangible benefits.  Merchants didn’t have to worry about making mistakes.  Young people didn’t have to worry about being discriminated against (or being embarassed) just because they “looked young” or got a haircut.  No identifying information was being released to the merchants.  No name, age or photo was stored on the cards.  The movements of young people were not tracked.  And so on.

Today touch2id published Testemonials – an impressive summary of their project consisting of reviews by individuals involved.  It is clear that those who liked it loved it.  It would be interesting to find out to what extent these rave reviews are typical of those who tried the system.  

At any rate, it's instructive to compare the positive outcome of this pilot with all the biometric proposals that have crashed onto the shoals of privacy invasion.

Stephan Engberg on Touch2ID

Stephan Engberg is member of the Strategic Advisory Board of the EU ICT Security & Dependability Taskforce and an innovator in terms of reconciling the security requirements in both ambient and integrated digital networks. I thought readers would benefit from comments he circulated in response to my posting on Touch2Id.

Kim Cameron's comments on Touch2Id – and especially the way PI is used – make me want to see more discussion about the definition of privacy and the approaches that can be taken in creating such a definition.

To me Touch2Id is a disaster – teaching kids to offer their fingerprints to strangers is not compatible  with my understanding of democracy or of what constitutes the basis of free society. The claim that data is “not collected” is absurd and represents outdated legal thinking.  Biometric data gets collected even though it shouldn't and such collection is entirely unnecessary given the PET solutions to this problem that exist, e. g chip-on-card.

In my book, Touch2Id did not do the work to deserve a positive privacy appraisal.

Touch2Id, in using blinded signature, is a much better solution than, for example, a PKI-based solution would be.  But this does not change the fact that biometrics are getting collected where they shouldn't.
To me Touch2Id therefore remains a strong invasion of Privacy – because it teaches kids to accept biometric interactions that are outside their control. Trusting a reader is not an option.

My concern is not so much in discussing the specific solution as reaching some agreement on the use of words and what is acceptable in terms of use of words and definitions.

We all understand that there are different approaches possible given different levels of pragmatism and focus. In reality we have our different approaches because of a number of variables:  the country we live in, our experiences and especially our core competencies and fields of expertise.

Many do good work from different angles – improving regulation, inventing technologies, debating, pointing out major threats etc. etc.

No criticism – only appraisal

Some try to avoid compromises – often at great cost as it is hard to overcome many legacy and interest barriers.  At the same time the stakes are rising rapidly:  reports of spyware are increasingly universal. Further, some try to avoid compromises out of fear or on the principle that governments are “dangerous”.

Some people think I am rather uncompromising and driven by idealist principles (or whatever words people use to do character assaination of those who speak inconvenient truths).  But those who know me are also surprised – and to some extent find it hard to believe – that this is due largely to considerations of economics and security rather than privacy and principle.

Consider the example of Touch2Id.  The fact that it is NON-INTEROPERABLE is even worse than the fact that biometrics are being collected, since because of this, you simply cannot create a PET solution using the technology interfaces!  It is not open, but closed to innovations and security upgrades. There is only external verification of biometrics or nothing – and as such no PET model can be applied.  My criticism of Touch2Id is fully in line with the work on security research roadmapping prior to the EU's large FP7 research programme (see pg. 14 on private biometrics and biometric encryption – both chip-on-card).

Some might remember the discussion at the 2003 EU PET Workshop in Brussels where there were strong objections to the “inflation of terms”.  In particular, there was much agreement that the term Privacy Enhancing Technology should only be applied to non-compromising solutions.  Even within the category of “non-compromising” there are differences.  For example, do we require absolute anonymity or can PETs be created through specific built-in countermeasures such as anti-counterfeiting through self-incrimination in Digital Cash or some sort of tightly controlled Escrow (Conditional Identification) in cases such as that of non-payment in an otherwise pseudonymous contract (see here).

I tried to raise the same issue last year in Brussels.

The main point here is that we need a vocabulary that does not allow for inflation – a vocabulary that is not infected by someone's interest in claiming “trust” or overselling an issue. 

And we first and foremost need to stop – or at least address – the tendency of the bad guys to steal the terms for marketing or propaganda purposes.  Around National Id and Identity Cards this theft has been a constant – for example, the term “User-centric Identity” has been turned upside down and today, in many contexts, means “servers focusing on profiling and managing your identity.”

The latest examples of this are the exclusive and centralist european eID model and the IdP-centric identity models recently proposed by US which are neither technological interoperable, adding to security or privacy-enhancing. These models represent the latest in democratic and free markets failure.

My point is not so much to define policy, but rather to respect the fact that different policies at different levels cannot happen unless we have a clear vocabulary that avoid inflation of terms.

Strong PETs must be applied to ensure principles such as net neutrality, demand-side controls and semantic interoperability.  If they aren't, I am personally convinced that within 20 or 30 years we will no longer have anything resembling democracy – and economic crises will worsen due to Command & Control inefficiencies and anti-innovation initiatives

In my view, democracy as construct is failing due to the rapid deterioration of fundamental rights and requirements of citizen-centric structures.  I see no alternative than trying to get it back on track through strong empowerment of citizens – however non-informed one might think the “masses” are – which depends on propagating the notion that you CAN be in control or “Empowered” in the many possible meanings of the term.

When I began to think about Touch2Id it did of course occur to me that it would be possible for operators of the system to secretly retain a copy of the fingerprints and the information gleaned from the proof-of-age identity documents – in other words, to use the system in a deceptive way.  I saw this as being something that could be mitigated by introducing the requirement for auditing of the system by independent parties who act in the privacy interests of citizens.

It also occured to me that it would be better, other things being equal, to use an on-card fingerprint sensor.  But is this a practical requirement given that it would still be possible to use the system in a deceptive way?  Let me explain.

Each card could, unbeknownst to anyone, be imprinted with an identifier and the identity documents could be surreptitiously captured and recorded.  Further, a card with the capability of doing fingerprint recognition could easily contain a wireless transmitter.  How would anyone be certain a card wasn't capable of surreptitiously transmitting the fingerprint it senses or the identifier imprinted on it through a passive wireless connection? 

Only through audit of every technical component and all the human processes associated with them.

So we need to ask, what are the respective roles of auditability and technology in providing privacy enhancing solutions?

Does it make sense to kill schemes like Touch2ID even though they are, as Stephan says, better than other alternatives?   Or is it better to put the proper auditing processes in place, show that the technology benefits its users, and continue to evolve the technology based on these successes?

None of this is to dismiss the importance of Stephan's arguments – the discussion he calls for is absolutely required and I certainly welcome it. 

I'm sure he and I agree we need systematic threat analysis combined with analysis of the possible mitigations, and we need to evolve a process for evaluating these things which is rigorous and can withstand deep scrutiny. 

I am also struck by Stephan's explanation of the relationship between interoperability and the ability to upgrade and uplevel privacy through PETs, as well as the interesting references he provides. 

Doing it right: Touch2Id

And now for something refreshingly different:  an innovative company that is doing identity right. 

I'm talking about a British outfit called Touch2Id.  Their concept is really simple.  They offer young people a smart card that can be used to prove they are old enough to drink alcohol.  The technology is now well beyond the “proof of concept” phase – in fact its use in Wiltshire, England is being expanded based on its initial success.

  • To register, people present their ID documents and, once verified, a template of their fingerprint is stored on a Touch2Id card that is immediately given to them. 
  • When they go to a bar, they wave their card over a machine similar to a credit card reader, and press their finger on the machine.  If their finger matches the template on their card, the lights come on and they can walk on in.

   What's great here is:

  • Merchants don't have to worry about making mistakes.  The age vetting process is stringent and fake IDs are weeded out by experts.
  • Young people don't have to worry about being discriminated against (or being embarassed) just because they “look young”
  • No identifying information is released to the merchant.  No name, age or photo appears on (or is stored on) the card.
  • The movements of the young person are not tracked.
  • There is no central database assembled that contains the fingerprints of innocent people
  • The fingerprint template remains the property of the person with the fingerprint – there is no privacy issue or security honeypot.
  • Kids cannot lend their card to a friend – the friend's finger would not match the fingerprint template.
  • If the card is lost or stolen, it won't work any more
  • The templates on the card are digitally signed and can't be tampered with

I met the man behind the Touch2Id, Giles Sergant, at the recent EEMA meeting in London.

Being a skeptic versed in the (mis) use of biometrics in identity – especially the fingerprinting of our kids – I was initially more than skeptical. 

But Giles has done his homework (even auditing the course given by privacy experts Gus Hosein and Simon Davies at the London School of Economics).  The better I understood the approach he has taken, the more impressed I was.

Eventually I even agreed to enroll so as to get a feeling for what the experience was like.  The verdict:  amazing.  Its a lovely piece of minimalistic engineering, with no unnecessary moving parts or ugly underbelly.    If I look strangely euphoric in the photo that was taken it is because I was thoroughly surprised by seeing something so good.

Since then, Giles has already added an alternate form factor – an NFC sticker people can put on their mobile phone so they don't actually need to carry around an additional artifact.  It will be fascinating to watch how young people respond to this initiative, which Giles is trying to grow from the bottom up.  More info on the Facebook page.

Fingerprint charade

I got a new Toshiba Portege a few weeks ago, the first machine I've owned that came with a fingerprint sensor.   At first the system seemed to have been designed in a sensible way.  The fingerprint template is encrypted and stays local.  It is never released or stored in a remote database.  I decided to try it out – to experience what it “felt like”.

A couple of days later, I was at a conference and on stage under pretty bright lights.  Glancing down at my shiny new computer, I saw what looked unmistakably like a fingerprint on my laptop's right mouse button.  Then it occurred to me that the fingerprint sensor was only a quarter of an inch from what seemed to be a perfect image of my fingerprint.  How secure is that?

A while later I ran into  Dale Olds from Novell.  Since Dale's an amazing photographer, I asked if he would photograph the laptop to see if the fingerprint was actually usable.  Within a few seconds he took the picture above. 

When Dale actually sent me the photo, he said,

I have attached a slightly edited version of the photo that showed your fingerprint most clearly. In fact, it is so clear I am wondering whether you want to publish it. The original photos were in Olympus raw format. Please let me know if this version works for you.

Eee Gads.  I opened up the photo in Paint and saw something along these lines:

The gold blotch wasn't actually there.  I added it as a kind of fig-leaf before posting it here, since it covers the very clearest part of the fingerprint. 

The net of all of this was to drive home, yet again, just how silly it is to use a “public” secret as a proof of identity.  The fact that I can somehow “demonstrate knowledge” of a given fingerprint means nothing.  Identification is only possible by physically verifying that my finger embodies the fingerprint.  Without physical verifcation, what kind of a lock does the fingerprint reader provide?  A lock which conveniently offers every thief the key.

At first my mind boggled at the fact that Toshiba would supply mouse buttons that were such excellent fingerprint collection devices.  But then I realized that even if the fingerprint weren't conveniently stored on the mouse button, it would be easy to find it somewhere on the laptop's surface.

It hit me that in the age of digital photography, a properly motivated photographer could probably find fingerprints on all kinds of surfaces, and capture them as expertly as Dale did.  I realized it was no longer necessary to use special powder or inks or tape or whatever.  Fingerprints have become a thing of “sousveillance”.

Chaos computer club gives us the German phish finger

If you missed this article in The Register, you missed the most instructive story to date about applied biometrics:  

A hacker club has published what it says is the fingerprint of Wolfgang Schauble, Germany's interior minister and a staunch supporter of the collection of citizens’ unique physical characteristics as a means of preventing terrorism.

In the most recent issue of Die Datenschleuder, the Chaos Computer Club printed the image on a plastic foil that leaves fingerprints when it is pressed against biometric readers…

Last two pages of magazine issue, showing article and including plastic film containing Schauble's fingerprint

“The whole research has always been inspired by showing how insecure biometrics are, especially a biometric that you leave all over the place,” said Karsten Nohl, a colleague of an amateur researcher going by the moniker Starbug, who engineered the hack. “It's basically like leaving the password to your computer everywhere you go without you being able to control it anymore.” … 

A water glass 

Schauble's fingerprint was captured off a water glass he used last summer while participating in a discussion celebrating the opening of a religious studies department at the University of Humboldt in Berlin. The print came from an index finger, most likely the right one, Starbug believes, because Schauble is right-handed.

The print is included in more than 4,000 copies of the latest issue of the magazine, which is published by the CCC. The image is printed two ways: one using traditional ink on paper, and the other on a film of flexible rubber that contains partially dried glue. The latter medium can be covertly affixed to a person's finger and used to leave an individual's prints on doors, telephones or biometric readers…

Schauble is a big proponent of using fingerprints and other unique characteristics to identify individuals.

“Each individual’s fingerprints are unique,” he is quoted as saying in this official interior department press release announcing a new electronic passport that stores individuals’ fingerprints on an RFID chip. “This technology will help us keep one step ahead of criminals. With the new passport, it is possible to conduct biometric checks, which will also prevent authentic passports from being misused by unauthorized persons who happen to look like the person in the passport photo.”

The magazine is calling on readers to collect the prints of other German officials, including Chancellor Angela Merkel, Bavarian Prime Minister Guenther Beckstein and BKA President Joerg Ziercke.

“The thing I like a lot is the political activism of the hack,” said Bruce Schneier, who is chief security technology officer for BT and an expert on online authentication. Fingerprint readers were long ago shown to be faulty, largely because designers opt to make the devices err on the side of false positives rather than on the side of false negatives…

[Read the full article here]

Dynamite interview with Latanya Sweeney

Scientific American has published a must-read-in-its-entirety interview with Carnegie Mellon computer scientist Latanya Sweeney. She begins by showing that privacy is not a political issue, but an animal need:

“We literally can't live in a society without it. Even in nature animals have to have some kind of secrecy to operate. For example, imagine a lion that sees a deer down at a lake and it can't let the deer know he's there or [the deer] might get a head start on him. And he doesn't want to announce to the other lions [what he has found] because that creates competition. There's a primal need for secrecy so we can achieve our goals.”

Then she ties privacy to human ontogenesis – again, a requirement for the existence of the species: 

Privacy also allows an individual the opportunity to grow and make mistakes and really develop in a way you can't do in the absence of privacy, where there's no forgiving and everyone knows what everyone else is doing. There was a time when you could mess up on the east coast and go to the west coast and start over again. That kind of philosophy was revealed in a lot of things we did. In bankruptcy, for example. The idea was, you screwed up, but you got to start over again. With today's technology, though, you basically get a record from birth to grave and there's no forgiveness. And so as a result we need technology that will preserve our privacy.

Continue reading Dynamite interview with Latanya Sweeney

The Biometric Dilemma

Vision researcher Terrence E. Boult has identified what he calls the “Biometric dilemma” – the more we use biometrics the more likely they will be compromised and hence become useless for security.   

This is a hugely important observation – the necessary starting point for all thinking about biometrics.  I'd even call it a law.

Terrence was responding to a piece by Sean Convery that picked up on my post about reversing biometric templates.  Terrence went on to call our attention to more recent work, including some that details the reversibility of fingerprint templates. Continue reading The Biometric Dilemma

Paper argues biometric templates can be “reversed”

Every time biometrics techology is sold to a school we get assurances that the real fingerprint or other biometric is never stored and can't be retrieved.  Supposedly the system just uses a template, a mere string of zeros and ones (as if, in the digital world, there is much more than that…)  

It turns out a Canadian researcher has shown that in the case of face recognition templates a fairly high quality image of a person can be automatically regenerated from templates.  The images calculated using the procedure are of sufficient quality to  give a good visual impression of the person's characteristics.  This work reinforces the conclusions drawn earlier by an Australian researcher, who was able to construct fingerprint images from fingerprint templates.  Continue reading Paper argues biometric templates can be “reversed”