Every time biometrics techology is sold to a school we get assurances that the real fingerprint or other biometric is never stored and can't be retrieved. Supposedly the system just uses a template, a mere string of zeros and ones (as if, in the digital world, there is much more than that…)
It turns out a Canadian researcher has shown that in the case of face recognition templates a fairly high quality image of a person can be automatically regenerated from templates. The images calculated using the procedure are of sufficient quality to give a good visual impression of the person's characteristics. This work reinforces the conclusions drawn earlier by an Australian researcher, who was able to construct fingerprint images from fingerprint templates.
The results are published in a short but intriguing paper by Andy Adler of the University of Ottawa. In part:
A simple algorithm is presented which can a regenerate sample images from templates using only match score results.
While results are demonstrated for face recognition algorithms, the conceptual framework should be applicable to any biometric algorithm.
A software application was implemented with the goal of recreating a face image of a target person in a face recognition database. The application has access to a local database of arbitrary face images, and has network access to a face recognition server. The software begins with only the database ID of the target person, and is able to obtain match scores of chosen images compared to the target person.
Three different facial recognition algorithms were studied; all are recent products by well known commercial vendors of biometric systems. Two of the vendors participated in the face recognition vendor test 2002.
This algorithm functions as follows: During preprocessing, a local database of face images is obtained, and an eigenface decomposition calculated. Note that there is no requirement that the local database resemble the target image, these results use target images from the Mugshot face database, and local database calculated from the University of Aberdeen face database.
Subsequently, the algorithm determines the match score for a selection of images in the local database against the target. The initial estimate is selected to be the image with the highest match score. The core of the algorithm is the iterative improvement of this estimate to better approximate the target. During each iteration, an eigenface image is selected, and a series of images produced equal to the current image estimate plus a small constant times the eigenimage.
The corresponding match scores between these images and the target are calculated, and the image with the best score is selected for the subsequent iteration. This process is repeated until there is no significant improvement in match score. It was heuristically determined that six different adjustment levels for each eigenimage gave the fastest convergence. Typically, the algorithm reached a maximum match score after about 4000 iterations.
[Thanks to David Clouter and Ross Anderson for bringing this to our attention, and to Sean Convery for noticing that the paper actually dates from 2003. T.E. Boult points to more recent discussions of reversibility here.]