In November I coined the term “Identity Chernobyl” for Britain's HMRC fiasco (at least it seems that way when I look at Google).
Cory Doctorow elaborates on this in a nice Guardian piece:
When HM Revenue & Customs haemorrhaged the personal and financial information of 25 million British families in November, wags dubbed it the “Privacy Chernobyl”, a meltdown of global, epic proportions [Hey, Cory, are you calling me a wag? – Kim].
The metaphor is apt: the data collected by corporations and governmental agencies is positively radioactive in its tenacity and longevity. Nuclear accidents leave us wondering just how we're going to warn our descendants away from the resulting wasteland for the next 750,000 years while the radioisotopes decay away. Privacy meltdowns raise a similarly long-lived spectre: will the leaked HMRC data ever actually vanish?
The financial data in question came on two CDs. If you're into downloading movies, this is about the same size as the last couple of Bond movies. That's an incredibly small amount of data – my new phone holds 10 times as much. My camera (six months older than the phone) can only fit four copies of the nation's financial data.
Our capacity to store, copy and distribute information is ascending a curve that is screaming skyward, headed straight into infinity. This fact has not escaped the notice of the entertainment industry, where it has been greeted with savage apoplexy.
Wet Kleenex
But it seems to have entirely escaped the attention of those who regulate the gathering of personal information. The world's toughest privacy measures are as a wet Kleenex against the merciless onslaught of data acquisition. Data is acquired at all times, everywhere.
For example, you now must buy an Oyster Card if you wish to buy a monthly travelcard for London Underground, and you are required to complete a form giving your name, home address, phone number, email and so on in order to do so. This means that Transport for London is amassing a radioactive mountain of data plutonium, personal information whose limited value is far outstripped by the potential risks from retaining it.
Hidden in that toxic pile are a million seams waiting to burst: a woman secretly visits a fertility clinic, a man secretly visits an HIV support group, a boy passes through the turnstiles every day at the same time as a girl whom his parents have forbidden him to see; all that and more.
All these people could potentially be identified, located and contacted through the LU data. We may say we've nothing to hide, but all of us have private details we'd prefer not to see on the cover of tomorrow's paper.
How long does this information need to be kept private? A century is probably a good start, though if it's the kind of information that our immediate descendants would prefer to be kept secret, 150 years is more like it. Call it two centuries, just to be on the safe side.
If we are going to contain every heap of data plutonium for 200 years, that means that every single person who will ever be in a position to see, copy, handle, store, or manipulate that data will have to be vetted and trained every bit as carefully as the folks in the rubber suits down at the local fast-breeder reactor.
Every gram – sorry, byte – of personal information these feckless data-packrats collect on us should be as carefully accounted for as our weapons-grade radioisotopes, because once the seals have cracked, there is no going back. Once the local sandwich shop's CCTV has been violated, once the HMRC has dumped another 25 million records, once London Underground has hiccoughup up a month's worth of travelcard data, there will be no containing it.
And what's worse is that we, as a society, are asked to shoulder the cost of the long-term care of business and government's personal data stockpiles. When a database melts down, we absorb the crime, the personal misery, the chaos and terror.
The best answer is to make businesses and governments responsible for the total cost of their data collection. Today, the PC you buy comes with a surcharge meant to cover the disposal of the e-waste it will become. Tomorrow, perhaps the £200 CCTV you buy will have an added £75 surcharge to pay for the cost of regulating what you do with the footage you take of the public.
We have to do something. A country where every snoop has a plutonium refinery in his garden shed is a country in serious trouble.
The notion of information half-life is a great one. Let's adopt it.
The tendency for “information to merge” is one of the defining transformations of our time. When it comes to understanding what this means, few think forward, or even realize that there “is a forward”.
The “contextual separation” in our lives has been central to our personalities and social structures for many centuries.
Call me conservative, but we need to retain this separation.
The mobility and clonability of digital information, in combination with commercial interest and naivite, lead us toward a vast sea of personal information intermixed with our most intimate and tentative thoughts.
The essence of free-thinking is to be able to think things you don't believe as part of the process of grasping the truth. If the mind melts into the computer, and the computer melts into a rigid warehouse of indelible data, how easy is it for us to change, and what is left of the mind that is “transcendental” (or even just unfettered…)?
The ramifications of this boggle the mind. The alienation it would cause, and the undermining of institutions it would bring about, concern me as much as any other threat to our civilization.
As ever Cory is not above bending the truth to make a point that matches his agenda.
You don't need to give personal information to get an Oyster Card to travel on London's public transport; you can get an anonymous one and refill it with cash. He also ignores that Oyster's usage data is held on an 8 week rolling basis (see http://www.coofercat.com/wiki/OysterCardRFI for an interesting response from Transport for London about how they retain and use data),
As for a half life of data; well now that's interesting. Enforcement is the problem. Would you trust someone who says “It'll be deleted in X months”? Probably not. Now, if you had issued an expiring certificate to use that data, which had a limited shelf life … well that would be more interesting (even if it's an implementation nightmare)
Thanks for the OysterCard link – it is very interesting.
Re: “trusting” people to delete the data, what is really needed is to change peoples’ thinking so that expiry becomes the normal practice, I think a legal framework around protection of identity information would help achieve this rebalancing.
I say let's not adopt this idea of information half-life. Where we understand the need to safely store radio active waste and how it gradually becomes less obnoxious, certain personal information should never be stored in the first place. Certainly not if it breeches the second data protection principle as laid down in our UK Laws. Sceptics will argue that such talk is designed to lull us in a false sense of security that our data is safe in the hands of government agencies. In the UK the Bichard ‘step model’ report similarly tries to re-assure us ‘the public’ that, after a few years, certain information the UK Police now routinely gathers about innocent citizens and store in the National Crime Data Base will be ‘stepped down’, so it's no longer routinely shared among all government agencies. (it's actually the inverse of information half life come to think of it.) Problem is that few people I know want to have their DNA stored alongside that of convicted terrorists, murderers and rapists and trust that inferences will never be made.
More on identitySpace http://www.bloglines.com/blog/Marcus-Lasance?id=15