Scientific American has published a must-read-in-its-entirety interview with Carnegie Mellon computer scientist Latanya Sweeney. She begins by showing that privacy is not a political issue, but an animal need:
“We literally can't live in a society without it. Even in nature animals have to have some kind of secrecy to operate. For example, imagine a lion that sees a deer down at a lake and it can't let the deer know he's there or [the deer] might get a head start on him. And he doesn't want to announce to the other lions [what he has found] because that creates competition. There's a primal need for secrecy so we can achieve our goals.”
Then she ties privacy to human ontogenesis – again, a requirement for the existence of the species:
Privacy also allows an individual the opportunity to grow and make mistakes and really develop in a way you can't do in the absence of privacy, where there's no forgiving and everyone knows what everyone else is doing. There was a time when you could mess up on the east coast and go to the west coast and start over again. That kind of philosophy was revealed in a lot of things we did. In bankruptcy, for example. The idea was, you screwed up, but you got to start over again. With today's technology, though, you basically get a record from birth to grave and there's no forgiveness. And so as a result we need technology that will preserve our privacy.
Her argument that privacy is a phenomenon that preceeds consciousness speaks volumes to the fact that there are objective requirements – beyond our individual wills -that must be satisfied before a digital environment will be “compatible enough with people” that it can survive. This is the basic premise of the Laws of Identity – and why I have always seen them as computer science rather than moral philosophy. Of course, we are talking about a new kind of computer science, at a higher level of abstration than a taxonomy for “sorting algorithms”. We are talking about how computer science matures as computing becomes social.
Asked how we “solve the privacy problem”, she says:
“My answer is that the privacy problems that I've seen are probably best solved by the person who first created the technology. What we really have to do is train engineers and computer scientists to design and build technologies in the right kind of way from the beginning.
“Normally, engineers and computers scientists get ideas for technologies on their own and engage in a kind of circular thinking and develop a prototype of their solution and then do some kind of testing. But we are saying we will give them tools that help them see who are the stakeholders and do a risk assessment, and then see what barriers will come up and deal with the riskiest problems and work to solve them in the technology design…
“There should be privacy technology departments, because there are no technologies for handling privacy problems [proactively]. The best solutions lie in the technology design. So we are targeting the creation of tools for the engineers and computer scientists…”
The work on this blog has been about exactly these issues – starting with a privacy risk assessment rather than tacking it on as an afterthought.
Finally, I like the way Latanya ties fingerprints to the problems of SSNs, given that fingerprints are “public” by definition:
Social security numbers are a whole discussion unto themselves — how they've outplayed themselves, do they need to be replaced? Now in law enforcement and the department of justice, they're saying it should be fingerprints. So now we'll see little devices in computers and cars and even refrigerators with very expensive fingerprint readers on them. But that's a problem because fingerprints could become the next social security number. They could give us all the ills of the social security number and worse. I can't get rid of my fingerprint, it goes with me wherever I go. I don't wear my social security number on my head.
We leave [fingerprints] everywhere, which is really good for law enforcement because they know where to find us at all times, but that means that anyone could pick them up. The point is you can see the progression.
[Thanks to Alan Herrell telling us about this article.]