Last month, I posted a short comment called Attention metadata and subpoenas that was inspired by a recent Justice Department subpoena of search data from Google, MSN and Yahoo relating to child pornography. The more recent story about NSA eavesdropping on the communications from U.S. citizens to and from foreign countries also raises concerns about the privacy of attention metadata (data based on your actual usage that track to what you pay attention; for example, Amazon’s tracking of what you’ve purchased in the past).
In the days of the old Hoover – J. Edgar Hoover that is – government eavesdropping was selective. An official wanting to wiretap a person suspected of something nefarious would find a friendly judge to authorize, someone would install pair of clip leads on the suspect’s telephone line, and a tape would roll or a technician would sit and monitor communications. Sometimes it was a racketeer who was monitored. But sometimes, as when the then-sainted President Kennedy permitted the wiretapping of the now-sainted Martin Luther King, it was for reasons of politics or public order.
[I promise to bring this back to the web/media realm, but bear with me for a bit.]
As information about the NSA eavesdropping on the overseas communications of U.S. citizens has been revealed, it’s been interesting to watch the assumptions about its nature change in press accounts. When the story first broke, I think that most people imagined it was the same situation that took place in the days of the old Hoover. Indeed, the legislation that established the FISA court seems to have assumed this model of surveillance.
But as the story has unfolded, it’s increasingly obvious that a new kind of Hoover is involved in the NSA surveillance, one similar to the Hoover you use to vacuum your living room. The government has understandably revealed little about what’s going on, but several published analyses and the reason stated for not going to the FISA court for specific permissions – not enough time – reveal that the NSA is vacuuming up all communications leaving the country and running them through various computer algorithms designed to detect speech, verbal or written, that provides the clues they’re seeking. That’s why they don’t have time to authorize each individual case – it’s happening in real or near-real time and often they don’t know the target in advance.
So when I talked to my brother in Kenya using Skype the other day, an algorithm – not some bored guy in a panel truck with coffee and doughnut – was listening. When I email friends and relatives in Norway almost daily, an algorithm is reading it. The algorithm is listening when I ask my brother if he and his family are OK; it’s reading when I’m exchanging genealogy information or torturing the Norwegian language by email; it’s listening to tender conversations between military spouses in Idaho and Iraq, adult children asking if a dying parent had a good day, sensitive business matters revealed between colleagues, and thousands of other interactions. Presumably, they have the same search, relational and attention tracking technologies as exist on the commodity Internet, so they should be able to construct quite powerful profiles of the really bad guys – or of any one of us.
In the vast majority of cases, the algorithm finds nothing of interest and moves on. We are told: “If you’re not doing anything wrong, don’t worry. Trust us. This is about our safety and security.” It probably is this time.
Algorithms are procedures written by people to achieve objectives. The people who are doing this are appointed by people we elect, so we democratically gave up this privacy, right? Did MLK knowingly give up his privacy when he presumably voted for JFK in 1960?
Algorithms can be tweaked or changed at will, so when we elect new officials we should expect new algorithms. Turn the algorithm’s dial to the right and the new Hoover detects things that violate someone’s sense of morality (this is happening in China and elsewhere today). Had the technology existed in the 60s and 70s, we would have found the algorithm tuned to civil rights and Cold War keywords. Turn it to the left and the people who want government to be the Great Nanny of Us All can tell if you’re visiting the Hummer web site more often than the Prius web site and have that trigger a "green" pop-up message.
The new Hoover wants to know your thoughts and to keep or discourage you from acting on them. The point of surveillance is to keep you from doing what you would otherwise do if you weren’t monitored.
This finally brings us back to web/media applications. Tracking attention is perhaps the most powerful web and media development since the invention of the browser. There are very good reasons why not only providers but consumers should consent to having their attention tracked.
Organizers of the Attention Trust are doing important work, particularly around issues of the ownership of attention data. I happen to agree with them. But I think we need to also extend awareness of the concept from the commercial realm to the Fourth Amendment realm. Attention metadata seem to me an even richer cache of information for reading our thoughts than are metadata collected by the new Hoover of our email and conversations. The ratio of useful to non-useful stuff is much greater.
At a minimum, if we want meaningful Fourth Amendment protection, web and media operations that collect attention metadata should also permit individual users to selectively destroy metadata associated with their attention just as one can delete his or her browsing history. Ownership of attention data without destruction rights isn't ownership. That destruction will inevitably interfere with the commercial interests of the provider, the surveillance interests of the government, and the consumer interests of the user, but without it, the Fourth Amendment right to privacy will soon sound as quaint as our Third Amendment right to give consent to housing soldiers in our homes.
Trust me. --Dennis