Fighting Big (Data) Brother

When I was a full-time student of Cold War history I had to study how the constant watch by an unknown yet omnipresent force affected people. In American classrooms that meant asking questions about the legacy of the 1968 Prague Spring or trying to prove whether Marshal Josip Broz Tito was really as popular as reported publicly.

If you clicked on the links above you will see that I believe some of the best sources for an answer could be from literary and artistic writers. There are classics like The Trial by Kafka and 1984 by Orwell, but we also gain excellent insights from modern work such as The Lives of Others by Donnersmarck.

The real story, however, is not just about the situation in someone else’s backyard. Surveillance society is a risk everywhere and is inextricably linked to advances in communication. It thus seems inevitable to find warnings at home of a “double-edge” to technology. A good example is the historic criticism by U.S. Senator Frank Church of an American-based system meant to spy on the Soviets.

“…this capability at any time could be turned around on the American people” he said in 1975, “and no American would have any privacy left, such is the capability to monitor everything: telephone conversations, telegrams, it doesn’t matter. There would be no place to hide.”

He added that if a dictator ever took over, the N.S.A. “could enable it to impose total tyranny, and there would be no way to fight back.

He was wrong. A dictator is not necessary to achieve what many would consider tyranny, at least if you take into account definitions put forward by the likes of Kafka and Orwell. And the N.S.A. giant spy center is not the only one to consider.

He also was right. The capability officially has been turned around on the American people.

CNET has learned that the FBI has formed a Domestic Communications Assistance Center, which is tasked with developing new electronic surveillance technologies, including intercepting Internet, wireless, and VoIP communications.

And that brings me back to the study of surveillance in the Cold War. The British classroom forced me to expand the scope of discussion. We had to spend many hours debating and trying to make sense of policies all over the world where analysts collected data on citizens to enforce laws and inform leaders. One of the things that stood out to me was how citizen behavior altered in some versions of surveillance, but not others. The difference appeared to me linked to a sense of value and opportunity.

If collection of information is in any way perceived by an individual as a threat to their success then countermeasures are a natural reaction. It was only when risk from surveillance was not perceived (e.g. “I have nothing to hide”), or a greater alternate threat was proposed (e.g. surveillance will save you from a worse fate) that someone might be expected to comply without question.

What countermeasures? We destroy a trail or hide it. In terms of security, people use integrity and confidentiality controls to fight surveillance. The tricky part is that we enjoy and want social approval; it gives the ability to see a more circumspect view. But on the other hand we do not want to feel as though we are being monitored to the point where we are boxed in by our every decision (i.e. the dilemma of whether to have a heavy or light existence, as expressed in the Unbearable Lightness of Being)

Countermeasures might not always be the right term. I am reminded of Manuel Castells’ 1996 book The Rise of the Network Society. He emphasized that a globally interconnected communication system was unlikely to make a work force go completely mobile. He showed that people prefer to keep local social attachment (e.g. owning a house, living near family and friends) intact. However, other social structures such as labor relationships were not as stable. They could evolve because more opportunities would be available and it would make part-time and temporary work the norm.

Thus, rather than call them countermeasures, we tend to keep only trails intact where we perceive value and limited risk. Cycling through connections when more connections are offered at low cost becomes a new norm, which can cause problems for those interested in data collection and analysis.

With that in mind, I noted recently big data researchers saying that their subjects are fighting back.

Ms. Boyd has made a specialty of studying young people’s behavior on the Internet. She says they are now often seeking power over their environment through misdirection, such as continually making and destroying Facebook accounts, or steganography, a cryptographic term for hiding things in plain sight by obscuring their true meaning. “Someone writes, ‘I’m sick and tired of all this,’ and it gets ‘liked’ by 32 people,” she said. “When I started doing my fieldwork I could tell you what people were talking about. Now I can’t.”

Now. Just like in history. The behavior she describes sounds like exactly what could be expected in a surveillance society that has a low cost to connection cycles. What does this mean in terms of future behavior? It’s not clear who the best artistic writers will be yet but there may be much more lightness ahead.

For example, in the past there was a desire in America to make a phone number portable to maintain continuity across providers, but the new trend appears to be more like what Castells predicted and numbers have short-term or temporary use. Another example is demand for peer-based key management, instead of server, for mobile devices. A third example is demand for sandboxes and hypervisors to create safe havens of communication. Hiding or destroying the trail of an application, a machine or even an entire data center, is more possible than ever through virtualization.

One thought on “Fighting Big (Data) Brother”

  1. Its fascinating that the RC3 application architecture (http://www.infoq.com/articles/regulatory-compliant-cloud-computing) appears to be the logical counter-measure for fighting Big Brother, as it is for protecting data from attackers in the Cloud.

    The RC3 architecture assumes that everything is being monitored and attacked in the Cloud Zone. Given that fundamental assumption, it protects data by eliminating the need for credentials/identity-management and for cryptography/key-management in the Cloud – the two pathways to data-compromise.

    Keeping credentials, keys and performing cryptographic operations only in the Regulated (secure) Zone, using one-way connections from the Regulated Zone to the Cloud Zone, and digitally-signing objects stored in the Cloud data-store effectively reduces anyone watching/attacking data in the Cloud to only see meaningless gibberish. If one goes further and distributes (shards) the data-schema and application across multiple Clouds, each Cloud now contains even less data for Big Brother or an attacker to make sense.

    RC3 seems like the logical evolution of application/security architecture, in an environment where Big Brother and attackers have the same goal: getting at your data.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.