TechCrunch Argues Location Tracking Harmless

I see the FUD (Fear, Uncertainty, Doubt) acronym thrown around a lot these days. If you disagree with someone you throw a FUD card at them, as if it is sufficient on its own to discredit an argument. TechCrunch gives a good example of this as they defend Apple.

They give their perspective on the controversy surrounding mobile location data and conclude there’s no need to worry just because Apple made three mistakes; the critics of Apple are accused of spreading FUD. What’s the opposite of FUD, rose colored glasses?

It’s a long article but here’s the meat of it. The attack requires someone to already have access to your phone and that must mean they have access to all your other information. TechCrunch concludes that the failure to prevent access negates any further worry because you probably already are screwed.

Theoretically, someone could steal your phone, hack it, and get access to this data. This could potentially show them where you were up until the point they stole your phone. (Of course, given that they stole/found your phone, they would probably already know that.)

But wait. If they stole/found your phone, couldn’t they also have access to information like your address, the addresses of friends/family, all your phone numbers, perhaps some passwords, maybe monetary information? Yes, but that’s not as sexy of a story.

Here are four simple reasons why they are wrong:

1) Six months is a long time. TechCrunch says “given that they stole/found your phone, they would probably already know [where you were]”. Not so, not at all. As anyone who has worked in digital forensics knows, you can take data from a device without knowing anything about the individual who owns it. I have even engineered a distraction, then imaged a device, and left before the owner returned. It is easier today than ever with remote apps and hidden services. The controversy is not about the hours or even days of movement on a stolen phone — knowing where you are when your phone is stolen — it’s about the months and even years of movement as the lawsuit in Germany proved in 2009.

2) Asset value may be related to surveillance. Addresses and phone numbers are sensitive, but everyone knows they tend to be in the public domain (e.g. phone books) and are easily changed. While a list of your addresses and phone numbers (static data) may be worth protecting, it has a much lower surveillance value than a database that shows where you have been every minute of the day for the past six months (active log). You should be able to specify that when you want to share a phone number that does not mean you also want to expose more valuable location data.

3) Opt-in rather than opt-out. You might be able to guess a little about someone from the records they keep for themselves, but a tracking system they do not know about is invaluable for investigators and surveillance. Someone may choose to only keep a few phone numbers, no addresses, no passwords…but how do they choose to reduce the amount of location data stored on their iPhone? Even more to the point a user may select how long they want to store call and message records. They may have a habit of regularly deleting the records on their phone. But if they want to reduce the location data…?

4) *Someone* could hack your phone? No need to hack the phone with so many remote vectors open. What about someone who hacks third party marketing firms or engineers an app to aid the theft of location data? Who trusts Apple to protect the information from exposure through non-hack channels since they say in their TOS that they may share the information on a phone with whomever? They clearly leave the door open to the old infamous ChoicePoint attack vector.

Let me recap how that breach unfolded in 2004. An attacker used stolen identities to set up businesses that looked legitimate to ChoicePoint. The attacker then opened 50 ChoicePoint accounts over twelve months in order to avoid suspicion as they stole identity information. The District Attorney on the case told me that someone at ChoicePoint only grew suspicious when a Nigerian in Los Angeles named Olatunji Oluwatosin called and said he had a common American name but he could not pronounce it. He then was arrested with five phones and three credit cards that he had registered with stolen identity information. As an aside, Oluwatosin’s attack was not the first major breach of this kind. An almost identical one happened two years before, leading to nearly $1 million in fraud. The difference in 2004 was that it happened after the 2003 California breach notification law (SB1386).

I would wager that many inside ChoicePoint before 2004 had a big stack of FUD cards they would throw at anyone who dared to say information was at risk. After 2004, due to California law, they could no longer use a FUD card to avoid dealing with the security of information — too much was available to too many people without the consent of the owners.

I hope that helps clarify why the controversy surrounding the iPhone tracking data reveals something much more serious than just an Apple oops moment. If consumers have to use laws or an online fuss to affect the privacy settings in an Apple device, it means Apple’s privacy advocate and security team inside their organization lack the necessary influence to represent consumer concerns.

A privacy screw up like this probably happens at Apple because no one was vested with the authority or presence to put their foot down and say “location info is too valuable to collect and store for a long time — we have no justification for the risk — no more than 7 days” or “users must have an opt-out”. Perhaps even that is being too kind. Maybe no one calculated the risk of collecting location data at all, which would be an even more disappointing possibility. Hopefully they are looking at the root cause for the failure and increasing the scrutiny of products before launch, but so far their public explanations have not been encouraging.

There are many issues in the TechCrunch analysis, and I could go on about how wrong they are (e.g. even anonymous data can be abused and need privacy protection), but I want to end with a brief favorite. Compare and contrast these two statements:

I’ve been sitting on panels about location issues for a few years now. The discussion always falls to the same place: privacy and security.

[…]

Let’s be honest: no one is going to be talking about this issue in a few weeks.

It seems that they predict no one will talk about this location issue of privacy and security because the discussion always comes back to this location issue of…privacy and security. Got that?

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.