Published on December 12th, 2018 | by Kevin McTiernan
Why you should be troubled by access to your location information
The New York Times published a great story on how little we know about the information we’re sharing and, in particular, location. We all see our location information being used in real time to give directions or restaurant recommendations. What we don’t see (and what the article illustrates) is the downstream marketplace for location information and where we lose control of our private information. As I read this article, I noticed an interesting dichotomy – recent court rulings have placed strong protections on location information when it comes to law enforcement access while the access to your location by mobile apps and the downstream location economy operates seemingly unrestrained. How is that possible?
First let me set the stage and start with the bad – your smartphone is likely sending very precise details on your location (as often as every few seconds) to as many as 70+ companies that use that information to serve up ads or to store and mine later. The good news is that the information is transmitted with an anonymized identifier, not your name or phone number… Phew! Right? Not so fast… As the article shows, it’s not that hard to figure out who that anonymized identifier really is.
How easy? Just consider how many people go to your gym – maybe a few hundred? How many of them go between 6 and 7 AM on weekdays – maybe fifty? How many of them go to your office building – maybe ten? How many of them go to your house in the evenings – probably just one.
It all started so innocent and easy, provide location in a smartphone and allow apps to access it to make people’s lives easier – ‘use my location to tell me good Italian food nearby… yes please’ or ‘use my location to tell me if it will be rain or snow tomorrow… thank you’. To pay the costs for running that app, the app maker uses the location information to present targeted advertising – ‘look, there’s a sale on cordless drills at the hardware store you’re about to drive past’. That seemed like a decent tradeoff – the app remains free to use. But then, the mobile app makers were presented with other sources of revenue – store the location information and mine it or sell the location information to 3rd parties. That’s where it gets a little dirty.
How data aggregators can make it all feel dirty
For example, take a data aggregator. They have purchased multiple bits of your life from various sources. Among other bits, they have your name, home and work addresses. They have mined the location information to narrow down the people who spend considerable time at your home and work address and associate with you. From the location information, they also know the gym that you attend and that you have one or more children in elementary school. They know when you change jobs. It’s possible that this is not too troubling for you – after all, much of the above could be learned from social media posts too. But, it can become troubling very fast – a new location at which you’re spending considerable time pops up – an Oncology center for your chemotherapy visits. That is something very personal and private and maybe something you didn’t want to share with all but a small group of people. And, you never authorized that information going to a data aggregator when you checked tomorrow’s weather forecast.
Another data aggregator example, the location information can be mined to look at how many phones (i.e., employees) are inside of a factory to spot whether new shifts were added to production or whether there are fewer people on the floor in a shift. That can be used to indicate whether that company is having a good or bad quarter. If you buy or sell on that knowledge, are you insider trading? If you have non-public information and act on it, you are committing insider trading fraud. The counter argument would be that the information is available to anyone and mining it for this purpose is akin to predicting demand for heating oil based on weather forecasts. New technologies sometimes require updating definitions of the law.
What about when police need it?
Just this year, a ruling by the US Supreme Court further curtailed law enforcement access to location information. The logic in the decision echoes the concerns most anyone reading the Times article would have. That case is referred to by some as “The Carpenter Case” but officially is Carpenter v. The United States. I’ll summarize the background information: After investigating a string of armed robberies in Michigan and Ohio, the FBI arrested four of the robbers. After gaining access to one of the robber’s cell phone, they performed typical investigations involving call, tower and location analysis. The tower records identified one person near several of the robberies, Carpenter. He was charged and convicted of a slew of crimes and was sentenced to 1395 months (or 116.25 years) in prison. The conviction was upheld in the first appeal. The second appeal was to ask the Supreme Court for a review. The case was heard in 2017 and the ruling provided in 2018. The Court ruled 5-4 that (I’m paraphrasing here) because we carry mobile phones wherever we go, a “detailed, encyclopedic and effortlessly compiled” location history can be provided to law enforcement. That location history is so accurate and detailed and reveals so much about us personally that it is protected under the Fourth Amendment and accessing it requires a search warrant.
This is now the law of the land – your location information is protected under the Fourth Amendment. There are a few obvious differences between the location information from a mobile app and a law enforcement investigation. First, when location information is provided today to a mobile app, there is an anonymized identifier (someone receiving this information is not told who you are). Compare to when law enforcement requests cell transactions, they are provided the phone numbers, which are associated with you. Second, typically the information available and provided to law enforcement is much lower accuracy than your phone provides. But, this begs to question, if location information is judged by the US Supreme Court to contain such personal information that a search warrant is now required in a law enforcement investigation, how is it that much more accurate information is pulled from someone’s phone without any regulation and possibly used in ways that the subscriber never intended?
The Times article quotes Senator Ron Wyden (Oregon) as saying, “Location information can reveal some of the most intimate details of a person’s life — whether you’ve visited a psychiatrist, whether you went to an A.A. meeting, who you might date,”. As the article points out, Senator Wyden has proposed bills to limit collection and sale of not just location, but other personal, sensitive information.
Isn’t it all just public information?
You can recall recent stories in the news about how publicly-accessible DNA databases were used by law enforcement to close cold cases. The Golden State Killer case is one that received a lot of press this year. In this case, police had 13 murders, 50 rapes and 100 burglaries across California over a 12-year period starting in 1974. DNA from the crime scenes did not produce a match in the State’s criminal DNA database. Similarly, fingerprints and rewards did not result in worthwhile leads. The investigators turned to a different DNA database, GEDmatch, which is used by people to find sperm donors, birth parents or long-lost relatives. The database is populated with raw information uploaded by people who took popular, consumer DNA tests (such as, 23andMe and AncestryDNA). The Golden State Killer investigators uploaded the raw data from the crime scene DNA and finally got a hit. The hit was not a complete match but a partial one – a third cousin. With this lead, they created a family tree that led them to a suspect. Surveillance and covert DNA samples from that subject provided an exact DNA match. They had their man! Additional cold cases were later solved using the same method.
The GEDmatch database was originally created for people to find lost relatives or birth parents, not criminals. And, it wasn’t the suspect’s DNA that created a lead, it was a third cousin. That third cousin uploaded their DNA data to find long-lost relatives, not to incriminate one. Did the investigators perform an illegal search? Good questions that actual legal scholars have addressed (here). But the interesting part is, if you read the legal opinions on the DNA investigation, many of the same legal concepts come up in both the Golden State Killer and The Carpenter cases. Does that mean something new in the future?
Bringing it back full-circle
On the heels of the success and techniques used in the Golden State Killer investigation, it’s possible that searching what industry makes available to consumers or businesses could be the new norm for law enforcement investigations. Could information gleaned from 3rd parties (like a public DNA database, a data aggregator or a targeted advertiser) and done so through a false pretext be a new source used alongside lawfully authorized sources (like, wiretaps or CDRs)? I think Yes and we’re already there.
At what point would such a general 3rd party search or the information gained from it violate Fourth Amendment protections?
Contact us today to learn how SS8’s nearly twenty-year legacy in the law enforcement and intelligence space can help deliver the right solution for you.
Kevin is responsible for leading the vision, design, and delivery of SS8’s government solutions, including the Xcipio compliance portfolio. His deep knowledge of the telecommunications and network security industries spans 20 years, with extensive experience in the areas of cyber security, network forensics, big data, fraud detection, and network monitoring.