Your Privacy: Lip Service or True Value?

The articles below recount an interesting set of experiences in which the authors did everything they could to cover their digital tracks and avoid all electronic tracking.
With a fair amount of effort and expense and success, they discuss the value of privacy and whether people at large are willing to actually do anything about it — or just gripe like usual.
How valuable is YOUR privacy? Are you doing anything about it?
Summary article by Pam Baker in FierceBigData, original pieces by Davey Alba in Popular Mechanics (March 6) and Popular Mechanics (Feb 4).
Emphasis in red added by me.
Brian Wood, VP Marketing

Most are too lazy to protect themselves

A few brave souls embarked on experiments to see if they could find a way to protect their privacy in an increasingly connected and big data fueled world. They’ve been successful to varying degrees and often at considerable cost. So did they find the effort to protect their privacy worth it in the end? Here are a couple of their stories to help you decide whether to follow the trail they blazed.
Julia Angwin, The Wall Street Journal, Pulitzer Prize-winning journalist and author of the newly released Dragnet Nation, spent a full year attempting to protect her privacy and thwart data collectors. She details her experiences in her book but also granted an interview to Davey Alba who wrote her answers to his questions in his post in Popular Mechanics. It’s an interesting read, particularly since Alba did much the same himself for an earlier Popular Mechanics cover story and understood both the privacy issues at stake and the enormity of the undertaking.
Interestingly, both of them worried that their reporting might amount to undue fear-mongering in the end, even though both are staunch privacy advocates.
“For three years I had been covering privacy issues for The Wall Street Journal and leading an investigative team,” said Angwin in the Alba interview. “We’d done stories that they’re tracking you here and they’re tracking you there. These were all revelations that people were shocked to find out. I did start to feel like I was contributing to the problem of fearmongering. I do think there are serious consequences to this tracking… but I was worried I was contributing to the feeling of helplessness, where people felt like, ‘Well, they know everything. The cat’s out of the bag, so there’s nothing I can do about it.'”
But neither was advocating giving up on actively protecting your privacy. Alba wrote about seven ways to protect your privacy in his cover story and Angwin dedicated a significant amount of the book on privacy protection tactics.
“What I hope is that… everyone will take a little something away from it [her book] and realize, you know what? I could do that. I could get better passwords, or I could block ad tracking. Or they might choose to do the really hard things. I wanted people to realize they do have some options,” said Angwin to Alba.
But what she discovered in the end was that most people didn’t care enough to exercise those options.
“Right now, there’s no metrics that show that anyone really cares,” she explained in that interview. “My husband refuses to do any of this. He’s like, ‘Are you kidding me? This is too much work.’ My kids actually love it, though. They do Silent text and Silent calls with me. They love that it’s secret. Kids love secrets, right? Having the secret is more fun.”
The hardest part was my sources,” she said. “I didn’t want them to think that talking to me is going to end up in trouble. Obviously, these days it’s getting more dangerous to talk to journalists, particularly about these kinds of topics. None of them–literally none of them–wanted to do it because it was too much work. It’s all this software. That was disappointing but actually understandable.”
So what do you think? Would you go to any great extent to protect your privacy or do you think it more effort and expense than it’s worth too? Do companies and governments need only to wait for the emotion to pass to get on with their work unimpeded? And if so, does this mean Big Brother is a certainty and ultimately inescapable?
We must decide now because big data is only getting bigger with every passing day and it will never be easier than it is now to confront and tame the issue.

Are You Too Paranoid About Digital Privacy, or Not Paranoid Enough?

After covering online privacy issues for years at The Wall Street Journal, Pulitzer Prize-winning journalist Julia Angwin asked what a lot of us are thinking: Is it possible to be an active participant in our increasingly connected world and avoid leaving a damaging digital trail, or at least keep a little of our privacy? In her new book, Dragnet Nation, Angwin describes how she spent 12 months changing her habits and using technological tools to protect herself from all the data-collecting dragnets that surround us—an experiment I’m all too familiar with, having done much the same for a PopMech cover story. Angwin quit Google, carried a burner phone, and lined her wallet with metallic film. I called her up to see if we could compare notes on our respective quests for privacy, security, and freedom.
Let’s start with one thing you hear over and over again when interviewing people about privacy: “I’ve got nothing to hide.”
Yeah, I hear that a lot, too, right? My feeling about that is, I also have nothing to hide, really. That’s why I think the checking is unfair. I’m not a suspect. I never would’ve had a file in the government, anywhere, because there’s no reason . . . except for maybe my tax returns. Certainly not in any of the intelligence agencies or my local police—even 10 years ago, right? Now I definitely do. That’s not because anything has changed about me. It’s because [it’s become so easy] for them to surveil everyone.
The question I ask is, what are they gonna do with that information? If it’s totally benign, that’s one thing, but I’m not sure we can guarantee that for all time to come. Just think about our changing social norms about drug use, right? Our social norms change over time. We’re willing to accept a certain amount of drug use, but maybe not so much. You don’t really know when you’re gonna get into trouble. Are you willing to trust that everything you do right now is not ever gonna be considered wrong?
When you assume you have nothing to hide, you concede way too much information at the outset—when it’s actually your right not to have that trove of information on you exist in the first place.
Right. I think that’s one thing that’s interesting about our country versus others. Basically, in Europe, privacy is considered a human right. They don’t ask whether or not you have anything to hide because they have this very different approach, which is you should have it as a guaranteed right. We don’t have that approach here, so we have to defend it and argue about it.
How do you find a balance between alerting people to all this digital spying without making them worry more than necessary?
That’s exactly why I wrote the book. For three years I had been covering privacy issues for The Wall Street Journal and leading an investigative team. We’d done stories that they’re tracking you here and they’re tracking you there. These were all revelations that people were shocked to find out. I did start to feel like I was contributing to the problem of fearmongering. I do think there are serious consequences to this tracking . . . but I was worried I was contributing to the feeling of helplessness, where people felt like, “Well, they know everything. The cat’s out of the bag, so there’s nothing I can do about it.”
My book was an attempt to investigate whether we can do anything about it. What I hope is that . . . everyone will take a little something away from it and realize, you know what? I could do that. I could get better passwords, or I could block ad tracking. Or they might choose to do the really hard things. I wanted people to realize they do have some options.
I consider those options to be a little bit like voting. If you care about privacy and you do something about it, then there’s some metrics that people care. Right now, there’s no metrics that show that anyone really cares.
Would you say that technology has opened the door to more possibilities for intrusion, but on the flip side, also gives us tools to fight back and reclaim our privacy?
Yes, and I agree that technology can be part of the solution. I think it might be hard for it to be the whole solution. If you think of it as an arms race between us, the citizens, and the people who are surveilling us, which is companies and the government, we’re gonna be underfunded in that race. [But if] we all just started encrypting our email, and we all started blocking unwanted tracking, and we all decided that we were not going to . . . agree to participate in companies that just say they’re gonna sell your data, we could make some decisions.
What were some of the problems you ran into when you tried to get off the grid? It was hard for me to give up my smartphone and do a burner phone or a flip phone. And I was pretty alone in a lot of the social networks and the tools that I used.
That’s why the section I have on encryption is called “Lonely Codes.” Because it was an entirely lonely existence. I could get myself all encrypted, but then convincing anyone I communicated with to do it was such a high bar. Silent Circle is one of the easier ones. Even that I had a hard time getting people to use—in part because it’s very expensive.
The burner phone was kind of a nightmare, because you think you have a phone with you, but you really can’t use it to call your regular contacts, or else you just basically might as well have your regular phone with you. And then you start giving out that number. Then it becomes your number, right? It was kind of a disaster. I ended up signing up for this service that gave me the equivalent of a Google Voice number, but I didn’t wanna use Google. I got this other number, and then it would forward to the burner phone. That was sort of a okay solution. The thing is, that company then has everything. Nothing really is an answer on the phone.
How much did you try to convince people to join you?
My husband refuses to do any of this. He’s like, “Are you kidding me? This is too much work.” My kids actually love it, though. They do Silent text and Silent calls with me. They love that it’s secret. Kids love secrets, right? Having the secret is more fun.
The hardest part was my sources. I didn’t want them to think that talking to me is going to end up in trouble. Obviously, these days it’s getting more dangerous to talk to journalists, particularly about these kinds of topics. None of them—literally none of them—wanted to do it because it was too much work. It’s all this software. That was disappointing but actually understandable.
A lot of these privacy issues involve information that you volunteer. Is this battle founded on this idea of convenience and sharing versus security now?
I still think that what you share explicitly on a social network is really a small part of what is being collected about you. What’s probably less obvious to people is that so much of what you do online, just what websites you visit and what searches you do, can be much more revealing about you than what you post. Because we post the sanitized, super-clean version. It’s all curated . . . Everybody’s posts are, like, “I’m so awesome.” I’m sure that that is of value to the people who do surveillance, but it seems as though they get much more information about us from watching our behavior. Where do we go? What do we do? What do we buy? Who are we talking to?
It’s the who are we talking to that really bothers me about social networks. The fact that your friends list is exposed, I think, is too revealing. That’s why I quit LinkedIn and I unfriended everyone on Facebook. I didn’t quit Facebook because I wanted to be able to see into Facebook, so I just kept a profile that basically says, “I’m not here. I’m on Twitter.”
A lot of these services you mentioned, especially the most popular ones, are free. But they take our information and offer it to advertisers, which gives them value and makes them profitable. Should we have a choice in the matter?
We have to decide whether we’re willing to pay for stuff. I’m willing to pay. I might not be a big enough market, but I do find it surprising that there aren’t people offering me more services. I can’t find an email service that I can pay for that doesn’t track me. I think that that is an indication to how far we’ve gone down the free road.
You’re exchanging your data and getting a free service. The problem is, you don’t know how much your data is worth. Right now, it’s not worth very much, actually. The Financial Times did an analysis, and my data was worth 28 cents. But what you don’t know is how much it’s worth over time, and when you aggregate it and add other data to it. f it’s being used just for marketing ads, maybe you’re willing to trade it. If it’s gonna be used against you by the government to put you in jail, it’s worth a lot.
How do we get more choices about our privacy?
I’m hopeful that, post-Snowden, that’s changing. There are a little bit more companies offering privacy-protecting services. I do think a market is starting to emerge. However, it’s not gonna be cheap. I pay a lot of money for that Silent Circle thing. I think it’s like $120 a year. I added up everything that I spent on privacy last year, and it was $2200.
Oh, wow.
Yeah, that’s a lot of money, and it doesn’t count my time.
What about regulation? How does the law need to change?
The laws that we operate under right now really envision an outdated mode of privacy, starting with the Fourth Amendment. The Fourth Amendment protects the walls of your home. They can’t come in and get your papers from your home without a search warrant. That is still true, but the thing is, now we don’t store our “papers” in our home. We store them in Google Docs.
Last year there was a Supreme Court decision, U.S. v. Jones, where Justice Sotomayor wrote in her concurrence that we may need to rethink the entire concept of the way the Fourth Amendment works in this world, where we store all of our documents outside of the home. That has not yet been adopted by the Supreme Court, but it was the first time that we’d seen the court acknowledge that there might need to be a rethinking.
The problem is, with the court it’ll take 10 years.
How hard is it to sustain a private life in the long term? Do you still actively use all the tools that you researched for your book?
I really do use them all. The thing that I haven’t solved is the phone. Other than limiting the apps on it and using encryption on the phone, I still haven’t really gotten to the point where I feel comfortable just leaving the house without my phone. I have kids, and I want them to reach me. I still am surprisingly committed to it. I think what happens is, the more you realize how much you’re being monitored, the more worried you get about it. I’m strangely more worried about it.
I haven’t been able to deal with the data brokers. It’s just not workable to never give your name out when you make purchases. Those kind of things are really hard.
Is it beneficial for people to at least get to that level, where maybe they can’t solve problems like the phone and the data brokers, but at least they’re aware?
I think so. Even if you don’t think you care about privacy, you do probably care about being tracked, and about criminals. They all use the same techniques. As a society, we’re all gonna just need to get more tech-literate, whether we want to or not.

7 Ways to Reclaim Your Digital Privacy

The digital spies are watching you—marketers, the NSA, identity thieves, and all kinds of snoops. But the battle’s not over. These are the seven best ways to fight back.
Privacy, we say, is about to come roaring back. No, it’s not too late. Yes, we know that Google monetizes both our emails and our search histories. It’s true that data brokers market our personal dossiers, listing everything from our favorite blogs to our old parking tickets (identity thieves must love it). And NSA leaker Edward Snowden really did prove the paranoids right: The United States government spies on everyone.
Now, we agree that security agencies have a vital responsibility to track terrorists, but that mission can’t require all citizens to live in a surveillance state. Feel you have nothing to hide? That assumes the data will always be used to defeat terrorists, not to monitor activists, let alone to stalk ex-girlfriends—yes, NSA employees have done that. Here’s the other side to the privacy-is-dead argument. You can fight the privacy erosion that technology has enabled using tools that technology provides. And when you protect your data—using encryption and other tools—you incidentally bolster the argument that security is the norm. At least it should be. Privacy is not dead but simply suffering from neglect. It’s your job to revive it.



Web browsers work in two directions: You use them to learn about the world, and snoops use them to learn about you. The sheer number of identifying files, or cookies, downloaded onto our computers can surprise even jaded digital natives. Many cookies are helpful—keeping you logged in to a service, for instance—but others exist purely to help marketers target their sales pitches. An online tool maintained by the Network Advertising Initiative can reveal who is collecting information on you; a browser we tested was being tracked by 82 firms, with names such as AppNexus, Criteo, and Datalogix.
Cookies can be cleared, but new methods for tracking online use will be harder to circumvent. For instance, some companies use browser fingerprinting, which looks for distinctive patterns of computer settings, such as installed fonts and time-zone details, to home in on a user’s identity. Google and Microsoft are also working on a new form of cookie-less identification: unique IDs with tracking that reaches beyond the desktop and into the user’s browsing activities on smartphones and tablets. Google’s system potentially could be used to tie together data across all its products—Gmail, the Chrome browser, and Android phones. In addition to tech firms, the U.S. government can monitor your digital trail through your browser. Among last year’s revelations: The NSA has tapped into the fiber-optic cables that make up the Internet’s backbone, and, through the Marina metadata application, the agency can track an individual’s browsing history, social connections, and, in some cases, physical locations.
Routine Fix: To practice good browser hygiene, regularly clear your cookies and your browser cache. There are a number of browser add-ons that can shrink the deluge as it pours in. For instance, AdBlock Edge blocks ads and third-party trackers. The Disconnect add-on lets you see and prevent otherwise invisible tracking of your browsing history. (Both add-ons work with Firefox and Chrome; Firefox is preferable because it’s an open-source browser.)
Extreme Fix: Organizing resistance to a totalitarian state and need real anonymity? Download the Tor Browser Bundle. Tor has become famous as a secure way for activists, journalists, and, yes, some criminals to browse the Web. Tor bundles your data into encrypted packets and directs it through a worldwide volunteer network of more than 3000 servers, hiding your location and making your data more difficult to read along the way.
There are two downsides to Tor: First, it’s slow, because your data is sent through at least three relays, with each relay donating different amounts of bandwidth to Tor users. Second, merely downloading it can draw government scrutiny. The NSA has reportedly developed a system called FoxAcid to insert eavesdropping applications into the machines of Tor users. However, the agency admitted in a leaked Snowden document, “We will never be able to de-anonymize all Tor users all the time.” A virtual private network (VPN) adds a different kind of protection by encrypting all outbound computer communications. Combine Tor with a VPN and you’ve got even tighter security.



In 2011 an Austrian law student named Max Schrems asked Facebook to provide all the data it had collected on him, taking advantage of an obscure provision in a European data-protection law passed in 1995. Schrems initially received only a fraction of his data. He protested, and eventually a CD showed up at his door that held a 1222-page PDF, which included employment information, relationship statuses, pokes, old chat conversations, and geotagged photos—most of it information that Schrems thought he had deleted. Such data is being monetized by tech companies in increasingly invasive ways. Google’s Shared Endorsements feature, for instance, allows the company to include a Google Plus user’s name and photo alongside ads being shown to his social contacts, if the original user had indicated some interest in the product. And potentially such data could also be pored over by recruiters, cybercriminals, and stalkers.
Routine Fix: Use strong privacy settings on each of your social networks, placing limits on who can see your posts. To block tracking software associated with the Share buttons on many websites, install Disconnect, an extension that disables such widgets. Also, log out of social networks when you’re finished, and routinely clear cookies.
Extreme Fix: Opt out of social media—invite your friends to a barbecue.



In early 2012 a tinkerer with the Internet alias Puking Monkey hacked a plastic “moo cow” toy to sound an alarm every time his E-ZPass was read. This RFID-enabled device is used to pay bridge and highway tolls throughout much of the East. But during a test drive in July 2013 the cow lit up and wailed in Manhattan, even when the car was nowhere near a toll plaza. The unseen E-ZPass readers had been installed to help monitor traffic flow—but that didn’t pacify the hacker. “If nontoll tracking is benign,” asks Puking Monkey in an email, “why is it not disclosed when you sign up for an E-ZPass?”
There are ways to avoid that kind of tracking. But you can’t do too much about the really big guns of automotive surveillance: the tens of thousands of automatic license-plate scanners deployed across the country. In Grapevine, Texas, to give one example, 14,547 vehicles were photographed in one day, and up to 2 million plates are currently stored in a database. Most law enforcement agencies can still set their own policies on the use and retention of the data (it varies by state); many have no policy at all. In addition to all this, cars are themselves data-sharing devices—electric cars can upload data to their manufacturers, and connected services such as GM’s OnStar and the Ford SYNC infotainment system send information to the cloud. But the most widespread in-car device is the event data recorder (EDR), which tracks seatbelt use, speed, steering, and braking, among other bits of vehicle data. This data comes into play during accident investigations. Ninety-six percent of cars built in 2013 have the devices; they will be required in all new cars starting next September.
Routine Fix: You can store RFID devices such as an E-ZPass in a read-prevention holder until you get to a tollbooth. Or simply pay cash—though that option is going away on some roadways. There’s a lot of chatter about techniques to defeat license-plate cameras, but it’s unclear whether these are legal or even effective.
Extreme Fix: When it comes to black boxes in cars, the best approach is to know your legal rights—or, better yet, just to drive safely. Really hate being watched? Buy an old car that predates black boxes.



Instant messages seem fleeting, but they’re not. The messages are stored, at least briefly, on the IM service provider’s servers, and, unless you delete them, on your machine and your partner’s. And unencrypted messages are vulnerable to interception as they travel from your device through your ISP’s network to your IM service provider (Google, AOL, Yahoo, Microsoft, or whomever) and then out to your friend’s computer. But does anyone actually snoop on IM conversations? Well, the U.S. government does, for one. Snowden leaks reported in July 2013 revealed the existence of XKeyscore, an NSA program run in cooperation with security agencies in New Zealand and Australia that, among other things, lets agents surveil IM correspondence, often in real time.
Routine Fix: Delete your chat records, in case anyone gets hold of your phone or laptop. You can stop recording future chats by changing the settings in your IM client.
Extreme Fix: The gold standard in IM encryption is OTR, or Off The Record (not to be confused with Google’s proprietary Off The Record chat feature, which isn’t secure). OTR uses “perfect forward secrecy,” which means a fresh set of encryption keys is created every time one partner in the chat sends a new batch of messages. Note: Even participants in the chat won’t be able to review old messages. As Ian Goldberg and Nikita Borisov, the designers of the OTR protocol, explained in an email, “The only record of the conversation is your memories.”



The content of your emails can be less revealing than the metadata—the record of which contacts you correspond with and how often. Through a program called Stellar Wind, the NSA logged metadata on email communications for 10 years, and from 2007 to 2011 the data included bulk information on Americans. In a separate effort, the government agency has been scooping up hundreds of millions of contact lists from around the world, at a rate of 250 million people a year.
One piece of fallout from that spying has been the shuttering of two services that until recently offered a high level of protection—not just against the United States government but also against repressive regimes and criminal organizations. Ladar Levison, the owner of Lavabit, a Texas-based secure email service, closed down operations in August after he was asked to hand over the encryption keys that protected his site to the FBI, which would have given the government access to all user data. The FBI said it was just interested in Lavabit’s most famous user, Edward Snowden—but refused Levison’s offer to provide access to that account only. A few hours later the encrypted communications company Silent Circle announced that it, too, was closing its email operations because, while the messages sent through its service were encrypted, email protocols—SMTP, POP3, and IMAP—leave user metadata open to spying. “We decided that our email service was too much of a risk for us and our customers,” Silent Circle’s Jon Callas says. “While it might have been a good idea six months before, it wasn’t a good idea in a post-Snowden world.” The companies have since teamed up to develop a new service, called Dark Mail, meant to secure both the content of an email and its metadata—the encryption will only work among Dark Mail users.
Routine Fix: Ordinary email protocols make it impossible to hide metadata information, but there are ways to secure the content of your messages. Check that you’re using the common Internet security protocols, SSL and TLS, when you’re on webmail. (The browser’s address line will start with https, and a small padlock appears.) If you’re using a desktop mail client, make sure you’re connected via SSL/TLS over IMAP or POP; otherwise your emails are being sent in cleartext and can be read by outsiders. Also, turn on two-factor authentication, a security feature offered by the three big email services, Gmail, Yahoo, and Outlook (see “5 Email Myths Debunked,” p. 82, for additional routine email-security measures).
Extreme Fix: People who truly need to guard their communications use PGP (Pretty Good Privacy) when they email each other. Every user has a pair of cryptographic keys, a public encryption key, and a private decryption one. The public key is widely distributed, while the private key is kept by the owner. A sender encrypts his or her note with the recipient’s public key, transforming it into gibberish. Since only the sender and receiver hold the keys, no one in the middle—including the email service provider—can decode the message. PGP doesn’t hide the metadata, though, and everyone you communicate with has to be using PGP for it to work.



There’s no need to invent the ultimate citizen-surveillance device: It already exists, and it’s called the smartphone. Police departments have been investing in IMSI catchers (that’s short for International Mobile Subscriber Identity). These devices insert themselves between mobile devices and cell towers—the technology can be used to identify participants at a demonstration and even access their conversations. Hackers can build or buy the devices, as well. Additionally, law enforcement agencies can easily subpoena third-party companies for user data; in 2011 cellphone carriers responded to an astonishing 1.3 million demands for subscriber information. The companies handed over text messages, caller locations, and other information, in most cases without the knowledge of the user. Brick-and-mortar retailers are also making use of cellphone-location data: Some chains have started experimenting with using phones to track individual shoppers as they move through the store. And many mobile phone apps can transmit location data, contact lists, and calendar information back to their developers. Lose an unlocked phone and, of course, you give up access to your contact lists, emails, chats, and everything else that resides on your phone.
Routine Fix: First, delete the apps you don’t use—fewer apps means fewer robotic spies.
Extreme Fix: Silent Phone can encrypt phone calls ($10/month, iOS and Android)—both parties need to be subscribers. There are also secure apps for IM chats and Web browsing. Prepaid, or burner, phones are relatively safe from snooping because they aren’t tied to an account. And if you’re worried about IMSI catchers at your next political rally, just leave your phone at home.



We all know that browsing on an unsecured network is just asking for someone armed with cheap network-analyzing software to tune in by vacuuming the 802.11 data packets flying between your machine and the Wi-Fi router. That can happen in Starbucks, in an airport, or in your home. Last September a federal appeals court ruled that Google could be held liable for civil damages for eavesdropping on homeowners’ Wi-Fi networks while using the company’s camera-carrying Street View cars. Google says it was all a misunderstanding: The Wi-Fi data was being used to pinpoint precise locations where GPS signals were spotty.
Routine Fix: Most wireless Internet access points come with WEP (Wired Equivalent Privacy) or WPA (Wi-Fi Protected Access) to let you encrypt the messages between your computer and your access point. Use WPA if possible; it’s the stronger technology. In addition to protecting your data, turning on encryption gives you legal protection against hackers under the Wiretap Act, which Congress passed in 1968 and last amended in 1986 through the Electronic Communications Privacy Act (ECPA). If you don’t make any attempt to secure your data transmissions, the law assumes that your intention is to run a public network.
Extreme Fix: Combine a virtual private network with the Tor bundle and you’re as safe as you can be—well, almost. Want even better security? Don’t use Wi-Fi at all.