TLDR 142

Too Long; Didn't Read Issue 142

Published: 2018-03-24 • 📧 Newsletter

Welcome to Issue 142. This week we come to terms with the fact that you are the product.

Typically in TL;DR, I strive for some balance in the news of the week, unless there is something big that happens and I think we need to drill down into the storylines. This is one of those weeks. Don't worry...we should get back to a broad spectrum of stories next week...but for now...

This week I was on Spring Break. I presented four different talks over the week. Here are the slide decks in case you're interested:

Please subscribe to make sure this comes to your inbox each week. You can review archives of the newsletter or on Medium.

Say hey with a note at hello@wiobyrne.com or on the socials at wiobyrne.


🔖 Key Takeaways


📺 Watch

This video is a good start to this week's issue. The story about Facebook and Cambridge Analytica is quite complex, and it has raised a lot of tension and questions. As we begin...take a couple minutes to get the overview.


📚 Read

This week a whistle blower came out with a scandal involving Cambridge Analytica, a British consulting firm which combines data mining, data brokerage, and data analysis with strategic communication for the electoral process. There is a complicated web of relationships that explains how Cambridge Analytica was able to harvest raw data from 50 million Facebook profiles to direct its messaging. Cambridge Analytica, is tangled up in several scandals and it can be hard to keep track of how all the pieces fit together. This series of diagrams from Vox provides a good overview.

Please be advised that this story is related to the 2016 U.S. Presidential Election. As such, there is a great deal of hysteria and hyperbole involved. In this issue of TL;DR, I'm trying to cut through the mess and help inform you about what is happening with your data, content, and identity in online spaces.

The three paragraph summary from the post in The Atlantic is available below:

In June 2014, a researcher named Aleksandr Kogan developed a personality-quiz app for Facebook. It was heavily influenced by a similar personality-quiz app made by the Psychometrics Centre, a Cambridge University laboratory where Kogan worked. About 270,000 people installed Kogan's app on their Facebook account. But as with any Facebook developer at the time, Kogan could access data about those users or their friends. And when Kogan's app asked for that data, it saved that information into a private database instead of immediately deleting it. Kogan provided that private database, containing information about 50 million Facebook users, to the voter-profiling company Cambridge Analytica. Cambridge Analytica used it to make 30 million "psychographic" profiles about voters.

Cambridge Analytica has significant ties to some of President Trump's most prominent supporters and advisers. Rebekah Mercer, a Republican donor and a co-owner of Breitbart News, sits on the board of Cambridge Analytica. Her father, Robert Mercer, invested $15 million in Cambridge Analytica on the recommendation of his political adviser, Steve Bannon, according to the Times. On Monday, hidden-camera footage appeared to show Alexander Nix, Cambridge Analytica's CEO, offering to bribe and blackmail public officials around the world. If Nix did so, it would violate U.K. law. Cambridge Analytica suspended Nix on Tuesday.

Cambridge Analytica also used its "psychographic" tools to make targeted online ad buys for the Brexit "Leave" campaign, the 2016 presidential campaign of Ted Cruz, and the 2016 Trump campaign. If any British Cambridge Analytica employees without a green card worked on those two U.S. campaigns, they did so in violation of federal law.


IMHO, some of the best thinking about digital spaces comes from Zeynep Tufekci. Her latest post in the NY Times details a common thread that she's been exposing over the last couple of months. Facebook is ultimately an online space where you social, and your data is sold to others.

Facebook users go to the site for social interaction, only to be quietly subjected to an enormous level of surveillance. "Facebook makes money, in other words, by profiling us and then selling our attention to advertisers, political actors and others. These are Facebook's true customers, whom it works hard to please."


This post from Andrew Keene Woods on the Lawfare blog is a great legal primer on the moving parts of the debacle. Woods indicates that this was not a 'breach' of data, but it was a breach of trust.

Several key takeaways from this piece by Woods:

"[Aleksandr] Kogan did not need to get Facebook data through the back door. He could waltz in through the front door -- the door Facebook built for developers."

This was not a breach of Facebook's network. But it was a breach of users' trust, general expectations and perhaps also Facebook's terms of service.

If you're Kogan, or Cambridge Analytica, expect lawsuits, public hearings and general regulatory hell. Maybe, in the extreme, jail time. If you're Facebook, expect lawsuits, public hearings, and general regulatory hell. Maybe, in the extreme, the end of the firm as we know it.


Facebook CEO Mark Zuckerberg finally stepped out to respond to the questions about Facebook's role in this controversy. Wednesday evening he sat down for an interview with CNN's Laurie Segall in which he largely repeated his PR department's talking points.

Zuckerberg clarified that he was "really sorry that this happened." He reiterated that he believed the company made big mistakes, first by allowing app developers far too much access to user data in previous builds of the site, and later by blindly trusting that Cambridge Analytica and other companies involved would actually delete it just because they sent them a sternly worded letter. Zuckerberg committed to notifying all impacted users that third parties may have run off with their data.

Let us remember that Facebook has been playing it "fast and loose" with your data and info all along. In recent issues of TL;DR we've talked about the growing cacophony of voices that suggest that social networks, and this platform specifically may be bad for us.

There was no Facebook data breach. This is not a mistake. Facebook is operating exactly as they are designed. They're doing exactly what they should be doing. Facebook never earned your trust and now we're all paying the price. You gave them the data, and they gave it away — all according to plan.


Arwa Mahdawi in The Guardian brings us to a close by asking the question that has been on everyone's lips all week. Are the Cambridge Analytica revelations the final nudge we need to turn away from the social network? And, this is only the tip of the iceberg when it comes to big tech harvesting private information. Is it time to delete Facebook?

In my opinion, the answer is no. Facebook provides an incredible service that is very much needed. They provide a digital commons around which we all commune. This recent event is a sign of a broader chain of their regular decisions, actions, and ethos. But, I don't think it's worth leaving the network. I do think this shows a need for a broader public dialogue about our rights and freedoms online. We cannot put the genie back in the bottle. We cannot wish for a world without big data and big tech. But, we can clearly indicate and enforce the rights, rules, and freedoms we demand online. In short, we need an Internet Bill of Rights for all users. Europe's GDPR may be the continuation of these events.

You may want to delete your private Facebook data without deleting your account. Please be advised...Facebook isn't going to make it easy on you.


🔨 Do

Doug Belshaw shared this excellent post from Buster Benson on Living Like a Hydra. I recommend reading the whole post...multiple times. You won't regret it.

One of the things I really enjoyed from the piece was this list of 10 ways to live an antifragile life:

  1. Stick to simple rules
  2. Build in redundancy and layers (no single point of failure)
  3. Resist the urge to suppress randomness
  4. Make sure that you have your soul in the game
  5. Experiment and tinker — take lots of small risks
  6. Avoid risks that, if lost, would wipe you out completely
  7. Don't get consumed by data
  8. Keep your options open
  9. Focus more on avoiding things that don't work than trying to find out what does work
  10. Respect the old — look for habits and rules that have been around for a long time

🤔 Consider

"If you're not paying for something, you're not the customer, you're the product being sold." — Andrew Lewis


Previous: TLDR 141Next: TLDR 143Archive: 📧 Newsletter

🌱 Connected Concepts:


Part of the 📧 Newsletter archive documenting digital literacy and technology.