TLDR 142
Too Long; Didn't Read Issue 142
Published: 2018-03-24 • 📧 Newsletter
Welcome to Issue 142. This week we come to terms with the fact that you are the product.
Typically in TL;DR, I strive for some balance in the news of the week, unless there is something big that happens and I think we need to drill down into the storylines. This is one of those weeks. Don't worry...we should get back to a broad spectrum of stories next week...but for now...
This week I was on Spring Break. I presented four different talks over the week. Here are the slide decks in case you're interested:
- Digital scholarship & digitally engaged publics. You'll want to read my piece in Hybrid Pedagogy if you want more info.
- Scaffolding learners in online & hybrid/blended learning spaces
- Digital Cougars survey: Documenting instructional use of technology in College of Charleston classrooms
- Educating students (& educators) as critical readers & writers in online spaces
Please subscribe to make sure this comes to your inbox each week. You can review archives of the newsletter or on Medium.
Say hey with a note at hello@wiobyrne.com or on the socials at wiobyrne.
🔖 Key Takeaways
- Harvested 50 Million Profiles: Cambridge Analytica scandal reveals how Aleksandr Kogan's personality quiz app harvested data from 270,000 users plus their friends' data totaling 50 million Facebook profiles, creating psychographic voter profiles for Trump campaign and Brexit.
- Surveillance Capitalism Business Model: Zeynep Tufekci exposes Facebook's core business model—users come for social interaction but are subjected to enormous surveillance, with their attention sold to advertisers and political actors who are Facebook's true customers.
- Not a Breach But Trust Violation: Legal analysis clarifies this wasn't a network breach but a breach of user trust—Kogan walked through Facebook's front door built for developers, highlighting how platform design enables data exploitation.
- Zuckerberg's Insufficient Response: Mark Zuckerberg's CNN interview largely repeated PR talking points saying he's "really sorry," acknowledging mistakes in allowing developers too much data access and blindly trusting companies would delete data when asked.
- Facebook Operating As Designed: Facebook never earned users' trust and is operating exactly as designed—you gave them data, they gave it away, all according to plan, raising question of whether to delete accounts or demand Internet Bill of Rights.
- Need for Internet Bill of Rights: Rather than deleting Facebook and losing valuable digital commons, we need broader public dialogue enforcing rights, rules, and freedoms online, with Europe's GDPR potentially showing the path forward.
📺 Watch
Cambridge Analytica and Facebook's data collection problem
This video is a good start to this week's issue. The story about Facebook and Cambridge Analytica is quite complex, and it has raised a lot of tension and questions. As we begin...take a couple minutes to get the overview.
📚 Read
The Cambridge Analytica scandal, in 3 paragraphs
This week a whistle blower came out with a scandal involving Cambridge Analytica, a British consulting firm which combines data mining, data brokerage, and data analysis with strategic communication for the electoral process. There is a complicated web of relationships that explains how Cambridge Analytica was able to harvest raw data from 50 million Facebook profiles to direct its messaging. Cambridge Analytica, is tangled up in several scandals and it can be hard to keep track of how all the pieces fit together. This series of diagrams from Vox provides a good overview.
Please be advised that this story is related to the 2016 U.S. Presidential Election. As such, there is a great deal of hysteria and hyperbole involved. In this issue of TL;DR, I'm trying to cut through the mess and help inform you about what is happening with your data, content, and identity in online spaces.
The three paragraph summary from the post in The Atlantic is available below:
In June 2014, a researcher named Aleksandr Kogan developed a personality-quiz app for Facebook. It was heavily influenced by a similar personality-quiz app made by the Psychometrics Centre, a Cambridge University laboratory where Kogan worked. About 270,000 people installed Kogan's app on their Facebook account. But as with any Facebook developer at the time, Kogan could access data about those users or their friends. And when Kogan's app asked for that data, it saved that information into a private database instead of immediately deleting it. Kogan provided that private database, containing information about 50 million Facebook users, to the voter-profiling company Cambridge Analytica. Cambridge Analytica used it to make 30 million "psychographic" profiles about voters.
Cambridge Analytica has significant ties to some of President Trump's most prominent supporters and advisers. Rebekah Mercer, a Republican donor and a co-owner of Breitbart News, sits on the board of Cambridge Analytica. Her father, Robert Mercer, invested $15 million in Cambridge Analytica on the recommendation of his political adviser, Steve Bannon, according to the Times. On Monday, hidden-camera footage appeared to show Alexander Nix, Cambridge Analytica's CEO, offering to bribe and blackmail public officials around the world. If Nix did so, it would violate U.K. law. Cambridge Analytica suspended Nix on Tuesday.
Cambridge Analytica also used its "psychographic" tools to make targeted online ad buys for the Brexit "Leave" campaign, the 2016 presidential campaign of Ted Cruz, and the 2016 Trump campaign. If any British Cambridge Analytica employees without a green card worked on those two U.S. campaigns, they did so in violation of federal law.
Facebook's surveillance machine
IMHO, some of the best thinking about digital spaces comes from Zeynep Tufekci. Her latest post in the NY Times details a common thread that she's been exposing over the last couple of months. Facebook is ultimately an online space where you social, and your data is sold to others.
Facebook users go to the site for social interaction, only to be quietly subjected to an enormous level of surveillance. "Facebook makes money, in other words, by profiling us and then selling our attention to advertisers, political actors and others. These are Facebook's true customers, whom it works hard to please."
The Cambridge Analytica-Facebook Debacle: A legal primer
This post from Andrew Keene Woods on the Lawfare blog is a great legal primer on the moving parts of the debacle. Woods indicates that this was not a 'breach' of data, but it was a breach of trust.
Several key takeaways from this piece by Woods:
"[Aleksandr] Kogan did not need to get Facebook data through the back door. He could waltz in through the front door -- the door Facebook built for developers."
This was not a breach of Facebook's network. But it was a breach of users' trust, general expectations and perhaps also Facebook's terms of service.
If you're Kogan, or Cambridge Analytica, expect lawsuits, public hearings and general regulatory hell. Maybe, in the extreme, jail time. If you're Facebook, expect lawsuits, public hearings, and general regulatory hell. Maybe, in the extreme, the end of the firm as we know it.
Mark Zuckerberg: I can barely handle this CNN interview, what makes you think I can handle Congress
Facebook CEO Mark Zuckerberg finally stepped out to respond to the questions about Facebook's role in this controversy. Wednesday evening he sat down for an interview with CNN's Laurie Segall in which he largely repeated his PR department's talking points.
Zuckerberg clarified that he was "really sorry that this happened." He reiterated that he believed the company made big mistakes, first by allowing app developers far too much access to user data in previous builds of the site, and later by blindly trusting that Cambridge Analytica and other companies involved would actually delete it just because they sent them a sternly worded letter. Zuckerberg committed to notifying all impacted users that third parties may have run off with their data.
Let us remember that Facebook has been playing it "fast and loose" with your data and info all along. In recent issues of TL;DR we've talked about the growing cacophony of voices that suggest that social networks, and this platform specifically may be bad for us.
There was no Facebook data breach. This is not a mistake. Facebook is operating exactly as they are designed. They're doing exactly what they should be doing. Facebook never earned your trust and now we're all paying the price. You gave them the data, and they gave it away — all according to plan.
Facebook: Is it time we all deleted our accounts?
Arwa Mahdawi in The Guardian brings us to a close by asking the question that has been on everyone's lips all week. Are the Cambridge Analytica revelations the final nudge we need to turn away from the social network? And, this is only the tip of the iceberg when it comes to big tech harvesting private information. Is it time to delete Facebook?
In my opinion, the answer is no. Facebook provides an incredible service that is very much needed. They provide a digital commons around which we all commune. This recent event is a sign of a broader chain of their regular decisions, actions, and ethos. But, I don't think it's worth leaving the network. I do think this shows a need for a broader public dialogue about our rights and freedoms online. We cannot put the genie back in the bottle. We cannot wish for a world without big data and big tech. But, we can clearly indicate and enforce the rights, rules, and freedoms we demand online. In short, we need an Internet Bill of Rights for all users. Europe's GDPR may be the continuation of these events.
You may want to delete your private Facebook data without deleting your account. Please be advised...Facebook isn't going to make it easy on you.
🔨 Do
Living Like a Hydra
Doug Belshaw shared this excellent post from Buster Benson on Living Like a Hydra. I recommend reading the whole post...multiple times. You won't regret it.
One of the things I really enjoyed from the piece was this list of 10 ways to live an antifragile life:
- Stick to simple rules
- Build in redundancy and layers (no single point of failure)
- Resist the urge to suppress randomness
- Make sure that you have your soul in the game
- Experiment and tinker — take lots of small risks
- Avoid risks that, if lost, would wipe you out completely
- Don't get consumed by data
- Keep your options open
- Focus more on avoiding things that don't work than trying to find out what does work
- Respect the old — look for habits and rules that have been around for a long time
🤔 Consider
"If you're not paying for something, you're not the customer, you're the product being sold." — Andrew Lewis
🔗 Navigation
Previous: TLDR 141 • Next: TLDR 143 • Archive: 📧 Newsletter
🌱 Connected Concepts:
- Cambridge Analytica Facebook Scandal — Aleksandr Kogan's personality quiz app harvested 270,000 users plus friends' data totaling 50 million Facebook profiles, creating 30 million psychographic voter profiles for Cambridge Analytica used in Trump campaign, Ted Cruz campaign, and Brexit Leave campaign, with Mercer family investing $15M on Steve Bannon's recommendation, CEO Alexander Nix suspended after hidden-camera footage showed bribery offers.
- Surveillance Capitalism Zeynep Tufekci — Zeynep Tufekci exposes Facebook's business model where users come for social interaction but are subjected to enormous surveillance, with Facebook profiling users and selling their attention to advertisers and political actors who are Facebook's true customers the platform works hard to please.
- Facebook Trust Breach — Legal analysis clarifies Cambridge Analytica wasn't network breach but breach of user trust and expectations—Kogan didn't need back door, he walked through front door Facebook built for developers, highlighting how platform design enables data exploitation by third parties with users paying the price.
- Psychographic Profiling — Cambridge Analytica built on Cambridge University Psychometrics Centre work to create psychographic profiles analyzing personality traits for targeted political messaging, combining data mining, data brokerage, and analysis with strategic communication for electoral manipulation.
- Internet Bill of Rights GDPR — Rather than deleting Facebook and losing valuable digital commons for social connection, need broader public dialogue establishing and enforcing rights, rules, and freedoms users demand online, with Europe's GDPR potentially showing path forward for privacy protection.
- Antifragile Living Hydra — Buster Benson's philosophy for antifragile life inspired by Nassim Taleb: stick to simple rules, build redundancy with no single point of failure, resist suppressing randomness, have soul in game, take small risks, avoid wipeout risks, don't get consumed by data, keep options open, focus on avoiding what doesn't work, respect old habits.
Part of the 📧 Newsletter archive documenting digital literacy and technology.