TLDR 183
Focusing on Importance
Published: 2019-02-02 • 📧 Newsletter
Welcome to Issue 183. Focusing on importance.
Hi all, welcome to TL;DR. My name is Ian O'Byrne. I research, teach, & write about technology in our lives. I try to synthesize what happened this week in tech...so you can be the expert as well. We'll have some changes upcoming for this newsletter to help achieve these goals. :)
I posted a couple of things this week:
- What is Digital Literacy? - I'm serving on two workgroups that are revising/rewriting the definitions of digital literacy for literacy educators. This post shares some insight...and asks for response.
- What is Screentime? - Change your phone's display to grayscale.
- Encouraging & facilitating student peer review using Peergrade - An overview of Peergrade, a tool that I've been using to create a space to allow my students to give and get feedback from peers.
🔖 Key Takeaways
- Facebook VPN Spying: Facebook paid users 13-35 up to $20 monthly to install VPN giving total phone access to study habits until Apple blocked it revealing systematic pattern of spying collecting data and selling to advertisers.
- Tailored Ads Delusion: Zuckerberg claims users want personalized ads as value for data but research shows this is false users don't actually want ad personalization contradicting Facebook's talking points.
- Screentime Depression Minimal: Massive study of 355,358 adolescents shows screentime explains less than 0.4% of depression revealing previous research deeply flawed and moral panic unfounded.
- AI Labor Reconfiguration: Data & Society research shows automated AI technologies reconfiguring work requiring "human infrastructures" teaching workers to make their work machine-legible for AI systems.
- Internet Nostalgia Reflection: New Yorker retrospective on tech moments bringing us here fills with nostalgia despite valid privacy and security concerns demonstrating how far we've come in short time.
📺 Watch
Mark Manson on Focusing on What Matters
Marie Forleo interviewing Mark Manson to identify the ways to focus on the things that matter in life. Manson is the author of the book, The Subtle Art of Not Giving a Fck*. I just finished reading this book, and it was excellent.
Please be warned...there are some NSFW words in the interview.
Manson's philosophy about selective caring applies directly to technology consumption. Can't care deeply about everything social media throws at you. Attention is finite resource requiring intentional allocation. Caring about what matters means not caring about what doesn't. The digital environment defaults to caring about everything (notifications, updates, viral outrage) leaving no capacity for what actually matters. Focusing on importance requires active curation of what gets attention and ruthless rejection of manufactured urgency.
📚 Read
An Internet Retrospective
For better or worse, the Internet is an element in all of our lives. This retrospective from Erin Overbey and Joshua Rothman in The New Yorker gives a great review of some of the moments in tech that brought us here.
I often have concerns about privacy, security, and our data in these digital spaces. But, this piece filled me with a bit of nostalgia as I consider how far we've come in such a short amount of time.
Now that we've started on a positive note...
Nostalgia for early internet reveals how much changed how quickly. Dial-up modems, AOL discs, first Google searches, early social networks—all within single generation. The retrospective allows appreciation for technological acceleration while maintaining critical eye on where we've arrived. Nostalgia doesn't mean uncritical acceptance. We can appreciate how far we've come while questioning whether we've come to the right place. The speed of change itself is worth reflecting on: has velocity prevented thoughtful development?
Facebook Has Been Paying Teens $20 a Month for Total Access to Their Phone Activity
Since 2016, Facebook has a program where they pay users from 13 to 35 up to $20 a month to install a VPN (virtual private network) on their phones. They use this backdoor to suck up all of the user's data to study their habits and practices.
Hours after the story broke, Facebook indicated that it would shut down the program. But, before they did anything, Apple blocked the app on iOS devices.
So why does this matter? Facebook is entering a second year of huge data scandals. There is a pattern in which Facebook spies on users, collects data, and sells this off to advertisers...or worse. When they get caught, they obfuscate, block transparency tools, or defer. They cannot be trusted with our data.
The VPN program exemplifies surveillance capitalism's exploitation model: target vulnerable populations (teens, young adults needing money), offer minimal payment for total surveillance, use data to optimize addictive features and ad targeting. The $20 payment obscures value extracted—comprehensive behavioral data worth far more. Calling it "research" frames exploitation as scientific inquiry. Apple's intervention reveals regulatory vacuum where platforms police each other not because of ethical concerns but competitive advantage. Facebook's "shutdown" came only after public exposure not internal ethics review.
Mark Zuckerberg's Delusion of Consumer Content
When Facebook gets caught, they provide two talking points.
The first of which is that they don't "steal" or collect your data...you give it to them. This argument is true, yet Facebook also is less than transparent in their terms of use, and letting you know what they're doing with your data.
The second thing they indicate is that they're providing you a value for your data in "tailored ads." They indicate that we're all going to get ads, so why not get ads for things you actually like. Protection of ad personalization is a common talking point, but do users really want this?
In this opinion, Joseph Turow and Chris Jay Hoofnagle share some research in which users suggest...no...this is not something that we want.
The "value exchange" narrative is gaslighting: telling users they want something research shows they don't. When explicitly asked, users overwhelmingly reject personalized ads, surveillance-based targeting, and data collection for advertising. But Facebook frames this as beneficial service users desire. The delusion isn't that Zuckerberg doesn't know users' actual preferences—it's that he doesn't care. The business model requires convincing users (and regulators) that surveillance advertising is value-add service not exploitative extraction. Research puncturing this narrative threatens fundamental business model justification.
The Association Between Adolescent Well-Being and Digital Technology Use
A new paper based on a massive sample size of 355,358 adolescents indicates that screentime explains less than 0.4% of depression. The research also shows that previous research in this area is deeply flawed.
This twitter thread by Patrick Markey provides an excellent overview on the publication.
The 0.4% finding demolishes screentime panic. For context, eating potatoes explains more variance in wellbeing than screentime. Wearing glasses explains more variance. The effect size is effectively zero. Previous research showing strong correlations suffered from methodological problems: small samples, self-reported data, correlation-causation confusion, publication bias favoring alarming findings. This massive study with rigorous methods finds essentially no relationship. The implications: maybe focus on actual drivers of adolescent depression (economic precarity, climate anxiety, school pressure, inequality) rather than moral panic about screens.
AI in Context: The Labor of Integrating New Technologies
In this new report from Data & Society, researchers Alexandra Mateescu and Madeleine Clare Elish show how automated and AI technologies are reconfiguring work at family-owned farms and grocery stores.
As automation becomes a larger force in our lives (and the lives of our children) we need to be considerate of how to work with the machines.
The authors discuss the "human infrastructures" needed to integrate with these machines. Put simply, future workers will need to think about how to make their work machine legible.
"Machine legibility" reveals automation's hidden labor cost: humans must adapt to machine requirements not vice versa. Farmers restructure fields for robot navigation. Grocery workers standardize movements for computer vision systems. Knowledge work gets fragmented into microtasks for algorithm distribution. Making work machine-legible often means deskilling, reducing autonomy, increasing surveillance. The "human infrastructure" is workers absorbing complexity of making incompatible systems function together. Rather than AI adapting to human work, humans adapt to AI limitations. This isn't inevitable but reflects power dynamics where workers bear adjustment costs.
🔨 Do
Performance and Printed Text
This video popped up in my feed earlier this week. It's a great look at the power of performance and printed text.
The interplay between performance and text demonstrates how meaning emerges from multiple modes simultaneously. Text provides precision and permanence. Performance adds tone, emphasis, embodiment, and affect. Neither alone captures full meaning. This has implications for digital literacy: understanding requires engaging both what's written and how it's presented, who's speaking and from what position. Multimodal analysis recognizes meaning as constructed across words, images, sounds, and contexts not reducible to single channel.
🤔 Consider
"Don't just sit there. Do something. The answers will follow." — Mark Manson
Manson's action-oriented philosophy challenges analysis paralysis plaguing digital culture. Facebook VPN scandal demands doing something—deleting account switching platforms demanding regulation. Tailored ads delusion requires action rejecting surveillance business model. Screentime panic needs action addressing actual depression causes not moral handwringing. AI labor reconfiguration demands worker organizing not passive acceptance. Internet retrospective nostalgia must inspire action building better future. Focusing on importance means acting on what matters not endlessly consuming information about what might matter someday. Sitting produces nothing. Doing produces learning even from failures. Answers follow action not contemplation.
🔗 Navigation
Previous: TLDR 182 • Next: TLDR 184 • Archive: 📧 Newsletter
🌱 Connected Concepts:
- Facebook Teen VPN Scandal — Facebook paying users 13-35 $20 monthly for total phone surveillance through VPN until Apple blocked revealing systematic spying pattern in Surveillance Capitalism.
- Zuckerberg Tailored Ads Delusion — Joseph Turow Chris Jay Hoofnagle research showing users don't want personalized ads contradicting Facebook's value exchange narrative justifying surveillance in Advertising Ethics.
- Adolescent Wellbeing Screentime — Study of 355,358 adolescents showing screentime explains less than 0.4% of depression demolishing panic and revealing flawed previous research in Media Effects.
- AI Labor Integration — Alexandra Mateescu Madeleine Clare Elish examining how automation requires human infrastructures where workers make work machine-legible absorbing adjustment costs in Future of Work.
- Internet Nostalgia — New Yorker retrospective on tech moments allowing appreciation for acceleration while maintaining critical eye on whether velocity prevented thoughtful development in Technology History.
Part of the 📧 Newsletter archive documenting digital literacy and technology.