DL 402
The Great Fracturing
Published: September 7, 2025 • 📧 Newsletter
Welcome to Digitally Literate 402. This week's headlines show a digital ecosystem splitting into two competing orders. One that answers risk with centralized control and mass data collection, and another that refuses that bargain to preserve privacy and human agency.
If you've found value in these issues, subscribe here or support me here on Ko-fi.
🔖 Key Takeaways
- Exit or Submit: Platforms are choosing to leave markets rather than comply with invasive surveillance requirements
- Protection as Pretext: Age verification, health monitoring, and cybersecurity measures expand data collection under safety rhetoric
- Authenticity Crisis: Even AI industry leaders admit automated content is fundamentally changing the web
- Legal Pushback: Major settlements signal real consequences for extractive data practices
📚 Recent Work
Here's some of my recent posts:
- What’s Actually Inside an AI Model? (It’s Not What You Think) - When we talk about an AI model, what is actually in that model?
- Cracking Open AI Models: What AI Intelligence Actually Looks Like - A deeper dive into the files that you'll find in an AI model.
- Your Personal AI Sandbox: What It Really Means to Run Models Locally) - I think it's time to start thinking about running your own local, AI models. Here's what that means.
👁️🗨️ Exit or Submit
The internet is splitting in ways that affect all of us. Every time someone says they’re protecting us, it usually means collecting more of our data. We're presented with the same choice. Give up privacy for convenience and safety, or push back and accept some friction.
Some platforms are choosing exit over compliance when regulators demand invasive fixes. Bluesky blocked access in Mississippi rather than implement age verification it said would "fundamentally change how users access" the service. A clear example of a platform refusing to normalize universal identity checks.
This isn't isolated resistance. It’s a clear sign of the new digital bargain. Either submit to surveillance-as-standard, or leave the mainstream internet.
🛡️ Protection as a Pretext for Data Capture
“Protection” is becoming the justification for expanding data collection. RFK Jr., who once warned about smart devices being built for surveillance, now wants every American wearing one, even though there are no laws strong enough to stop companies from misusing that data.
Schools face the same trap. To defend against cyberattacks, they roll out monitoring systems that track students more closely. The PowerSchool breach that exposed data on 62 million students shows the risk: even systems meant to protect kids can end up harming them.
When protection becomes infrastructure, surveillance follows.
👾 Authenticity Under Strain
The fabric of the human web may be shrinking. Industry leaders are publicly worrying that automated content is changing the ratio of human to machine on the web. Sam Altman recently said he's "suddenly worried" that dead internet concerns may be real. The Dead Internet Theory (DIT) suggests that much of today's internet, particularly social media, is dominated by non-human activity, AI-generated content, and corporate agendas, leading to a decline in authentic human interaction. This is a remarkable admission from Altman, CEO of OpenAI, and one of the people most responsible for flooding platforms with AI-generated content.
Social media is "sloshing" in AI-generated posts. Engagement is collapsing. People keep scrolling, not out of joy, but out of habit.
⚖️ Legal Pushback
Courts and regulators are starting to react. Anthropic's proposed $1.5 billion settlement with authors over using pirated books to train AI models signals meaningful legal consequences for extractive data practices. A noteworthy element of this is that the judge ruled that the training of the AI models on the books was fine, but the stealing of the raw materials was not.
It will be interesting to see what happens in the next wave of lawsuits when publishers, music labels, news organizations arrive with deep pockets.
Meanwhile, the Google antitrust ruling avoided breakup while requiring data sharing. Google must hand over its search results and some data to rival companies but does not need to break itself up by selling its Chrome web browser.
A pattern of accountability theater that creates the appearance of strong action while potentially making underlying problems worse.
💪 From Fracture to Design: Choosing Agency
The bigger picture is clear. We’re being asked to trade privacy for safety. But that’s not the only option.
Instead of rejecting technology, we can demand smarter tech. Tools that collect less, explain themselves clearly, and keep people in control. That may mean accepting a little friction, like double-checking permissions or slowing down to verify a source. But sometimes friction is the feature, not the bug.
The challenge ahead isn’t whether we’ll keep using powerful digital tools. It’s whether we’ll use them on our terms, not theirs.
🤔 Consider
We live in a world where there is more and more information, and less and less meaning.
— Jean Baudrillard
This week’s stories point to the same dilemma. Protection framed as safety, or agency framed as resistance. We don’t have to reject technology, but we do have to insist on tools that collect less, explain themselves, and keep us in the loop. That’s how we bring meaning back to the flood of information.
🔗 Navigation
Previous: DL 401 • Next: DL 402 • Archive: 📧 Newsletter
🌱 Connected Concepts:
- Platform Accountability - How companies respond when regulation conflicts with business models
- Digital Rights - The tension between safety measures and privacy protection
- Age Verification - Universal identity requirements disguised as child protection
- Dead Internet Theory - The collapse of authentic human content online
- Surveillance Capitalism - How protection rhetoric enables data collection expansion
- AI Copyright Settlement - Legal consequences for extractive training data practices
- Data Minimization - Designing systems that collect less, not more
- Intentional Friction - Choosing agency over frictionless automation
Also available at digitallyliterate.net with linked connections.