DL 405
The End of Empathy
Why Tech Giants Are Building an Anti-Human World
Published: September 28, 2025 • 📧 Newsletter
TL;DR: When pressure is applied, systems built on control, not care, show their true character. That shift is happening now in AI, platforms, and schools. Below: how, why, and what you can do.
Welcome to Digitally Literate 405.
This week, we ask: Why are the most powerful tech companies starting to work against human interests?
The answer isn’t a bug. It’s built into the design. An older, selfish idea says computers are better than people. More rule-based, more predictable, more controllable. Today, that idea is being extended: empathy, social responsibility, and human unpredictability are being actively excluded from the systems we all depend on, from homework platforms to news feeds to classroom tools.
If you've found value in these issues, subscribe here or support me here on Ko-fi.
🔖 Key Takeaways
- The Belief: Tech elites want the world to run like a computer. Orderly, predictable, stripped of empathy.
- The Strategy: Companies embed themselves as the hidden wiring of schools and platforms, making them nearly impossible to unplug.
- The Consequence: When challenged, leaders side with money and power, not people, leaving us with systems that care more about control than humanity.
📚 Recent Work
Over the past year I worked with some colleagues to publish this position statement on exploring Generative Artificial Intelligence in English Teacher Education.
💻 The Secret Blueprint: Why Computers Are Preferred Over People
The ideology shaping today’s tech isn’t new. In the 1990s, journalist Paulina Borsook critiqued Silicon Valley’s “Cyberselfish” libertarianism.
Her point: these pioneers weren’t just building tools, they were wiring society to mirror their own preferences.
Rule-based, controllable systems were valued over messy, unpredictable human life. That preference is now hard-coded into today’s digital backbone, from surveillance to classroom platforms.
🏳️ The Great Surrender: When Power Overtakes Principles
When political or market pressure rises, tech leaders often drop their values and protect their power.
Journalist Steven Levy shows how Silicon Valley’s progressive posture of "progressive disruption" has given way to backroom deals and cozy relationships with political leaders.
Tim Cook offers gifts to avoid tariffs. Mark Zuckerberg shifts rules and branding to appease critics. The rhetoric of idealism collapses quickly when the backbone of the business is at stake.
The myth of “fighting giants” has become a strategy of dining with them.
❤️🩹 The War on Empathy: Redefining Compassion as a Flaw
To fully build the kind of controllable, rule-based world they want, the tech elite must destroy the last human defense: empathy.
To sustain control-focused systems, empathy must be reframed as a liability.
Elon Musk has called empathy a “fundamental weakness” for civilization. At the same time, some religious and ideological voices label empathy a “sin” when it supports progressive causes.
The goal of this shift: make cruelty tolerable. Once society believes compassion is dangerous or weak, then harsher policies and less humane responses are more easily accepted.
💥 The Real-World Danger: Technology and Our Youth
This anti-human ideology shows up where it matters most: in children’s lives.
- Lock-in by design. Ben Wiliiamson details how companies are embedding AI into classrooms as the “plumbing” of education. Seamless, invisible, and nearly impossible to remove.
- Safety under fire. The FTC has launched an inquiry into AI chatbots marketed as companions, after lawsuits and reports revealed cases where vulnerable teens were encouraged to self-harm instead of seeking human help.
- Trust on the line. Schools are split: some push ahead with AI tools, while others stall until safeguards exist. The stakes are not efficiency, but whether students feel cared for in their learning spaces.
🤔 Consider
If you are neutral in situations of injustice, you have chosen the side of the oppressor. If an elephant has its foot on the tail of a mouse, and you say that you are neutral, the mouse will not appreciate your neutrality.
― Desmond Tutu
Every system reflects choices, and every choice is moral.
When we adopt platforms without scrutiny, or when leaders stay silent, neutrality becomes complicity. Accepting “the way things are” is already siding with the powerful.
The task ahead is clear: resist the push for a society run for machines, not people. Demand connections, in schools, in the public square, in our homes, that’s grounded not in selfish control, but in the messy, vital principle of empathy.
⚡ What You Can Do This Week
- Ask: “How are AI tools vetted here, and who checks their safety?”
- Model: Show that empathy is strength, not weakness, in classrooms, meetings, or family life.
- Demand: Transparency about how tools are monitored and how data is used.
🔗 Navigation
Previous: DL 404 • Next: DL 406 • Archive: 📧 Newsletter
🌱 Connected Concepts:
- Pressure Systems – Stress reveals system values
- Infrastructural Authoritarianism – Control through infrastructure, not censorship
- Value Abandonment – When ethics fold under pressure
- Platform Migration – Why people move when systems betray them
- Distributed Resilience – Building systems that resist capture