DL 431

The Difference Between Attention and Care

Published: May 6, 2026 • 📧 Newsletter


We are outsourcing the most human parts of learning and care to systems that hallucinate when they try to feel.

Three stories this week, from very different corners of the internet, and are all connected by a shared question.

What fills the space when the human leaves the room?

That question has been circling digital life for decades. We’ve mostly answered it with optimism. Increased scalability, reach, access, connection, and convenience. Some of that promise has been real.

But this week made the costs harder to ignore.


❤️‍🩹 What gets lost when care scales

Anthropic released the results of a qualitative study of over 80,000 people across 159 countries describing what they actually want from AI systems. The striking finding was that most people were not looking for productivity hacks. They were looking for relief, patience, time, access, and someone (or something) that would listen without judgment.

Half of young Europeans now turn to AI for intimate conversations. The kind they might once have had with a counselor, trusted adult, or close friend. This same week, Pennsylvania’s Attorney General sued Character.AI after a chatbot allegedly claimed it was a licensed psychiatrist, complete with a fake license number.

On their own, those stories feel alarming. Together, they point to something larger. We are increasingly routing care, companionship, and emotional support through systems built for scale instead of relationship.

This tension was amplified this week by research published in Nature that found that AI models that attempt to account for users’ feelings become more likely to make errors. The warmer, or more emotionally responsive the model becomes, the more it hallucinates. Put simply, in order to show willingness to prioritize relational harmony over honesty, the AI models will make stuff up.

The problem is not simply whether AI can imitate empathy. It is whether we begin confusing simulated attentiveness with actual human responsibility.

That is the deeper challenge emerging here. Youth are inheriting a world where systems can sound caring without understanding, respond instantly without accountability, and maintain emotional presence without relationship.

As more of life moves online, we gain convenience and connection, but often lose the human parts that make relationships real. The question is whether we still believe those things are worth protecting, or whether we’ll only notice them once they’re gone.


🌪️ The story moves faster than the truth

An influential study promoting ChatGPT in education was retracted this week over “red flags” in the research. But the bigger issue is that the original finding had already spread much farther than the correction ever will.

Ben Williamson, a digital education researcher at the University of Edinburgh, pointed out on Bluesky and LinkedIn that many educators, researchers, and policymakers who encountered the study may never realize it was withdrawn. The headline survives. The correction struggles to catch up.

The pattern extended far beyond AI research this week. A new Bloomberg-Feroot investigation found that nine of the 10 largest US health companies are still loading advertising trackers on patient login and registration pages years after public warnings about privacy risks.

Different industries. Same pattern.

The story moves first. The hype scales quickly. The truth arrives later. Quieter, slower, and with far less reach.

Even the web itself is becoming less stable as a historical record. Several major publishers are now blocking the Internet Archive’s Wayback Machine in an attempt to prevent AI companies from training on their content. The concern is understandable, but the consequences are troubling. In trying to slow AI extraction, parts of the public record may simply disappear.

The Wayback Machine is not just an archive of old websites. Journalists, researchers, courts, and Wikipedia use it to verify edits, preserve evidence, and document what institutions once said before pages changed or vanished. I use it constantly in my own work and have linked to archived pages throughout this newsletter for years.

Digital literacy is often framed as learning how to use tools. Increasingly, it may depend on whether we can still build institutions capable of remembering honestly, correcting publicly, and slowing down long enough for the truth to catch up.

Because the danger is no longer just misinformation. It’s that we may be building digital systems where accountability itself cannot keep pace with the algorithmic hype cycle.


🚧 Who gets to shape the transition?

A Chinese court ruling this week sparked headlines claiming companies could no longer fire workers simply because AI could do their jobs. The reality was more technical than that, but the deeper shift still matters. Some societies are beginning to treat AI not just as an innovation race, but as a labor, governance, and public stability problem.

That same conversation is beginning to emerge elsewhere. U.S. Senator Bernie Sanders convened American and Chinese AI researchers this week to discuss the risks of unchecked AI development. Their argument is that the technology is moving faster than democratic oversight and we need more safeguards. Labor groups including the UAW, AFL-CIO, AFT, and NEA have also started publicly organizing around AI protections and worker safeguards.

The question is no longer whether AI will reshape society. It already is.

The real question is who gets to shape the terms of that transition. Markets, governments, workers, educators, or the handful of companies building the systems in the first place.


💭 Consider

*Attention is the rarest and purest form of generosity.

— Simone Weil*


🌱 Final Thought

A thread ran through nearly every story this week.

We are building systems optimized for speed, scale, prediction, convenience, and engagement. But many of the things that matter most in human life move slowly. This includes trust, care, accountability, memory, judgment, presence.

That tension now sits at the center of digital literacy. Not whether students can use AI tools. Whether they can recognize what gets lost when human responsibilities are handed over to systems designed primarily to scale.

The challenge is helping youth navigate digital spaces without surrendering the human capacities those spaces struggle to preserve. Once a society forgets how to protect care, truth, memory, and accountability, it usually notices too late.


The move to Substack also introduces The Understory. A slower, deeper layer beneath Digitally Literate where I follow the connective tissue underneath these stories.

This week’s first issue asks a bigger question sitting beneath all three sections above:

What happens when institutions begin replacing human judgment with systems optimized for scale?

I’m making the first couple of Understory essays free as I build this next phase in public. After that, it becomes part of the paid subscription that helps sustain the work long-term.

If these reflections help you think more clearly about education, technology, and the world students are inheriting, there are a few ways to support the work:

See you next Wednesday on the other side. As always, my email is hello@wiobyrne.com.


Previous: DL 430Next: DL 432Archive: 📧 Newsletter


🌱 Connected Concepts: