DL 191

On Suffering & Surveillance

Published: 2019-03-30 • 📧 Newsletter

Welcome to Digitally Literate 191. On suffering & surveillance.

Hi all, my name is Ian O'Byrne and welcome to Digitally Literate. In this newsletter, I try to synthesize what happened this week so you can be digitally literate as well.

I posted a couple of other things this week:


🔖 Key Takeaways


📺 Watch

This week my classes started playing with AR and VR using some high tech, and low tech tools and toys. This video discusses the fact that the two technologies are confusingly similar, but utterly different.

Augmented reality overlays digital information onto physical world—you see your actual environment enhanced with virtual elements through phone screen or glasses. Virtual reality replaces your environment entirely—headset blocks physical world and substitutes computer-generated space. AR augments reality; VR creates alternative reality. The distinction matters pedagogically: AR keeps students grounded in physical context while adding information layers, VR transports them to otherwise inaccessible places and perspectives. AR works with cheap devices (any smartphone); VR requires dedicated hardware. AR suits contextual learning (identifying plants in the field); VR enables impossible experiences (walking through historical events, exploring molecular structures). Both technologies raise presence questions—what happens when digital and physical blur?


📚 Read

Nellie Bowles in The NY Times on the possible reasons why the dominant thought leaders from Google to Apple are focused on inner virtues, self-mastery, and courage. This focus on Stoicism may be an indication that the "world and its current power structure are correctly set" and they need to just fit right in.

Start-ups big and small believe their mission is to make the transactions of life frictionless and pleasing. But the executives building those things are convinced that a pleasing, on-demand life will make them soft. So they attempt to bring the pain.

The contradiction is telling: billionaires building platforms to eliminate friction in others' lives embrace personal friction as virtue. Cold showers, fasting, extreme exercise—suffering as status symbol. Stoicism historically emerged among those without power to change circumstances; tech's adoption inverts this, deployed by those with maximum power to avoid changing circumstances. If the problem is internal (weakness, distraction, insufficient discipline), the solution is personal transformation. If the problem is external (inequality, exploitation, concentration of power), the solution is systemic change. Stoicism conveniently locates all problems internally. The philosophy that once comforted the powerless now justifies the powerful.

Great piece by Chris Gilliard responding to an op-ed in the New York Times about a privacy lesson. Kate Klonick, an assistant professor at St. John's University Law School, described an assignment where she asked students to eavesdrop on and surveil unsuspecting people in public to see what information they could gather using only Google search on their phones.

Gilliard summarizes his thinking: Don't surveil people. Don't turn students into spies. Don't divorce privacy from its effects on vulnerable populations.

Gilliard's critique cuts to fundamental problem with "privacy as abstraction" pedagogy. Treating surveillance as intellectual exercise—interesting puzzle about information availability—ignores differential impact. Some populations face surveillance as existential threat: undocumented immigrants, domestic violence survivors, political dissidents, marginalized communities targeted by law enforcement. Teaching students to surveil strangers normalizes surveillance as neutral activity rather than power exercise with real victims. The pedagogy also assumes students occupy positions of relative safety—they're the watchers, never the watched. Truly understanding privacy requires recognizing it as protection for the vulnerable against the powerful, not thought experiment for the privileged.

James Mullarkey on the test we need to give to "Internet of Things" devices to make sure they ensure our basic rights and personal safety.

How do your devices stand up to these three conditions regarding your data:

Mullarkey's three-part test exposes how thoroughly current IoT devices fail basic data ethics. Smart speakers that listen always, thermostats that report your schedule, televisions that watch you watching them—none pass even the first test. The individual doesn't decide what's collected; the company does. Data purpose isn't user enrichment but corporate value extraction. Sharing decisions are buried in terms of service no one reads. The framework isn't radical—it simply asserts individual sovereignty over personal data. That such basic standards seem utopian reveals how far surveillance capitalism has normalized invasion as convenience.

Angela Chen on a startup, Vainu, that uses "prison labor" to classify data to train artificial intelligence algorithms. The startup is using inmates at two prisons in Finland to do a new type of labor.

Some suggest that this is a partnership and a kind of prison reform that teaches valuable skills. Others suggest it plays into the exploitative economics of prisoners being required to work for very low wages.

Is this empowerment or exploitation?

The question resists simple answers. Traditional prison labor—license plates, furniture—offers no transferable skills. Data tagging provides experience in growing field, potentially useful post-release. But: prisoners can't meaningfully consent given power differential. Low wages extract value from captive workforce. "Job training" rhetoric often masks exploitation. Finland's prison system is more humane than most, but questions persist about whether prisoners can truly choose. The deeper issue: AI systems require massive labeled datasets, and that labeling is tedious human work. Who does invisible labor enabling machine learning? Often the most vulnerable—gig workers, developing world contractors, prisoners. The AI supply chain obscures human cost.

Jamie Brooker, entrepreneur and co-founder of Kahoot! on the regular discussions we have about banning devices from our classrooms.

Brooker suggests that instead we should treat this as an opportunity to help young people today become productive members of society.

By becoming knowledgeable about the wider role technology plays politically, socially, and environmentally, and with a greater appreciation for the positives of what technology could enable if designed empathetically, the next generations will be better placed to create the tools that provide a more sustainable future.

The ban debate often misses the point. Phones aren't the problem—unexamined phone use is. Banning devices in school doesn't teach students to manage devices outside school; it just delays reckoning. Brooker argues for education about technology rather than protection from it. Students need to understand how platforms manipulate attention, how algorithms shape perception, how design choices serve corporate interests. This knowledge enables critical engagement rather than passive consumption. The goal isn't digital abstinence but digital citizenship—informed users who can navigate technology intentionally rather than reactively.


🔨 Do

Understanding Procrastination

If procrastination isn't about laziness, then what is it about?

A few of the key insights:

…on a neural level, we perceive our "future selves" more like strangers than as parts of ourselves. When we procrastinate, parts of our brains actually think that the tasks we're putting off — and the accompanying negative feelings that await us on the other side — are somebody else's problem.

We must realize that, at its core, procrastination is about emotions, not productivity. The solution doesn't involve downloading a time management app or learning new strategies for self-control. It has to do with managing our emotions in a new way.

Neuroscience reframes procrastination as temporal empathy failure. We discount future suffering because future self feels like stranger—the neural patterns for thinking about yourself in five years resemble patterns for thinking about other people. Present comfort beats future self's problems. Understanding this shifts intervention strategy: not willpower or scheduling but emotional regulation. The task feels overwhelming? Address the overwhelm. The task triggers anxiety? Address the anxiety. Productivity systems fail because they treat symptoms (missed deadlines) rather than cause (emotional avoidance). Managing procrastination means befriending your future self.


🤔 Consider

"With technology tracking us everywhere we go, 'cosplay' might become our best defense against surveillance." — Annalee Newitz

Newitz's provocation suggests identity play as resistance strategy. If surveillance depends on consistent identity—linking behaviors across time and space to single profile—then inconsistency defeats it. Cosplay traditionally means costume play, adopting fictional identities. Digital cosplay might mean: varying personas across platforms, introducing noise into data profiles, strategic inconsistency confounding algorithms. Silicon Valley stoics perfect consistent selves; perhaps the oppressed should perfect inconsistent selves. Privacy through obscurity, identity through multiplicity.


Previous: DL 190Next: DL 192Archive: 📧 Newsletter

🌱 Connected Concepts:


Part of the 📧 Newsletter archive documenting digital literacy and technology.