DL 202
Clearing the Path
Published: 2019-06-22 • 📧 Newsletter
Welcome to issue 202. Clearing the path.
Hi all, my name is Ian O'Byrne and welcome to Digitally Literate. In this newsletter I curate the news of the week and distill it into an easy-to-read resource for you. Thank you for reading.
This week I worked on a lot things behind the scenes as I'm preparing for a weeklong institute on computational thinking and coding as part of the Infusing Computing initiative.
🔖 Key Takeaways
- Moderator Trauma Crisis: Facebook content moderators at Cognizant facilities experience severe PTSD, workplace filth, and lasting psychological damage from policing toxic content.
- Tech Veganism Philosophy: Doug Belshaw explores preference for open source, big tech suspicion, and high privacy standards as ethical technology stance.
- Art Censorship Irony: Bill Posters' deepfake of Zuckerberg gets labeled and suppressed by Facebook, demonstrating the platform silencing critique of itself.
- Zero-Click Dominance: 49% of Google searches now end without clicks as the platform provides answers directly, harming content creators and enabling "Google says it's true" thinking.
- Mindfulness Mechanism: Research reveals mindfulness reduces depression specifically by decreasing rumination—the repetitive mulling over negative emotions.
📺 Watch
Anouk Wipprecht: Fashion Tech and Wearable Interfaces
Somehow fashion designer Anouk Wipprecht popped up in my feed this week, and I went down a rabbit hole exploring her work.
Up above I shared the "spider dress." Check out the "smoke dress" and the "drinkbot dress." Definitely review Wipprecht's TED talk on Clothing as Emotional Interface.
Wipprecht's work collapses boundaries between fashion, technology, and emotional expression. The spider dress uses proximity sensors to extend mechanical legs when someone approaches too quickly—clothing that enforces personal space. The smoke dress deploys vapor when the wearer feels threatened. These aren't costumes but functional interfaces between body and environment, making internal states externally visible and actionable. The "clothing as emotional interface" concept extends wearable technology beyond fitness tracking toward affect computing: garments that sense, respond, and communicate. This represents fashion's evolution from passive covering to active agent in social interaction.
📚 Read
Facebook Content Moderators: The Trauma Behind the Scenes
As a general advisory, this story receives a content warning as it describes "violent acts against people and animals, accounts of sexual harassment and post-traumatic stress disorder, and other potentially disturbing content."
Former Facebook content moderators are speaking out about their working conditions in the United States for the first time ever.
Earlier this year, The Verge's Casey Newton broke the story about the working conditions of Facebook moderators in Phoenix, AZ. This latest story is a followup revealing that the pattern of severe workplace conditions extends to a second campus in Tampa, FL, but in even more extreme ways. They describe a scene in which the office is filthy, the work is grim, and the side effects of doing the job last long after it is over.
You can watch the video overview here.
We need to think about the longterm mental health implications on humans of policing our social networks. We also need to think about the discourse practices as we engage in these spaces.
Casey Newton's investigation reveals the human cost hidden beneath platform cleanliness. Content moderators—often contractors, not employees—view horrific violence, abuse, and exploitation daily, making split-second decisions about what billions of users should see. The working conditions compound trauma: inadequate breaks, surveillance, production quotas, filthy facilities. PTSD symptoms persist long after employment ends. This is the labor that makes social media usable—someone has to watch the worst humanity produces so you don't have to. Facebook's outsourcing to Cognizant distances the company from accountability while profiting from the moderation labor. We discuss what content should be removed; we rarely discuss who removes it and at what personal cost.
What Is No Good for the Hive Is No Good for the Bee
A great post by good friend Doug Belshaw looking at the subject of tech veganism. Tech veganism is:
- a preference for open-source software over proprietary software
- a suspicion of big tech companies
- a high bar for privacy and security
Belshaw threads the needle between "time well spent" in our public social networks (Facebook, Twitter), politics, and power.
My takeaway from this is that we need to think more not only about the practices and tools we use online, but also the real power or freedom that we have in these choices. Once again…this is bringing me back to open source and indieweb philosophies.
The "tech veganism" metaphor illuminates how technology choices become ethical stances. Just as food veganism involves examining supply chains and refusing participation in harmful systems, tech veganism examines who controls software, who profits from data, and what systems your usage supports. The indieweb connection matters: owning your data, hosting your content, building on open protocols rather than proprietary platforms represents technological self-determination. Belshaw's synthesis recognizes this isn't about purity—we all make compromises—but about conscious choice-making. What are you willing to accept? What are you willing to trade? These become moral questions, not just convenience calculations.
The Mark Zuckerberg Deepfakes Are Forcing Facebook to Fact Check Art
I've talked quite a bit about deepfakes over the last couple of years here in this newsletter. My favorite example of this technology is this video mashup of Steve Buscemi's head on Jennifer Lawrence's body.
That was until this deepfake of Mark Zuckerberg was released on June 7th by the digital artist & researcher, Bill Posters. Facebook marked this video and limited sharing on its network, which is weird as they frequently avoid limiting the sharing of other viral, and perhaps disingenuous content.
As a response, Posters said that he is "deeply concerned" about Facebook's decision to downrank his art, saying that it sets a dangerous precedent for other artists who want to critique or challenge systems of power. He's posted another Zuckerberg deepfake on Instagram, to protest the first one being labeled as false.
The Zuckerberg deepfake incident exposes platform power asymmetry. Facebook declined to remove manipulated videos of Nancy Pelosi, citing free expression. Yet when an artist uses the same techniques to critique Zuckerberg, the platform labels and suppresses it. The "fact-checking" framing misses the point: Posters' work is obviously art, not deceptive impersonation. The real issue is a platform protecting its leader while permitting attacks on others. Posters' protest—posting more deepfakes in response to censorship—demonstrates how artists can make platform inconsistency visible. When Facebook decides what critique is acceptable, it exercises editorial power while claiming neutrality.
49% of All Google Searches Are No-Click
Zero-click searches have steadily risen over the past three years, up from 12% in 2016 to 48.96% in 2019.
No click, or zero-click, searches are a search engine result page (SERP) that displays the answer to a user's query at the top of a Google search result. This kind of search result satisfies the user's intent without having to actually click on any search result links.
This is troubling for a number of reasons. First, it makes it harder for content creators as we see that Google is slowly building a walled garden in online spaces. We often find fault with Facebook for this, but Google is actively replicating this in the open.
Second, this is troubling for critical evaluation of online information. If "Google says it's true" people will increasingly take it as fact and move on.
The zero-click trend represents Google's transformation from search engine to answer engine—and the distinction matters enormously. Search engines connect you to sources; answer engines provide conclusions. When Google extracts content from websites and displays it directly, creators lose traffic while Google gains engagement time. The content ecosystem that produces answers gets undermined by the platform displaying them. The critical thinking implication is equally concerning: users never see source context, competing perspectives, or uncertainty markers. "Google says" becomes authoritative, bypassing the evaluation skills that clicking through to sources might develop. We're outsourcing not just search but judgment.
Mindfulness Reduces Depressive Symptoms by Decreasing Rumination
New research sheds light on the relationship between depression and mindfulness. The study found that people who exhibit more dispositional mindfulness tend to ruminate less about past events.
The study, printed in the journal Mindfulness suggests that ruminators tend to latch onto a negative emotion and repeatedly mull it over in their mind, whereas mindfulness teaches us not to become entangled with our negative emotions.
One of the authors, Paul Jose, suggested that "Future work would usefully explain more precisely how mindfulness provides a protective resource for individuals coping with their problems. Perhaps the mindfulness facets of non-judging and non-reacting are associated with particularly helpful coping strategies, such as cognitive restructuring?"
The rumination mechanism explains why mindfulness helps: not by eliminating negative thoughts but by changing relationship to them. Rumination involves obsessive replay—the same painful memory or worry circling endlessly. Mindfulness cultivates observation without attachment: noticing thoughts arise without following them down spirals. The "non-judging and non-reacting" facets matter because judgment feeds rumination ("I shouldn't feel this way") while reaction amplifies it. This research suggests mindfulness isn't mystical but mechanical—it interrupts cognitive patterns that sustain depression. The practical implication: mindfulness practices targeting rumination specifically may be more effective than generic meditation.
🔨 Do
Day One: Private Journaling App
Farhad Manjoo talks about the opportunity to journal, and document your life in an "unsocial way" using apps like Day One. Manjoo suggests that this "has transformed my relationship with my phone, improved my memory, and given me a deeper perspective on my life than the one I was getting through the black mirror of social media."
Think of Day One as a private social network for an audience of one: yourself. You post updates to it just as you might on Instagram or Facebook. The app can handle long text journals, short picture-focused status updates, and pretty much anything else that comes across the digital transom.
The "unsocial network" framing captures what private journaling offers that public posting doesn't: reflection without performance. Social media encourages crafting experiences for audience approval; journaling enables processing experiences for personal understanding. The memory benefit comes from active engagement—writing about events consolidates them differently than passive scrolling past memories served by algorithms. Day One's design recognizes that the interface matters: making journaling as frictionless as posting to Instagram lowers barriers to consistent practice. The "deeper perspective" Manjoo describes emerges from accumulation—patterns visible across entries that individual moments obscure.
🤔 Consider
"Not all storms come to disrupt your life, some come to clear your path." — Paulo Coelho
Coelho's reframing of disruption connects to this issue's challenges and clearings. Content moderators experience storms of trauma that platforms should address. Tech veganism clears paths toward more ethical technology use. Zero-click searches disrupt content creators. Mindfulness clears rumination's storms. The question is which disruptions we accept and which we resist.
🔗 Navigation
Previous: DL 201 • Next: DL 203 • Archive: 📧 Newsletter
🌱 Connected Concepts:
- Content Moderation — Casey Newton exposing Facebook moderators' PTSD trauma and workplace conditions as hidden cost of platform cleanliness in Platform Labor.
- Tech Ethics — Tech veganism philosophy preferring open source privacy and big tech suspicion as conscious ethical stance in Digital Rights.
- Platform Power — Facebook suppressing Zuckerberg deepfake art while permitting similar attacks on others demonstrating asymmetric editorial control in Censorship.
- Search Evolution — Zero-click Google searches rising to 49% transforming search engine to answer engine harming creators and critical thinking in Information Literacy.
- Digital Wellbeing — Mindfulness reducing depression through rumination interruption and Day One enabling private reflection versus public performance in Mental Health.
Part of the 📧 Newsletter archive documenting digital literacy and technology.