TLDR 187

Dealing with Trauma

Published: 2019-03-02 • 📧 Newsletter

Welcome to Issue 187. Dealing with trauma.

Hi all, welcome to TL;DR. My name is Ian O'Byrne. I research, teach, & write about technology in our lives. I try to synthesize what happened this week in tech...so you can be the expert as well.

I posted a couple of other things this week:


🔖 Key Takeaways


📺 Watch

Kevin Smith is an American filmmaker, actor, comedian, author, and podcaster. He has directed (and starred in) some of my favorite movies of my youth.

Thanks to the YouTube algorithms, I found out that he also recently suffered a massive heart attack caused by a blockage of the left anterior descending artery. I've lost some important people in my life through heart disease (and cancer) so this caught my attention.

Thankfully, Kevin is alive, but this video has me thinking a bit more about the food that my Wife and I consume...and have in the house. We're thinking about eliminating all dairy and gluten from our consumption at home. The kids can still eat whatever they'll eat.

What are your thoughts about diet?

Smith's near-death experience demonstrates how health crises prompt reassessment of daily habits. The "widowmaker" heart attack—complete blockage of left anterior descending artery—has extremely high mortality. His survival and subsequent lifestyle changes raise questions about preventive approaches versus reactive interventions. The decision to eliminate dairy and gluten reflects growing awareness of inflammation's role in cardiovascular disease. Dietary choices become proxy for confronting mortality, control we can exert when facing uncontrollable bodily vulnerability.


📚 Read

National Online Safety Guides

A friend of mine, Peggy Semingson sent me this link earlier this week to include in my research and outreach. Subscribe to Peggy's YouTube channel for more of her work.

The portal from the National Online Safety organization shares their most up to date guides for social media apps and platforms.

Please note, the content at the link above has a content warning. The story contains discussion of serious mental health issues and racism.

Here in TL;DR, I've often taken Facebook, YouTube, and other organizations to task for identifying or developing algorithms to solve problems associated with content monitoring. But, I also don't think about the challenges with having humans work through these same materials.

This post is making me think much more seriously about the challenges with policing these spaces.

The Verge's investigation reveals the profound psychological cost of content moderation. Workers at Cognizant facilities contracted by Facebook experience PTSD, depression, anxiety, and substance abuse from reviewing thousands of disturbing images and videos daily—child abuse, graphic violence, animal cruelty, suicide. The "trauma floor" operates under surveillance, with strict quotas (400+ pieces of content daily), limited breaks, low pay ($28,800 annually), and inadequate mental health support. The contradiction: we demand platforms remove harmful content but externalize the human cost to underpaid contractors. The impossible position: either automate moderation (missing context, making errors) or expose humans to traumatizing material. The systemic failure: treating content moderation as disposable labor rather than specialized work requiring extensive support, fair compensation, and trauma-informed management.

Post from Brian Resnick in Vox addressing one simple phenomenon. Teen anxiety, depression, and suicide are all on the rise. Blaming these on screentime, technology, and devices may be a cop-out.

We're not sure what is driving these trends, but it's problematic (lazy) to lump everything into one catchall topic known as "screentime."

"Screen time isn't a thing; it's 100 things," Florence Breslin, a scientist with the Laureate Institute for Brain Research, says. "It's social media, it's video games. it's research, it's reading." Those categories can even be refined further. Playing an online cooperative game with friends is a different experience than playing a solitary game, for example. And researcher ought to expect, or wonder, if those different applications yield different effects in the mind.

Resnick's critique exposes the intellectual laziness in screentime panic. "Screentime" conflates fundamentally different activities: reading academic articles, cyberbullying, video chatting with grandparents, doomscrolling Twitter, collaborative gaming, passive YouTube consumption. These have wildly different psychological effects. The aggregation obscures actual mechanisms—what specifically about which activities might harm or help? Teen mental health crisis is real and alarming, but monocausal explanations rarely explain complex social phenomena. Other factors: academic pressure, economic anxiety, climate crisis awareness, reduced independence and free play, helicopter parenting, sleep deprivation, social media comparison culture. Blaming screentime provides comforting simplicity and actionable target (take devices away!) while avoiding harder questions about societal structures causing youth distress.

J-PAL North America recently released a new publication summarizing 126 rigorous evaluations of different uses of education technology. Drawing primarily from research in developed countries, the publication looks at randomized evaluations and regression discontinuity designs across four broad categories: (1) access to technology, (2) computer-assisted learning or educational software, (3) technology-enabled nudges in education, and (4) online learning.

The results point to four interesting areas:

The J-PAL meta-analysis provides rare evidence-based clarity in edtech debates. The key finding: access alone doesn't improve learning—just giving students devices without pedagogical integration yields no academic benefits, only technical skills. Computer-assisted learning works when software adapts to student level, provides immediate feedback, and supplements teacher instruction—essentially personalized practice in areas like math where algorithms can effectively scaffold. Technology nudges succeed because behavioral interventions (text reminders about deadlines, personalized encouragement) are low-cost, scalable, and address actual barriers (forgetting, procrastination). Online courses consistently underperform face-to-face instruction, likely due to reduced accountability, social isolation, and lack of immediate support. The implications challenge both techno-optimism (devices don't magically improve learning) and techno-skepticism (well-designed software does help). The critical variable isn't technology itself but pedagogical design and implementation.

Last week in TL;DR, I indicated that Russia was in the process of testing barriers to their connection to the Internet. I asked the question why they would want to test out these implements.

Apparently the United States military went on the offensive during the 2018 midterm elections, knocking out the internet at a Russian "troll factory" that was trying to spread misinformation online to interfere in the US elections.

Much more to think about and unpack in this area. Of worth noting at the end of the piece, as one US official told the Post, "It's not escalatory. In fact, we're finally in the game."

The cyber operation against Internet Research Agency marks shift from defensive to offensive posture in information warfare. US Cyber Command deliberately disrupted internet access to prevent Russian operatives from spreading disinformation during 2018 midterms—direct action against foreign infrastructure conducting active measures. The official framing as "finally in the game" reveals strategic calculus: treating cyberspace as domain for military operations, not just defensive monitoring. This escalates conflict while claiming it's proportional response. Russia's internet disconnect tests make more sense in this context—preparing for international cyber conflicts where nations might be cut off from global internet. The implications: militarization of information space, potential for cascading retaliation, normalization of offensive cyber operations, fragmentation of internet into national enclaves. The paradox: defending election integrity through actions that set precedents for broader cyber warfare.


🔨 Do

Keto Diet Research

Keto diets have been quite the rage over the last couple of years. I'm still researching this low carb diet plan since I first heard about it on the Tim Ferriss Show. I've been really interested in it in light of recent research suggesting it may be an adjuvant therapy in cancer treatments.

Still researching.

Ketogenic diets—high fat, adequate protein, very low carbohydrate—force body into ketosis, burning fat instead of glucose for energy. Originally developed for epilepsy treatment, now marketed for weight loss, performance, and potential therapeutic applications. The cancer research angle: cancer cells rely heavily on glucose metabolism (Warburg effect), so metabolic therapy through ketosis might selectively starve tumors while nourishing healthy cells. Evidence remains preliminary—promising in animal studies, limited human trials. The dietary intervention is challenging to maintain and carries risks (nutrient deficiencies, kidney stress, keto flu). Like many dietary trends, keto gets oversold by advocates and dismissed by critics when reality likely lies between extremes.


🤔 Consider

"People who do a job that claims to be creative have to be alone to recharge their batteries. You can't live 24 hours a day in the spotlight and remain creative. For people like me, solitude is a victory." — Karl Lagerfeld

Lagerfeld's insight connects directly to dealing with trauma—creative workers and content moderators both face battery depletion requiring solitude for recovery. Content moderators experience trauma from constant exposure to humanity's worst, needing space to process what they've witnessed. Teens experiencing mental health struggles need solitude from constant social media spotlight, not generic screentime reduction. The throughline: sustained attention—whether creative, curatorial, or social—depletes psychological resources requiring restorative solitude. Modern platforms demand 24/7 presence, treating withdrawal as failure when actually solitude enables continued engagement.


Previous: TLDR 186Next: TLDR 188Archive: 📧 Newsletter

🌱 Connected Concepts:


Part of the 📧 Newsletter archive documenting digital literacy and technology.