DL 190
Hate, Fear, and Trolling
Published: 2019-03-23 • 📧 Newsletter
Welcome to Digitally Literate 190. Hate, fear, and trolling.
Hi all, my name is Ian O'Byrne and welcome to Digitally Literate. I'm remixing & rebranding a couple of my digital streams as I continue to think about the signals I create & consume online. As part of this, TL;DR is now Digitally Literate.
I research, teach, & write about technology in our lives. In this newsletter, I try to synthesize what happened this week so you can be digitally literate as well. This week I pull together some of the varying discussion about the role of digital, social spaces in the Christchurch mosque shootings from last week. If this content is too chilling, please feel free to skip this issue and join us next week.
I posted a couple of other things this week:
- The Technopanic Podcast - My podcast with Kristen Turner went live this week. Subscribe on iTunes, Spotify, PocketCasts, or the podcast catcher of your choice.
- Understanding the differences between privacy and security - It is your responsibility to protect and secure yourself while using digital tools and spaces. This primer gives an overview on some of the language we should use.
- Use the Internet Archive, WordPress, & Blubrry Plugin to set up audio podcasts - A video overview of how I used the tools and spaces listed above to host and share the Technopanic podcast.
🔖 Key Takeaways
- Algorithmic Radicalization: Online extremism operates through platform-driven pipelines that nudge users toward increasingly strident beliefs through recommendation algorithms and engagement optimization.
- Shitposting as Weapon: The Christchurch shooter's manifesto deliberately mixed genuine beliefs with trolling and memes to manipulate journalists into amplifying extremist messaging.
- Platform Design Problem: Social media's fundamental architecture—designed to spread content virally without friction—makes it structurally incapable of preventing harm at scale.
- Media Amplification Trap: Journalism's economic and cultural pressures lead to coverage patterns that inadvertently provide oxygen to extremists seeking attention and recruitment.
- Mental Toughness Necessity: Building psychological resilience becomes essential survival skill for navigating online spaces weaponized for harassment and radicalization.
📺 Watch
Christchurch Terrorism Analysis
The gunman in the Christchurch mosque shootings shared a racist manifesto online and posted live video of his attack on Facebook. The NY Times spoke to terrorism experts about why this matters.
The Christchurch attack represented a new synthesis of online extremism and real-world violence. The shooter didn't just commit mass murder—he produced it as content, optimized for viral spread. Live-streaming on Facebook, manifesto posted to 8chan, shoutouts to YouTube personalities during the killing. The attack was designed to be shared, discussed, memed. Previous mass shooters sought media coverage; this one created his own media. The platforms faced impossible task: content spread faster than moderation could work, copies proliferated through screen recordings and re-uploads, and the viral mechanics that make platforms valuable made them complicit in terrorism. The conversation shifted from "platforms need better moderation" to "perhaps platforms shouldn't be designed this way."
📚 Read
Online Extremism and the Radicalization Pipeline
Kevin Roose on the mass murders in Christchurch, New Zealand.
Now, online extremism is just regular extremism on steroids. There is no offline equivalent of the experience of being algorithmically nudged toward a more strident version of your existing beliefs, or having an invisible hand steer you from gaming videos to neo-Nazism. The internet is now the place where the seeds of extremism are planted and watered, where platform incentives guide creators toward the ideological poles, and where people with hateful and violent beliefs can find and feed off one another.
Roose identifies something crucial: algorithmic radicalization has no offline analog. The recommendation engine doesn't just connect people with similar interests—it optimizes for engagement, and outrage engages. A teenager watching gaming videos gets recommended anti-feminist content, then men's rights material, then white nationalist propaganda. Each step feels natural, chosen. The platform invisibly shepherds users toward extremity because extreme content generates extreme engagement. The radicalization pipeline isn't a bug but an emergent property of engagement-optimized systems. How do we talk about this in schools? The challenge extends beyond media literacy—students need to understand they're navigating systems designed to manipulate their attention and beliefs.
The Shooter's Manifesto Was Designed to Troll
Taylor Lorenz in The Atlantic.
Significant portions of the manifesto appear to be an elaborate troll, written to prey on the mainstream media's worst tendencies. As the journalist Robert Evans noted, "This manifesto is a trap … laid for journalists searching for the meaning behind this horrific crime. There is truth in there, and valuable clues to the shooter's radicalization, but it is buried beneath a great deal of, for lack of a better word, 'shitposting.'"
Shitposting means posting trollish, ironic content designed to derail conversation or elicit reaction from those not in on the joke. The Christchurch manifesto weaponized this—mixing genuine white supremacist ideology with internet memes, fake influences (he named PewDiePie and Spyro the Dragon), and deliberately provocative statements designed to generate media coverage. The trap: journalists trying to explain the attack would inevitably amplify the message, spreading extremist content while debunking it. This represents evolved propaganda—not just lying but poisoning the information environment so truth becomes impossible to extract. Media literacy now requires understanding that some content is designed specifically to be misreported.
Social Media Are a Mass Shooter's Best Friend
Ian Bogost on how technology platforms police content.
But the internet separates images from context and action from intention, and then it spreads those messages quickly among billions of people scattered all around the globe. The internet was designed to resist the efforts of any central authority to control its content—even when a few large, wealthy companies control the channels by which most users access information.
It's worth remembering that "viral" spread once referred to contagious disease, not to images and ideas. As long as technology platforms drive the spread of global information, they can't help but carry it like a plague.
Bogost's analysis cuts deep: platforms can do better, but better isn't the solution. The problem is the media ecosystem itself—systems designed to spread content without friction will inevitably spread harm. The decentralized architecture that makes the internet resilient also makes it uncontrollable. The viral metaphor is literal: content spreads through networks like pathogens through populations. Platforms face structural impossibility: the features that make them valuable (rapid sharing, global reach, algorithmic amplification) are precisely the features that make them dangerous. The only solution that works—limiting usability and reach—directly conflicts with business models built on maximizing engagement.
The Attack That Broke the Net's Safety Net
The NY Times Editorial Board on how the Christchurch attack overwhelmed content moderation systems.
It's telling that the platforms must make themselves less functional in the interests of public safety. What happened this weekend gives an inkling of how intractable the problem may be. Internet platforms have been designed to monopolize human attention by any means necessary, and the content moderation machine is a flimsy check on a system that strives to overcome all forms of friction.
The editorial identifies the fundamental tension: platforms designed to maximize engagement have bolted on content moderation as afterthought. The systems that spread content operate at machine speed; moderation operates at human speed. The Christchurch video spawned copies faster than reviewers could remove them—one upload became dozens became thousands. The "safety net" is structurally inadequate to the system it's meant to catch. The honest solution—making platforms less functional, adding friction, limiting virality—threatens profitability. So we get the worst of both worlds: platforms optimized for spread with moderation teams perpetually overwhelmed.
The Oxygen of Amplification
A report on better practices for reporting on extremists by Whitney Phillips. This Data & Society report showcases how news media was hijacked from 2016 to 2018 to amplify hate group messages.
The report has three parts: historical overview of far-right manipulation using trolling and meme culture during 2016; consequences of reporting on problematic information and structural journalism limitations; and tactical recommendations for establishing newsworthiness and handling manipulators.
Phillips' research documents systematic exploitation of journalism norms. Extremists learned that outrageous statements guarantee coverage—the more offensive, the more newsworthy. Journalists face impossible choice: ignore dangerous movements (failing public interest) or cover them (providing platform). The report advocates "better practices" over "best practices"—acknowledging no perfect solutions while offering tactical improvements. Key insight: newsworthiness is constructed, not given. Journalists can choose what amplifies. The oxygen metaphor: fire needs oxygen to burn, extremism needs attention to spread. Strategic silence sometimes serves public interest better than coverage.
🔨 Do
Building Psychological Armor
Although the structures to make you safe online leave a lot to be desired, there are some things you can do to protect yourself from cyberhate. One of the first steps is to get your psychological armor on.
Dr. Sean Richardson spoke about failure and its relation to mental toughness in a TEDx talk.
Build your mental toughness. Manage your expectations. Prevent emotions from getting the best of you. Find your source of motivation. Learn to delay gratification and let things go.
Richardson's framework for mental toughness becomes survival skill for digital age. Online spaces are designed to provoke emotional reactions—outrage drives engagement, fear keeps attention. Building psychological resilience means developing capacity to encounter provocation without reactive response. This isn't about becoming numb but about choosing when and how to engage. Managing expectations acknowledges that online interactions rarely match offline norms. Finding motivation means connecting to purpose beyond winning arguments. Letting things go recognizes that some battles can't be won and shouldn't be fought. Mental toughness for the internet age means emotional regulation as practiced skill.
🤔 Consider
"It is easy to hate and it is difficult to love. This is how the whole scheme of things works. All good things are difficult to achieve; and bad things are very easy to get." — Confucius
Confucius captures the asymmetry that platforms exploit. Hate spreads faster than love because hatred is simpler—binary, certain, energizing. Love requires complexity, nuance, patience. Platform design amplifies easy over difficult, fast over slow, reaction over reflection. The Christchurch shooter understood this—hatred is easily packaged, virally spreadable, engagement-optimized. Building counter-movements requires the harder work of connection, understanding, and sustained commitment that algorithms don't reward.
🔗 Navigation
Previous: TLDR 189 • Next: DL 191 • Archive: 📧 Newsletter
🌱 Connected Concepts:
- Online Extremism — Algorithmic radicalization pipelines nudging users toward increasingly strident beliefs through engagement optimization and recommendation systems in Platform Design.
- Content Moderation — Platform safety systems operating at human speed while viral spread operates at machine speed creating structural inadequacy in Social Media Governance.
- Media Manipulation — Extremists exploiting journalism norms through shitposting manifestos mixing genuine ideology with trolling to guarantee amplifying coverage in Information Warfare.
- Platform Responsibility — Tension between engagement-maximizing design and public safety requiring platforms to make themselves less functional in Tech Ethics.
- 03 CREATE/🌲 Evergreens/Digital Resilience — Building psychological armor and mental toughness as survival skills for navigating spaces designed to provoke emotional reactions in Media Literacy.
Part of the 📧 Newsletter archive documenting digital literacy and technology.