Hate, fear, and trolling
Digitally Lit #190 – 3/23/2019
Hi all, my name is Ian O’Byrne and welcome to Digitally Literate. I’m remixing & rebranding a couple of my digital streams as I continue to think about the signals I create & consume online. I’ll have a post on all of this soon. We still have some loose wires…so please be patient.
I research, teach, & write about technology in our lives. In this newsletter, I try to synthesize what happened this week so you can be digitally literate as well. This week I pull together some of the varying discussion about the role of digital, social spaces in the Christchurch mosque shootings from last week. If this content is too chilling, please feel free to skip this issue and join us next week.
I posted a couple of other things this week:
- The Technopanic Podcast – My podcast with Kristen Turner went live this week. Subscribe on iTunes, Spotify, Google Play Music, PocketCasts, Stitcher…or the podcast catcher of your choice.
- Understanding the differences between privacy and security – It is your responsibility to protect and secure yourself while using digital tools and spaces. This primer gives an overview on some of the language we should use.
- Use the Internet Archive, WordPress, & Blubrry Plugin to set up audio podcasts – A video overview of how I used the tools and spaces listed above to host and share the Technopanic podcast.
The gunman in the Christchurch mosque shootings shared a racist manifesto online and posted live video of his attack on Facebook. The NY Times spoke to terrorism experts about why this matters.
Kevin Roose on the mass murders in Christchurch, New Zealand.
Now, online extremism is just regular extremism on steroids. There is no offline equivalent of the experience of being algorithmically nudged toward a more strident version of your existing beliefs, or having an invisible hand steer you from gaming videos to neo-Nazism. The internet is now the place where the seeds of extremism are planted and watered, where platform incentives guide creators toward the ideological poles, and where people with hateful and violent beliefs can find and feed off one another.
At the same time, we need to understand and address the poisonous pipeline of extremism that has emerged over the past several years, whose ultimate effects are impossible to quantify but clearly far too big to ignore. It’s not going away, and it’s not particularly getting better. We will feel it for years to come.
How do we talk about this in our schools?
Taylor Lorenz in The Atlantic.
Significant portions of the manifesto appear to be an elaborate troll, written to prey on the mainstream media’s worst tendencies. As the journalist Robert Evans noted, “This manifesto is a trap … laid for journalists searching for the meaning behind this horrific crime. There is truth in there, and valuable clues to the shooter’s radicalization, but it is buried beneath a great deal of, for lack of a better word, ‘shitposting.’”
Shitposting is a slang term used to describe the act of posting trollish and usually ironic content designed to derail a conversation or elicit a strong reaction from people who aren’t in on the joke.
Ian Bogost on how technology platforms police content. Global internet services were designed to work this way, and there might be no escape from their grip.
But the internet separates images from context and action from intention, and then it spreads those messages quickly among billions of people scattered all around the globe.
The internet was designed to resist the efforts of any central authority to control its content—even when a few large, wealthy companies control the channels by which most users access information.
It’s easy to say that technology companies can do better. They can, and should. But ultimately, that’s not the problem. The problem is the media ecosystem they have created. The only surprise is that anyone would still be surprised that social media produce this tragic abyss, for this is what social media are supposed to do, what they were designed to do: spread the images and messages that accelerate interest, without check, and absent concern for their consequences. It’s worth remembering that “viral” spread once referred to contagious disease, not to images and ideas. As long as technology platforms drive the spread of global information, they can’t help but carry it like a plague.
The NY Times Editorial Board on how to beat a system designed to keep the worst of the web out of sight.
It’s telling that the platforms must make themselves less functional in the interests of public safety. What happened this weekend gives an inkling of how intractable the problem may be. Internet platforms have been designed to monopolize human attention by any means necessary, and the content moderation machine is a flimsy check on a system that strives to overcome all forms of friction. The best outcome for the public now may be that Big Tech limits its own usability and reach, even if that comes at the cost of some profitability. Unfortunately, it’s also the outcome least likely to happen.
A report on better practices for reporting on extremists, antagonists, and manipulators by Whitney Phillips. This Data & Society report draws on in-depth interviews by Phillips to showcase how news media was hijacked from 2016 to 2018 to amplify the messages of hate groups.
The Oxygen of Amplification has three interlocking parts:
- Part 1 provides a historical overview of the relationship between the news media and far-right manipulators who leveraged trolling and meme culture during the 2016 US presidential election.
- Part 2 identifies the consequences of reporting on bigoted, damaging, or otherwise problematic information and the structural limitations of journalism (economic, labor, and cultural) that exacerbate these tensions.
- Part 3 is a tactical guide for newsrooms that recommends “better” practices on establishing newsworthiness; handling objectively false information; covering specific online harassment campaigns or manipulators, bigots, and abusers.
Although the structures to make you safe online leave a lot to be desired, there are some things you can do to protect yourself from cyberhate. One of the first steps is to get your psychological armour on.
Dr. Sean Richardson spoke about failure and its relation to mental toughness in a November, 2011 TEDx talk.
Build your mental toughness. Manage your expectations. Prevent emotions from getting the best of you. Find your source of motivation. Learn to delay gratification and let things go.
It is easy to hate and it is difficult to love. This is how the whole scheme of things works. All good things are difficult to achieve; and bad things are very easy to get.
Digitally Literate is a summary of all the great stuff from the Internet this week in technology, education, & literacy. Follow along here.