Tag: fake news

The Case for Quarantining

WELCOME
The Case for Quarantining
Digitally Lit #262 – 10/3/2020

Welcome back to Digitally Literate and issue #262.

This week I worked on the following:

If you haven’t already, please subscribe if you would like this newsletter to show up in your inbox. Feel free to reach out and let me know what you think of this work at hello@digitallyliterate.net.

Watch

Show Your Work by Austin Kleon

Doug Belshaw had this great post about working out loud which was inspired by Austin Kleon’s book.

The video I share above is Brian Johnson’s review of five big ideas from the same text.

If you really want to dig deep, check out this full session from Kleon at SXSW.

Read

The case for quarantining extremist ideas

This week we heard a lot in the news about the Proud Boys, and white supremacist groups that promote and engage in political violence.

This piece by Joan Donovan and danah boyd discusses the topic of strategic silence.

By avoiding amplifying extremist ideas, are we starving them of oxygen in the informational space?

For more on this topic, read this profile of Emily Gorcenski.

Study Finds ‘Single Largest Driver’ of Coronavirus Misinformation: Trump

Cornell University researchers analyzing 38 million English-language articles about the pandemic found that President Trump was the largest driver of the “infodemic.”

The study is the first comprehensive examination of coronavirus misinformation in traditional and online media.

This study identifies and analyzes the most prominent topics of COVID-related misinformation that emerged in traditional media between January 1 and May 26, 2020 based on a total sample of over 38 million articles published in English-language media around the world.

While on the topic of COVID, this piece by Zeynep Tufekci asks why some people and areas are super-spreaders…and others are not.

How It Feels When Software Watches You Take Tests

One of my students needed to take a test virtually this past week and immediately relayed to our class the challenges of testing online and dealing with virtual proctors.

This post shares insight on software designed to flag students cheating on tests by doing things like tracking eye movements via a webcam. In a related story, other students indicated that it felt callous and unfair to be suspected of cheating because they read test questions aloud, had snacks on their desks or did other things that the software deemed suspicious.

As a result, some of my students are indicating that they may put their health, and the health of others, in jeopardy and head out to physical locations to test.

Connecting with Youth through Authenticity and Collaboration

Media companies around the world are finding out that when it comes to capturing the attention of youth, authenticity (or at least a sense of it) equals relevancy.

Anyone who has worked in a middle or high school setting can also confirm that teenagers are human lie detectors, unafraid to call out a lack of genuineness when they see it.

Armed with this realization, content creators and distributors continue vying for this group’s attention, through ever-changing media platforms in an increasingly interconnected digital space.

Reimagining Learning Spaces for Uncertain Times

This great resource from UNESCO MGIEP shares insight on the possibilities for a post-pandemic world.

Do

8 Strategies to Improve Participation in Your Virtual Classroom

enter image description here
Really digging the intersection of food, design, and art of the Ghetto Gastro. These waffles look awesome…and definitely not an option on my current diet. 🙂

Read more here about the Ghetto Gastro.

Consider

In a time of deceit telling the truth is a revolutionary act.

George Orwell

digilit banner

With the news that President Trump has tested positive for the coronavirus, the fast-moving information system that is the Internet has kicked into high gear. As a result, it can be hard to separate truth from fiction. It can also be hard to not be emotionally manipulated online.

This guide from The Verge and this from The Washington Post are two resources to help avoid being part of the problem.

Connect at hello@digitallyliterate.net or on the social network of your choice.

Finland is winning the war on fake news. Other nations want the blueprint

An interactive piece from CNN focusing on the role of education, critical media literacy, and the fight against fake news.

“What we want our students to do is … before they like or share in the social media they think twice – who has written this? Where has it been published? Can I find the same information from another source?” Kari Kivinen, director of Helsinki French-Finnish School and former secretary-general of the European Schools, told CNN.

Much of this we’ve seen before. Education, specifically critical literacy, and critical media literacy is needed in our schools and society.

Part of the challenge is that this practice often fails when it goes up against our value systems. and the very act of routinely questioning everything you read or learn, is antithetical to the narrative shared by many parents and educators.

In much of my research, I view this as a need to build healthy skepticism in students. This perspective is often challenged by colleagues…and is evidenced in the piece.

He cautioned that it is a balancing act trying to make sure skepticism doesn’t give way to cynicism in students.

I ultimately view this healthy skepticism as the same insight that we have when a stranger rings our door bell on the weekend offering to repair our roof, or sell us solar panels. It is the same perspective that we have when a telemarketer calls our phone indicating they have a great new deal for us.

Hopefully we can create a new generation of web literate individuals that can employ this healthy skepticism…while utilizing these techniques as they create digital content online as well.

“It’s very annoying having to fact check everything, not being able to trust anything … or anyone on the internet,” said 15-year-old Tatu Tukiainen, one of the students in Uitto’s class. “I think we should try to put a stop to that.”

SOURCE: CNN

For more insight on this, review this Twitter thread from Mike Caulfield.

Fake ‘Ukrainian’ News Websites Run by Russian ‘Troll Army’ Offshoots

Fake ‘Ukrainian’ News Websites Run by Russian ‘Troll Army’ Offshoots (Global Voices)

A new investigation of Russia's information war has revealed fake 'Ukrainian' news sites are actually hosted, operated, and staffed in Russia without any local correspondents.

Aric Toler on GlobalVoices as part of the RuNet Echo Project. All annotations in context.

creating a new international news operation called Sputnik to “provide an alternative viewpoint on world events.” More and more, though, the Kremlin is manipulating the information sphere in more insidious ways.

 

In June, BuzzFeed published a detailed feature on this operation, through which the Kremlin supposedly funds a small army of young web-savvy Internet users who flood website comment sections around the world with pro-Russian and anti-Western rhetoric.

Fake news. It's complicated.

Fake news. It’s complicated.
Claire Wardle in First Draft News. All annotations in context.

By now we’ve all agreed the term “fake news” is unhelpful, but without an alternative, we’re left awkwardly using air quotes whenever we utter the phrase. The reason we’re struggling with a replacement is because this is about more than news, it’s about the entire information ecosystem. And the term fake doesn’t begin to describe the complexity of the different types of misinformation (the inadvertent sharing of false information) and disinformation (the deliberate creation and sharing of information known to be false).

 

To understand the current information ecosystem, we need to break down three elements:

1. The different types of content that are being created and shared

2. The motivations of those who create this content

3. The ways this content is being disseminated

 

As Danah Boyd outlined in a recent piece, we are at war. An information war. We certainly should worry about people (including journalists) unwittingly sharing misinformation, but far more concerning are the systematic disinformation campaigns. 

 

Back in November, I wrote about the different types of problematic information I saw circulate during the US election. Since then, I’ve been trying to refine a typology (and thank you to Global Voices for helping me to develop my definitions even further). I would argue there are seven distinct types of problematic content that sit within our information ecosystem. They sit on a scale, one that loosely measures the intent to deceive.


 

Why is this type of content being created?

I saw Eliot Higgins present in Paris in early January, and he listed four ‘Ps’ which helped explain the different motivations. I’ve been thinking about these a great deal and using Eliot’s original list have identified four additional motivations for the creation of this type of content: Poor Journalism, Parody, to Provoke or ‘Punk’, Passion, Partisanship, Profit, Political Influence or Power, and Propaganda.

This is a work in progress but once you start breaking these categories down and mapping them against one another you begin to see distinct patterns in terms of the types of content created for specific purposes.

 

Dissemination Mechanisms

Finally, we need to think about how this content is being disseminated. Some of it is being shared unwittingly by people on social media, clicking retweet without checking. Some of it is being amplified by journalists who are now under more pressure than ever to try and make sense and accurately report information emerging on the social web in real time. Some of it is being pushed out by loosely connected groups who are deliberately attempting to influence public opinion, and some of it is being disseminated as part of sophisticated disinformation campaigns, through bot networks and troll factories.

 

When messaging is coordinated and consistent, it easily fools our brains, already exhausted and increasingly reliant on heuristics (simple psychological shortcuts) due to the overwhelming amount of information flashing before our eyes every day. When we see multiple messages about the same topic, our brains use that as a short-cut to credibility. It must be true we say — I’ve seen that same claim several times today. :arrow:

 
When humans are angry and fearful, their critical thinking skills diminish.

What Do We Know About False News?

What Do We Know About False News? (Harvard Business Review)

A roundup of the latest thinking

From the Harvard Business Review:

As false news has become a global phenomenon, scholars have responded. They’ve ramped up their efforts to understand how and why bad information spreads online — and how to stop it. In the past 18 months, they’ve flooded academic journals with new research and have raised the level of urgency. In a March 2018 article, titled “The Science of Fake News,” in the prestigious journal Science, 16 high-profile academics came together to issue a call to action, urging internet and social media platforms to work with scholars to evaluate the problem and find solutions.

There appears to be some difference in the reach and appetite for these sources in the U.S. and across Europe.
Also very interesting:

Another key, potentially surprising, takeaway from that study: “In general, fake news consumption seems to be a complement to, rather than a substitute for, hard news — visits to fake news websites are highest among people who consume the most hard news and do not measurably decrease among the most politically knowledgeable individuals.”

People are less skeptical of information they encounter on platforms they have personalized — through friend requests and “liked” pages, for instance — to reflect their interests and identity.
Sundar characterizes his research findings in this way: “We discovered that participants who had customized their news portal were less likely to scrutinize the fake news and more likely to believe it.”

The piece had some guidance on how to stop this. But, I found most interesting the stuff we still don’t know.

For one, much of the new research centers on U.S. politics and, specifically, elections. But social networks drive conversations about many other topics such as business, education, health, and personal relationships. To battle bad online information, it would be helpful to know whether people respond to these sorts of topics differently than they respond to information about political candidates and elections. It also would be useful to know whether myths about certain subjects — for instance, a business product or education trend — are trickier to correct than others.

There is also a need for a collaborative approach to this problem.

Truth, Disrupted

Truth, Disrupted (Harvard Business Review)

False news spreads online faster, farther, and deeper than truth does — but it can be contained. Here’s how.

Sinan Aral in the Harvard Business Review:
For the past three years Soroush Vosoughi, Deb Roy, and I have studied the spread of false news online. (We use the label “false news” because “fake news” has become so polarizing: Politicians now use that phrase to describe news stories that don’t support their positions.) The data we collected in a recent study spanned Twitter’s history from its inception, in 2006, to 2017. We collected 126,000 tweet cascades (chains of retweets with a common origin) that traveled through the Twittersphere during this period and verified the truth or falsehood of the content that was spreading. We then compared the dynamics of how true versus false news spreads online. On March 9 Science magazine published the results of our research as its cover story.

What we found was both surprising and disturbing. False news traveled farther, faster, deeper, and more broadly than the truth in every category of information, sometimes by an order of magnitude, and false political news traveled farther, faster, deeper, and more broadly than any other type.
The importance of understanding this phenomenon is difficult to overstate. And, in all likelihood, the problem will get worse before it gets better, because the technology for manipulating video and audio is improving, making distortions of reality more convincing and more difficult to detect. The good news, though, is that researchers, AI experts, and social media platforms themselves are taking the issue seriously and digging into both the nature of the problem and potential solutions.

The piece provides guidance on how to recognize, and deal with these false news stories.

In this article I’ll examine how we might contain the spread of falsity. A successful fight will require four interrelated approaches — educating the players, changing their incentives, improving technological tools, and (the right amount of) governmental oversight — and the answers to five key questions:

The Other Mr. President

The Other Mr. President – This American Life from This American Life

What it’s actually like to live in the confusing information landscape that is Putin’s Russia.

From the This American Life podcast:

Since Russia meddled in our election, there’s been concern that the fake news and disinformation that’s so prevalent there could be taking hold in this country. But is that hyperbole? This week we look at what it’s actually like to live in the confusing information landscape that is Putin’s Russia.

PBS NewsTracker

What is the NewsTracker? (PBS NewsHour)

As the country was reacting to the outcome of the 2016 Presidential election, concerns soared about the problems of misinformation or so-called “Fake News” spreading across social media. To understand the scale and shape of a problem that was incredibly opaque, we began intensive research to collect and analyze the sources of this misinformation.

First developed by PBS for internal use, NewsTracker is a tool that identifies Facebook pages that traffic in misinformation and tracks how often the content there is liked, shared, and on commented on. Reporters use this tool to find patterns and trends that may merit reporting. The tool will have a new home soon: the Shorenstein Center at Harvard’s Kennedy School of Government, where it can gain wider testing and use.