DL 228

Under Observation

Published: January 11, 2020 • 📧 Newsletter

Welcome to Digitally Literate, issue 228. Your go-to source for insightful content on education, technology, and the digital landscape.

🔖 Key Takeaways


Hi all, welcome to issue 228 of Digitally Literate. If you haven't already, please subscribe if you would like this to show up in your email inbox.

If you're reading on the website, feel free to leave a comment behind. You can also use Hypothesis to leave notes and annotations. Feel free to reach out and let me know what you think of this work at hello@digitallyliterate.net.

I'm also always reading and learning online. I choose to share some things socially…and others I keep for my reference. If you want to see a feed of my notes…here you go.

Let's get to the news of the week. Thanks for stopping by.

📺 Watch

I'm including more of a focus on surveillance capitalism in my ed tech course this semester.

Frames is a short video that shows how a smart city tracks and analyzes a woman as she walks through the city. Things she does are interpreted and logged by the city system, but are they drawing an accurate picture of the woman?

The video is accompanied by a facilitator guide and media pack to support discussions.

This is an excellent resource for helping students visualize what pervasive surveillance actually looks like in practice—and how algorithmic interpretation can diverge from reality.

📚 Read

Chris Gilliard with a must read essay that explores how tech that tracks creates different spatial experiences for users on opposite ends of the tool. Different realities exist for different races and classes at the receiving end of the surveilling gaze.

…these technologies will continue to fuel a negative feedback loop between individuals and communities on both ends of the surveillance spectrum, where the only real winners are the companies who profit from the fear they help to manufacture

Gilliard's framing is essential: surveillance isn't experienced uniformly. The same technology that provides convenience for some creates constant monitoring and suspicion for others.

A piece in The Outline by Will Partin on how modern conspiracy theorizers are co-opting the tools and rhetoric of media literacy and critical thinking to assemble/support their theories.

The wager of this critique is that skepticism can be just as corrosive to society as naïvite. If this is true, then we've been looking at the issues media literacy purports to address backwards. What if the problem is not that people don't check their sources, consult with a friend, or read critically, but precisely that they do?

This challenges a core assumption of media literacy education—that more critical thinking automatically leads to better outcomes. When conspiracy theorists "do their own research," they're following the same process we teach.

Last week in this spot on the newsletter, we discussed the research from Mark Ledwich and Anna Zeitsev that suggested YouTube's recommendation algorithm fails to promote inflammatory or radicalized content, as previously claimed by several outlets.

If you scroll to the bottom of issue #227, good friend Aaron Davis (subscribe to his newsletter ReadWriteRespond if you haven't already) shared a number of important links that add to the complexity of the story.

Not soon after Aaron shared this info, Becca Lewis used this as an opportunity to compile the piece linked at the top of this story. Lewis posits that we need to look beyond the recommendation algorithms to understand YouTube's political propaganda problem.

The platform's design, monetization structure, and creator ecosystem all contribute—focusing only on the algorithm misses the larger picture.

We haven't talked much about the bushfires ravaging all of Australia. I have many friends that are impacted by these events…especially in New South Wales. My thoughts are with you in these times.

Within this ecological disaster is a media literacy lesson about informational wars and climate change. As the planet burns, disinformation is being used to shift blame and divert attention from climate change.

I'm in the middle of writing a grant proposal for NSF to develop a climate change curriculum to teach to pre-service teachers. I'd love your thoughts on this.

Technology has changed us, robbed us of something important, and we must get it back. It's your devices versus your best life.

This post from Arielle Pardes at Wired shares a great list of books to help as you consider your life in a tech-saturated world. These texts provide some good advice on "rewriting bad habits, reviewing the devices we actually need, and relearning how to listen amid all the noise."

The key insight: technical fixes to biased algorithms don't address the social structures that created the bias. We need political and social solutions, not just engineering ones.

🔨 Do

I use Creative Commons (CC) licensed images in all of my work. My process involves a complex mixture of searching on Flickr while ensuring that I use images that I have permission to include in my work.

Richard Byrne shares guidance on the new Chrome extension from CC to help expedite that process.

🤔 Consider

Under observation, we act less free, which means we effectively are less free.

Edward Snowden

Snowden's observation captures this issue's central theme. Whether it's smart city tracking, surveillance creating different realities for different communities, or algorithmic systems monitoring our every click—observation itself constrains freedom. The Frames video demonstrates this viscerally: being watched changes behavior, which changes who we are.


Neil Peart, the best drummer of all time, passed away as I was finishing this week's issue. Take five minutes to witness some of his brilliance here.


Previous: DL 227 • Next: DL 229 • Archive: 📧 Newsletter

🌱 Connected Concepts: