TLDR 189

Chilling Effects & Other Attempts to Silence

Published: 2019-03-16 • 📧 Newsletter

Welcome to Issue 189. Chilling effects and other attempts to silence.

Hi all, welcome to TL;DR. My name is Ian O'Byrne. I research, teach, & write about technology in our lives. I try to synthesize what happened this week in tech...so you can be the expert as well.

I posted a couple of other things this week:


🔖 Key Takeaways


📺 Watch

As a regular reader of this newsletter, we've talked about deepfakes in the past. This is one of the best examples of a deepfake I've found in a long time. I'll be using this for some upcoming talks.

Deepfakes represent synthetic media where AI generates convincing fake videos by swapping faces, manipulating expressions, or fabricating entire performances. The technology uses generative adversarial networks (GANs) to learn facial features and movements, producing increasingly realistic forgeries. Early deepfakes were obviously fake—uncanny valley effects, poor lighting matches, weird artifacts. Modern versions are nearly indistinguishable from authentic footage. The implications span entertainment (de-aging actors, posthumous performances), misinformation (fake political speeches, fabricated evidence), and abuse (non-consensual pornography). The challenge isn't just detection—which becomes harder as technology improves—but epistemological: how do we maintain shared reality when video evidence becomes unreliable? The solution requires both technical approaches (forensic tools, blockchain verification) and social ones (media literacy, institutional trust-building).


📚 Read

World Wide Web Turns 30

Happy 30th Birthday to the World Wide Web.

The world wide web was invented by Sir Tim Berners-Lee in 1989 – originally he was trying to find a new way for scientists to easily share the data from their experiments. Hypertext (text displayed on a computer display that links to other text the reader can immediately access) and the internet already existed, but no one had thought of a way to use the internet to link one document directly to another.

Berners-Lee's breakthrough was elegantly simple: combine hypertext, networking, and universal identifiers (URLs) to create system where any document could link to any other. The three core technologies—HTML (formatting), HTTP (transfer protocol), URLs (addressing)—enabled distributed information architecture without central control. Berners-Lee worked at CERN when he developed the Web, crucially deciding to release it freely rather than patent it. This open approach enabled exponential growth: from one website in 1991 to millions within years. The Web transformed from academic tool to commercial platform to social infrastructure to contested political space. Thirty years later, the open Web faces threats from platform consolidation, surveillance capitalism, authoritarian control, and fragmentation. Berners-Lee now advocates for Web renewal through decentralization, data ownership, and returning to original vision of distributed, user-controlled network.

Facebook appears to be in the middle of a pivot.

In "A Privacy-Focused Vision for Social Networking", a 3,200-word essay that Zuckerberg posted to Facebook on March 6, he says he wants to "build a simpler platform that's focused on privacy first." In apparent surprise, he writes: "People increasingly also want to connect privately in the digital equivalent of the living room."

This piece from Konstantin Kakaes hits the nail on the head:

Zuckerberg's essay is a power grab disguised as an act of contrition. Read it carefully, and it's impossible to escape the conclusion that if privacy is to be protected in any meaningful way, Facebook must be broken up.

Facebook grew so big, so quickly that it defies categorization. It is a newspaper. It is a post office and a telephone exchange. It is a forum for political debate, and it is a sports broadcaster. It's a birthday-reminder service and a collective photo album. It is all of these things—and many others—combined, and so it is none of them.

Zuckerberg describes Facebook as a town square. It isn't. Facebook is a company that brought in more than $55 billion in advertising revenue last year, with a 45% profit margin. This makes it one of the most profitable business ventures in human history. It must be understood as such.

Kakaes exposes Zuckerberg's rhetorical sleight of hand. The "privacy pivot" claims Facebook will shift from public sharing to private messaging while integrating WhatsApp, Instagram, and Messenger into unified encrypted infrastructure. Sounds positive—except it consolidates Facebook's control across previously separate platforms while laundering reputation through privacy promises. The town square metaphor obscures that Facebook is profit-maximizing corporation extracting value from user data and attention. The uncategorizable size argument is crucial: Facebook operates simultaneously as publisher, common carrier, advertising network, payment processor, identity system, news distributor, and democratic infrastructure—wielding power no single entity should have. Historical precedent: AT&T breakup, Standard Oil dissolution. The problem isn't that Facebook does each thing badly but that doing everything enables monopoly power incompatible with democratic governance. Breaking up Facebook means separating platforms (Facebook, Instagram, WhatsApp), prohibiting platform owners from competing on their own infrastructure, and treating essential digital services as utilities requiring public accountability.

Online Activists Are Silencing Us, Scientists Say

Researchers say social media activists' thought-policing is having a chilling effect on pursuing cures for diseases.

Advocates on social media are targeting scientists who release studies that don't fit into their views on the diseases, going so far as to wishing for the demise of their careers because of a research paper. Scientists say it can dissuade researchers for wanting to do work on certain diseases, setting off a vicious cycle where patients are the ones who suffer.

The chilling effects phenomenon describes how fear of consequences causes self-censorship even without formal prohibition. Scientists report that publishing findings contradicting patient advocacy narratives about diseases (chronic fatigue syndrome, Lyme disease, myalgic encephalomyelitis) triggers coordinated harassment campaigns: mass emails to employers demanding firing, negative reviews on grant applications, social media pile-ons, career threats. The researchers emphasize they're investigating disease mechanisms to develop treatments—the goal is helping patients. But activists insist certain etiologies (psychological factors, incremental rehabilitation) delegitimize suffering and protect medical establishment. The conflict: activists have legitimate grievances about medical dismissal of poorly understood conditions, while scientists need freedom to investigate unpopular hypotheses. The problem: online harassment replaces scientific debate, making controversial research too costly personally and professionally. The consequence: researchers avoid contentious areas, leaving patients without answers. The resolution requires distinguishing between justified criticism of bad science and harassment suppressing legitimate inquiry, while acknowledging that scientific authority has historically dismissed patient experience, particularly for women and marginalized groups.

Larry Sanger, co-founder of Wikipedia, CIO of Everipedia on information privacy in digital spaces.

Sanger closes by sharing how he is locking down his cyber-life. He describes this as your personal, familial, and civic duty.

Think of it as cyber-hygiene. You need to wash your data regularly. It's time to learn. Our swinish data habits are really starting to stink the place up, and it's making the executives, criminals, and tyrants think they can rule the sty.

Sanger frames information privacy as civic responsibility, not individual preference. The personal dimension: protecting against identity theft, financial fraud, harassment. The familial dimension: safeguarding children's data, securing household networks, modeling digital citizenship. The civic dimension: resisting surveillance capitalism, maintaining democratic participation, preventing authoritarian control. His "locking down" approach includes: using privacy-focused browsers and search engines, enabling encryption everywhere, minimizing data sharing with platforms, supporting open-source alternatives, practicing operational security. The cyber-hygiene metaphor is apt: like physical hygiene prevents disease spread, digital hygiene prevents surveillance proliferation. The "swinish data habits" critique indicts both users (carelessly sharing everything) and companies (exploiting that carelessness). The concentration of personal data enables executives to manipulate, criminals to exploit, and tyrants to control. Privacy isn't about having something to hide but maintaining power balance—refusing to be ruled requires refusing total transparency.

Sen. Elizabeth Warren published an extensive plan on Friday to break up big tech companies likes Facebook, Google, Amazon, and Apple.

If elected president in 2020, Warren, a Massachusetts Democrat, said that she would pass legislation that would classify large tech platforms with annual global revenue of $25 billion or more as "platform utilities," and break them apart via antitrust laws.

Today's big tech companies have too much power — too much power over our economy, our society, and our democracy. They've bulldozed competition, used our private information for profit, and tilted the playing field against everyone else. And in the process, they have hurt small businesses and stifled innovation.

Warren's proposal has two main components: platform utility designation and structural separation. Platforms exceeding $25 billion revenue (Facebook, Google, Amazon, Apple) would be designated "platform utilities"—similar to railroads or electricity grids, providing essential infrastructure requiring nondiscriminatory access. Structural separation prohibits platform owners from competing on their own platforms: Amazon can't sell Amazon Basics products on Amazon marketplace, Google can't favor Google services in search results, Apple can't privilege Apple apps in App Store. The economic argument: current system allows giants to use platform power to crush competition, acquire potential rivals, and extract monopoly rents. The democratic argument: concentrated private power over speech, commerce, and information threatens self-governance. The innovation argument: breaking up monopolies releases entrepreneurial energy, as AT&T breakup enabled cell phone innovation and Standard Oil dissolution enabled competitive oil markets. The objection: breaking up successful companies punishes excellence. The response: these companies grew through anticompetitive practices (predatory pricing, forced bundling, data advantages, strategic acquisitions), not just superior products. The precedent: antitrust enforcement has historical pedigree when market concentration threatens public interest.


🔨 Do

AR/VR Classroom Experiments

Over the next couple of weeks in my tech classes, we'll start diving into augmented reality (AR) and virtual reality (VR). I'm investigating ways to have students make their own Google Cardboard in class.

Google Cardboard democratized VR by providing low-cost ($5-15) headset using smartphones as display and processing. The DIY approach teaches students about VR technology while enabling hands-on experimentation: understanding optics (lenses create stereoscopic 3D), sensors (gyroscope tracks head movement), and software (apps render perspective-correct images). Educational VR applications include virtual field trips (visiting distant locations), spatial learning (exploring molecular structures, architectural designs), empathy experiences (perspective-taking scenarios), and skills training (simulated practice environments). AR overlays digital information on physical world through phone cameras, enabling interactive learning (pointing phone at diagram to see 3D model), contextual information (identifying plants, translating signs), and creative expression (designing virtual sculptures in physical spaces). The pedagogical value isn't technology itself but how immersive experiences enable learning impossible through traditional media—embodied cognition, spatial reasoning, perspective shifts.


🤔 Consider

"Old George Orwell got it backward. Big Brother isn't watching. He's singing and dancing. He's pulling rabbits out of a hat. Big Brother's busy holding your attention every moment you're awake. He's making sure you're always distracted. He's making sure you're fully absorbed. He's making sure your imagination withers. Until it's as useful as your appendix. He's making sure your attention is always filled. And this being fed, it's worse than being watched. With the world always filling you, no one has to worry about what's in your mind. With everyone's imagination atrophied, no one will ever be a threat to the world." — Chuck Palahniuk

Palahniuk's inversion clarifies contemporary control mechanisms. Orwellian surveillance worried about what you're thinking—requiring monitoring, suppression, fear. Modern platform power doesn't need to watch because it controls attention itself—the deeper invasion. WWW's 30th birthday celebrates tools for connection that became engines of distraction. Facebook's privacy pivot obscures that surveillance capitalism depends less on watching than on capturing attention for ad targeting. Scientists experience chilling effects not from surveillance but from attention mobs wielding harassment as weapon. Warren's Big Tech critique addresses how platforms monopolize attention as much as markets. Cyber-hygiene protects privacy but does little against attention capture. The existential threat isn't Big Brother watching but Big Brother distracting—imagination withering from constant feeding makes resistance unthinkable.


Previous: TLDR 188Next: DL 190Archive: 📧 Newsletter

🌱 Connected Concepts:


Part of the 📧 Newsletter archive documenting digital literacy and technology.