TLDR 168

Too Long; Didn't Read Issue 168

Published: 2018-10-06 • 📧 Newsletter

Welcome to Issue 168. A nation of sheep.

TL;DR is a weekly review of things that I think you should be reading. A primer of some of the cool things that happened…but you may have missed.

This week I posted the following:


🔖 Key Takeaways


📺 Watch

Thomas Frank: Using Social Media Responsibly

Thomas Frank with some practical tips for using social media responsibly - and breaking your addiction.

A quick question for all TL;DR readers…what connections do you see between behaviorist philosophies and social media and other digital tools? I'm working on a writing/research project and would love your thoughts.

Frank's practical advice connects to deeper question about behaviorist design in social media. Platforms use variable reward schedules—sometimes you get likes/comments, sometimes you don't—which creates compulsive checking behavior more effectively than consistent rewards. This is same principle behind slot machines. The pull-to-refresh gesture literally mimics slot machine action, training users to repeatedly check for rewards.

Understanding these behaviorist manipulation techniques helps resist them. When you recognize that compulsive checking is designed response rather than personal weakness, you can implement countermeasures—turning off notifications, batching social media time, removing apps from phones. The addiction isn't accidental; it's engineered. Recognizing engineering enables conscious choice about whether to participate on platforms' terms or your own.


📚 Read

Chauncey DeVega in Salon indicating that the algorithms that guide our social media feeds may not only be designed to keep us clicking, scrolling, and sharing. They may be keeping the powerful powerful.

Technology is not neutral. How it is used and for what ends reflects the social norms and values of a given culture. As such, in the United States and around the world, algorithms and other types of artificial intelligence often reproduce social inequality and serve the interests of the powerful—instead of being a way of creating a more equal, free and just social democracy.

DeVega's argument challenges technological utopianism that assumes algorithms bring objectivity. Algorithms trained on historical data reproduce historical biases—racist lending patterns, gendered hiring discrimination, class-based opportunity gaps. When these biased patterns get automated and scaled through AI, inequality becomes systematic and hidden behind claims of mathematical objectivity. The algorithm becomes tool for maintaining existing power structures while appearing neutral and inevitable.


Kara Swisher in the NY Times with an overview and critique of the "overarching values" for the use of the Internet as developed by Ro Khanna.

The internet age and digital revolution have changed Americans' way of life. As our lives and the U.S. economy are more tied to the internet, it is essential to provide Americans with basic protections online.

You should have the right:

  1. To have access to and knowledge of all collection and uses of personal data by companies
  2. To opt-in consent to the collection of personal data by any party and to the sharing of personal data with a third party
  3. Where context appropriate and with a fair process, to obtain, correct or delete personal data controlled by any company and to have those requests honored by third parties
  4. To have personal data secured and to be notified in a timely manner when a security breach or unauthorized access of personal data is discovered
  5. To move all personal data from one network to the next
  6. To access and use the internet without internet service providers blocking, throttling, engaging in paid prioritization or otherwise unfairly favoring content, applications, services or devices
  7. To internet service without the collection of data that is unnecessary for providing the requested service absent opt-in consent
  8. To have access to multiple viable, affordable internet platforms, services and providers with clear and transparent pricing
  9. Not to be unfairly discriminated against or exploited based on your personal data
  10. To have an entity that collects your personal data have reasonable business practices and accountability to protect your privacy

Khanna's proposal codifies digital rights many assume they have but actually don't under current law. Opt-in consent (right 2) would fundamentally disrupt surveillance capitalism business model that assumes consent through opaque terms of service. Data portability (right 5) would enable competition by letting users move information between platforms. Net neutrality protection (right 6) prevents ISPs from creating tiered internet favoring wealthy companies.

The framework treats internet access and digital privacy as fundamental rights rather than privileges granted by corporations. Whether these principles become law depends on political will to regulate platforms currently operating with minimal constraints. But articulating rights framework shifts debate from whether to regulate toward how to protect digital citizenship.


Schools are increasingly looking for ways to secure their campuses while not making the school look like a prison encampment.

One recent response is to spend hundreds of thousands of dollars to outfit campuses with high-tech surveillance, crisis response teams, and police technologies. This raises questions about where the funds come to pay for these solutions…as well as whether or not they infringe on individual liberties and freedoms.

School surveillance represents troubling priorities. Hundreds of thousands spent on cameras and monitoring systems while teachers buy classroom supplies from personal funds and programs get cut for lack of funding. The security theater addresses symptoms (fear of school shootings) while ignoring causes (gun access, insufficient mental health support, cultures of violence).

Surveillance also normalizes monitoring for students, teaching them constant observation is normal condition rather than violation of privacy. Students growing up under pervasive surveillance may internalize that privacy isn't legitimate expectation. The data collected on students—behavioral patterns, social networks, disciplinary actions—follows them and potentially gets misused. Police-grade surveillance in schools treats students as potential threats rather than young people deserving dignity and trust.


I've shared the climate visualization tool Earth.nullschool.net in earlier issues of TL;DR.

PolarGlobe is a large-scale, web-based four-dimensional visualization tool allowing climate data access to anyone with an internet connection. It's capable of illustrating changes in the atmosphere vividly in real time.

Designed specifically for polar scientists seeking to understand the ice caps, the tool is also useful for high school science teachers and weather fanatics.


John Sellars in Quartz with some guidance from Stoic philosophies as we think about happiness, and our response to daily events.

The paradox of Stoicism, as Epictetus formulates it, is that we have almost no control over anything, yet at the same time we have potentially complete control over our happiness.

Stoic philosophy offers vital framework for digital age overwhelmed by information and outrage. We can't control what platforms do, what algorithms prioritize, what others post, or what news happens. But we control our responses—how we interpret events, where we direct attention, what meaning we construct. This isn't passive acceptance but active choice about what deserves our emotional investment and how we engage with circumstances beyond our control.


Sheera Frenkel writes about cybersecurity for the NY Times. In this post she discusses the tools and tactics she uses to make sure she is protected. I like this post because it provides a real-world framing of how to be secure…and still live your life.

Frenkel's assume-you'll-be-hacked approach reflects realistic threat modeling. Question isn't if but when security gets compromised. This mindset drives better practices—two-factor authentication everywhere, password managers, separate email for sensitive accounts, regular security audits. Building security assuming breach will happen creates resilience rather than depending on perfect prevention.


🤔 Consider

"Three things cannot be long hidden: the sun, the moon, and the truth." — Buddha

Buddha's teaching about truth's inevitable emergence resonates with this issue's theme of nations of sheep and the systems trying to obscure reality. Social media platforms use behaviorist manipulation hoping users won't recognize engineered addiction, but understanding spreads as people wake to exploitation. Algorithms reproduce power structures hoping mathematical objectivity disguises political choices, but critical examination reveals bias embedded in seemingly neutral systems. Our task isn't preventing truth from being hidden—that happens constantly. Our task is being ready when truth emerges, recognizing it, and acting accordingly rather than being among the sheep who accept comfortable lies over uncomfortable truths.


Previous: TLDR 167Next: TLDR 169Archive: 📧 Newsletter

🌱 Connected Concepts:


Part of the 📧 Newsletter archive documenting digital literacy and technology.