DL 312
Start Often, F*@k Achievements
Published: November 21, 2021 • 📧 Newsletter
Welcome back. Here's Digitally Literate, issue 312. Your go-to source for insightful content on education, technology, and the digital landscape.
This week I delved into several background projects and stumbled upon some fascinating insights worth sharing.
🔖 Key Takeaways
- Community Internet Solutions: NYC Mesh demonstrates how decentralized, nonprofit networks can challenge corporate ISP monopolies and provide equitable internet access
- Platform Misinformation Economics: Facebook and Google directly fund clickbait and misinformation actors through advertising payments, creating perverse incentives for harmful content
- Algorithmic Transparency Progress: Twitter's public approach to algorithm research contrasts with other tech giants' secrecy, offering hope for more accountable AI systems
- Open Source Ethical Dilemmas: Mastodon's licensing challenge with Trump's Truth Social highlights tensions between free software ideals and preventing harmful platform uses
- SOFA Creative Philosophy: The principle of starting often without finishing everything challenges achievement culture and embraces process-focused creativity
📺 Watch
NYC's Nonprofit DIY Internet
A nonprofit in NYC challenges corporate giants like Verizon by creating a decentralized, community-driven internet network called NYC Mesh. It offers low-cost or free internet access for residents.
Connectivity is a fundamental right, and projects like these reduce barriers to education, employment, and social interaction.
📚 Read
How Facebook and Google Fund Misinformation
Karen Hao on how the tech giants are paying millions of ad dollars to bankroll clickbait actors, fueling the deterioration of information ecosystems around the world. Many of these actors would not exist without these payments from both platforms.
Over the past few weeks, the Facebook Papers have reaffirmed that FB has fueled the spread of hate speech & misinformation around the world. But there's a crucial piece missing from the story. FB isn't just amplifying misinformation. The company is also funding it.
It's not just amplification but direct funding that's eroding global information systems.
How Twitter Got Research Right
Casey Newton indicating that while other tech giants hide from their internal researchers, Twitter is doing its failing — and fixing — in public. Here's what Newton learned in the review.
- Twitter is betting that public participation will accelerate and improve its findings.
- Responsible AI is hard in part because no one fully understands decisions made by algorithms.
- There's no real consensus on what ranking algorithms "should" do.
- Twitter thinks algorithms can be saved.
Transparency and public accountability can lead to better algorithmic fairness and trustworthiness.
Disney's Text-to-Speech TikTok Voice Controversy
Disney's TikTok feature censored words like "gay" and "lesbian," raising ethical questions about technology's role in inclusion.
Technology mirrors societal values. How can we build tools that uplift rather than exclude?
More Software Isn't Better Software
A few months after Eugen Rochko earned his degree in computer science, he decided to push out an open sourced social network not too different from one of his favorite -- but flawed, in his view -- sites, Twitter. He named it Mastodon and it soon took off.
Mastodon was created as free, open source software with a "copy-left" license, which means anyone can download it, run it, and change it, on the condition that they continue to work under the same license and freely share the altered version they are operating.
Last month, Rochko learned that Mastodon was being used to run Donald Trump's new Truth Social network. Rochko may not agree with the views expressed on the new network. But, the licensing for the software indicates that he cannot ask that they refrain from using Mastodon. Not only is Trump permitted to use the software for his own peculiar purposes, but the free software saves a startup like Truth Social millions of dollars in programming expenses. All Mastodon asks in return is that Truth Social then pay it forward. As of the date of this newsletter, Truth Social has now complied with this request by making the source code publicly available in compliance with the license, which is known as AGPLv3.
It remains to be seen what will happen if Truth Social doesn't comply with the license.
The battle between Mastodon and Trump's Truth Social is a reminder that while the internet has changed, the ideals of free software haven't. That's a problem.
Open source challenges the centralized control of platforms like Truth Social while highlighting ethical dilemmas in software use.
The Expertise Gap
Seth Godin explores the difference between folk wisdom and earned expertise, emphasizing the importance of valuing specialized knowledge.
As misinformation grows, the gap between feelings and expertise becomes more pronounced.
🔨 Do
Combat Zoom Fatigue
A new study recently published in the Journal of Applied Psychology showed that women and newer employees were more likely to feel exhausted by too much time on video calls. The reason helps to both illuminate the causes of Zoom fatigue and how we can all avoid it.
Simply stating that you support the right of your employees to choose when they switch on the camera, cutting unnecessary meetings, and making sure to schedule adequate breaks between calls can go a long way toward preventing burnout and getting the best out of others.
Solutions:
- Encourage camera-off meetings when appropriate.
- Reduce unnecessary meetings.
- Schedule breaks between calls to prevent burnout.
🤔 Consider
Everyone is entitled to feelings about things, but expertise is earned.
Seth Godin
This distinction becomes crucial as we navigate the information landscape explored in this issue. The Facebook and Google funding of misinformation shows what happens when algorithms prioritize engagement over expertise. NYC Mesh succeeds because it combines community feelings about internet access with technical expertise in network building.
Twitter's transparent approach to algorithm research demonstrates how platforms can balance public input with specialized knowledge. The Mastodon-Truth Social situation reveals tensions between idealistic feelings about open source software and the expertise needed to prevent its misuse.
Godin's insight reminds us that while everyone's feelings matter, not all opinions carry equal weight. The SOFA principle—starting often without finishing everything—can help us explore ideas while still respecting the difference between casual interest and deep expertise.
🔗 Navigation
Previous: DL 311 • Next: DL 313 • Archive: 📧 Newsletter
🌱 Connected Concepts:
- Community Networks — NYC Mesh and grassroots internet infrastructure alternatives
- Platform Governance — Facebook and Google funding misinformation operations
- Algorithmic Transparency — Twitter's approach to open algorithm research
- Open Source Ethics — Mastodon-Truth Social tensions around software misuse
Part of the 📧 Newsletter archive documenting digital literacy and technology.
Intrigued by the SOFA principle after reading about it in Doug Belshaw's recent Weeknote.
SOFA stands for Start Often Finish rArely or Start Often F*@k Achievements
SOFA is the name of a hacker/art collective, and also the name of the principle upon which the club was founded. The point of SOFA club is to start as many things as possible as you have the ability, interest, and capacity to, with no regard or goal whatsoever for finishing those projects.