Welcome to Digitally Literate, issue #380.
I worked on the following this week:
- Down the QAnon Rabbit Hole -In Trust the Plan, Daily Beast reporter Will Sommer delves into the origins and evolution of the QAnon conspiracy theory movement. His book provides important context for how QAnon gained traction and influenced American politics.
- Until Death, It Is Always About Life – Rather than reacting to life’s dramas, we can take charge of our own heroic journey.
- Unlocking Your Inner Genius with the Feynman Method – Richard Feynman advocated always having a dozen favorite problems or questions on your mind. Most of the time, these problems sit dormant in the back of your mind. But whenever you encounter a new idea, result, or technique, pull out your list and test the new concept against each problem.
- Future Trends Forum – Media and Information Literacies – Doug Belshaw, Laura Hilliger, I were invited on the Future Trends Forum this week by good friend Bryan Alexander. We were there to discuss our recent research about Promoting Informed Citizenship in a Connected World.
The Disappearing Computer: An Exclusive Preview of Humane’s Screenless Tech | Imran Chaudhri
Several months ago, former Apple designer and Humane cofounder Imran Chaudhri gave this TED Talk in which he envisioned a future where AI enables our devices to “disappear.”
Good news! You can now order the AI Pin in the United States starting Nov. 16 for $699 for the entire system, along with a monthly subscription of $24 per month, which includes a cell phone number for the pin and cell services.
Seeing Like An Algorithm
Fei-Fei Li, a key figure in artificial intelligence, has written a book called “The Worlds I See” that combines her personal journey as an immigrant with her contributions to the field of AI. She discusses her work on the ImageNet project, which classified millions of digital images and paved the way for advancements in AI. Li also addresses the issue of bias in AI algorithms and the need for diversity and inclusivity in the field. She advocates for increased funding for AI research in academia and government to ensure future breakthroughs.
Why this matters: Fei-Fei Li, a prominent figure in AI, believes that AI will transform society and people’s lives, but only if fundamental changes are made to how AI is engineered and who engineers it. Therefore, making AI more responsible and ethical is crucial to ensure positive outcomes for society.
Seeking Accountability
Last week, the White House issued its long-awaited executive order on artificial intelligence. The order outlines the Biden administrations understanding of the scale of AI’s risks and its demonstrated harms to people’s rights, opportunities, and access. The executive order draws heavily from the Blueprint for an AI Bill of Rights and made clearer by guidance from the Office of Management and Budget (OMB).
Taken together, the documents emphasize the need to protect people from the wide range of AI risks that already exist, and the need for public interest regulation.
Why this matters: The executive order is a positive step for the administration, even though there is still more work to be done to achieve its goals. Implementing the order’s orders and addressing the necessity of moving people and resources swiftly in order to accomplish an ambitious plan on a short schedule will be crucial going forward.
Handling Weaponized Content
Small platforms without resources to handle takedown requests have been weaponized by terrorist groups that share their content online. A new tool called Altitude, developed by Google subsidiary Jigsaw in collaboration with Tech Against Terrorism, aims to help smaller online platforms detect and remove terrorist content more easily.
Why this matters: Tools like this aim to improve the quality of content moderation and support the rule of law while preventing the spread of terrorist propaganda.
A Cesspit of Disinformation
Mike Caulfield discusses how social-media researchers overemphasized the platform now called X for years. But now, as it rapidly changes into something new and frightening, we risk paying too little attention.
Why this matters: Although many of the warning lights that we once used to identify emerging misinformation are now out of service, the ones that remain are flashing red.
Digital Rabbit Holes
Tobias Rose-Stockwell discusses the impact of digital technology on our attention and how tech companies are using apps, news feeds, and notifications to capture our attention and turn it into advertising dollars. Rose-Stockwell describes how this addiction to digital devices and social media has changed the way we consume information and how it has affected our ability to focus.
Why this matters: These new tools are fracturing our ability to make sense, cohere, and cooperate around the deepest challenges facing our species.
Critically Engage With New Information
The THIEVES reading technique is designed to get you thinking critically while you read.
THIEVES is just an acronym for the following elements of your content:
- Title
- Headings
- Introduction
- Every first sentence in a paragraph
- Visuals and vocabulary
- End-of-chapter questions
- Summary
When you invent the ship, you also invent the shipwreck; when you invent the plane you also invent the plane crash; and when you invent electricity, you invent electrocution.
Paul Virilio
Cover Photo CC BY using Playground AI
Say hey at hello@digitallyliterate.net or on the social network of your choice.