Digital Literacy
🌳 Forest
Thesis
Digital literacy is not a skill set. It's a stance toward power.
Knowing how to use Google Docs or navigate a learning management system was never the point — though that's where most curricula stop. Real digital literacy means understanding who built the systems you depend on, what they extract from you, and what alternatives exist. It means having the tools, habits, and community support to act on that understanding. The five Groves below — taken together — form a framework for teaching people not just to survive the digital world but to shape it on their own terms.
The Groves
Digital Self-determination
The philosophical foundation. Self-determination gives the framework its spine: the right of individuals and communities to decide how their data, identity, and participation are governed. Without this, the other three Groves are just technical recommendations. With it, every tool choice becomes a statement about agency.
Privacy by Design
The material answer to "what do we actually use?" This Grove makes self-determination concrete by identifying tools that protect users by default — CryptPad, Signal, Matrix, Nextcloud. The key insight: when the tool does the privacy work, adoption stops being a technical burden and becomes a pedagogical opportunity. Tool choices become teaching.
Digital Sovereignty
The infrastructure argument. Privacy by Design picks the right tools. Sovereignty asks: who owns the server? Whose jurisdiction governs the data? This Grove pushes past individual tool swaps toward community-controlled infrastructure — the difference between using Signal and running your own Matrix server.
Digital Resilience
The human capacity layer. Tools and infrastructure are necessary but not sufficient. Resilience addresses the lived experience — privacy fatigue, overwhelm, the gap between knowing what to do and sustaining the practice. Without resilience, sovereignty and privacy by design become ideals that burn people out.
AI Literacy
The newest frontier. AI tools are simultaneously the most powerful learning amplifiers and the most sophisticated extraction systems ever built. This Grove maps the landscape — foundational concepts, classroom frameworks, safety concerns, and the human questions AI raises about identity, agency, and trust. It connects the safety spending gap, sycophantic AI, and the educator's hidden advantage.
Synthesis
Each Grove alone solves part of the problem. Together they reveal something none of them say individually:
Digital literacy is a practice of collective self-determination, not individual skill acquisition.
The standard model treats digital literacy as a checklist: can you use this tool? Can you evaluate this source? Can you spot misinformation? That model fails because it places all the burden on the individual and none on the system. A person can be a perfect critical thinker and still be surveilled, manipulated, and locked into extractive platforms.
The four-Grove model reframes literacy as:
- Understanding power (Self-determination) — Who controls the systems I use? What are they extracting? What rights do I have?
- Choosing tools that embody values (Privacy by Design) — Are there alternatives where protection is the default, not the exception?
- Building autonomous infrastructure (Sovereignty) — Can my community own its digital spaces instead of renting them?
- Sustaining the practice (Resilience) — Can I keep doing this without burning out? Can I help others start?
This progression — understand, choose, build, sustain — is a curriculum. It moves from awareness to action to community capacity. It works for a workshop, a semester, or a multi-year community initiative.
Implications
For educators: Stop teaching "digital citizenship" as etiquette (don't cyberbully, cite your sources). Start teaching it as political economy: who profits from your attention, who owns your data, and what would it look like to build digital spaces that serve learners instead of extracting from them.
For InitiatED: The four Groves map directly to the Signpost Sessions model — each Grove could anchor a cohort module. Self-determination is the opening frame. Privacy by Design is the hands-on workshop. Sovereignty is the advanced track. Resilience runs through all of them as the ongoing support structure.
For the Digitally Literate newsletter: Every issue already touches these themes. The Forest gives them a frame. Instead of standalone tips ("try Signal," "here's a privacy setting"), each piece can be positioned within the larger argument: this is one step in a practice of digital self-determination.
For policy: The AI safety spending gap (AI Safety Spending Gap) is not a separate problem. It's the same structural dynamic — capability outpacing safety, extraction outpacing protection — playing out at a different scale. AI literacy and digital literacy are the same fight.
Architecture
If this Forest becomes a course, book, or keynote series:
| Part | Grove | Focus | Output |
|---|---|---|---|
| 1. The Problem | — | Why "learn to Google better" isn't literacy | Framing essay / opening lecture |
| 2. Agency | Self-determination | Power, rights, and context in digital systems | Conceptual foundation |
| 3. Tools | Privacy by Design | Hands-on with privacy-first alternatives | Workshop / practical guide |
| 4. Infrastructure | Sovereignty | Community ownership of digital spaces | Case studies + setup guides |
| 5. Sustainability | Resilience | Habits, support structures, avoiding burnout | Ongoing practice framework |
| 6. AI & the New Frontier | AI Literacy | What AI changes about literacy, agency, and trust | Critical framework + workshops |
| 7. The Bigger Picture | All five | Surveillance, extraction, and the future of digital agency | Synthesis + call to action |
Cross-Cutting Themes
Threads that weave through multiple Groves and don't belong to just one.
Equity and Power
- Access and Inclusion in Digital Learning — Who gets left out?
- Digital Redlining — Algorithmic discrimination in access and opportunity
- Techno-colonialism — Extraction dressed as innovation
- Surveillance AI — When AI becomes the surveillance apparatus
Ethics and Accountability
- Ethics in AI — Frameworks for responsible development
- Critical Digital Pedagogy — Teaching that interrogates the tools it uses
- Computational Thinking — The cognitive skill set, not just "learn to code"
Resistance and Futures
- Platform Refusal — Choosing not to participate in extractive systems
- Open Source Movements — Community-built alternatives
- Decentralized Web — Infrastructure without central control
- Speculative Futures in Tech — Imagining what could be different
Projects and Applied Work
- Initiative for Literacy in a Digital Age — Applied research initiative
- STEAM Vanguard Capstone — Capstone project connecting STEAM and digital literacy
- Algorithmic Justice Collection — Curated resources on algorithmic harm
New Groves (added 2026-02-12)
Three new Groves created from the MOC migration that connect to this Forest:
- Teaching Philosophy — Pedagogical frameworks, critical pedagogy, and AI in education. The "how we teach" companion to "what we teach."
- Emotional Intelligence — The human capacity layer: communication, organizational culture, care practices. Resilience requires emotional infrastructure.
- Internet Culture — Platform dynamics, digital identity, misinformation, and online communities. You can't teach digital literacy without understanding the culture it operates in.
Gaps
What's missing before this Forest is complete:
Grove: AI Literacy— Built. See AI Literacy.MOC migration— Done. Three MOCs absorbed into Groves, three promoted to new Groves. See notes above.- Evergreen: "Digital Citizenship" critique — A polished piece arguing why the standard digital citizenship model is insufficient. Several Plants touch this but none commit to the argument.
- Evergreen: The case for cooperative infrastructure — Connecting Nextcloud/Matrix/CryptPad not as individual tool choices but as a cooperative economic model for communities.
- Grove: Community Practice — The InitiatED/Signpost Sessions model as its own Grove, connecting pedagogy, cooperative governance, and digital autonomy.
Open Questions
- Does this framework translate across contexts? It's built from a U.S. education perspective — does it hold for Global South digital sovereignty movements, Indigenous data governance, or non-English-speaking communities?
Where does AI fit as a fifth pillar vs. a thread running through all four?→ Added as fifth Grove. Still worth asking whether it crosscuts the others.- How do the three new Groves (Teaching Philosophy, Emotional Intelligence, Internet Culture) relate to the five original Groves? Are they satellites, or do they belong in the Architecture table?
- Is "Forest" the right framing for something that's also a call to action? At what point does this stop being a knowledge architecture and become a manifesto?
Digital literacy isn't about using technology well. It's about refusing to let technology use you.