DL 424
Programming The World
Published: March 8, 2026 • 📧 Newsletter
For a long time, maps showed us where things were. Now they can show us what happened.
A new generation of AI tools is making it possible to stitch together open data into dynamic systems that replay events as they unfold.
What used to require intelligence agencies and specialized infrastructure can now be assembled by individuals using publicly available data and AI-generated code.
This shift isn’t just about better maps. It’s about a deeper change in how technology works.
We are moving from coding systems to directing them. As these tools become easier to build, the real power shifts from those who own the infrastructure to those who can interpret the signals.
If you've found value in these issues, subscribe here or support me here on Ko-fi.
📚 Recent Work
I’ve been refining the architecture of my digital garden to better serve as a public resource. Key updates include:
- Refreshed Home Page: A cleaner overview with a "Live Feed" at the bottom showing the latest notes synced to the public vault.
- Enhanced Navigation: The top header now features streamlined search, a dark/light mode toggle, and organized links to different note types.
- Unified Ecosystem: I am moving the newsletter directly into the garden. By housing notes, blog posts, and publications in one place, every idea becomes a "node" you can follow to connect the dots.
🔖 Key Takeaways
- Situational awareness is being democratized. Tools that once belonged to intelligence agencies (satellite tracking, sensor networks, and geospatial analysis) can now be assembled by individuals using open data and AI.
- The interface is no longer the hard part. With AI generating code and connecting data sources, the expensive parts of software (maps, dashboards, integrations) are becoming fast and cheap to build.
- Coding is shifting toward orchestration. In “vibe coding,” humans define goals, aesthetics, and constraints while AI systems generate the underlying logic. The role of the developer increasingly resembles a director.
- Transparency and surveillance now share the same infrastructure. The same open data that allows citizens to investigate events also enables new forms of monitoring and tracking.
- Power moves from tools to interpretation. When everyone can access the same data streams, the advantage shifts to those who can frame, analyze, and narrate what the data means.
🌐 The Power of Open Data: Replaying a War in 4D
Former Google Maps manager Bilawal Sidhu recently built something that looks like it belongs in a Mission Impossible control room.
Using AI tools, he created WorldView, a digital globe that pulls together live data from across the internet.
It’s not just a map. It’s a “God-view” of the planet showing things like:
- Satellites: Every satellite currently orbiting Earth
- Flights: Over 6,000 aircraft in real time, including military planes
- City sensors: CCTV traffic feeds and earthquake sensors mapped onto 3D buildings
He even added visual filters so the globe looks like you're viewing it through night vision or a thermal camera.
But the real breakthrough isn’t the visuals. It’s time.
Most maps are 3D (length, width, and height). Sidhu added a fourth dimension, time.
When recent Iranian strikes began, he deployed an army of AI agents to capture publicly available information (OSINT—Open Source Intelligence) before it disappeared.
By stitching these digital traces together, he created a 4D replay of the conflict.
Instead of just seeing where things are, you can watch what happened. Aircraft leave flight “donuts” showing their paths and sudden turns. Satellite and flight data sync with earthquake sensors and CCTV feeds. When a strike occurs, the seismic sensor blip aligns with the camera flash to the millisecond.
In other words, 3D shows the scene. 4D shows the event.
🤖 What Is “Vibe Coding”?
Sidhu didn’t write thousands of lines of code to build WorldView.
He "vibe coded" it.
In vibe coding, the human focuses on ideas, design, and user experience, while AI handles the technical work.
Think of it like building a house:
- You describe the kitchen you want
- AI acts as the architect and contractor
- AI agents handle plumbing, wiring, and painting simultaneously
When something breaks, you don’t debug the code.
You just say:
“That feels clunky. Make the transitions smoother and use a darker palette.”
You’re debugging the experience, not the code.
🕵️♂️ The Palantir Paradox
Sidhu’s project also caught the attention of Palantir, the data analytics company used by intelligence agencies and militaries. Named after the "seeing stones" from Lord of the Rings, Palantir is the world’s most powerful (and secretive) data firm.
While WorldView helps visualize what already happened, Palantir’s systems aim to predict what will happen next.
Think of the movie Minority Report. Instead of psychics predicting crimes, Palantir uses algorithms that combine massive datasets (credit card transactions, travel patterns, social media activity, networks of associates).
The result is a risk score for people, places, and events. This technology has helped identify human trafficking networks, disrupt terrorist plots, and track disease outbreaks.
But it raises a difficult question: What happens when someone becomes suspicious because of a pattern they can’t see or challenge?
⚖️ Democratization vs. Surveillance
Sidhu’s weekend project reveals something bigger about the world we’re entering.
On one hand, tools like WorldView democratize intelligence. A single person with a laptop can now achieve levels of situational awareness once reserved for military command centers.
On the other hand, it exposes the global surveillance infrastructure we’ve quietly built.
If one person can track planes, satellites, and sensors over a weekend, imagine what happens when governments, or malicious actors, deploy entire fleets of AI agents to do the same.
The most surprising part? The expensive parts of software (maps, interfaces, and data connections) are becoming nearly free.
Which means the real question is no longer, “Can someone build this?”
It's now “Who will?”
🔎 Consider
Prescriptive technologies… come with an enormous social mortgage. The mortgage means that we live in a culture of compliance._
— Ursula Franklin
As powerful tools become easier to build, we are entering a world increasingly shaped by systems we didn’t design and can’t fully see.
The vibe of the future is clear. The technology is here, it’s fast, and it’s accessible.
Now we have to decide what kind of world we want to build with it.
⚡ What You Can Do This Week
- Map the data around you. Pick one moment in your day (a commute, a purchase, a social post) and list the systems that might record it. Cameras, sensors, apps, satellites, payment systems. You don’t need to be paranoid. Just curious. Situational awareness begins by noticing how many systems are quietly watching the same moment.
- Notice where the human disappears. When you encounter a system this week (an algorithmic recommendation, automated moderation, or AI-generated result) ask a simple question: Who made the decision here? The person who wrote the code, the model that generated the answer, or the system that structured the options?
- Build something small. Try using an AI tool to create something simple. A script, visualization, or workflow that solves a small problem you understand well. The goal isn’t technical mastery. It’s learning how quickly powerful tools can appear when you combine domain knowledge with AI assistance.
- Slow down one conclusion. When a technological claim sounds inevitable (“AI will replace…”, “This is the future…”), pause before accepting it. Technology moves fast. Interpretation should not.
🔗 Navigation
Previous: DL 423 • Next: DL 425 • Archive: 📧 Newsletter
🌱 Connected Concepts
- Acceleration as Governance — when speed itself becomes a governing strategy: decisions are pushed faster than institutions, norms, or publics can meaningfully respond.
- Friction Removal — the systematic effort to eliminate procedural, technical, or social resistance in the name of scale, efficiency, or competitiveness.
- Compliance Culture — a social condition where systems are designed so that participation requires passive acceptance rather than meaningful consent or oversight.
- Guardrail Inversion — the rhetorical shift where safeguards are reframed as obstacles, allowing their removal to appear pragmatic or inevitable.
- Infrastructure Lock-In — once verification systems, surveillance mechanisms, or platform standards are deployed at scale, they become difficult to reverse even if their original justification fades.
- Legibility Pressure — the demand that people, identities, and behaviors become measurable, verifiable, and machine-readable in order to participate in digital systems.