DL 197
The Privacy Paradox
Published: 2019-05-11 • 📧 Newsletter
Welcome to Issue 197. The privacy paradox.
Hi all, welcome to Digitally Literate. My name is Ian O'Byrne. I research, teach, & write about technology in our lives. I try to synthesize what happened this week in tech...so you can be the expert as well.
I posted a couple of things this week:
- Formative and Summative Assessments - An overview of learning pathways, purpose of assessment, and formative/summative assessments.
- Journaling as a means to scaffold and assess student learning - I gave a talk on formative assessments and journaling at a professional development workshop on campus this week.
- Assistive technology to help people with disabilities - An overview of recent innovations in technology to support individuals with disabilities.
🔖 Key Takeaways
- Privacy Paradox Explained: Research shows people express privacy concerns but continue disclosing data, driven by "privacy calculus" where convenience outweighs abstract risks.
- Corporate Privacy Reframing: Google and Facebook now claim "the future is private" while redefining privacy from "we don't collect data" to "we don't share it elsewhere."
- Values Over Facts: Fact-checking fails when disagreement stems from different values and questions rather than different answers to the same questions.
- Teachers as Daring Leaders: Brené Brown argues teachers model vulnerability and courage daily, deserving recognition as our most daring institutional leaders.
- Memory Through Deletion: Johnny Harris counterintuitively argues the key to remembering your life through photos is deleting most of them to curate meaningful memories.
📺 Watch
How to Remember Your Life
Johnny Harris on how to use your camera to use your phone, take pictures, and organize them to connect to your longterm memory.
Harris suggests…the key to remembering your life is deleting photos. This is a great assignment for use in high school and higher ed.
Harris's counterintuitive argument—delete to remember—challenges the "capture everything" default that smartphones enable. When thousands of photos exist undifferentiated, none become memorable; the archive overwhelms rather than preserves. Curation creates hierarchy: selecting which moments matter transforms passive capture into active memory-making. The educational application is rich: asking students to review, select, and justify their photo choices develops critical evaluation skills while connecting to broader information literacy. What deserves preservation? What can we let go? These questions extend far beyond photo libraries.
📚 Read
Fact-Checking Can't Do Much When "Dueling Facts" Are Driven by Values
Morgan Marieta and David C. Barker, authors of the Inconvenient Facts blog on Psychology Today, share some insight from the research presented in their text, One Nation, Two Realities.
Values not only shape what people see, but they also structure what people look for in the first place. We call this intuitive epistemology.
Those who care about oppression look for oppression — so they find it.
Those who care about security look for threats to it — and they find them.
In other words, people do not end up with the same answers because they do not begin with the same questions.
The "intuitive epistemology" concept reframes polarization debates. Traditional fact-checking assumes shared questions with disputed answers—we can verify which answer is correct. But if people ask different questions based on different values, fact-checking the answers misses the point. Someone focused on systemic oppression and someone focused on security threats aren't disagreeing about facts; they're investigating different phenomena. This doesn't mean relativism wins—some claims remain empirically false regardless of values. But it explains why fact-checking rarely changes minds: it addresses symptoms rather than underlying value divergences.
Google's Sundar Pichai: Privacy Should Not Be a Luxury Good
This week Google held their annual conference for developers. You can watch the keynote here. I did…and I love this event every year.
This year we're seeing tech companies try to reshape the narrative around privacy and our digital tools. Many of these companies are trying to redefine privacy for us. They're trying to redefine privacy from "we don't take your data" to "we don't give it to anyone else." There is a big difference between the two.
In multiple times in the Google I/O keynote, they mentioned that "the future is private" and that privacy is for everyone. Google's CEO, Sundar Pichai puts a finer point on this in the op-ed linked above, as he also highlighted the need for legislation around privacy.
Mark Zuckerberg delivered a similar message at Facebook's developer conference. "The future is private," he said, and indicated Facebook will focus on more intimate communications. He shared these messages in a Washington Post op-ed just weeks before.
This transparency is good, but we need to question how transparent all of these companies are with our data. The takeaway is that they're still collecting your data. They're still using it to teach machine learning engines. Some will sell (or hand off) your data to others. Data breaches and hacks will happen. Business will continue as they seek to redefine privacy in an attempt to pacify user concern and angst on the topic.
The synchronized "future is private" messaging from Google and Facebook signals coordinated narrative management rather than genuine policy shift. The redefinition sleight-of-hand deserves attention: "privacy" traditionally meant not being observed, not having data collected. The new definition—we collect everything but promise to keep it ourselves—reframes surveillance as protection. The legislation calls are equally strategic: companies preferring federal regulation they can influence over state-by-state restrictions they can't. Digital literacy requires parsing these redefinitions, understanding what's actually changing (messaging) versus what's staying the same (business model).
The Privacy Paradox: Why Do People Keep Using Tech Firms That Abuse Their Data?
As discussed in the earlier story, tech companies are trying to redefine privacy in our use of these digital texts and tools.
The question ultimately becomes…why do we continue to use and trust these companies?
John Naughton examines this problem through the lens of the "privacy paradox" as detailed by Nina Gerber, Paul Gerber, and Melanie Volkamer in Computers & Security. You can read more about the "privacy calculus" that we all employ as we consider our privacy, and willingness to disclose.
The privacy paradox research explains apparent hypocrisy: people say they value privacy but behave as if they don't. The "privacy calculus" framework reveals this isn't hypocrisy but rational trade-offs under uncertainty. Privacy harms are abstract, future, probabilistic—your data might be breached, might be used against you, might enable manipulation. Convenience benefits are concrete, immediate, certain—the app works, connection happens, information flows. Humans systematically discount future uncertain risks against present certain benefits. Understanding this pattern suggests interventions: making privacy harms more concrete and immediate, rather than simply warning people to care more.
Teachers: Our Most Daring Leaders
I really enjoy all of the work Brené Brown shares. She's now on Netflix, and her blog is a great resource. I'd start with Daring Greatly if you haven't encountered her before.
This post is adapted from Dare to Lead, and it shares support for educators as we celebrated Teacher Appreciation Day here in the U.S.
Brown's vulnerability framework applied to teaching reveals what educators do daily: entering spaces where they might fail publicly, caring about outcomes they can't control, exposing their thinking to critique. This is daring leadership in her terms—choosing courage over comfort repeatedly. The Teacher Appreciation framing matters but risks becoming annual sentiment disconnected from material support. Brown's intervention connects appreciation to understanding: teachers aren't saints enduring hardship but professionals practicing courage. This reframe demands different responses—not gratitude for sacrifice but systems enabling sustainable practice.
A Manifesto for Strategic Pedagogical Change
Peter Bryant with a great post on how to incorporate real pedagogical change in your classroom.
His manifesto is as follows:
- Have a plan
- Reward and recognize excellence and achievement
- Be in the conversation
- Connection is the glue
- We don't know what the students want - but we need to
- Expose yourself to risk
- Be rigorous, evidenced-based, and critically reflective
- Enhance, don't replace
- The future happens
Bryant's manifesto balances aspiration with realism. "Have a plan" and "the future happens" create productive tension: preparation matters but adaptability matters more. "Enhance, don't replace" offers technology guidance often missing from edtech enthusiasm—technology serves pedagogy, not vice versa. "We don't know what students want" acknowledges the presumption in much educational design while insisting on seeking student input. The "expose yourself to risk" point connects to Brown's vulnerability work: pedagogical innovation requires willingness to fail publicly. Manifestos risk remaining declarations; Bryant's items are actionable enough to guide actual practice.
🔨 Do
Google Calendar for Fitness Tracking
My Wife and I have started to track more of our food consumption as we want to track down health and wellness as it is impacted by the foods we eat. This post from Engadget shares how to use the Goals function in Google Calendar to keep track of workouts, food, maladies, etc.
The Goals feature represents Google's lifestyle integration strategy—becoming infrastructure for daily life beyond information retrieval. Tracking workouts, food, and health patterns in the same system managing work meetings creates comprehensive behavioral picture. The convenience is real: one place for everything, patterns visible across domains. The privacy paradox this issue explores applies directly: we know this data enables the profiling discussed elsewhere, yet the utility draws us in. Using the tool thoughtfully—being aware of what data accumulates while gaining genuine benefit—exemplifies the calculus we all navigate.
🤔 Consider
"The most courageous act is still to think for yourself. Aloud." — Coco Chanel
Chanel's observation on thinking aloud connects to this issue's themes of privacy, values, and disclosure. The privacy paradox involves calculated silence—what we share, what we withhold. The dueling facts research shows how thinking flows from values we rarely examine aloud. Teachers modeling vulnerability think publicly, demonstrating courage Chanel identifies.
🔗 Navigation
Previous: DL 196 • Next: DL 198 • Archive: 📧 Newsletter
🌱 Connected Concepts:
- Privacy Research — Privacy paradox and privacy calculus explaining why stated concerns don't match disclosure behavior as users trade abstract future harms for concrete present benefits in Behavioral Economics.
- Corporate Rhetoric — Google and Facebook redefining privacy from data non-collection to data non-sharing while business models remain unchanged in Platform Criticism.
- Political Epistemology — Intuitive epistemology showing how values shape questions people ask making fact-checking ineffective for value-driven disagreements in Media Literacy.
- Educational Leadership — Brené Brown framing teachers as daring leaders who practice vulnerability and courage daily in Teaching Practice.
- Pedagogical Change — Peter Bryant's manifesto balancing planning with adaptation and evidence with risk-taking in Instructional Design.
Part of the 📧 Newsletter archive documenting digital literacy and technology.