Tag: algorithms

Growth & Engagement

WELCOME

Welcome back, friends and family!

This week I also posted the following:

If you haven’t already, please subscribe if you would like this newsletter to show up in your inbox. Feel free to reach out and say hey at hello@digitallyliterate.net.

Watch

The Fastest Way to Learn a New Language: The Video Game Map Theory

Johnny Harris on how video games helped him rethink language learning.

Read

The mess at Medium

When Medium first came out, I was in love. I had students write on the space. I developed several publications and shared my work there. In the nine years since the start of the writing space, things have perpetually gotten worse.

Things came to a head this week when Medium CEO and Twitter co-founder Ev Williams sent an email to the entire Medium staff announcing that the company would like employees charged with doing journalism to feel free to quit and that the company would in fact be shifting away from professional journalism altogether.

In truth, it seems the situation was much worse. The platform has grown to include writing of every kind: viral posts about COVID-19; generic business wisdom; tech blogging; productivity porn; actual porn.

There is a lot to learn from this story. For me, the key lesson is about having a space of your own to write. Lately, I’ve been having thoughts about moving this newsletter to Substack. Lessons learned from Medium are a reminder to keep building our own spaces.

If Mark Zuckerberg won’t fix Facebook’s algorithms problem, who will?

It was quite the week for Mark Zuckerberg and Facebook.

The CEOs of Facebook, Twitter, and Google testified before Congress about the spread of extremism, hoaxes, and misinformation online.
Zuckerberg suggested in his written statement that Facebook has a great system in place, and that their future competitors should be eliminated.

Research published by the Tech Transparency Project (TTP), a nonprofit watchdog organization, indicated that not only do militia groups still remain, but they’re using the space to build movements.

And…Facebook’s bullying and harassment policy explicitly allows for users to call for the death of public figures.

what demoralization does to teachers

Anne Helen Petersen on chronic burnout and exploitation.

“Demoralization occurs when teachers cannot reap the moral rewards that they previously were able to access in their work. It happens when teachers are consistently thwarted in their ability to enact the values that brought them to the profession.”

*This is chronic burnout and deep demoralization as labor is increasingly under-funded, under-valued, and under-resourced.

Telling teachers they’re great isn’t enough. If you value them, act, vote, and speak in a way that evidences that value. They have held a crumbling system together for so long. It’s time to give them relief — and reconsider its construction.*

The following image from Al Abbazia seems to capture the moment.

Stanford researchers identify four causes for ‘Zoom fatigue’ and their simple fixes

Excessive amounts of close-up eye contact is highly intense. Solution – Take Zoom out of the full-screen option and reducing the size of the Zoom window relative to the monitor to minimize face size, and use an external keyboard to allow an increase in the personal space bubble between oneself and the grid.

Seeing yourself during video chats constantly in real-time is fatiguing. Solution – use the “hide self-view” button, which one can access by right-clicking their own photo, once they see their face is framed properly in the video.

Video chats dramatically reduce our usual mobility. Solution – An external keyboard or monitor will allow you to sit back and doodle.

The cognitive load is much higher in video chats. Solution – Give yourself an audio-only break. Turn off your camera, and turn away from the screen.

You might want to also check out the Zoom Exhaustion Fatigue (ZEF) scale. The paper is here, and the survey is available here.

How politics tested Ravelry and the crafting community

Thanks to one of my friends, I’ve been fascinated with Ravelry. Yes, you’re right. The community site, organizational tool, and yarn & pattern database for knitters and crocheters. Here’s the thing…I don’t knit. 🙂

I am interested in the way they developed this community and networking space all focused on making.

This post discusses the challenges the developers had as they hoped that everyone would just behave while in the community.

Do

Zoom Escaper lets you sabotage your own meetings with audio problems, crying babies, and more

Had enough Zoom meetings? Can’t bear another soul-numbing day of sitting on video calls, the only distraction your rapidly aging face, pinned in one corner of the screen like a dying bug? Well, if so, then boy do we have the app for you. Meet Zoom Escaper: a free web widget that lets you add an array of fake audio effects to your next Zoom Call, gifting you with numerous reasons to end the meeting and escape, while you still can.

Discuss

consider

I guess that’s what growing up is. Saying good-by to a lot of things. Sometimes it is easy and sometimes it isn’t. But it is all right.

Beverly Cleary

digilit banner

Fagradalsfjall is a volcano about 25 kilometres from Reykjavík, Iceland, which has been dormant for 6,000 years. On Friday, a new vent opened up just to the south at Geldingadalir, which is forming a new volcano right now. This is the first eruption on Iceland’s Southern Peninsula in 800 years. And we can watch it happening live via webcam.

If you’re really adventurous, you can fly through it via drone.

Connect at hello@digitallyliterate.net or on the social network of your choice.

The Coming War

WELCOME
The Coming War
Digitally Lit #271 – 12/05/2020

Thank you for being here. You are valued.

This week I worked on the following:

  • Trust, But Verify – Users of the Internet become pawns in a flow of information that circulates endlessly in the ether causing a contagion that is nearly insurmountable.
  • Shades of Gray – Absolute truth becomes even more subjective as there are very few things that are clearly right or wrong.

If you haven’t already, please subscribe if you would like this newsletter to show up in your inbox. Feel free to reach out and say hey at hello@digitallyliterate.net.

Watch

1981 Nightline interview with Steve Jobs

 

Ted Koppel, Bettina Gregory, and Ken Kashiwahara present news stories from 1981 on the relevancy of computers in every day life and how they will affect our future. Included are interviews with Apple Computer Chairman Steve Jobs and writer David Burnham.

Read

Google Researcher Says She Was Fired Over Paper Highlighting Bias in A.I.

Timnit Gebru, a prominent a co-leader of the Ethical Artificial Intelligence team at Google sent an email to her colleagues voicing exasperation over the company’s response to efforts to increase minority hiring.

Gebru had been working on a research paper that she hoped to publish, but ran into resistance from her superiors at Google. And so she sent a letter expressing her frustration to the internal listserv Google Brain Women and Allies.

The paper, titled “On the Dangers of Stochastic Parrots: Can Language Models Be Too Big?” lays out the risks of large language models—AIs trained on staggering amounts of text data.

A few days later, Gebru was fired — Google reportedly found the email “inconsistent with the expectations of a Google manager.” It details the struggles Gebru experienced as a Black leader working on ethics research within the company, and presents a bleak view of the path forward for underrepresented minorities at the company.

The coming war on the hidden algorithms that trap people in poverty

A growing group of lawyers are uncovering, navigating, and fighting the automated systems that deny the poor housing, jobs, and basic services.

Credit scores have been used for decades to assess consumer creditworthiness, but their scope is far greater now that they are powered by algorithms: not only do they consider vastly more data, in both volume and type, but they increasingly affect whether you can buy a car, rent an apartment, or get a full-time job. Their comprehensive influence means that if your score is ruined, it can be nearly impossible to recover. Worse, the algorithms are owned by private companies that don’t divulge how they come to their decisions. Victims can be sent in a downward spiral that sometimes ends in homelessness or a return to their abuser.

Online exam monitoring can invade privacy and erode trust at universities

Bonnie Stewart on the testing and proctoring methods that invade privacy and erode trust end up undermining the very integrity that institutions demand students uphold.

As institutions of higher ed turn to online proctoring in the name of academic integrity the risks of exchanging the four walls of the classroom for surveillance platforms may be higher than many institutions bargained for.

As Stewart points out at the end of the piece, higher ed doesn’t need proctoring. Timed tests value what students remember.

Is memorization really a valid educational reason for risking privacy, well-being, and tight university budgets in a world where students will spend most of their lives with Google in their pockets?

Examining Screen Time, Screen Use Experiences, and Well-Being in Adults

This study examined the relationship between screentime and well-being in adults, including positive relationships, meaning, and loneliness. The study is possibly the first to investigate how much pleasure and meaning people feel during screen use and their mediating effects.

Screentime was not found to be significantly correlated with well-being; and screen use experiences did not mediate any of the screen time and well-being relationships.

However, screen use meaning was positively associated with overall well-being and positive relationships. This finding prompts a review of the importance of screen time for well-being, suggesting that this may be a limited approach. Other factors related to screen quality may be equal if not more important for well-being.

Teaching in the Pandemic: ‘This Is Not Sustainable’

Teacher burnout will erode instructional quality, stymie working parents and hinder the reopening of the economy.

“If we keep this up, you’re going to lose an entire generation of not only students but also teachers,” said Shea Martin, an education scholar and facilitator who works with public schools on issues of equity and justice.

Do

Enhance Student Engagement with Virtual Social Learning Spaces

Caitlin Tucker with ideas and strategies for utilizing those shared spaces to create student-centered learning experiences.

Consider

consider

If you get tired, learn to rest, not to quit.

Banksy

digilit banner
Connect at hello@digitallyliterate.net or on the social network of your choice.

Digitally Literate #228

Digitally Literate #228

WELCOME
Under Observation
Digitally Lit #228 – 1/11/2020

Hi all, welcome to issue #228 of Digitally Literate. If you haven’t already, please subscribe if you would like this to show up in your email inbox.

If you’re reading on the website, feel free to leave a comment behind. You can also use Hypothesis to leave notes and annotations. Feel free to reach out and let me know what you think of this work at hello@digitallyliterate.net.

I’m also always reading and learning online. I choose to share somethings socially…and others I keep for my reference. If you want to see a feed of my notes…here you go.

Let’s get to the news of the week. Thanks for stopping by.

Watch

Frames (11:11)

I’m including more of a focus on surveillance capitalism in my ed tech course this semester.

Frames is a short video that shows how a smart city tracks and analyzes a woman as she walks through the city. Things she does are interpreted and logged by the city system, but are they drawing an accurate picture of the woman?

The video is accompanied by a facilitator guide and media pack to support discussions.

Read

Caught in the Spotlight

Chris Gilliard with a must read essay that explores how tech that tracks creates different spatial experiences for users on opposite ends of the tool. Different realities exist for different races and classes at the receiving end of the surveilling gaze.

…these technologies will continue to fuel a negative feedback loop between individuals and communities on both ends of the surveillance spectrum, where the only real winners are the companies who profit from the fear they help to manufacture

What if modern conspiracy theorists are altogether too media literate?

A piece in The Outline by Will Partin on how modern conspiracy theorizers are co-opting the tools and rhetoric of media literacy and critical thinking to assemble/support their theories.

The wager of this critique is that skepticism can be just as corrosive to society as naïvite. If this is true, then we’ve been looking at the issues media literacy purports to address backwards. What if the problem is not that people don’t check their sources, consult with a friend, or read critically, but precisely that they do?

All of YouTube, Not Just the Algorithm, is a Far-Right Propaganda Machine

Last week in this spot on the newsletter, we discussed the research from Mark Ledwich and Anna Zeitsev that suggested YouTube’s recommendation algorithm fails to promote inflammatory or radicalized content, as previously claimed by several outlets.

If you scroll to the bottom of issue #227, good friend Aaron Davis (subscribe to his newsletter ReadWriteRespond if you haven’t already) shared a number of important links that add to the complexity of the story.

Not soon after Aaron shared this info, Becca Lewis used this as an opportunity to compile the piece linked at the top of this story. Lewis posits that we need to look beyond the recommendation algorithms to understand YouTube’s political propaganda problem.

How Rupert Murdoch Is Influencing Australia’s Bushfire Debate

We haven’t talked much about the bushfires ravaging all of Australia. I have many friends that are impacted by these events…especially in New South Wales. My thoughts are with you in these times.

Within this ecological disaster is a media literacy lesson about informational wars and climate change. As the planet burns, disinformation is being used to shift blame and divert attention from climate change.

I’m in the middle of writing a grant proposal for NSF to develop a climate change curriculum to teach to pre-service teachers. I’d love your thoughts on this.

Live Your Best Life—On and Off Your Phone—in 2020

Technology has changed us, robbed us of something important, and we must get it back. It’s your devices versus your best life.

This post from Arielle Pardes at Wired shares a great list of books to help as you consider your life in a tech-saturated world. These texts provide some good advice on “rewriting bad habits, reviewing the devices we actually need, and relearning how to listen amid all the noise.”

Make

How to Use the New Creative Commons Chrome Extension

I use Creative Commons (CC) licensed images in all of my work. My process involves a complex mixture of searching on Flickr while ensuring that I use images that I have permission to include in my work.

Richard Byrne shares guidance on the new Chrome extension from CC to help expedite that process.

Consider

enter image description here

Under observation, we act less free, which means we effectively are less free.

Edward Snowden

digilit banner

Digitally Literate is a synthesis of the important things I find as I critically consume and then curate as I work online. I leave my notes behind of everything that catches my eye, and then pull together the important stuff here in a weekly digest.

Neil Peart, the best drummer of all time, passed away as I was finishing this week’s issue. Take five minutes to witness some of his brilliance here.

Feel free to say hello at hello@digitallyliterate.net or on the social network of your choice.

Digitally Literate #222

WELCOME

A Broken World
Digitally Lit #222 – 11/16/2019

Hi all, welcome to issue #222 of Digitally Literate, thanks for stopping by. Please subscribe if you would like this to show up in your email inbox.

This week I posted the following:

Watch

Lil Buck with Icons of Modern Art (3:56)

Do yourself a favor. Put on a pair of headphones and watch this short clip.

It should provide an amuse bouche as we begin to dig into the news for the week.

Read

I’m the Google whistleblower. The medical data of millions of Americans is at risk

A whistleblower who works in Project Nightingale, the secret transfer of the personal medical data of up to 50 million Americans from one of the largest healthcare providers in the US to Google, has expressed anger that patients are being kept in the dark about the massive deal.

The secret scheme was first reported by the Wall Street Journal, involves the transfer to Google of healthcare data held by Ascension, the second-largest healthcare provider in the US. The data is being transferred with full personal details including name and medical history and can be accessed by Google staff. Unlike other similar efforts it has not been made anonymous though a process of removing personal information known as de-identification.

So the Internet didn’t turn out the way we hoped. Where do we go from here?

The Internet hasn’t lived up to all our dreams for it.

But it also may not conform to the nightmares (of misinformation, of alienation, of exploitation) that so many people spin around it now.

…after decades of imagining it as a utopia, and then a few years of seeing it as a dystopia — we might finally begin to see it for what it is, which is a set of powerful technologies in the midst of some serious flux.

Definitely check out this interactive piece from the NY Times.

The Captured City

Jathan Sadowski with an excellent look at how smart technologies are being used to make surveillance and infrastructure indistinguishable from one another.

The ‘smart city’ is not a coherent concept, let alone an actually existing entity. It’s better understood as a misleading euphemism for a corporately controlled urban future. The phrase itself is part of the ideological infrastructure it requires.

I also recommend checking out this piece from John Torpey in Forbes. Torpey connects the dots between surveillance communism to surveillance capitalism and beyond. Keep this in mind given the news I shared about Google acquiring FitBit.

Surveillance capitalism, less overtly intrusive, makes our online activities a source of data that private firms harvest for their profit. Self-surveillance, finally, transforms our daily activities into a source of data that we train on ourselves.

Education, privacy, and big data algorithms: Taking the persons out of personalized learning

Results of a literature review on philanthropy in education from Priscilla M. Regan and Valerie Steeves in First Monday.

The research examines the impact of philanthropy by technology company foundations (e.g., Bill and Melinda Gates Foundation, Chan Zuckerberg Initiative) and education magazines have on personalized learning, while paying special attention to issues of privacy.

Findings suggest competing discourses on personalized learning revolve around contested meanings about the type of expertise needed for twenty-first century learning, what self-directed learning should look like, whether education is about process or content, and the type of evidence that is required to establish whether or not personalized learning leads to better student outcomes.

Privacy issues remain a hot spot of conflict between the desire for more efficient outcomes at the expense of “student privacy and the social construction of and expectations about data and surveillance.”

‘I am a scavenger’: The desperate things teachers do to get the classroom supplies they need

The Washington Post asked teachers throughout the country how much they spend on supplies, what they buy and why. Teachers — mostly in public school districts but also in charter, private and Catholic schools — sent more than 1,200 emails to The Post from more than 35 states. The portrait that emerges is devastating — and reveals that the problem has existed, without remedy, for decades. And it has gotten worse over time.

In a related story, this piece by Jon Marcus highlights the fact that funding for institutions of higher ed has regularly declined over the last decade.

Our system is broken. We are not investing in our future.

Make

Entire YouTube Studio Setup on ONE DESK (12:09)

I’ve been rebuilding my office and will have some updates coming soon. One thing I’ve been investigating is setting up an easy setup to record video from my desk.

This setup from the DSLR Video Shooter YouTube channel looks great.

Consider

enter image description here

We came into a broken world. And we’re the cleanup crew.

Kanye West

digilit banner

Digitally Literate is a synthesis of the important things I find as I surf, skim, & scan the Internet each week. I take notes of everything that piques my interest, and then pull together the important stuff here in a weekly digest.

This I enjoyed listening to this interview of Noam Chomsky by Zack de la Rocha while finishing up the newsletter.

Feel free to say hello at hello@digitallyliterate.net or on the social network of your choice.

The Messy Reality of Algorithmic Culture

danah boyd argues that we need to develop more sophisticated ways of thinking about technology before jumping to hype and fear.

Data-driven and algorithmic systems increasingly underpin many decision-making systems, shaping where law enforcement are stationed and what news you are shown on social media. The design of these systems is inscribed with organizational and cultural values. Often, these systems depend on the behavior of everyday people, who may not act as expected. Meanwhile, adversarial actors also seek to manipulate the data upon which these systems are built for personal, political, and economic reasons. In this talk, danah will unpack some of the unique cultural challenges presented by “big data” and machine learning, raising critical questions about fairness and accountability.

Video of the talk, with the slide deck off to the side is available here.

The Devastating Consequences of Being Poor in the Digital Age

Mary Madden discussing the privacy and security violations that occur in our increasingly digitized society. This is increasingly true for marginalized and vulnerable populations.

The poor experience these two extremes — hypervisibility and invisibility — while often lacking the agency or resources to challenge unfair outcomes.

Madden draws on work that focused on privacy perceptions in the Post-Snowden Era, as well as how these experiences and resources vary by socioeconomic status, race, & ethnicity.

The story of income inequality and differential surveillance practices in America is also deeply intertwined with the history of racial inequalities. In addition to understanding the differing concerns of economically marginalized groups, it’s critical to understand how different racial and ethnic groups experience privacy. 

Madden closes with some important questions about the data ecosystem for all groups, especially as we seek to integrate marginalized and low-income communities. Specifically, their research that suggests that low-income Americans, specifically “foreign-born Hispanic adults, are disproportionately reliant on mobile devices as their primary source of internet access.”

This has me thinking about how data collection and algorithms unfairly impact access and use to the Internet in and across these groups.

SOURCE: The New York Times

The Age of Cultured Machines

The Age of Cultured Machines (SAPIENS)

Two robots traverse the desert floor. Explosions from a decades-old conflict have left a pockmarked and unstable territory, though many more improvised bombs lie concealed in its vast reaches. Sunlight splays off the beaten edges of Optimus, the smaller robot. Its motors whir as its claw grasps an u…

From Sapiens.

This imaginary scene shows the power of learning from others. Anthropologists and zoologists call this “social learning”: picking up new information by observing or interacting with others and the things others produce. Social learning is rife among humans and across the wider animal kingdom. As we discussed in our previous post, learning socially is fundamental to how humans become fully rounded people, in all our diversity, creativity, and splendor.

 

If we didn’t have social learning, we wouldn’t have culture. As zoologists Kevin Laland and Will Hoppitt argue, “culture is built upon socially learned and socially transmitted information.” Socially acquired knowledge is distinct from what we learn individually and from information inherited through genes or through imitation.

 

Soon we might add robots to this list. While our fanciful desert scene of robots teaching each other how to defuse bombs lies in the distant future, robots are beginning to learn socially. If one day robots start to develop and share knowledge independently of humans, might that be the seed for robot culture?

 

This system of demonstrating tasks to one robot that can then transfer its skills to other robots with different body shapes, strengths, and constraints might just be the first step toward independent social learning in robots. From there, we might be on the road to creating cultured robots.

The post does a good job (IMHO) of connecting social learning to machine learning. The one loose thread they dangle is the one of culture. I don’t think I’m ready to frame this robot culture. I think there is more involved in culture.
It also seems to be obvious that the robot does not have motivation. It’s motivation lies in the activities of human beings in which it is included. The human participants in such activities have that motivation within themselves though, and consequently they have motivations. The robot has no more motivation than a hammer has motivation.

All annotations in the source.

How ‘Googling it’ can send conservatives down secret rabbit holes of alternative facts

How ‘Googling it’ can send conservatives down secret rabbit holes of alternative facts (washingtonpost.com)
We saw some of this happening in earlier research on online reading comprehension. Specifically, I had concerns about how algorithms might impact, shape, or modify what we’re looking for.

“Googling it” has become the news equivalent of “do your own research.” But neither Google, nor search terms, are purely neutral. “Even in the face of research and due diligence,” Tripodi wrote in her study, “voters can walk away from Google armed with alternative news and alternative facts.”

“When you search for something on Google, our goal is to provide you with results that are both authoritative and relevant to the query you have typed. This is why when you change your query and use different words, you may get different Search results. However, irrespective of your query, we continue to be committed to providing you the information you need to form your own opinions by surfacing a diversity of sources on our Search results pages,” Google said in a statement.

Tripodi coined the phrase “scriptural inference” to describe the search tactics she observed in her field study, because the method by which the people she observed picked Google search terms was a lot like the research methods taught in Bible study classes. “The conservatives I observed all hold the belief that certain fundamental truths exist, and they critically interrogate media messages in the same way they approach the Bible, focusing on specific passages and comparing what they read, see, and hear to their lived experiences,” Tripodi wrote in the study.

Algorithmic Accountability: A Primer

Algorithmic Accountability: A Primer (Data & Society)

Algorithmic Accountability examines the process of assigning responsibility for harm when algorithmic decision-making results in discriminatory and inequitable outcomes. The primer–originally prepared for the Progressive Congressional Caucus’ Tech Algorithm Briefing–explores the trade-offs debate

Big decisions about people’s lives are increasingly made by software systems and algorithms. This primer explores issues of algorithmic accountability, or the process of assigning responsibility for harm when algorithmic decision-making results in discriminatory and inequitable outcomes.
There are few consumer or civil rights protections that limit the types of data used to build data profiles or that require the auditing of algorithmic decision-making, even though algorithmic systems can make decisions on the basis of protected attributes like race, income, or gender.
This brief explores the trade-offs between and debates about algorithms and accountability across several key ethical dimensions, including:
  • Fairness and bias;
  • Opacity and transparency;
  • The repurposing of data and algorithms;
  • Lack of standards for auditing
  • Power and control; and
  • Trust and expertise.