TLDR 45

Too Long; Didn't Read Issue 45

Published: 2016-05-13 • 📧 Newsletter

Welcome to issue 45 of the TL;DR Newsletter. This week...don't forget to trust the algorithm. ;)

If you have feedback, questions, or concerns...please feel free to the "reply" button and send me a response. I'd love to hear from you.

In last week's issue, I discussed my field testing of Nuclino. I inadvertently shared the wrong URL in the post. You can sign up for Nuclino and follow them on Twitter. I'm having a meeting with the Nuclino developers this upcoming week to share some things I'd like to see in the platform. Please shoot me a note if you've got some ideas.

This week I worked on the following:


🔖 Key Takeaways


📺 Watch

Excellent video from The School of Life YouTube Channel. Perfect advice on the art of listening. As they point out in the start of the video...people suggest that we should all be a good listener...but no one taught us how.

Follow The School of Life on Facebook or Twitter depending on which you prefer.


📚 Read

Incredible post from Andrew Sullivan in NY Mag.

What interests me about the post is that it presents a pretty compelling case (IMHO) about the disruptive potential for the Internet and other communication technologies on institutions of power. New regimes and groups are given a microphone to speak...where we like it or not. Imagine a possible future in which brick and stone do not identify a specific group or affiliation. Rather, we could be identified as a community, or citizens based on ideologies and affiliations.

Fascinating piece that I'd love to get a group of people to discuss. The challenge is that the current political climate and dystopian election campaign here in the U. S. sometimes makes discussion untenable.


Over the past couple of weeks, many people have been shocked (once again) that Facebook and other companies are adjusting our feeds online. Twitter recently made headlines when they announced that they would start "aggregating" the info they present you. Facebook and Google have been questioned by the government in the U.S. and globally to better understand how and why some news is privileged over other information. This identification of "editorial judgement" in the news feed is of high interest as the elections are red hot in the U.S.

Please keep in mind that news feeds and search results are determined by algorithms. Data is collected every time you select a link, respond to a comment, or scan your cursor over a spot on a page. Data is always being collected, and has been for years. This information is used to modify online content, improve the algorithm, improve the feed.

Please also keep in mind that when we ask companies (e.g., Facebook) to explain the process, they indicate that it's the algorithms that are making the decisions. It is human beings that are writing the code that makes the algorithms function. They're constantly tweaking things to getter better user response. When Facebook notices that you're not sharing as much as you used to...things change...and you start sharing.

Finally, as more and more content floods online, it's impossible for us to negotiate this firehose. As a result, the feed is dying. We need these algorithms to tell us what to read...and what to ignore. That...and individuals that curate things into newsletters. ;)


One month ago I wrote up a post about bots (or artificial intelligence) acting as a teaching assistant in online or hybrid classes. I discussed my work, and thinking about these possibilities in the post, and received a decent amount of positive and negative feedback from the post.

This post from The Wall Street Journal discusses the use of IBM's Watson to provide the brains behind Ms. Jill Watson, a teaching assistant in graduate classes at the Georgia Institute of Technology. They're using Ms. Watson (an AI bot) to answer some of the low level questions and provide feedback for student work. The post goes on to discuss the role of "bots" in other fields and purposes.

As more information floods into online spaces, perhaps it's not such a bad idea to use some "algorithms" to handle low level discussions and tasks. The problem arises when my email bot is communicating with your email bot and neither of us knows. :)


A guest post from Sarah Guminksi in Scientific American examining the motivation that fuels the volunteers that edit Wikipedia.

I'm fascinated by the discourse that happens on Wikipedia...but not on the pages for individual articles. You need to click on the "Talk" link for a page to see the real "dialogue" around a topic. The Talk link for each page is up to the left on Wikipedia articles. A fascinating read is the "talk" pages for the Gamergate page.

The piece by Guminski shares research that identifies that the common element that binds this volunteerism is the community that is built in the endeavor.


There's been a lot of fascinating news and research lately about the brain and how it is wired. Some of this recent news from neuroscience talks about the way that our brains "take out the trash."

We learn in basic educational psychology classes that as we learn new things (and multitask) we strengthen the neural connections (i.e. myelination). Through our daily cognitive efforts, we also forget and break down old pathways that we no longer regularly use (i.e., synaptic pruning). Sadly, after my doctoral program I cannot remember a single Simpsons or Hogan's Heroes episode. :(

This post describes the brain as a garden, and the role of the glial cell as the "gardener" of the brain as it pulls up weeds, kills pests, and prunes. The microglial cells pay attention to proteins to know what to prune, and what to keep.

All of this cleaning and taking out the trash happens when you sleep and nap. You also have some choice in what your brain decides to keep and delete. Whatever you spend time focusing on will be privileged over other information and save itself from the chopping block. This is possibly why I know too much about psychometrics and not enough about Springfield.


🔨 Do

When I was teaching middle school, one of our school initiatives was to have two hours set aside each week for informal learning activities. During this "advisory" period, I would play a stock market game with the students to build their financial literacy.

I would definitely play this "capture the flag" game with students to build their web literacies.


🤔 Consider

"The first principle is that you must not fool yourself and you are the easiest person to fool." — Richard Feynman

This week: trust the algorithm.

School of Life teaches us how to listen - no one taught us how. Andrew Sullivan on tyranny and democracy - I'd love to discuss but the dystopian election campaign makes it untenable. Facebook bias reveals humans write the algorithms - we need them to tell us what to read and what to ignore. That...and individuals that curate things into newsletters. Jill Watson AI teaching assistant at Georgia Tech - the problem arises when my email bot talks to your email bot. Wikipedia volunteers bound by community. Your brain's delete button prunes while you sleep - I know too much about psychometrics and not enough about Springfield. Facebook's Capture the Flag builds web literacies.

The first principle is that you must not fool yourself and you are the easiest person to fool.


Previous: TLDR 44Next: TLDR 46Archive: 📧 Newsletter

🌱 Connected Concepts:


Part of the 📧 Newsletter archive documenting digital literacy and technology.