TLDR 115

Too Long; Didn't Read Issue 115

Published: 2017-09-16 • 📧 Newsletter

Welcome to issue 115 of TLDR. You see the picture changing everything you've heard.

This week I sent out the following:


🔖 Key Takeaways


📺 Watch

This video was created by Wit & Wisdom and shared by Greenpeace...and it ultimately made it to my eyes thanks to Laura Hilliger.

This week we saw the launch of the new iPhone to much fanfare in a brand new, "economically friendly" Apple campus and Steve Jobs Theater.

As we're all lining up to purchase the "brand new" and "greater than before"....we need to think about the overall effects of this machine we've helped build.


📚 Read

This week two important Pew Research Reports were released.

The first report documented that two-thirds (67%) of Americans report that they get at least some of their news on social media. Twitter, YouTube, and Snapchat have grown in their usage and uptake. We can connect these results to research I've shared previously on TL;DR that indicates that most people get their info on social media, but view social media as the "source" and don't evaluate or question the source, bias, or perspective for this info. They just believe it to be true.

The second Pew Report is the link and image I shared above. The research outlines five distinct groups in a "information-engagement typology":

This typology is a very important metric to use as we examine our information seeking/consumption habits online. I ask that you consider which group you fit within. I also ask you to consider how this impacts how you create and share content online.


I didn't share this news about Equifax last week as I wanted to get some perspective and actionable advice to share. Sadly, a week later...I don't have any. In this post, Cory Doctorow does a good job of setting the scene for what happened.

Basically, Equifax was sloppy and failed to patch a two-month-old bug that led to the breach. Equifax waited five weeks to be transparent with their customers and admit to the breach. In the meantime, executives sold their stock to cash in before the public was informed. Brian Krebs also has an excellent post on exactly how this happened.

Here's what you need to know. First, breaches and hacks like this will happen. To address this, the best thing you can do is remain vigilant. Use robust passwords, and more importantly, be prepared to change them frequently. Second, if/when a breach happens, we need to require that companies are transparent and let us know a breach has occurred. That allows us to change passwords, habits, and prepare. I relate this to the hurricane that just passed through my area. I was informed that it was on the way. My family prepared and now we're still monitoring the impact after it has passed. Third, if a company plays it "fast and loose" with your data, you have opportunities to chose other companies, or build your own tools. The problem is that Equifax is one of the three big credit service that you have to use...you don't have a choice. For that...we need to really think through our relationships with these companies.

Finally, I know that it can be a challenge to read through and understand the nuances in privacy, security, and encryption. As a user of the Internet, you need to begin to understand these complexities. That means we all need to read and inform ourselves.


This report from Jisc (formerly the Joint Information Systems Committee) surveyed 22,000 students across 74 organizations in the UK, and 10 international universities. The results suggest that students want more digitally savvy instructors that will include more meaningful use of tech. They also want to have technology embedded into instruction to prepare them for the future workforce.

As Stephen Downes notes, the problem with surveys and this population is that they're surveying students. I've had surveys like this in the past where we canvas students, alumni, or employers to have them evaluate the program, and more specifically technology instruction. The results usually suggest that use of technology in instruction is subpar. My problem with this is that we're not really looking to expand our use of technology, or think about agile or entrepreneurial uses of tech in instruction. I believe that these results are valid and "true"...but we knew that already. We need to think about new uses of tech, and not what we needed a decade ago.

Access the full digital student tracker report (pdf) and the accompanying briefing booklet (pdf).


My thinking about technology has been changing drastically over the last two weeks and it is primarily evidenced in this link, and the next link about Reddit. I urge you all to think deeply about many of these forces at play, and think about how you feel about the consequences.

One of the subthreads in the outcomes of the results of the 2016 U.S. elections is that we've put a magnifying glass to the digital businesses that provide us services and spaces online. Recent reporting extends this examination to suggest that this is part of a new theory of war using our own information usage against us. As we've discussed here in TL;DR, we regularly believe the information we read online without questioning or evaluating it. This is also problematic given that trolls, bots, and filter bubbles are acting against us. Finally, we need to accept the fact that these digital services are helping spread hate and misinformation for profit, even though Facebook (for one) doesn't want to admit it.

My thinking is slowly, steadily changing. What do you think?


After a lot of bad press, and to much controversy, Reddit began to clean up many of their "hateful" and offensive communities in 2015.

Researchers from the Georgia Institute of Technology took a look at the effectiveness of this strategy. (For the supergeeks out there, they compiled this dataset with PushShift.io.) The results suggest that the practices of quarantining and banning hateful and offensive speech on the site generally worked. They conclude:

For the definition of "work" framed by our research questions, the ban worked for Reddit. It succeeded at both a user level and a community level. Through the banning of subreddits which engaged in racism and fat-shaming, Reddit was able to reduce the prevalence of such behavior on the site.

This has me wondering. Given the storyline I've (hopefully) drawn out in this week's issue, what sort of discourse practices do we want in these online social environments? How do we address this when we encounter it? Finally, is there a "terms of use" or "terms of service" for freedom of speech?


🔨 Do

This post from Aaron Davis details his thoughts about his current writing workflow, and iterations over time. This post is motivated primarily by Doug Belshaw's recent series on how to blog.

Davis describes in detail the texts and tools he's used over time, and the challenges with each one. His current system involves the use of Trello, Markdown, and Google Docs. I think I need to set up a Hangout and record it with Aaron to try and teach me to finally get on board with Markdown.

My workflow has been changing quite a bit and I'm meaning to get out a post detailing this. One of the primary reasons is that I have a new crop of students starting up websites and blogging. I want to give them ideas to scaffold their work.

As Aaron asks in the end of his post, what workflow do you use as you write?


🤔 Consider

"It is the mark of an educated mind to be able to entertain a thought without accepting it." — Aristotle


Previous: TLDR 114Next: TLDR 116Archive: 📧 Newsletter

🌱 Connected Concepts:


Part of the 📧 Newsletter archive documenting digital literacy and technology.