This week…
African languages get more scientific terms (Sarah Wild, Nature), a brief history of hand gestures (William Park, BBC), and how the co-founder of Snopes plagiarised articles for the fact-checking site (Dean Sterling Jones, Buzzfeed) are the stories that inform the theme for this issue.
Here’s a selection of other stories on my radar, a few personal recommendations, and the chart of the week.
Bad News: Selling the story of disinformation
Joseph Bernstein for Harper’s Magazine:
The Commission on Information Disorder is the latest (and most creepily named) addition to a new field of knowledge production that emerged during the Trump years at the juncture of media, academia, and policy research: Big Disinfo. A kind of EPA for content, it seeks to expose the spread of various sorts of “toxicity” on social-media platforms, the downstream effects of this spread, and the platforms’ clumsy, dishonest, and half-hearted attempts to halt it. As an environmental cleanup project, it presumes a harm model of content consumption. Just as, say, smoking causes cancer, consuming bad information must cause changes in belief or behavior that are bad, by some standard. Otherwise, why care what people read and watch?
Are we really living in an infodemic? Science suggests a different story
Chico Q. Camarco for the University of Oxford:
However, all this hype hides a basic problem: only few have actually made the effort to ask if the ‘infodemic’ concept makes much sense, and if its underlying claims are properly backed up by science. Unfortunately, there is a lot of research to suggest that neither is the case.
This article is based on Autopsy of a metaphor: The origins, use and blind spots of the ‘infodemic’, published in the journal New Media and Society.
It’s hard to be a moral person. Technology is making it harder
Sigal Samuel for Vox:
Multiple studies have suggested that digital technology is shortening our attention spans and making us more distracted. What if it’s also making us less empathetic, less prone to ethical action? What if it’s degrading our capacity for moral attention — the capacity to notice the morally salient features of a given situation so that we can respond appropriately?
‘A lot of people are sleepwalking into it’: the expert raising concerns over AI
Stephanie Wood for The Sydney Morning Herald:
“Data centres are basically the backbone of how AI and large-scale data processing work,” [Kate Crawford] says. We think of artificial intelligence as something floating above us, disembodied, suspended and without earthly costs or consequences; it is frequently represented in imagery as blue tunnels of suspended numbers, illuminated circuit boards or white robots, the stuff of science fiction. But Crawford believes that such imaginaries misdirect people from what is unfolding in the real world, the material world. AI is anything but immaterial; for its very existence, it relies on an earthly and unsustainable supply chain.
“The creation of contemporary AI systems depends on exploiting energy and mineral resources from the planet, cheap labour, and data at scale,” Crawford says. Alongside climate change, she believes AI is the most profound story of our time: omnipresent and pervasive and weighted with potential for exploitation and bias. It is one of the planet’s biggest political, cultural and social shifts for centuries. And, she thinks, “a lot of people are sleepwalking into it”.
Long read; good read.
Afghans scramble to delete digital history, evade biometrics
Rina Chandran for Thomson Reuters Foundation:
After years of a push to digitise databases in the country, and introduce digital identity cards and biometrics for voting, activists warn these technologies can be used to target and attack vulnerable groups.
What I read, watch and listen to…
I’m reading Casey Newton’s newsletter about what Twitter can learn from Tinder to solve its verification issue.
I’m watching computer scientist Hilary Mason explain machine learning in five levels of difficulty for WIRED:
I’m listening to Taylor Owen speaking with Jameel Jaffer for CIGI’s Big Tech on free speech in the digital era.
Chart of the week
Neal Rothschild reports on social media engagements for the IPCC climate report for Axios: