129th Block: Pink slime and transparency
Read to the end to debate if UK can be called a failed state or not
This week…
I normally have a pretty sharp BS radar, so when I found out from journalist and epidemiologist Seema Yasmin’s Twitter post that a viral ‘emu girl’ turns out to have a racist past and is potentially starting an avian flu outbreak, I was shocked. I thought they were referring to Kingston-based ‘emu lady’ @useless_farm, whom ‘emu girl’ seemed to have modelled her entire TikTok personality after (ie. an otherwise adorable emu that keeps violently attacking the camera). They turn out to be two different people, and I am glad to report that my ‘emu lady’ remains unproblematic (and her farm animals continue to be useless, just as advertised).
In the process of catching up on this story, I realised that there are many more emus on social media than I would’ve guessed. I am pretty sure if they were to wage a social media war, they would win, based on their Great Emu War track record (Wikipedia). Are they as trainable as military dolphins?
Anyway, on to more important things. Here’s a selection of top stories on my radar, a few personal recommendations, and the chart of the week.
‘Journalistic meat or fraudulent filler’ – What is pink slime journalism?
Pride David for Poynter:
Here’s some background: In the past decade, many local news sites have either gone out of business or are struggling to survive, and questionable sites like the West Cook News have replaced them. Many of the articles on pink slime sites are written by inexperienced writers.
The sites appear to be reliable, but in reality, they’re funded by outside companies that receive financing from a partisan source or one that has an interest in a certain type of coverage — or avoidance of other coverage.
The term “pink slime” was first used to describe a meat byproduct used as “filler,” and [media critic Margaret Sullivan] advises one solution: “Skeptical awareness.” She said readers need to figure out the difference between “journalistic meat and fraudulent filler.”
The hunt for Wikipedia’s disinformation moles
Masha Borak for Wired:
“We can see what’s happening on YouTube and Facebook and Twitter and Telegram, we can see how much effort states are putting into trying to control and maneuver in those spaces,” says Carl Miller, a research director at the CASM under UK public-policy think tank Demos. “There’s nothing to me that suggests that Wikipedia would be immune to as much effort and time and thought as in any of those other areas.”
Governments have good reasons to influence Wikipedia: 1.8 billion unique devices are used to visit Wikimedia Foundation sites each month, and its pages are regularly among the top results for Google searches. Rising distrust in institutions and mainstream media have made sources of reliable information all the more coveted.
“Because of its transparency and auditability, Wikipedia became one of the few places where you can actually build a sense of trust in information,” says Mathieu O’Neil, an associate professor of communication at the University of Canberra in Australia who studies Wikipedia. “Governments and states that want to promote a particularly strategic perspective have every reason to try and be there and kind of try and influence it.”
Proving government intervention, however, has proved difficult, even as some cases have raised suspicion. In 2021, the Wikimedia Foundation banned an “unrecognised group” of seven Wikipedia users from mainland China and revoked administrator access and other privileges for 12 other users over doxing and threats to Hong Kong editors. Speculation of “pro-China infiltration,” however, was never proven.
Regardless, no other platform fact-checks with as much scrutiny and transparency as Wikipedia. Meta and Twitter and TikTok and YouTube can only dream.
All our 2021 corrections and clarifications
Canadaland releases its 2021 transparency report and lists all the corrections and clarifications it made last year with clarity and succinctness barely seen anywhere else. While reputable newsrooms do their best to get information right and unambiguous on the first go, sometimes some things slip through. The rectification process is crucial to maintaining credibility. More of this, please.
What I read, listen, and watch…
I’m reading “Information Warfare and Wikipedia,” the report by the Institute for Strategic Dialogue and the Centre for the Analysis of Social Media mentioned in the second story above.
I’m listening to “Some of Us Are Brave” by Danielle Ponder.
I’m watching The Watcher by Ryan Murphy, based on the 2018 article on The Cut. I may have imagined this, but I read on Twitter that it is Get Out (2017) for white people. It could have also been a tweet that described a different show and not The Watcher. If there’s any chronically online person out there who may have stumbled upon the tweet, could you send it my way so I can attribute it to the right person? I cannot retrace my step to find it, and I normally can, which is why I think it is imagined—because it also sounds like something I would have said.
Reviews, opinion pieces and other stray links:
Even school boards are now experiencing severe political polarisation by Sachin Maharaj, Stephanie Tuters, and Vidya Shah for The Conversation.
Experts grade Facebook, TikTok, Twitter, YouTube on readiness to handle midterm election misinformation by Dam Hee Kim, Anjana Susaria, and Scott Shackelford for The Conversation.
Behind TikTok’s boom: A legion of traumatised, $10-a-day content moderators by Niamh McIntyre, Rosie Bradbury, and Billy Perrigo for TBIJ.
The Synthetic Party is a Danish political party led by an AI by Chloe Xiang for Motherboard.
New Zealand passes plain language bill to jettison jargon by Tess McClure for The Guardian.
Chart of the week
Liz Truss lasted 44 days in office before announcing her resignation becoming the shortest-serving post-war prime minister for the UK, as Martin Armstrong illustrates for Statista.