In this update I want to highlight ten key articles added to the database in September:
- How three conspiracy theorists took ‘Q’ and sparked Qanon
- Blocking information on COVID-19 can fuel the spread of misinformation
- As wildfires rage, false antifa rumors spur pleas from police
- Reporting for Duty: How A Network of Pakistan-Based Accounts Leveraged Mass Reporting to Silence Critics
- Here are the QAnon supporters running for Congress in 2020
- Trump amplified a bogus QAnon misinformation campaign against the CDC that is now flooding social media
- News outlets give Trump the headlines he wants for vaporware drug executive order
- Facebook’s big QAnon crackdown might have come too late
- Unlicensed medical ‘cures’ are flourishing in closed Facebook groups, where cancer treatments — and even surgery — are sold beyond the reach of the law
- OF VIRALITY AND VIRUSES: THE ANTI-VACCINE MOVEMENT AND SOCIAL MEDIA
Why it’s worth reading: This article provides a fascinating insight into the origin story of QAnon, and how it originally gained traction. It also looks at how three people established QAnon communities on different sites, and highlights evidence that they control the “Q” account.
Key quotes: “Before Q, there was a wide variety of “anon” 4chan posters all claiming to have special government access. In 2016, there was FBIAnon, a self-described “high-level analyst and strategist” offering intel about the 2016 investigation into the Clinton Foundation. Then came HLIAnon, an acronym for High Level Insider, who posted about various dubious conspiracies in riddles, including one that claimed Princess Diana had been killed because she found out about 9/11 “beforehand” and had “tried to stop it.” Then “CIAAnon” and “CIA Intern” took to the boards in early 2017, and last August one called WH Insider Anon offered a supposed preview that something that was “going to go down” regarding the DNC and leaks.”
“One archived livestream appears to show Rogers logging into the 8chan account of “Q.”The Patriots’ Soapbox feed quickly cuts out after the login attempt. “Sorry, leg cramp,” Rogers says, before the feed reappears seconds later.
Users in the associated chatroom begin to wonder if Rogers had accidentally revealed his identity as Q. “How did you post as Q?” one user wrote.
In another livestreamed video, Rogers begins to analyze a supposed “Q” post on his livestream program when his co-host points out that the post in question doesn’t actually appear on Q’s feed and was authored anonymously. Rogers’ explanation — that Q must have forgotten to sign in before posting — was criticized as extremely unlikely by people familiar with the message boards, as it would require knowledge of the posting to pick it out among hundreds of other anonymous ones.”
Why is it worth reading: This article highlights the negative consequences of being unable to trust government sources in an “infodemic”; the information black hole is filled by mis and disinformation.
Key quotes: “The epidemic began with a poignant example of potential life-saving information suppressed as a rumour. On 30 December, Li Wenliang, a young ophthalmologist in Wuhan, China, posted a message to colleagues that tried to call attention to a severe acute respiratory syndrome (SARS)-like illness that was brewing in his hospital. The Chinese government abruptly deleted the post, accusing Li of rumour-mongering. On 7 February, he died of COVID-19.”
Why is it worth reading: There’s a great analysis of how the disinformation was amplified, including the following quote;
Key quotes: “Some users responding to Romero’s posts said they were sent there by Q and pushed false QAnon talking points that the fires were part of an elaborate political plot. According to data gleaned from the disinformation analytics tool Hoaxy, most of the traffic to Romero’s post came after Q’s post, almost a full day after Romero had posted it.”
““Rumors spread just like wildfire and now our 9-1-1 dispatchers and professional staff are being overrun with requests for information and inquiries on an UNTRUE rumor that 6 Antifa members have been arrested for setting fires in DOUGLAS COUNTY, OREGON,” the Douglas County Sheriff’s Office wrote in a Facebook post on Thursday.”
Reporting for Duty: How A Network of Pakistan-Based Accounts Leveraged Mass Reporting to Silence Critics
Why it’s worth reading: This report provides a super deep dive into the operationalisation of co-ordinated reporting via Facebook groups.
It also identifies a dilemma about whether this was genuinely inauthentic activity; some of the brigaded pages were breaking Facebook’s ToS – so were rightfully reported. That said the co-ordinated reporting appeared to have ulterior motives, so I think it’s worth including here.
Key quotes: “Group instructions specified that individuals should only use fake accounts when reporting, and that each member must have at least two fake accounts.”
“Administrators of the more exclusive private Groups would then push reporting requests to public Groups, and include instructions on how to do so quickly. It appears they often reported accounts for allegedly using hate speech or being a fake account.”
“A post from a suspended account to the Group “Pakistan of all of us” … “provides shortened links that take the user directly to Facebook’s reporting site. In this case, users are encouraged to report many different parts of the same account (e.g., profile photos, cover photos). The video shows users how to quickly open many URLs at once. The text says, in part, “please report this arrogant page as much as possible.” The targeted Page expresses pride in being an atheist.”
“It does not appear that ordinary individuals who helped to report accounts received any concrete benefits; they appear to have reported out of a feeling of patriotism and service to Pakistan, and service to Islam.”
Why it’s worth reading: This is definitely the article which made me feel most genuinely worried about the state of the world. The article provides a detailed list of every QAnon supporter running for Congress, and details why they have been identified as a QAnon supporter. It regularly updates with new information, so will be a great resource to refer back to if you.
Key quote: “According to NBC News, Cruz believes some of the “Q” posts are “valid information,” saying, “I think that the biggest thing with QAnon is there’s information coming out. And sometimes it is in line with what’s going on in government.” She also told NBC that she believes “there is someone out there putting information on the internet” as part of QAnon, adding that “a conspiracy theory only sounds crazy until it’s proven.””
Trump amplified a bogus QAnon misinformation campaign against the CDC that is now flooding social media
Why it’s worth reading: This article gave a great insight into the waterfall effect that happens when a public figure amplifies disinformation.
Why it’s worth reading: This article highlights failings in titles of articles in high profile media outlets in accurately representing reality; and this matters given that lots of people just read the headlines. First Draft even introduces its second category of information disorder (False Connection; when headlines don’t match the story) as being prevalent in the mainstream media.
Key quotes: “[T]he editors of major news outlets have not incorporated that reality into their coverage. They still don’t seem to understand that the most important information about a Trump policy announcement is not whether his proposal is a good idea or whether it will be politically advantageous, but whether his administration is actually doing what he claimed it was doing. That makes them easy marks for Trump’s disinformation, which exploits their standard journalistic practice of writing up presidential claims as news.”
Why it’s worth reading: This article has a section which covers how Reddit handled sprouting QAnon communities vs Facebook, and the impact that had. Not many articles focus on the “positioning” stage of the disinformation cycle, so this was particularly interesting.
Key quotes: “Part of the appeal of QAnon is that many of its core messages are written in code, giving it the feel of an augmented reality game. But coded messages are harder to discern, particularly by policy teams that have not invested in unscrambling them. It’s easier to maintain the ironic detachment that defined our early reactions to the Tide Pods challenge — no one actually believes this stuff, right? — than it is to take any kind of preemptive action.
But in 2020 we know what happens when you let a movement fester. We have seen Reddit ignore its most racist forums until they spun out into thriving standalone communities. And we have now seen QAnon, which Facebook was recommending in its group recommendation algorithms until Tuesday, evolve into something like a new religion. Notably, Reddit banned QAnon forums starting in 2018 — long before it even banned hate speech — after the forums were found to be inciting violence.”
Unlicensed medical ‘cures’ are flourishing in closed Facebook groups, where cancer treatments — and even surgery — are sold beyond the reach of the law
Why it’s worth reading: An accessible, heart-wrenching tale of how closed Facebook groups can spread disinformation regarding healthcare that made me want to call my Mum and tell her it really is time to get off of Facebook.
Key quotes: “On secret Facebook groups, she said, people can be persuaded to turn to unproven treatments instead of proven ones.
“They are allowing people to turn their backs on proven treatments to go with quacks, and they are giving them a platform,” Dalmayne said of Facebook. Her conclusion: “They are allowing people to die.””
“Once someone manages to gain access to a private group like Jewell’s — accessible to search but requiring the permission of moderators to join — some are given access to totally secret groups, which are not publicly searchable.
Here, banned medicines are sold via instant messenger.
“Once you’ve been a member of the private group and they trust that you’re not a secret infiltrator or anything, that’s when they refer you and invite you into the secret group — you see people being brought in more into more closed communities,” she said. “
Why is it worth reading: This is an accessible article which provides insight into some of the history of Anti-Vaccine misinformation, and how it has developed with the advent of the internet and social media. The article also helps explain how Anti-Vaccine proponents expertly promote their views on different platforms.
Key quotes: “Zero-cost publishing has been around since the early days of the internet; sites like Geocities and Blogger gave everyone the ability to write and post whatever they wanted online for free. This of course meant that there was misinformation online — including anti-vaccine misinformation. But in the decentralized days of the early blogosphere, that was largely irrelevant; it was very hard to find things, and they didn’t spread very easily. True believers who were looking for the content might find like-minded people, but things didn’t spread very easily.“
“The internet democratized flows of information, making it possible for anyone with a point of view to reach vast numbers of people relatively easily. Anti-vax narratives perform particularly well on social media, where algorithms reward emotionally-engaging personal anecdotes and sensational content — not dry scientific facts.“
“Social platforms and their gameable algorithms have provided a space for the anti-vaccine movement to thrive. Search functions and recommendation engines proactively surface anti-vaccine communities and content. Social networks have profoundly transformed communication, and the anti-vaccine movement is capitalizing on the new infrastructure of speech to amplify its growth and reach new audiences.“
“The anti-vaccine movement is well-funded and technically savvy. They followed the best practices of internet marketers, writing blogs and cross-promoting content and sharing material across all of the new platforms. Social network design choices meant that popularity determined what people saw; even nuanced policy issues began to be run as digital marketing campaigns.”
“Google and Facebook algorithms inadvertently create the illusion of fact and truth out of mere ubiquity; if you make it trend, you make it true.”