TikTok Conspiracy Theories
My research team is investigating TikTok conspiracy videos, AKA “conspiracytok.” We view these videos as disinformation, but we also analyze them along three axes: power, identity presentation, and evidence. Our goal is to trace a relationship between epistemology and identity, specifically regarding the production of counter-factual knowledge.
I will be presenting this paper at AOIR in Dublin on an amazing Feminist Disinformation Studies panel! We have collected 200 videos and the top 20 comments on each video, and are using qualitative content analysis. I hope to submit this paper to a journal by the end of the semester.
My collaborators on this project are of Courtlyn Pippert, Katie Furl, and Elaine Schnabel.



Social Science One – Ecosystems of Disinformation
In 2018, Deen Freelon, Daniel Kreiss, Shannon McGregor, Megan Squire and I applied for a grant from the ill-fated Social Science One initiative through which Facebook was to make large datasets available to researchers. Two years later, we got it; and four years later, we are still working on it. But I swear it is getting done! I have been running the qualitative team and have taken a general leadership role getting the project finished (which clearly I am not succeeding at very well). Here’s the general summary of the current paper, which has involved a tremendous amount of quant and qual analysis and about eight people:
Using a mixed-methods, chronological, and relational approach, we examine how disinformation differentially spreads through the internet in a hybrid media system We analyze three U.S.-based case studies of disinformation: one targeted to right-leaning audiences, one to left-leaning audiences, and one without a clear ideological orientation. The right-wing case concerns false allegations that high school student David Hogg—present during a mass shooting in Parkland, Florida and subsequent gun rights activist—was a “crisis actor.” The left-wing case is the Steele Dossier’s ostensibly false allegations of Russian kompromat on President Trump (popularly known as the “pee tape”). The non-ideological case is a fabricated story about a man winning the lottery and dumping manure on his bosses’ lawn. Using large Facebook and Twitter datasets as our starting point, we trace the origins and dynamics of disinformation, including the migration of disinformative narratives between media formats, sources, and platforms. While these are unique empirical cases, analytically they reveal dynamics that we can logically generalize to other cases and that can guide further research. We find that disinformation originates from mainstream media, social media, and so-called “fake news” websites, revealing how disinformation is not simply a problem of platforms. Regardless of origin, disinformative narratives reverberate across media formats, taking different forms as they circulate through the media ecosystem. Second, these cases reveal a complex relationship between disinformation and mainstream media, with news media covering disinformation in detail, seemingly sometimes to debunk it, sometimes to amplify it, and sometimes amplifying it while debunking it. This suggests that the “information quality” metric often used to distinguish legitimate news sites from disinformative sources fails to capture the complexity of disinformation and its pervasiveness. Third, we find different dynamics at play in our right- and left-wing disinformation cases. While the right-wing case largely follows the common narrative of disinformation in which fringe sites originate and amplify false content as it moves through mainstream media to broader audiences, the mainstream media played a key role in furthering the “pee tape” narrative through humor, satire, and more traditional news and political coverage. We also find that non-ideological disinformation has enormous uptake and is shared across political lines.
Future Proves Past: The QClock and Conspiratorial Literacy
My collaboration with Will Partin on QAnon continues with an in-depth discussion of this thing:
This, my friends, is the QClock, an instrument that QAnon researchers use to create relationships between Q drops, Trump’s tweets, and providential dates. We have finished this paper and it is off to review! Abstract:
Recent debates in STS consider the relationship between the symmetry principle, right-wing populism, and “post-truth” in Western societies. Our paper contributes a detailed case study of knowledge production in the QAnon conspiracy, which posits that Donald Trump and his allies are working to expose a global cabal of pedophiles. QAnon’s most distinctive feature is online “drops” by purported Trump insider “Q,” which challenge readers to decode them. Adherents consider drops to be infallible and capable of predicting geopolitical events. We draw on non-participant observation to describe how QAnon researchers use the elaborate interpretive instrument the QClock to structure their research; produce the inerrancy of Q’s drops; and transform coincidences into conclusions, legitimizing QAnon overall. These practices constitute a conspiratorial literacy, the ability to sustain the existence of a covert scheme or event, demonstrated through mastery of textual interpretation, the creation of connections between apophenic phenomena, and the production of counterfactual knowledge. This combination of counterfactual conspiracy, knowledge-making community, and right-wing populism make QAnon useful for assessing STS debates about post-truth. Our findings lend credence to the idea that while knowledge-making may indeed always be “otherwise,” it requires considerable effort, infrastructure, and validation structures to be taken up.
Mountains of Evidence: Processual Redpilling as a Sociotechnical Effect of Disinformation
Katie Furl and I have been coding a gigantic corpus (7 million words) of “redpilling accounts” pulled from Gab, Reddit, and Discord, and have finally submitted our first paper from the project. Here is the abstract:
How do people come to believe far-right, extremist, and conspiratorial ideas they encounter online? This paper examines how participants in far-right online communities describe their adoption of “redpill” beliefs, as well as the role of disinformation in these accounts. Applying the sociotechnical theory of media effects, we conduct qualitative content analysis of “redpilling narratives” gathered from Reddit, Gab, and Discord. While many users frame redpilling as a moment of conversion, others portray redpilling as a process, something achieved incrementally through years of community participation and “doing your own research.” In both cases, disinformation presented as evidence and the capacity to determine the veracity of presented evidence play important roles in redpilling oneself and others. By framing their beliefs as the rational and logical results of fully considering a plethora of evidence, redpill adherents are able to justify holding and promoting otherwise indefensible prejudices.The community creation, promotion, and repetition of far-right disinformation, much of which is historical or “scientific” in nature, plays a crucial role in the adoption of far-right beliefs.