Social media misinformation theory draws on classic tragedies, platform algorithms

A man with a broken ankle considers the wisdom of misinformation and social media.

Tales of people reading on social media about suspect, off-label uses of ivermectin to ward off COVID-19, ingesting the livestock dewormer and then suffering gastrointestinal distress might seem like the pinnacle of the 2021 zeitgeist. According to a new theory by two College of Business computer information systems faculty, it’s actually a tale as old as storytelling itself.

Drawing on literature’s roots in Greek and Shakespearean tragedy and philosophical explorations of the nature of truth, Nick Roberts and Hamed Qahri-Saremi advance a theory that looks to explain how misinformation on social media platforms can lead people to take real-world actions with disastrous consequences. In “Tragedy, Truth, and Technology: The 3T Theory of Social Media-Driven Misinformation,” the pair maps how social media users struggle to evaluate claims. Drawing parallels to Shakespeare’s Othello and Greek tragic tradition, they illustrate how social media algorithms are performing similar roles to those that villains take in literary tragedy.

“I have an undergrad degree in English, and so that’s where I was coming from,” Roberts said. “Hamed and I were talking about these crazy stories about fake news, and I said, ‘This sounds like a tragedy.’ These people, these crazy things don’t sound realistic. Maybe we could apply tragedy to this and go from there.”


Nick Roberts and Hamed Qahri-Saremi
Nick Roberts, Associate Professor of Computer Information Systems (left), and Hamed Qahri-Saremi, Assistant Professor of Computer Information Systems

“Tragedy, Truth, and Technology: The 3T Theory of Social Media-Driven Misinformation”
Nicholas Roberts, Hamed Qahri-Saremi
Journal of the Association for Information Systems

Their theory, which will be published in the Journal of the Association for Information Systems, opens new avenues of research and ideas for platform developers that could curtail the real-world impacts of misinformation.

Doubt and the allure of fake news

The 3T Theory presents a roadmap for how social media users interact with claims made on social platforms from their first exposure to them. Initially, users evaluate the claim’s truthfulness, measuring the claim against facts they can observe themselves or by how well it fits within frameworks of existing knowledge or worldviews. Often, they’ll use both methods to gauge its truthfulness.

Overreliance on existing worldviews as a yardstick for truth can lead to confirmation bias, where users accept a claim as fact simply because it slots neatly into their ideas about the world. Alternately, when users struggle to accept a claim as truthful or untruthful, they enter what researchers term the dialectic stage where they attempt to determine its veracity.

Social media leverages cognitive dissonance and hijacks the dialectic phase, complicating users’ evaluation of a claim.

“We think that social media can have a larger influence on the users if it’s a false claim,” Qahri-Saremi said. “It can lead them toward a maladaptive or distorted decision-making phase.”

Algorithms and the repetition of misinformation

By engineering, social media algorithms strive to keep users engaged as long as possible, extending the amount of advertising they view. To do this, platforms provide users content with which they’re most likely to interact, using a multitude of criteria such as whether individuals clicked on, “liked” or commented on similar stories, how similar users reacted to posts on the same topic and how a user’s network interacts with comparable content.

Although these algorithms fill users’ feeds with posts irrespective of their content, the 3T Theory posits that when users are evaluating a claim’s truthfulness, algorithms mirror the behavior of villains in classic tragedies, continuing to feed them with misinformation.

Two features common across platforms work to amplify the influence of misinformation among users. First, algorithms use engagement with a post to include similar content in a feed. For example, users who comment or click on flat-earth misinformation will see more of it in their feeds. Repetition of an untruth makes it seem more believable.

“When you see something over and over again, the repetition makes it look like a fact,” Qahri-Saremi said. “You see things on your feed from different sources or even the same sources. You have different friends sharing it again and again and again, which is really a unique characteristic of these algorithms, that essentially increases the power of those sort of false claims.”

Secondly, the interactions baked into the platform further confuse efforts to evaluate a claim’s truthfulness. Users who see that a large number of others have “liked” a post or positively commented on it are more likely to view it as credible.

Both aspects of social media reinforce misinformation, eventually leading a user to believe it.

Real consequences of fake news

Once users accept a false claim as truth, it’s inevitable that some of them will act on that misbelief. Unsurprisingly, misinformed decisions often result in detrimental actions – think of the person who made themselves ill after mistakenly taking ivermectin. Roberts and Qahri-Saremi cite instances that range from a man accused of murdering his children after thinking they contained “reptile DNA” to users overloading a human trafficking helpline with tips relating to a sex-ring hoax about the online retailer Wayfair.

Roberts and Qahri-Saremi see parallels between the consequences of accepting a mistruth encountered on social media with the elements of classical tragedy, where characters evolve to act in detrimental ways, be it Othello murdering his wife on false claims or Romeo committing suicide because of Juliet’s ruse.

“One thing that was fascinating to me was that the whole process does seem very similar to what you see in tragedy, that the person changes and then does some really crazy things that you cannot believe it,” Qahri-Saremi said.

Changing the tragedy narrative

There will always be false claims that lead people to make detrimental decisions, but in the age of social media, Qahri-Saremi and Roberts’ theory offers social media platforms options to fight misinformation and limit their ability to alter users’ perceptions of reality. Strategies the pair suggest in the theory include displaying the number of dislikes on a piece of content to negate any social-norming influence seemingly popular stories may have on users or limiting algorithms from sharing disputed claims.

“It’s a long road and a long journey, but there are things social media platforms can do to tweak their interfaces in ways that can reduce the diffusion and effects of misinformation,” Roberts said.

The College of Business at Colorado State University is focused on using business to create a better world.

As an AACSB-accredited business school, the College is among the top five percent of business colleges worldwide, providing programs and career support services to more than 2,500 undergraduate and 1,300 graduate students. Faculty help students across our top-ranked on-campus and online programs develop the knowledge, skills and values to navigate a rapidly evolving business world and address global challenges with sustainable business solutions. Our students are known for their creativity, work ethic and resilience—resulting in an undergraduate job offer and placement rate of over 90% within 90 days of graduation.

The College’s highly ranked programs include its Online MBA, which has been recognized as the No. 1 program in Colorado for five years running by U.S. News and World Report and achieved No. 16 for employability worldwide from QS Quacquarelli Symonds. The College’s Impact MBA is also ranked by Corporate Knights as a Top 20 “Better World MBA” worldwide.