Faculty paper inaugurates FSU participation in global extremism project
Extremism and polarization are much in the news in recent years as issues on the rise in the U.S. and abroad. Attention has been focused on social media and news commentary as a breeding ground for extremist thought and activity, resulting in calls for moderating, regulating and limiting access to our most popular social platforms, particularly in the wake of the January 6 insurrection at the U.S. Capitol.
A recently published article by Professor of Sociology Deana Rohlinger, however, states an alternative position right up front in its title. In “We Cannot Just Moderate Extremism Away,” published in Items: Insights from the Social Sciences on March 9, Rohlinger cites her research to argue that “stronger moderation policies alone would fail to account for the many ways that users express political beliefs through online forums.”*
The article is the first to be published in the “Extremism Online” series launched by Items, a digital publication of the Social Science Research Council (SSRC). Rohlinger is one of 13 scholars throughout the world chosen by SSRC to participate in an ongoing project to expand the available academic scholarship on extremism, radicalization and related topics. She has been working with a team of FSU graduate and undergraduate students to systematically assess the characteristics of political expression online and whether moderation might affect how individuals express their political identities and views.
The first phase of the project examined thousands of posts in moderated comment sections accompanying news stories about Brett Kavanaugh’s US Supreme Court nomination and the sexual assault accusations made against him in 2018. The team analyzed the comment sections in right-leaning outlets (FOX News, Breitbart, Daily Caller and Gateway Pundit), left-leaning outlets (MSNBC News, HuffPost, Daily Kos and Raw Story) and more mainstream outlets (USA Today, New York Times, Washington Post and Washington Times).
The study found that political polarization and extremism are not being moderated away.
“Moderation is nebulous territory, in part, because it involves censoring thoughts and ideas that are regarded as bad for a community,” Rohlinger writes. “The problem is that good and bad are not objective categories. Outlets and forum users negotiate their meanings, mutually constituting what is clearly acceptable and unacceptable within an online community.”
The situation is complicated further by user names and profile pictures that amplify an individual’s political expression. This is particularly true in right-leaning forums, according to Rohlinger, where users incorporate not only variants of “conservative” but more incendiary names like “libsrnazi” and images such as the so-called Betsy Ross flag, which has been associated with the extreme right.
Some forums, such as those maintained by Breitbart, claim to enforce community standards against commentary that is “false, misleading, libelous, slanderous, defamatory, obscene, abusive, hateful, or sexually-explicit.” But as Rohlinger points out, on an open forum dedicated to giving voice to a range of opinions, what constitutes some of those categories is not necessarily clear.
Rohlinger draws from her extensive research in recent years on political expression around such diverse issues as the Terri Schiavo right-to-die case in 2005 and the more recent debate over gun control following the shootings at Parkland High School in Florida to illustrate the need for more critical thinking on individual agency and how it effects extremist expression on moderated forums.
“If researchers want to understand the full range of ways in which political polarization and extremism might be expressed online, we need to think more deeply about how moderation policies and practices create gray areas, as well as how individuals might exploit them in their political expression,” she says.
When moderating these forums fails to adequately address the apparent growth in extremist expression online, what can be done?
Rohlinger does not advocate abandoning efforts to improve moderation policies and practices, and she supports calls for algorithmic accountability to make automated review methods more transparent and hold platforms responsible for the online cultures they help to create.
She does, however, argue for additional direct interventions such as political bias training, an idea she pitched in a presentation at the Heroes in Public Safety conference in Tallahassee last December. Integrating such training into our workplaces, she argues, could be an important step in protecting against extremism and encouraging democratic participation.
Rohlinger’s work as part of the SSRC project is now focused on commentary around Amy Coney Barret’s nomination to the Supreme Court. This semester, she and her student team – graduate students Pierce Dignam and Shawn Gaulden; undergrads Allison Bloomer, Alex Cubas, Alejandro Garcia, Jade Harris, Emily Ortiz and Lauren Torres – will finish coding the data they have been collecting. Rohlinger will then analyze the results and report them this summer.
“I will likely do another article this summer as I work through the analysis of both data sets and have a good grip on whether/how things have changed between 2018 and 2020.”
* Excerpts are quoted from Deana A. Rohlinger, “We Cannot Just Moderate Extremism Away,” in “Extremism Online,” series, Items: Insights from the Social Sciences, March 9, 2021. Reprinted with permission. For the full article, please visit this link.