Skip to main content
Skip to navigation

This site is archival. Please visit news.missouri.edu for up-to-date content.

Where’s the Line? Managing Extreme Speech on Social Media

Focus groups reveal opportunities for social media platforms in managing extreme speech, according to MU researcher

August 24th, 2017

Story Contact: Sheena Rice, 573-882-8353, ricesm@missouri.edu

COLUMBIA, Mo. – Extreme speech on social media—foul language, threats and overtly sexist and racist language—has been in the spotlight. While such language is not new, recent increases of extreme and offensive posts on social media have led to politicians, celebrities and pundits calling for social media platforms to do more in curbing such speech, opening new debates about free speech in the digital age. Now, a new study from the University of Missouri School of Journalism shows that while people tend to dislike extreme speech on social media, there is less support for outright censorship. Instead, people believe sites need to do a better job promoting healthy discourse online.

“Facebook and Twitter feeds have become the home to various breeds of questionable content,” said Brett Johnson, assistant professor of mass communication. “As public discourse continues to move to online discussions, we are seeing more debates about the limits of free expression in digital realms.”

Johnson used focus groups to examine individual opinions toward extreme speech in social media and if such speech should be regulated. Focus group participants were regular social media users divided into race and gender groups—African-American females, African-American males, white females and white males. The choice to make the groups homogenous was made to facilitate more open and honest conversation about extreme speech without alienating or creating conflict among participants.

Most participants defined freedom of expression on social media sites in terms of the speech having a very clear purpose or being an expression of one’s individuality. When asked about freedom of expression in social media, it was revealed that while people tend to dislike extreme and offensive speech, there might not be an overwhelming willingness for social media sites to censor such speech.

However, a prevailing belief was that social media platforms need to be transparent in how they manage content and that they follow clear standards aimed at both promoting free speech and preventing harm against users. This was particularly true among female participants who tended to value the overall health of public discourse and protection of more vulnerable users. African-American female participants explicitly called for platforms to recognize a special duty to protect minority users.

“While the focus groups did not reveal an outright demand to censor extreme and offensive speech, we found a prevailing trend of participants calling for social networking sites to have clear and transparent policies related to permissible content,” Johnson said. “Sites should communicate their policies clearly to users and frame speech policies as a means of promoting healthy public discourse rather than pledging to keep users safe from harmful speech.”

Johnson hopes the findings from the focus groups will help social media companies better understand and manage problems posed by extreme speech on their platforms.

“Tolerating and managing extreme speech on social media” recently was accepted for publication in a special issue of the journal Internet Research devoted to the topic of “the dark side of social media.” The issue is expected to be released in late 2017.

Johnson’s research was funded in part by a $2,500 grant from the Association for Education in Journalism and Mass Communication for being named a 2016-17 AEJMC Emerging Scholar, and a $1,000 grant from the Mixed-Methods Interdisciplinary Graduate Group at the University of Minnesota. Johnson would like to thank the staff at the Information Experience Lab at the University of Missouri’s Allen Institute for facilitating the focus groups, including Neeley Current, Kenneth Haggerty and Shahzaade Cannon. He also would like to thank MU doctoral students Rachel Grant, Marina Hendricks and Yan Wu and graduate student Madeline McClain for their assistance in this research project.

--30--