Facebook release of content guidelines a step in the right direction, Virginia Tech expert says
Facebook this week for the first time released their internal enforcement guidelines that determine what content can stay on the platform and what can’t, in a move the company said “will help people understand where we draw the line on nuanced issues.”
Virginia Tech expert Mike Horning said the publication of the guidelines appears to fall in line with public favor, based on data he’s recently collected.
Horning and his colleagues recently conducted a study set to be published that asked Americans how they felt about government regulation of social media, knowing that congress had recently discussed this idea in hearings with Facebook CEO Mark Zuckerberg.
“We found that the majority of our respondents were opposed to more government oversight even though they were increasingly concerned with the amount of misinformation that seems to be on social media,” said Horning, an assistant professor of multimedia journalism in the College of Liberal Arts and Human Sciences’ Department of Communication. “This may suggest that Americans may favor more self-regulation on the part of tech companies. In this sense Facebook’s move to impose clearer standards is a good first step in that direction.”
More from Horning on Facebook
“Facebook’s recent post on how it uses its guidelines to enforce community standards provides some additional insights into how the company tackles this difficult problem, but to a large degree Facebook’s application of those standards could still face scrutiny. Facebook has said that a team of people evaluate content and that it has mechanisms in place to ensure that standards are not applied in an arbitrary manner, but it is difficult to say whether the public trusts the platform enough to believe that this internal process will be truly fair.”
“Facebook may provide more transparency in the future by giving the public more insight into the ways that both their team and their algorithms function in the day-to-day to make choices about appropriate content. This process remains quite abstract, and I don’t think the public is still clear on how difficult choices about content are made.”
“Facebook will still likely face some increasing criticism by some political and advocacy groups for a failure to distinguish between hate speech and political speech. The policies as they currently exist define hate speech as a ‘direct attack’ on people based on protected characteristics. This policy is still relatively broad and may unintentionally censor a wide range of speech about contested social policies.”