Congress questions YouTube, Facebook, and Twitter on what it takes to ban an account | Tech Industry

Chemicloud Web Hosting

During a Tuesday Congressional hearing, representatives from Facebook, Twitter, and YouTube emphasized that if a user on their platform garners too many strikes — that is, they violate too many of the platform’s rules — they’re out. But how many strikes a user gets, and whether or not a strike will be handed out, is still unclear to many users, including members of Congress.

Representatives from the three companies testified in front of the House Judiciary Committee to shed light on their “content filtering practices.” The hearing touched on a slew of the most-talked about controversies involving social media companies in the last year —  Russian election interference, Cambridge Analytica, and perceived suppression of conservative moments. But one revealing exchange came when Rep. Ted Deutch (D-FL), asked Facebook and YouTube’s representatives specifically about what they’re doing to prevent conspiracy theories from spreading on the platform. Deutch’s district includes Parkland, Florida, which was the site of a mass shooting at Marjory Stoneman Douglas High School several months ago.

He specifically brought up the case of the far-right blog Infowars, which published a YouTube video accusing survivors of the shooting of being crisis actors. YouTube later took the video down — as YouTube’s global head of public policy and government relations Juniper Downs explained, if an individual or group is claiming that a “specific, well-documented violent attack didn’t happen and you use the name or image of survivors of the attack, that is a malicious attack and it violates our policy.”

YouTube’s community guidelines state that if a channel garners three strikes, or three violations of community guidelines in three months. When asked what YouTube was doing to stop the spread of conspiracy theories, Downs said that YouTube’s goal was to primarily “demote low quality content and providing more authoritative content.”

Web Hosting

Facebook vice president of global policy management Monika Bickert was also asked by Deutch how many strikes Facebook Groups, Pages, or Profiles have before they’re kicked off the platform. Bickert responded that Facebook did have a strikes policy, but the “threshold varies depending on the severity of different types of violations,” but didn’t offer any further specifics.

Combined, the two answers are a useful distillation of the problems social media platforms today, including Facebook and YouTube, face at keeping harmful content, at bay. Yes, social media companies are fond of talking about they’re community guidelines. But just how consistently these platforms are enforcing guidelines, and what it takes for someone to completely get kicked off the platform, remain the bigger questions. When asked by Rep. Karen Bass (D-CA) about what options Facebook offered for activists who feel like they’re content has been unfairly taken down, Bickert said that Facebook had an appeals process. But Rep. Bass questioned how many users know about Facebook’s appeals process.

Facebook was already on damage control duty earlier today as a documentary on its content moderation practices  that questions just how consistently Facebook is applying its community guidelines, is set to air in the U.K. tonight. Details about the documentary have already leaked, which involved a reporter for U.K. broadcaster Channel 4 working undercover for an Ireland-based contractor that works with Facebook’s content moderation team. One moderator reportedly told the reporter not to take down a far-right activist’s page who had violated Facebook’s policies, because “they have a lot of followers, so they’re generating a lot of revenue for Facebook.” Facebook later published a blog post, attributed to Bickert, pushing back on the idea that the company believed turning a blind eye to bad content was necessary to generate more revenue.

Much of the hearing fell along partisan lines, as many Republicans used their time to ask the Facebook, YouTube, and Twitter representatives questions about why content favorable to conservatives seemed to be censured, or why pages or accounts encouraging violence against conservatives were not. Rep. Matt Gaetz (R-FL) questioned why a Facebook page that appeared to encourage violence against conservatives was still up on the platform, while Rep. Steve King (R-IA) asked why the Facebook traffic of far-right blog Gateway Pundit appeared to have drop over the past year.

Democrats such as the vice ranking member of the committee, Jamin Raskin, (D-MD), accused the committee of arranging the hearing to “resume consideration of the totally imaginative narrative that social media companies are biased against conservatives.” The Judiciary Committee held a meeting three months prior on similar topics, but invited only a pair of conservative video bloggers known as Diamond and Silk, who complained after receiving a message from Facebook characterizing their page as “dangerous.” Facebook later apologized, saying that the characterization was incorrect.

 

 

You might also like More from author

Comments are closed.