inside sources print logo
Get up-to-date news in your inbox

YouTube Content Ban Sparks Backlash From Free Speech Advocates

On Wednesday, YouTube changed its content policies and began blocking more of what it considers “borderline” hate speech. These new restrictions resulted in takedowns of non-defamatory accounts — like history and journalism channels — and a backlash from free speech advocates.

YouTube’s actions came after Vox reporter Carlos Maza asked them to ban conservative commentator Steven Crowder for harassment in a viral Twitter thread last week, citing videos in which Crowder described Maza, who is a gay Cuban American, with a variety of racial and sexual slurs.

On Tuesday, YouTube told Maza that Crowder did not violate its community standards and policies regarding speech. But on Wednesday, YouTube pivoted.

YouTube demonetized Crowder’s channel but did not ban him outright. They also published a blog post explaining its reasoning and newly tightened standards titled, “Our ongoing work to tackle hate.”

(Maza responded to this saying demonetization isn’t enough, even though Crowder believes this will be devastating to his business. YouTube originally told Gizmodo that Crowder didn’t violate YouTube’s policies because he mocked Mazo within the context of ideological debate and didn’t encourage his followers to harass and bully Mazo, even though they did.)

“Today, we’re taking another step in our hate speech policy by specifically prohibiting videos alleging that a group is superior in order to justify discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status,” the blog post says. “This would include, for example, videos that promote or glorify Nazi ideology, which is inherently discriminatory.”

YouTube’s blog post discusses how the Alphabet-owned video platform will crack down more aggressively on “borderline” content “to reduce the spread of content that comes right up to the line.”

Many free speech advocates on the right and the left see this as a dangerous shift for YouTube because tech companies often don’t apply their speech and discrimination policies fairly or transparently. Free speech advocates also worry about the slippery slope of tech companies deciding which ideas are allowed on the internet and which are not.

This is ridiculous. YouTube is not the Star Chamber — stop playing God & silencing those voices you disagree with. This will not end well,” Sen. Ted Cruz (R-Texas) tweeted.

Glenn Greenwald, the progressive reporter and free speech advocate who led reporting on U.K. and U.S. government surveillance following the Edward Snowden leaks, also criticized YouTube’s new direction. “Apparently, creating and implementing vague, arbitrary censorship standards on the fly in response to mob demands and then purging people en masse end up suppressing and punishing many voices that censorship advocates like,” he tweeted on Wednesday. “Who could have guessed this would happen?”

According to a June 2018 report from the Verge, YouTube restricts and demonetizes LGBT creators as well as anti-LGBT creators. The Electronic Frontier Foundation (EFF) finds that tech platforms are notoriously inconsistent about how they enforce content policies, and all too often, they prioritize accounts and creators that drive clicks — on both sides of the political spectrum.

“Commercial content moderation practices negatively affect all kinds of people, especially people who already face marginalization,” the EFF says on its website. “We’ve seen everything from black women flagged for sharing their experiences of racism to sex educators whose content is deemed too risqué.”

The EFF sees tech companies’ inconsistent content policy enforcement as a threat to free speech and free expression on the internet, which is why on May 20, the think tank launched TOSsed Out, a platform to track and publicize the ways tech companies “unevenly enforce” content moderation policies with “little to no transparency.”

Independent journalist Ford Fischer, for example, said YouTube just demonetized his entire YouTube channel documenting different kinds of activism and extremism because of Wednesday’s policy update.

“YouTube says ‘our team of policy specialists carefully looked over the videos you’ve uploaded to your channel News2Share. We found that a significant portion of your channel is not in line with our YouTube Partner Program policies.’ As you’ll see in the next tweet, that’s wrong,” Fischer tweeted Wednesday. “YouTube sent me only two specific videos that they’ve taken down. 1st is a video of @JasonRCharter and other #Antifa activists confronting a Holocaust denier. While it’s true that the Holocaust denier says Holocaust-denier-stuff, this is raw vid documenting him being shut down. The only other one flagged was raw video of a speech given by Mike Peinovich ‘Enoch.’ While unpleasant, this documentation is essential research for history.”

Tech platforms like Twitter, YouTube and others sometimes mistakenly pull down accounts for violating content policies, but when the owners of the accounts alert them to the error, they restore the accounts. But all too often, tech companies permanently remove or demonetize benign accounts as well as defamatory ones — sometimes at the direction of government officials.

The EFF argues that if tech platforms were transparent and held accountable for what posts and accounts they take down, fewer mistakes and censorship would occur. The EFF supports the Santa Clara Principles, which “urge companies to:

  • publish the numbers of posts removed and accounts permanently or temporarily suspended due to violations of their content guidelines;
  • provide clear notice to all users about what types of content are prohibited, and clear notice to each affected user about the reason for the removal of their content or the suspension of their account; and
  • enable users to engage in a meaningful and timely appeals process for any content removals or account suspensions.”

“People rely on internet platforms to share experiences and build communities, and not everyone has good alternatives to speak out or stay in touch when a tech company censors or bans them,” the EFF said. “Rules need to be clear, processes need to be transparent, and appeals need to be accessible.”

Follow Kate on Twitter

Millions of Europeans Sign Petition Opposing EU’s Updated Copyright Directive

Four million Europeans seem to have signed a petition launched by Save the Internet opposing the controversial “upload filter” conditions in the European Union’s updated Copyright Directive — pitting copyright concerns against fears of censorship and anti-competitive market conditions.

But there’s also concern that the petition-signers may not all be European.

In September, the EU updated its Copyright Directive to add stronger protections for content creators — like artists, musicians, writers and especially newspapers.

The Copyright Directive (once finalized) will force big tech platforms like Facebook, Google and YouTube to pay content creators for linking to or providing samples of their content, and it also mandates platforms regularly scan for and remove any content that violates copyright — in other words, employ an “upload filter,” which many experts believe sets a precedent for censorship.

Silicon Valley vehemently opposed the changes to the directive, and experts have fallen on both sides of the issue: while it’s important that content creators are properly compensated (they’re often not fairly compensated in the current internet ecosystem), it’s also important to balance competition between tech platforms and not set a precedent for censorship.

The Electronic Frontier Foundation (EFF) — which supports Europeans’ grassroots effort to oppose the “upload filter” mandate — argues upload filters are “incredibly expensive to create” and would force out small to medium-sized tech platforms that don’t have the resources of Facebook and Google to comply.

“Filters don’t work, they cost a lot, they underblock, they overblock, they are ripe for abuse,” the EFF’s Special Advisor Cory Doctorow wrote in a Nov. 30 blog post.

Even YouTube’s own efforts to scan for and take down copyrighted material don’t work. Furthermore, Doctorow argues, the mandate creates liability for tech platforms, which undermines the idea of a free and open internet.

“What happens if you strip liability protections from the internet? It means that services are now legally responsible for everything on their site,” Doctorow wrote. “Consider a photo-sharing site where millions of photos are posted every hour. There are not enough lawyers — let alone copyright lawyers — let alone copyright lawyers who specialize in photography — alive today to review all those photos before they are permitted to appear online.”

Furthermore, the EFF argues the upload filter mandate will only concentrate market power between a few content sharing platforms — like Facebook and Google — force out the smaller platforms and result in more leverage for Facebook and Google to pay creators less for their content.

Roslyn Layton, the American Enterprise Institute’s (AEI) visiting fellow specializing in tech policy issues, said it isn’t so simple. For one, the Copyright Directive attempts to streamline a patchwork of different European countries’ copyright laws and enforce copyright protections for creators — and the status quo only benefits the tech platforms.

Secondly, this is the only legislative effort that seeks to fairly compensate content creators — and the legislative language is purposefully vague to avoid censorship problems.

But more importantly, she said, she’s not sure how legitimate this petition is, since anyone can sign it and it may not really represent European interests.

Activist groups can ask anyone to sign the petition, and if you sign a petition on one issue, they’re likely to ping you to sign another one — for any country, no matter where you’re from.

“It’s like ‘oh well you like net neutrality, you’ll probably like this issue, please sign this,'” Layton said. “[For this issue] there’s no way to ensure they’re all Europeans.”

When a similar petition regarding net neutrality in the EU circulated in 2015, she said, one third of those who signed the petition were not European.

“It’s part of a world movement called transnational activism, where you’re using social and digital tools to change the norm in different countries,” she said. “Social networks and computer networks are used to create political change. Ostensibly you want to have people who are voters or constituents within the particular country or region [signing the petition]. It’s the same thing with the Russian hacking. You use digital tools to create the appearance of a grassroots movement.”

Not only that, but those who sign online petitions usually represent limited demographics. The elderly and the poor, who may not have consistent online access and often do not prioritize tech policy issues above more pertinent issues like jobs and healthcare, are severely underrepresented.

Those who end up signing these petitions are “the people who are more likely to be online at any given time — usually men,” Layton said. “When you’re trying to have a democratic process, you want to make sure you have an accurate representation of the people. The rich, their voices are heard more than the poor.”

Follow Kate on Twitter

PragerU Vs. YouTube Case is About Choosing the Lesser of Two Evils

Conservative educational platform Prager University sued YouTube last year for “restricting” and “demonetizing” many of its educational videos, citing censorship of conservative ideas, but one think tank argues that siding with PragerU means you support government control of speech.

In amicus brief filed Nov. 7 and a blog post published Nov. 26, the Electronic Frontier Foundation (EFF) — a leading think tank in digital privacy and free speech issues — sided with YouTube, arguing that YouTube’s flawed content moderation should not grant government the ability to control what content is published on private platforms.

In March, a federal judge dismissed PragerU’s case, but PragerU has appealed and the case will now be heard by the Ninth Circuit Court of Appeals.

The PragerU case is uniquely difficult because it weighs free speech concerns on one side, and freedom of association issues on the other, which EFF discusses in its brief and blog post.

“YouTube’s actions here with respect to Prager University should be deeply scrutinized in the court of public opinion,” EFF states in the amicus brief. “But YouTube’s actions were constitutionally permissible.”

According to EFF, it is extremely disturbing and concerning that YouTube has restricted so many of PragerU’s videos for seemingly no reason at all. In the brief, EFF calls out Facebook and Twitter for removing content posted by individuals both sides of the political spectrum — conservatives and liberals — that seem to contradict content policies.

Tech platforms’ decisions regarding content removal, EFF argues, are rife with inconsistencies.

EFF concedes that YouTube may be, in fact, censoring conservative ideas, but fears giving the federal government the power to force YouTube to restore the videos is a slippery slope into fascism, and believes both moderated and unmoderated internet platforms are necessary for a healthy, flourishing society.

“Given the long history of governments using their power to regulate speech to promote their own propaganda, manipulate the public discourse, and censor disfavored speech, we are very reluctant to hand the U.S. government a role in controlling the speech that appears on the Internet via private platforms,” wrote David Greene, EFF’s senior staff attorney and civil liberties director, in the blog post. “The First Amendment prevents the government from dictating content moderation rules and controlling what platforms can and can’t publish.”

Because YouTube is not a government entity, it should be free to moderate its content as it chooses. However, PragerU argued that as a de facto public forum, the First Amendment should apply to it.

“The exception to that is the state actor doctrine, in which a private entity is functioning as the government,” Greene told InsideSources. “The issue here is the public function test, in which if you’re performing a function that’s typically a government function, then you are essentially treated as the government.”

Legally, though, YouTube needs to be performing a function that’s traditionally exclusively performed by the government in order to be considered a government entity — and it doesn’t.

“A court cannot import only one facet of the entire doctrine — the designated public forum — and leave the other facets behind,” EFF states in the amicus brief.

Greene told InsideSources that EFF sees both sides of the issue. Deciding which side to take, ultimately, is choosing the lesser of two evils.

“We believe content moderation by platforms is a big problem, they don’t do it well, they do it at the expense of human rights,” he said. “At the same time … We think it’s very dangerous when you are allowing courts to tell the platforms that they must publish something. That’s just a very dangerous power to hand to governments. That’s a power we’ve seen historically and internationally abused.”

In an effort to improve how tech platforms like YouTube, Facebook and Twitter moderate content, EFF supports the Santa Clara Principles — also supported by the American Civil Liberties Union (ACLU) of Northern California, the Center for Democracy and Technology and New America’s Open Technology Institute — which seeks to bring more transparency and accountability to content moderation.

As the blog post concludes, “the answer to bad content moderation isn’t to empower the government to enforce moderation practices.”

Follow Kate on Twitter