than 100 nongovernment organizations (NGOs) and government agencies
around the world help police YouTube for extremist content, ranging from
so-called hate speech to terrorist recruiting videos.
of them have confidentiality agreements barring Google, YouTube’s parent
company, from revealing their participation to the public, a Google
representative told The Daily Caller on Thursday.
handful of groups, including the Anti-Defamation League and No Hate
Speech, a European organization focused on combatting intolerance, have
chosen to go public with their participation in the program, but the
vast majority have stayed hidden behind the confidentiality agreements.
Most groups in the program don’t want to be publicly associated with it,
according to the Google spokesperson, who spoke only on background.
“Trusted Flaggers” program goes back to 2012, but the program has
exploded in size in recent years amid a Google push to increase
regulation of the content on its platforms, which followed pressure from
advertisers. Fifty of the 113 program members joined in 2017 as
YouTube stepped up its content policing, YouTube public policy
director Juniper Downs told
a Senate committeeon Wednesday.
third-party groups work closely with YouTube’s employees to crack down
on extremist content in two ways, Downs said and a Google spokesperson
confirmed. First, they are equipped with digital tools allowing them to
mass flag content for review by YouTube personnel. Second, the partner
groups act as guides to YouTube’s content monitors and engineers who
design the algorithms policing YouTube but may lack the expertise needed
to tackle a given subject.
not just terrorist videos that Google is censoring. Jordan B. Peterson,
a professor known for opposing political correctness, had one of his
videos blocked in 28 countries earlier this month. A note sent to
Peterson’s account said YouTube had “received a legal complaint” about
the video and decided to block it.
used his large social media following to push back, calling out YouTube
on Twitter, where he has more than 300,000 followers. YouTube reversed
Peterson’s block after another popular YouTuber, Ethan Klein, demanded
an explanation on Twitter, where he has more than 1 million followers.
Although the original notice said that YouTube was responding to a legal
complaint, on Twitter the company gave the impression that the block was
overwhelming majority of the content policing on Google and YouTube is
carried out by algorithms. The algorithms make for an easy rebuttal
against charges of political bias: it’s not us, it’s the algorithm. But
algorithms are designed by people. As noted above, Google’s anonymous
outside partners work closely with the internal experts designing the
algorithms. This close collaboration has upsides, Google’s
representatives say, pointing to advances in combatting terrorist
propaganda on the platform. But it also provides little transparency,
forcing users to take Google’s word that they’re being treated fairly.
partnership with outside organizations to combat extremist content is
just one part of the company’s efforts to prioritize certain kinds of
content over others. YouTube also suppresses certain content through its
“restricted” mode, which screens out videos not suitable for children or
containing “potentially mature” content, as well as by demonetizing
certain videos and channels, cutting off the financial stream to their
University, a conservative nonprofit that makes educational videos, sued
Google in October for both putting their content in restricted mode and
demonetizing it. Prager faces an uphill battle in court (as a private
company, Google isn’t bound by the First Amendment) but the lawsuit has
forced Google to take public positions on its censorship.
Google representative who spoke with TheDC said that it is the
algorithms that are responsible for placing videos in restricted mode.
But in court documents reviewed by TheDC, Google’s lawyers argued
otherwise. “Decisions about which videos fall into that category are
often complicated and may involve difficult, subjective judgment calls,”
they argued in documents filed on Dec. 29.
her testimony before the Senate committee on Wednesday, Downs described
some of the steps Google has taken to suppress “offensive” or
“inflammatory” content that falls short of actual violent extremism.
borderline videos, such as those containing inflammatory religious or
supremacist content without a direct call to violence or a primary
purpose of inciting hatred, may not cross these lines for removal. But
we understand that these videos may be offensive to many and have
developed a new treatment for them,” she said.
borderline content will remain on YouTube behind an interstitial, won’t
be recommended, won’t be monetized, and won’t have key features
including comments, suggested videos, and likes. Initial uses have been
positive and have shown a substantial reduction in watch time of those
videos,” she added. (RELATED: Snopes,
Which Will Be Fact-Checking For Facebook, Employs Leftists Almost
demonetization push, which is affecting some of the most popular
non-leftist political channels, is meant to accommodate advertisers who
seek to avoid controversial content, the Google spokesperson said.
Rubin, a popular YouTube host, has seen his videos repeatedly
demonetized. Rubin posted a video, “Socialism isn’t cool,” on Wednesday.
The video was up a little over 24 hours before YouTube demonetized it on
video was later remonetized, a Google representative told TheDC. But
users can’t recoup the advertising dollars they lost while their videos
were erroneously demonetized.
suspect that there is some political bent to it but I don’t think it’s
necessarily a grand conspiracy against conservatives or anyone who’s not
a leftist. Part of the problem is their lack of transparency has created
a situation where none of use really know what’s going on,” Rubin told
TheDC. “Does it seem that it is more so affecting non-leftist channels?
Yeah, it does.”