The Google-owned video site was the last of the major social media networks to suspend Trump after the attack on the U.S. Capitol. It said it removed a video uploaded Tuesday for violating its policies and "in light of concerns about the ongoing potential for violence."
YouTube wouldn't confirm which video broke its rules, but a review of archived versions of its site suggests it was a clip from a news conference Trump gave to reporters where he said his comments to supporters before the Capitol attack were "totally appropriate."
In the same clip, which is available on C-SPAN, Trump said social media companies were making a "catastrophic mistake" and doing a "horrible thing for our country" by penalizing him.
The White House did not respond to a request for comment.
Last Thursday, Facebook said it would cut the president off indefinitely, "for at least the next two weeks." Facebook COO Sheryl Sandberg later told Reuters that the company had no plans to reinstate the president's account. YouTube took down one video from the president's account. A day later, Twitter banned him.
YouTube's decision came after a weekend of criticism that the company hadn't acted strongly enough against the president. The newly-formed Alphabet Workers Union, a collection of Google employees and contractors, put out a statement saying YouTube's actions in taking down just one video were "lackluster, demonstrating a continued policy of selective and insufficient enforcement of its guidelines."
YouTube has a three-strike process when deciding which channels to take down which directly affects the speed at which it moves. Facebook also has a strike system, but big, complex decisions often roll up directly to Sandberg and CEO Mark Zuckerberg. At Twitter, decisions are made by the company's policy team and signed off on by CEO Jack Dorsey.
YouTube's process can seem "frustratingly slow," but the company actually has more of a thought-out process than Twitter and Facebook do, said Jim Steyer, CEO and founder of Common Sense Media, which advocates for safer technology for children. The suspension of Trump is a positive first step, but the ban should be made permanent, he said.
"I think all the platforms missed this one," Steyer said, saying they should have acted earlier to crack down on misinformation Trump shared.
It's not the first time YouTube specifically has come under fire for moving too slow.
In June 2019, gay rights activists and other progressive groups lambasted the company for not taking down videos by YouTuber Steven Crowder in which he used homophobic language against another popular YouTuber, journalist Carlos Maza. The company said Crowder's comments were "hurtful" but did not break its rules against promoting hatred. "Opinions can be deeply offensive, but if they don't violate our policies, they'll remain on our site," the company said in a statement at the time.
A day later, YouTube changed its mind, deciding to block Crowder's ability to make money from ads on his videos, but not taking them down completely.
YouTube differs from Facebook and Twitter in sharing advertising revenue with creators. In many cases, the company has chosen to just turn off the flow of cash to videos called out as harmful. It's also tweaked its algorithm so that those videos don't get as much attention as others. But banning videos and creators altogether is a much more rare.
The resistance to removing videos completely has helped allow YouTube to fly under the radar as other social media sites take the heat for allowing misinformation to proliferate on their sites, said Harvard Law School Lecturer Evelyn Douek.
Twitter CEO Jack Dorsey and Facebook CEO Mark Zuckerberg have become familiar faces on Capitol Hill where they were called to testify in front of Congress about tech's power and role in misinformation last year. Google CEO Sundar Pichai has testified as well, but YouTube CEO Susan Wojcicki escaped the grilling.
YouTube is often mentioned as an afterthought, even as social media companies have been pushed into a harsher spotlight for the lies that spread on their sites.
Researchers tend to focus on text-based Twitter and Facebook, Douek said, because video can be more time consuming and labor intensive to sift through. That doesn't mean there is less misinformation floating around on YouTube, and in fact the company has been accused of allowing people to get radicalized on the site by promoting conspiracy theory videos, she added.
YouTube's policy of laying out rules and using a strike-system to enforce them is better than ad hoc decision-making by executives, Douek said.
"My view of content moderation is companies should have really clear rules they set out in advance and stick to, regardless of political or public pressure," she said. "We don't just want these platforms to be operating as content cartels and moving in lockstep and doing what everyone else is doing."
On Wednesday, Google also said it wouldn't allow political ads until at least Jan. 21, the day after the inauguration of President-Elect Joe Biden. The company paused political ads in the week after the November presidential election as well, following a policy Facebook had laid out earlier.
The strike against Trump's account means he can't add new videos for a minimum of seven days, YouTube said in a Twitter post late Tuesday. The company will also disable comments on his channel indefinitely. A second strike within the next three months would net Trump a two-week suspension, and a third strike would result in a ban, according to YouTube's policies.
Trump's YouTube account is still visible and past videos can still be viewed.
The suspensions from YouTube, Facebook and Twitter effectively cut the president off from his usual social media megaphones. Niche right-wing social media platform Parler, which was growing in popularity with conservatives, was knocked offline Monday after Amazon pulled its technical support.
(Amazon founder and chief executive Jeff Bezos owns The Washington Post.)
Facebook and YouTube could allow the president to access his account again as early as next week.
Trump's animosity toward tech companies has become particularly heated in the past year after Twitter and Facebook started labeling his posts. He has repeatedly called for Section 230, an Internet liability shield law, to be revoked, presumably to penalize the companies.
Section 230 shields tech companies from being sued over what their users post on those sites. Politicians on both sides of the aisle generally agree the law needs to be reformed, but open-Internet advocates say revoking it could have a chilling effect on free speech on the Web.
Published : January 14, 2021
By : The Washington Post · Gerrit De Vynck, Rachel Lerman