FRIDAY, March 29, 2024
nationthailand

Facebook can't win its moderation dilemma

Facebook can't win its moderation dilemma

Facebook is hard at work crafting its own Supreme Court of content moderation-just as it works to make much of its content impossible for it to review.

This week, Facebook hired the director of its oversight board and released material about the review process. Most cases are expected to take 90 days-which the company that gave us the motto "move fast and break things" surely knows is an eternity in the world of the internet.

Facebook has committed $130 million to fund the board, so it can deal with the hardest calls it faces about which content to remove from its site. For political content and misinformation, a board of elders could make a lot of sense, even though it wouldn't extend to encrypted messages (more on that later). 

The company is aware that the public has run out of patience. "We know that the initial reaction to the oversight board and its members will basically be one of cynicism-because basically, the reaction to pretty well anything new that Facebook does is cynical," said Nick Clegg, its VP of global policy. Sure enough, TechCrunch on Tuesday called the board "toothless."

I'll reserve judgment until we see who Facebook has selected as co-chairs. But not all of Facebook's problems are all that hard to adjudicate. Just this week, BuzzFeed wrote about a woman who struggled to get Facebook to remove an obscene page using her name without success - for four years! The company became suddenly responsive once BuzzFeed joined the cause.

To take a more extreme example, the New York Times reported last year about 45 million pictures of child sexual abuse appearing online in 2018. The investigation pinpointed Facebook Messenger as a place where abusers swap images, including a video of a man sexually assaulting a 6-year old that went viral in the messaging app.

These cases don't require an oversight board. (And they wouldn't get one: Messages would be exempt under Facebook's current proposal.) But there are other enforcement issues here. Facebook's WhatsApp product already has end-to-end encryption, meaning the company itself can't see the contents. And now, Facebook has plans to encrypt Messenger as well. Some of those Facebook cynics that Clegg knows are out there might even say the company plans to encrypt away its Messenger moderation conundrum.

Facebook's official explanation is that encryption increases privacy - a goal its critics say they share. But elected officials are worried about the company's plans to encrypt Facebook Messenger perhaps even more than they care about privacy.

Republican Sen. Lindsey Graham, along with Democratic Sen. Richard Blumenthal, is working on a bill wrapping in attacks both on encryption and content moderation. It would "require that companies work with law enforcement to identify, remove, report and preserve evidence related to child exploitation." While the bill doesn't specifically mention encryption, it would be hard for Facebook to cooperate if it couldn't read the underlying messages.

Under the bill, companies that don't cooperate with law enforcement would no longer receive protection from Section 230 of the Communication Decency Act-the law that shields them from much liability for content their users post. In Facebook's eyes, this law is the only way to make content moderation tenable. Without it, it could face civil lawsuits for libelous and other problematic content, and will have a much greater incentive to pull down far more content.

But Graham thinks that Section 230 gives companies like Facebook too much freedom in how they run their platforms. He also doesn't like encryption. So if he doesn't get his way on encryption, at least he'll get his way on the content moderation. Like Graham's idea or not, you have to admit that it's clever.

 

nationthailand