TUESDAY, April 23, 2024
nationthailand

YouTube bans some misleading or doctored political videos

YouTube bans some misleading or doctored political videos

YouTube said it will ban misleading or doctored videos that could impact elections, tightening its rules ahead of the presidential election.

The video-streaming site said in a blog post Monday that it will remove altered videos such as "deepfakes" and videos with patently false information, such as clips that report a living candidate is dead. It will also target videos that attempt to mislead the public about the voting or election process.

"We've increased our efforts to make YouTube a more reliable source for news and information, as well as an open platform for healthy political discourse," Leslie Miller, YouTube's vice president of government affairs and public policy, wrote in the blog post.

The moves come hours before voters in Iowa gathered to signal their support for presidential candidates in the Democratic field, the first tally of the 2020 primary season. Several primaries are scheduled for the coming weeks, including in strategically crucial states such as California and Virginia.

YouTube, a division of Google, has become a central advertising platform for candidates and source for voters seeking election information. But that has also brought bad actors who may willfully try to manipulate voters with incorrect information about hot-button political issues or even realistic-looking videos manipulated to give a false impression of a politician they oppose.

Last year, for instance, manipulated videos of House Speaker Nancy Pelosi (D-Calif.), slowed to make her appear to be drunkenly slurring her words, were spread widely on social media, including YouTube, which removed them.

The dissemination of manipulated videos, in particular, has been a complicated topic in Silicon Valley heading into the 2020 presidential election amid a deeply divided electorate and the advancement of digital tools that make altering content easier than ever. Tech companies have formed special fact-checking teams and employed technical experts to help identify misleading or manipulated content.

Still, false narratives can spread quickly and effectively. Right-wing activists took to Twitter over the weekend to push claims of voter fraud in Iowa, including claims of voter registration inaccuracies, that went viral on the social media site.

"It's a huge technical challenge to detect these kind of videos. We just don't have very effective algorithms for finding digitally altered videos at this scale," said Siwei Lyu, director of computer-vision lab at the State University of New York's University at Albany and a member of the Deepfake Detection Challenge's advisory group. "I applaud the effort, but we'll have to see how this can be implemented in an effective way." 

Facebook last month said it removed videos it determined had been digitally manipulated by technologies such as artificial intelligence in a way that average users would not easily spot, including attempts to make the subjects of videos say words that they never did. But the policy did not appear to apply to doctored videos such as the Pelosi clip, which Facebook allowed to remain on the site. The social media site has been criticized for allowing content that is false as long as it is distributed by politicians or candidates - content it says should be up to voters to judge.

Social media sites Pinterest and Twitter last week also announced policies to combat election misinformation. Twitter users will be able to more easily flag content that contains false information about the process of voting, while Pinterest said it will proactively pull down such posts.

YouTube said such misleading videos represent less than 1% of what is watched in the United States But with well over 1 billion hours of content streamed daily globally, that could still represent tens of millions of hours. Election watchdogs have argued that YouTube, Facebook and other social media sites were central repositories in the efforts to manipulate the 2016 U.S. presidential vote and other global elections.

Google-parent Alphabet on Monday disclosed for the first time advertising revenue for YouTube since acquiring the site in 2006. While the $15 billion in annual sales doesn't account for payouts the site makes to video creators, Wall Street analysts praised the company for its newfound transparency, which included cloud computing revenue.

Still, investors sent shares down nearly 5% in after-market trading, as Alphabet missed expectations on key metrics such as operating income and companywide revenue and advertising sales.YouTube says viewers are spending less time watching conspiracy-theory videos. But many still do.

In December, YouTube said its efforts to combat conspiracy theories and other debunked theories on its site had led to a reduction in how much time viewers spent watching such content. But its claims of a 70% drop in average time U.S. viewers were watching "borderline" content such as flat-Earth or medical-cure videos was tempered by a lack of underlying data.

Also included in YouTube's ban are attempts to artificially boost numbers of likes, views or comments on videos and channels that impersonate others or try to conceal their connection to a government actor, the company said.

YouTube said it promotes videos that it determines are authoritative on a subject, which should marginalize content that may be misleading or erroneous.

 

RELATED
nationthailand