Search

Facebook Moves to Stop Election Misinformation - The New York Times

paksijenong.blogspot.com

SAN FRANCISCO — Facebook on Thursday moved to clamp down on any confusion about the November election on its service, rolling out a series of changes to limit voter misinformation and to prevent interference from President Trump and other politicians.

The social network, in one of its most sweeping sets of election actions, said it planned to bar any new political ads on its site in the week before Election Day. It said it would also strengthen measures against posts that try to dissuade people from voting. Postelection, Facebook said it would quash any candidates’ attempts at claiming false victories by redirecting users to accurate information on the results.

Facebook is bracing for what is set to be a highly contentious presidential election. With two months to go, President Trump and Joseph R. Biden Jr. have ratcheted up their attacks against each other, clashing over issues including the coronavirus pandemic and racial unrest. Even when the results are in, Mr. Trump has suggested that he may not accept them and has questioned the legitimacy of mail-in voting.

“This election is not going to be business as usual,” Mark Zuckerberg, Facebook’s chief executive, wrote in a post. He said he was concerned about the challenges that people could face when voting in a pandemic and how the count might take days or weeks to finalize, potentially leading to more unrest. As a result, he said, “we all have a responsibility to protect our democracy.”

Facebook has become a key battleground for both Mr. Trump’s and Mr. Biden’s campaigns. Mr. Trump’s campaign has run ads on the social network featuring false corruption accusations about Mr. Biden. Mr. Biden’s campaign has criticized Facebook for allowing lies, while also spending millions of dollars to buy ads on the service to appeal to voters.

The social network is striving to prevent itself from being misused. It also wanted to keep either Republicans or Democrats from saying that it unduly influenced voters. In particular, the Silicon Valley company wants to avoid a repeat of 2016, when Russians used the service to sway the American electorate with divisive messaging to promote Mr. Trump.

At the time, Mr. Zuckerberg shrugged off the idea that his social network had influenced the election and Mr. Trump’s victory. After evidence of Russian meddling through Facebook became overwhelming, Mr. Zuckerberg spent billions of dollars to secure the social network, hired thousands of employees to focus on security, and worked with intelligence agencies and other tech companies to guard against foreign meddling.

Even so, Facebook has continued to face criticism as domestic misinformation about this year’s election — including from Mr. Trump — has proliferated. Mr. Zuckerberg has declined to remove much of that false information, saying that Facebook supports free speech. Many of the company’s own employees have objected to that position.

On Tuesday, Facebook said the Kremlin-backed group that interfered in the 2016 presidential election, the internet Research Agency, tried to meddle on its service again using fake accounts and a website set up to look like a left-wing news site. Facebook said it was warned by the Federal Bureau of Investigation about the Russian effort and removed the fake accounts and news site before they had gained much traction.

Thursday’s changes, which are a tacit acknowledgment by Facebook of how powerful its effect on public discourse can be, are unlikely to satisfy its critics. Some of its measures, such as the blocking of new political ads a week before Election Day, are temporary. Yet they demonstrate that Facebook has sweeping abilities to shut down untruthful ads should it choose to do so.

Facebook said it would begin barring politicians from placing new ads on Facebook and Instagram, the photo-sharing service it owns, starting on Oct. 27. Existing political ads will not be affected. Political candidates will still be able to adjust both the groups of people their existing ads are targeting and the amount of money they spend on those ads. They can resume running new political ads after Election Day, the company said.

In another change, Facebook said it would place what it calls its voting information center — a hub for finding accurate, up-to-date information on how to register to vote, and when and where to do so — at the top of its News Feed, which is seen daily by millions, through Election Day. The company had rolled out the voter information center in June and has continued promoting it to users, with a goal of registering four million people and encouraging them to vote.

To curb misinformation about voting, Facebook said it would remove posts that tell people they will catch Covid-19 if they take part in voting. For posts that use the coronavirus to discourage people from voting in other, less obvious ways, the company said it would attach a label and link to its voter information center.

Facebook also plans to remove posts that both explicitly and implicitly aim to disenfranchise or prohibit people from voting; previously, the company removed only posts that actively discouraged people from voting. Now, a post that causes confusion around who is eligible to vote or some part of the voting process — such as a misstatement about what documentation is needed to receive a ballot — would also be removed.

The company also said it would limit the number of people that users can forward messages to in its Messenger app to no more than five people, down from more than 150 people previously. The move mirrors what WhatsApp, the messaging app also owned by Facebook, did in 2018 when it limited message forwarding to 20 people from a previous maximum of 250.

Misinformation across private communication channels is a much more difficult problem to tackle than on public social networks because it is hidden. Limiting message forwarding could slow that spread.

To get accurate information on the election’s results, Facebook said it plans to partner with Reuters, the news organization, to provide verified election results to the voting information center. If any candidate tries to declare victory falsely or preemptively, Facebook said, it would add a label to those posts directing users to the official results.

Mr. Zuckerberg has said publicly that Facebook exists to “give people a voice,” and that “voting is voice.” On Tuesday, he and his wife, Dr. Priscilla Chan, donated $300 million to support voting infrastructure and security efforts.

Facebook teams have worked for months to walk through different scenarios and contingency plans for how to handle the election. The company has built an arsenal of tools and products to safeguard elections in the past four years. It also invited those in government, think tanks and academia to participate.

In recent months, Facebook also turned more to postelection planning. Mr. Zuckerberg and some of his lieutenants had started holding daily meetings about minimizing how the platform could be used to dispute the election, people with knowledge of the company have said.

In his post on Thursday, Mr. Zuckerberg said the period after the election “could be a period of intense claims and counterclaims as the final results are counted.”

The chief executive was personally involved in the new election-related changes, according to two people familiar with the company, who declined to be identified because the details are confidential. He pushed the team working on the changes to come up with new ways to tamp down on misinformation and voter suppression, they said.

“It’s going to take a concerted effort by all of us — political parties and candidates, election authorities, the media and social networks, and ultimately voters as well — to live up to our responsibilities,” Mr. Zuckerberg said.

Let's block ads! (Why?)



"Stop" - Google News
September 03, 2020 at 06:01PM
https://ift.tt/2QREn5W

Facebook Moves to Stop Election Misinformation - The New York Times
"Stop" - Google News
https://ift.tt/2KQiYae
https://ift.tt/2WhNuz0

Bagikan Berita Ini

0 Response to "Facebook Moves to Stop Election Misinformation - The New York Times"

Post a Comment

Powered by Blogger.