Customize Consent Preferences

We use cookies to help you navigate efficiently and perform certain functions. You will find detailed information about all cookies under each consent category below.

The cookies that are categorized as "Necessary" are stored on your browser as they are essential for enabling the basic functionalities of the site. ... 

Always Active

Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.

No cookies to display.

Functional cookies help perform certain functionalities like sharing the content of the website on social media platforms, collecting feedback, and other third-party features.

No cookies to display.

Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics such as the number of visitors, bounce rate, traffic source, etc.

No cookies to display.

Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.

No cookies to display.

Advertisement cookies are used to provide visitors with customized advertisements based on the pages you visited previously and to analyze the effectiveness of the ad campaigns.

No cookies to display.

HomeBusinessMeta Turns to Community Notes, Mirroring X

Meta Turns to Community Notes, Mirroring X

Social media companies are increasingly relying on fact-checks written by their users, allowing companies to step back from politically loaded decisions about what content to take down.

Elon Musk’s X, which stopped using employees to fact-check posts, relies heavily on its users to police its site for misinformation in a program called Community Notes. YouTube has also begun testing a similar feature, although it uses third-party evaluators to determine whether the corrective notes are helpful.

The decisions to move away from strict rules about what is allowed on the sites and employing thousands of content moderators to police them follows yearslong complaints from Republicans that social media companies effectively censored conservative voices. And, despite the companies’ moderation efforts, many social media researchers still found myriad posts containing rule-breaking content.

X’s Community Notes began before Mr. Musk acquired the company in 2022. But Mr. Musk aggressively accelerated the program and largely did away with the fact-checking labels the company had once applied to misleading posts about hot-button issues like elections and the Covid-19 pandemic.

Mark Zuckerberg, Meta’s chief executive, nodded to X’s influence in his announcement. “We’re going to get rid of fact checkers and replace them with Community Notes similar to X, starting in the U.S.,” Mr. Zuckerberg said.

Mr. Musk, responding in a post on X on Tuesday, said, “This is cool.”

Community Notes allows users who participate in the program to write fact-checks for any post on X. The approach works for topics on which there is broad consensus, researchers have found. But users with differing political viewpoints have to agree on a fact-check before it is publicly appended to a post, which means that misleading posts about politically divisive subjects often go unchecked.

MediaWise, a media literacy program at the Poynter Institute, found in July that only about 6 percent of the drafted Community Notes on posts about immigration became public, and only 4 percent of drafted fact-checks on posts about abortion were published.

The program has also added fact-checking labels to X posts that turned out to be accurate. During hurricane season, Community Notes participants incorrectly labeled storm forecasts as inaccurate.

Keith Coleman, a vice president of product at X who oversees the Community Notes program, said in a recent interview with Asterisk Magazine that social media users distrusted companies’ fact-checking.

“A lot of people just did not want a tech or media company deciding what was or was not misleading,” Mr. Coleman said. “So even if you could put labels on content, if people think it’s biased, they’re not likely to be very informed by it.”

Content Source: www.nytimes.com

Related News

Latest News