Twitter is testing a new feature to combat misinformation and fake news on its platform ahead of the 2020 presidential election.
Orange and red badges would appear underneath posts shared by politicians and public figures that are deemed ‘Harmfully Misleading’, according to NBC.
Fact-checkers and journalists verified on Twitter will be tasked with the responsibility of providing the correct information, which will be placed below the badge.
Twitter is also allowing users to participate in community reports, which allows the public to determine if the post with a badge is likely or unlikely harmfully misleading.
In 2018, it was found that 80 percent of the accounts used to spread false information during the 2016 presidential campaign were still active and were still publishing more than a million tweets just weeks before the US midterm elections.
However, the roll out of the new badges seems to hit before the potential candidates start revenging up for the 2020 election.
Twitter confirmed the leaked demo to NBC and said it could roll out as a live feature on March 5th.
‘We’re exploring a number of ways to address misinformation and provide more context for tweets on Twitter,’ a Twitter spokesperson said. ‘Misinformation is a critical issue and we will be testing many different ways to address it.’
The demo shows the red and orange badges on three tweets it found to be fake news.
The tweets are about whistleblowers by House Minority Leader Kevin McCarthy, a tweet about gun background checks by Senator Bernie Sanders and a tweet by an unverified Twitter account posting a doctored video of House Majority Leader Nancy Pelosi.
Twitter is also looking to the public for help with identifying misinformation with its new ‘Community Reports’.
The feature presents users with a survey that asks them to rate how likely or unlikely harmfully misleading the post in question is.
Then users are ask to rate how likely other community members are to answer in the same way – using a scale from zero to 100.
The final question in the survey lets the participant elaborate on why they believe the tweet does or does not contain harmfully misleading material.
Users earn ‘points’ and a ‘community badge’ if they ‘contribute in good faith and act like a good neighbor’ and ‘provide critical context to help people understand information they see.’
‘Together, we act to help each other understand what’s happening in the world, and protect each other from those who would drive us apart,’ the demo reads.
Twitter explained to NBC that this feature is one of a few others that could go live over the next few weeks.