Facebook-parent Meta on Thursday said it will be activating its Elections Operations Center to watch out for potential abuses that could emerge across the platform related to state elections, as the social media giant outlined its firm resolve to curb hate speech, harmful content and misinformation on its platforms.
Meta, in a blog, said it has been preparing for these elections in India, and has a comprehensive strategy in place to keep people safe and encourage civic engagement.
The move assumes significance as big social media platforms have drawn flak in the past over hate speech, misinformation and fake news circulating on their platforms.
The government had notified IT rules last year to make digital intermediaries more accountable and responsible for content hosted on their platforms.
IT Minister Ashwini Vaishnaw has recently said social media needs to be made more accountable and stricter rules in this regard could be brought in, if there is political consensus on the matter.
Also Read
Assembly elections in Uttar Pradesh, Uttarakhand, Punjab, Manipur and Goa will be held between February 10 and March 7 in seven phases, with the counting of votes on March 10.
In its blogpost on Thursday, Meta underlined its commitment to combating misinformation, harmful content, voter suppression and fake news, while improving transparency of political and social advertising.
Meta said it is launching the security megaphone before elections to remind users to protect their accounts against online threats by activating two-factor authentication.
This will be available in three Indian languages, including Hindi.
"We'll be activating our Elections Operations Center so we can monitor and respond to potential abuses that we see emerging related to these elections in real time," Meta said.
Facebook's parent company recently changed its name to Meta. Apps under Meta include Facebook, WhatsApp and Instagram.
Meta said it has a comprehensive strategy in place for these elections, which includes detecting and removing hate speech and content that incites violence, reducing the spread of misinformation, making political advertising more transparent, and partnering with election authorities to remove content that violates local law.
Meta acknowledged that is well aware of how hate speech on its platforms can lead to offline harm.
The backdrop of elections, makes it even more critical for the platform to detect potential hate speech and prevent it from spreading, it pointed out.
"This is an area that we've prioritised and will continue working to address comprehensively for these elections to help keep people safe," it pledged.
Meta said it has invested more than USD 13 billion in teams and technology.
"This has allowed us to triple the size of the global team working on safety and security to over 40,000 including 15,000 plus dedicated content reviewers across 70 languages," it said.
For India, Meta has reviewers in 20 Indian languages.
Under the existing Community Standards, the platform removes certain slurs that it determines to be hate speech.
"We are also updating our policies regularly to include additional risk areas. To complement that effort, we may deploy technology to identify new words and phrases associated with hate speech, and either remove posts with that language or reduce their distribution," it said.
Content that violates policies against hate speech are removed, it said, adding that even where it does not violate policies but can still lead to offline harm if it becomes widespread, the content is demoted so fewer people see it.
Claiming it has made significant progress on its efforts, Meta said the prevalence of hate speech on the platform is now down to just 0.03 per cent, although "there is always more work to be done".
Meta will also be offering Election Day reminders to give voters accurate information and encourage them to share the information with friends on Facebook.
Last December, it announced the expansion of ads enforcement, requiring 'Paid By For' disclaimers for ads about elections or politics, to include social issues.
"The enforcement will be applicable on ads that discuss, debate, or advocate for or against important topics," Meta said.
Ahead of all elections, Meta said it trains political parties about the responsible use of WhatsApp, and party workers are cautioned about the possibility of their accounts getting banned if they send messages to people without prior user-consent.
"We know that election periods are contentious and they can often be unpredictable. So while we head into these elections in India prepared and ready to meet the challenges we know will be present, we're also ready to adapt to changing circumstances and unforeseen events," it said.
Meta emphasised it will not hesitate to take additional steps if necessary "to protect this important exercise of democracy in India" and keep the platform and the Indian people safe before, during, and after the voting ends.
India is a large market for social media platforms. As per data cited by the government last year, India had 53 crore WhatsApp users, 44.8 crore YouTube users, 41 crore Facebook subscribers, 21 crore Instagram users, while 1.75 crore account holders were on microblogging platform Twitter.
(Only the headline and picture of this report may have been reworked by the Business Standard staff; the rest of the content is auto-generated from a syndicated feed.)