Twitter on Wednesday said it had introduced measures to prevent misinformation during elections in five states and the union territory of Puducherry. Kathleen Reen, Senior Director Public Policy and Philanthropy, APAC, Twitter, spoke to Neha Alawadhi via email about issues the platform has been seeing in India.
Edited excerpts of the interview follow:
What are the top concerns you have heard from the Elections Commission and State Election Commissions around Twitter’s role ahead of and during elections?
Every year is an election year on Twitter and, as was the case in the 2019 Loksabha election and previous Assembly elections, we continue to maintain proactive engagement and dialogue with the election authorities at the National and State level. Over the last two years, our teams have held several workshops for the Election Commission officers at a National and State level to help them understand how to use our service to connect with the public during elections, how to report issues, and other escalation-related processes.
We maintain open lines of communication to help support democratic processes in India by improving the integrity and transparency of the electoral process.
Given the current frameworks, how much time would it take for Twitter to take down politically sensitive content flagged by the right channels/authorities?
If we receive a valid legal request from a government authorized entity about potentially illegal content on Twitter, we review it under both the Twitter Rules and local laws. If the content violates Twitter Rules, it will be removed from the service. If it is determined to be illegal in a particular jurisdiction, but not in violation of the Twitter Rules, we may withhold access to the content in that location only. We review every report as expeditiously as possible, and take appropriate action while making sure we hold firm to our fundamental values and commitment to protecting the public conversation.
What do you think about the new social media rules notified by the Indian government?
Twitter supports a forward-looking approach to regulation that protects the Open Internet, drives universal access, and promotes competition and innovation. We believe regulation is beneficial when it safeguards citizen’s fundamental rights and reinforces online freedoms. We are studying the updated Intermediary Guidelines and engaging with a range of organizations and entities impacted by them. We look forward to continued engagement with the Government of India and hope a balance across transparency, freedom of expression, and privacy is promoted.
Twitter’s pushback against the Indian government’s blocking orders came under a lot of discussion recently. Has the issue been resolved?
We are committed to our core values; building and protecting a safer, open, public conversation. We continue to engage and work with the Government of India, and value the critical open lines of communication and dialogue we have with them on these complex issues.
Besides our elections-related support to various government authorities, we work closely with many other government entities at the national and state levels. For example, we are working closely with the Ministry of Health and Family Welfare on COVID-19 vaccine initiatives such as Vaccine Varta, a weekly talk show hosted on Twitter that allows experts to answer questions about the COVID-19 vaccine.
The role of Big Tech and social media platforms has come under scrutiny in many jurisdictions, including India, in the past couple of years. How do you see Twitter’s role as a big technology major in the backdrop of these conversations?
In order to realise the full potential of the Internet as a global force for good, Twitter advocates for both regional and global regulatory alignment on the principle of preserving and protecting one global, open, Internet.
Regulation should enshrine and protect the fundamental rights of everyone online and not privatize the critical role of the state in protecting these rights. We urge regulators to consider whether such measures enhance the dominance of a handful of existing companies or erect barriers and set impossible-to-meet compliance costs.
We also advocate for moving away from leave up vs take down approaches to content moderation, as we believe they rule out other promising alternatives that could better address the spread and impact of problematic content, while safeguarding rights and the potential for smaller companies to compete.