Social media platforms are now the public squares of the world’s democracies. It is through them that political leaders speak to citizens, voters engage with one another, and narratives of one sort or another are built. In most democracies, therefore, there is an open discussion about how these platforms must conduct themselves. It has become clear that India too is overdue for such a discussion. While Facebook has said it has a policy of enforcing a filter “without regard to anyone’s political position or party affiliation”, a recent report in The Wall Street Journal revealed that a senior executive of Facebook in India had repeatedly intervened to protect posts from ruling-party politicians that might otherwise have fallen foul of Facebook’s internal hate-speech prohibitions. This is in spite of the fact that Facebook Chairman Mark Zuckerberg, in discussions with the company’s own employees, has used instances of hate speech from India — in the context of the Delhi riots earlier this year — as examples of when Facebook should intervene to control expression on its platform.
It is natural that there are political biases in how social media platforms operate. It would be difficult to eliminate them. Yet it is clear that in India, at least, the protections that many of these platforms have promised for other democracies are not being applied in full measure. Companies like Facebook will have to voluntarily choose to be more transparent and less arbitrary in how they manage political speech, especially in sensitive geographies like India. The Journal report suggested that a single employee could essentially derail whatever management and regulatory processes Facebook has set up internally. These processes therefore are not robust enough, and clearly can be subverted by those whose job is not to ensure the appropriateness of content on the platform, but to keep the corporation in the good graces of the political powers that be. Facebook will have to make clear changes to its internal systems dealing with Indian content, and publicise the changes, alongside ensuring accountability for the employee or employees that subverted it.
If platforms do not do such reforms, the cost to both themselves and the countries in which they operate will be great. Hate speech cannot be taken lightly. In countries struggling with internecine tension and even violence, those who incite further violence should not be given a platform. It should also be clear that these platforms’ own continued sustainability in democracies depends upon them instituting such reforms. Their own credibility as platforms and the security of their operations in democracies depend upon them being seen as neutral to the dispensation in power. Or else they will be subject to the sort of political risk and exposure which will blow back on their shareholders and owners.
Politicians might also be encouraged to demand a closer say in how social media platforms manage content, which would not just hinder these companies’ operations but also have significant consequences for the freedom of speech generally. Facebook in particular has sought to pacify such forces in the United States and Europe, incensed by its inability to control meddling in democratic processes like elections. In its largest market by audience volume, India, it can do no less. Unless it makes amends publicly, it — and other such platforms — will find they are supplanted in time.
To read the full story, Subscribe Now at just Rs 249 a month