Changing the status quo for social media companies in India - Hindustan Times
close_game
close_game

Changing the status quo for social media companies in India

ByAmbika Khanna
Feb 18, 2021 06:35 AM IST

In India, MeitY proposed amendments to the extant Intermediary Guidelines of 2011 in 2018 to include mandatory use of technology in content moderation and data disclosures to the government. These are still under review as the government seeks to align it with the pending Personal Data Protection Bill.

The ministry of electronics and information technology (MeitY) is engaged in a tussle with Twitter over its directions to block certain accounts. While the legal framework empowers the government to act, the episode throws up larger questions on policy gaps with regard to content regulation, and areas of ambiguity even where policy exists. Globally, social media firms are protected by the “safe harbour provision”. This protects the intermediary, say Twitter or Google, from being penalised for harmful or unlawful content, if it is not created or modified by it, or if the platform did not have knowledge of such content posted by a user.

Europe is leading the effort to effectively regulate intermediaries. In 2020, building on its e-Commerce Directive, it introduced a comprehensive Digital Services Act for handling online content, liability of intermediaries and diligence requirements, and protection of the fundamental rights of individuals. (Getty Images)
Europe is leading the effort to effectively regulate intermediaries. In 2020, building on its e-Commerce Directive, it introduced a comprehensive Digital Services Act for handling online content, liability of intermediaries and diligence requirements, and protection of the fundamental rights of individuals. (Getty Images)

The United States offers similar protection to internet companies through Section 230 of the Communications Decency Act. In Europe, the e-Commerce Directive 2000, provides protection to internet intermediaries if they act only as a conduit and do not have knowledge of unlawful content. In recent years, the Indian judiciary has tried to clarify ambiguous provisions related to the liability of intermediaries to take down unlawful content, while keeping in mind the fundamental right to freedom of expression of users.

Unlock exclusive access to the story of India's general elections, only on the HT App. Download Now!

Europe is leading the effort to effectively regulate intermediaries. In 2020, building on its e-Commerce Directive, it introduced a comprehensive Digital Services Act for handling online content, liability of intermediaries and diligence requirements, and protection of the fundamental rights of individuals. Obligations of intermediaries include timely notification to law enforcement agencies in case of illegal content, content takedown obligations, transparency disclosures such as details of account suspensions and content removals, rules on digital advertising, appointment of compliance officers and conducting annual audits.

Australia incorporated stricter rules after the Christchurch terrorist attack. The Criminal Code Amendment (Sharing of Abhorrent Violent Material) Act, 2019, mandates social media platforms to remove violent content and imposes a large penalty in case of non-compliance — 10% of the annual turnover of the company.

In India, MeitY proposed amendments to the extant Intermediary Guidelines of 2011 in 2018 to include mandatory use of technology in content moderation and data disclosures to the government. These are still under review as the government seeks to align it with the pending Personal Data Protection Bill.

While existing provisions give the State enough room to act, a change in status quo for more credible and effective interventions is urgently needed. This can only happen with the participation and deliberation of tech companies, civil society, academia and governments. Together, they can create the necessary balance between controlling misinformation/ unlawful content and protection of citizen rights, including freedom of speech. MeitY should consider following the guiding principles of transparency, accountability and grievance redressal. For transparency, each social media intermediary must disclose, in a timely manner, the process followed in moderating content, technology applied, categorisation of content between lawful and unlawful, and taking down of content. For accountability, make the principle of “duty of care” central, ie, intermediaries should be made responsible by imposing positive obligations on them to prevent users from harming others. And for grievance redress and dispute resolution, set up an independent quasi-judicial body with provisions for following the due process of law.

Additionally, MeitY may consider emulating the European classification of intermediaries, which segregates social media platforms into a sub-heading, “online platforms” with separate rules. Global rules on intermediary liability or content takedown regulations are largely absent, and social media companies have been self-regulating. Here, the G20 Digital Economy Taskforce can play an important role. As internet giants have porous territorial boundaries, it can provide a neutral platform for sharing best practices to create global standards and guidelines for liability of social media intermediaries.

Ambika Khanna is a senior researcher at the international law studies programme, Gateway HouseThe views expressed are personal

Discover the complete story of India's general elections on our exclusive Elections Product! Access all the content absolutely free on the HT App. Download now!
SHARE THIS ARTICLE ON
Share this article
SHARE
Story Saved
Live Score
OPEN APP
Saved Articles
Following
My Reads
Sign out
New Delhi 0C
Tuesday, April 16, 2024
Start 14 Days Free Trial Subscribe Now
Follow Us On