Saturday, April 27, 2024
154,225FansLike
654,155FollowersFollow
0SubscribersSubscribe

Regulation, the Name of the Game

In order to avoid stringent regulatory action by governments across the globe, websites are making their community standards more elaborate and their AI driven-content moderation more strict

By Ashit Kumar Srivastava

Recently, Twitter India filed a writ petition in the Karnataka High Court challenging the orders given by the Ministry of Electronics and Information Technology (MeitY) for taking down content under the provisions of the Information Technology Act, 2000. The matter is still pending before the court of law, thus it is sub-judice. 

However, from a jurisprudential perspective, there are larger questions underlying these tussles. Incidences like these have not been raised for the first time and surely are not an example of a rookie war between a Private Player and a Deep State trying to over-regulate. Rather, it has turned into a more seasoned and sophisticated war between two well-versed warriors who are not ready to surrender in the battlefield.

The seeds of the battle were first sowed in 2016 when Facebook was questioned over its role in the 2016 US presidential election, famously known as Facebook-Cambridge Analytica case. Six years after that incident, governments across the globe are trying to regulate the working of social media websites. There are diverse perspectives from which the websites are being governed, one aspect being data protection. Social media websites are seen as predators who will capture the personal data of their audiences and use it to micro-target them by manipulating their choices.

The other aspect is to govern the content posted on the social media websites, under which the audience uses the neutral wall of the website to post content which is less than desirable (such as hate speech, obscene content, misinformation and disinformation). All this content has the capacity of distorting the harmonious ambience of society. Websites are not held liable till a particular time, as they are merely intermediaries. However, if the content continues to stay on their page after a particular time period as prescribed in the regulations, the websites will not be regarded as intermediaries, but be held liable as if the content has been posted by them. 

These kind of regulatory frameworks across the globe have brought about cosmic changes within these websites and altered relations between governments and social media intermediaries.

In order to avoid the stringent regulatory measures of governments, these websites have been making their community standards more elaborate and extensive and their Artificial Intelligen­ce driven-content moderation more strict. Interestingly, even after this, there have been several instances when questions have been raised on the neutrality of the content posted on the social media websites, with allegations that they have kept certain kinds of content immune from the moderation or the moderation mechanism is not adequate to address all the issues.

In this mix, when the government endorsed social media content regulation rules were made with adequate punishment for breach as in the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules 2021, several public interest litigations were filed across the country for challenging its validity on the pretext that they were suffocating freedom of expression. It has turned into a sophisticated tussle between one party showcasing itself as the harbinger of free speech and on the other, the government trying to regulate free speech. 

However, with time it has become a necessity to have more standardisation of free speech on the platforms. Also, there should be proper streamlining for the audience to know the regulation on free speech. Interestingly, IT Rules 2021 does this appropriately as it brings standard regulatory mechanism for free speech with adequate redressal mechanism. It does not necessarily mean that it is the only mechanism for ensuring standardisation; rather, there can be self-regulation by the websites, leading to what we can call as the growth of a private-regulation or private constitutionalism. 

A good example of it can be seen in recent times in the Aotearoa New Zealand Code of Practice for Online Safety and Harms. It is a pact agreed to be signed by tech giants in New Zealand to curb online harmful content. But how far these pacts will be effective is a different question. 

Interestingly, in 2011, India too had Intermediary Guidelines providing due diligence to be followed by a social media intermediary. This meant that the website needs to follow due diligence for ensuring that its platforms are not floating harmful content. However, Intermediary Guidelines 2011 as compared to Intermediary Guidelines 2021 were not stringent enough to create real pressure on the workings of these websites. India saw a plethora of obscene and hate speech content being floated on diverse platforms for close to a decade. 

As Intermediary Guidelines 2021 came in, it categorically provided for a time-period (upon receiving actual knowledge or the complaint). This can be seen in Rule 3 of the Intermediary Guidelines 2021 under which the harmful content needs to be removed. In the absence of any action taken by the concerned intermediary (refer to Rules 3, 4 and 7 of the Intermediary Guidelines Rules 2021), it will lead to removal of the “safe harbour clause”, meaning that the website cannot claim that it is merely a neutral wall on which the content is being posted. Rather, it will be held liable as if the content has been posted by it. 

Interestingly, this approach of intermediary liability is also being followed in Germany (its regulation is named as the Network Enforcement Act or NetzDG). Thus, regulator framework across the globe is taking shape, whether in the form of government-endorsed regulations or private body self-regulation. Yet, what is more important is the element of streamlining these regulations for ensuring that the masses are acquainted with them and thus help in creation of a neutral social media ecosystem.           

—The writer is Assistant Professor of Law, Dharmashastra National Law University, Jabalpur

Previous articleAbortion Rights for All
Next articleThe Cyber Threat
spot_img

News Update