While the outbreak of the Covid-19 virus has forced a good number of industries to count their blessings, the social media sector and over-the-top (“OTT”) platforms have been an aberration and witnessed a significant surge in usage. As the content consumption increases, content production increases as well.
Thus, regulatory check over operations and content was also required to ensure that users and intermediaries do have a code that details their rights and duties pertaining to the content on digital media. Accordingly, the Indian Government formulated the new Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (“IT Rules”) which to a certain extent mirrors the Digital Services Act of the EU and the Online Safety Bill by Australia, both of which are under consideration.
On a cursory reading, the intention of the government appears to be clear i.e. to restrict misinformation and to regulate online content. The uncertainty that still looms is whether the government has framed these rules to limit the scope to just these objectives. Grievances w.r.t to online content now have to be readdressed within 15 days in contrast to the previous span of a month. Moreover, in furtherance of Rule 4 (8) of IT Rules, if a Significant Social Media Intermediary (“SSMI”) takes down some content by any person, sufficient reasoning for such action has to be provided, and the creator of the content has a right to dispute such action. This will ensure accountability on behalf of the SSMI and content takedowns for ulterior motives will face a major hurdle. However, various controversies have stirred up regarding the applicability and constitutionality of the IT Rules.
Glaring discrepancies, one of which is the first originator clause, is the obligation on SSMIs to identify the ‘first originator’ of a particular message. The same can be enforced by an order from an authority under Section 69 of the Information Technology Act, 2000 (“IT Act”). The problem here is the fact that tracing (of the first originator) and end-to-end encryption cannot happen simultaneously which has already been stressed upon in various pieces. Other issues include the chilling effect that the rules have apparently unleashed. Thus the IT Rules come as restrictions on right to privacy and freedom of speech respectively.
While delving deep and analysing the text of the IT Rules it appears that there are a host of other shortcomings which have not yet been highlighted. Elements such as overlapping provisions, wide discretionary powers, and evasion of liabilities by SSMIs are concerning, and need to be revisited. Through this piece, the authors will provide a brief backdrop of events which culminated in the making of the IT Rules, point out the lacunae in the provisions and suggest viable solutions.
In 2019, the Delhi High Court while hearing a Public Interest Litigation (“PIL”) which requested for framing guidelines to regulate online platforms, issued a notice to the government enquiring about the same. To which, the Ministry of Information and Broadcasting (“MIB”) and Ministry of Information and Technology (“MIT”) responded that they didn’t possess any legal authority to regulate online content. Online platforms, noticing these proceedings, formulated and released Universal Self-Regulation Code for Online Curated Content Providers in 2020 (“Code”).
However, as there was a conflict of interest this Code was summarily rejected by the MIB. Thereafter, another PIL was filed in 2020 seeking directions to the government to create a regulatory body for online streamed content. One of the pertinent points relied upon by the petitioners was with regards to the film, Gunjan Saxena: The Kargil Girl aired on Netflix. In an attempt to glorify the Indian Air Force (“IAF”) Flight Lieutenant, IAF was portrayed as a misogynist organisation.
IAF had to write a letter highlighting their concern and display of demeaning and disrespectful false content. However, neither the Central Board of Film Certification nor MIT had the powers to regulate online streamed content. The Honourable Apex Court upon realizing these lacunae in law directed the government to frame guidelines for the regulation of online streaming. In furtherance of the same, the Government of India (Allocation of Business) Rules, 1961 was amended to encompass what the IT Rules define as Social Media Intermediaries (“SMI”) and digital media.
The IT Rules in Part III contain Grievance Redressal Mechanism regarding content published by publishers of news and current affairs, and content or publishers of online curated content. As per the IT Rules, an Interdepartmental Committee along with representatives from various Ministries has to be set up for removal of objectionable content. As per Rule 8 (3), remedies available under Part III are in addition to Information Technology (Procedure and Safeguards for Blocking of Access of Information by the Public) Rules, 2009 (“IT Safeguard”). The latter Rules prescribe a procedure that includes setting up a Committee and reviewing the request for the removal of information in its Rule 7.
The other concerns are provisions for emergency blocking of content as the provisions are already provided in IT Safeguard Rules, 2009. While it is appreciated that the government wants timely action taken, giving the users or viewers multiple procedures for a single cause of action appears to be a case of heckler’s veto wherein socially powerful groups can shut down critical or inconvenient speech by threatening public disorder or disturbance. The consequence is that the Government restricts freedom of speech and expression of the people outside such social groups.
Wide Discretionary Powers
Part II of the IT Rules provides a framework for the regulation of SMI. Clause 3 (1) (b) (viii) provides for ‘public order’ as a red flag, pertaining to which any activity on social media needs to be taken down by the intermediary. The next material clause which uses ‘public order’ as a parameter is clause 4 (2), which requires an SSMI to identify the ‘first originator’ of the information, if that information may disrupt public order.
According to the Apex Court ‘public order’ is of wide connotation, and thereby accords immense discretionary power to the government. In this context, the IT Rules shower significant powers on the government to take such content down which may be contrary to their political ambitions under the garb of ‘public order’. Part III which deals with digital media, in clause 12 (5) (e), provides a solution to this problem.
The said clause reads that the content can be taken down ‘for preventing incitement to the commission of a cognizable offence relating to public order’. In the event that this phrase is not enough to encompass miscreants who willingly and systematically spread fake news to disrupt the public order, then clause 3 (1) (b) (x) discourages any content which is false and is circulated with an intent to mislead to cause injury to any person. In this light, it is argued that the wide powers that the government derives by the usage of the phrase ‘public order’ can be done away with. Doing so, will not compromise the intent of maintaining public order, and at the same time does not bestow huge discretionary powers upon the government.
Unnecessary burden on the budgets of organizations is another concerning fact that might have indirect consequences on the freedom of speech and expression of organisations.. According to Rule 6, the Central Government through notification can mandate an intermediary or SMI to comply with the requirements meant for SSMI. This results in a dynamic increase in cost of management and administration for SSMIs. This method can be used by the Government to make policy think tanks from criticizing the actions of the government through research by imposing obligations on them and cutting a hole in their budgets because these organizations run primarily on crowdfunding and donations. The consequence is that the right to freedom of speech and expression of these organisations is curtailed to a certain degree.
Evasion of Liabilities by SSMIS
Rule 4 (7) of the IT Rules intends to check the spread of incorrect information or deepfakes and casts an obligation on SSMIs to verify the users through ‘any appropriate mechanism’. However, using such an open-ended expression gives the SSMI a method to bypass verification in spirit. SSMI in bad faith can misuse the provision through shallow verification mechanisms of users and, then later on, try to absolve itself from liability citing the absence of any standards of verification. Such a situation would defeat the purpose of the Rules and would also curb freedom of speech and expression as right to receive information is inbuilt in the said freedom.
Solution to this can be a change in the language of the provision by removal of ‘any appropriate mechanism’ and adding more parameters in addition to phone numbers like photo ID that can conveniently be used for verification. IT Rules provide SSMI another leverage, i.e., an obligation to appoint a Chief Compliance Officer (“CCO”). The problematic part of the rule is that it extends a personal liability to the CCO in case of failure of compliance.
In the opinion of the authors, this creates an issue as the SSMI can easily absolve its liability by pinning it to CCO. A platform hosting content is not the responsibility of one person. The whole policy formation regarding content that would be compliant and non-compliant with the internal policy of the organization involves decisions at many levels. The idea to hold a single person liable is inappropriate. This must also be looked at in the context of the fact that the IT Rules 2021 are already being viewed in a negative light seeking to create a chilling effect. The issue can be resolved if the employment contract of these persons is mandated to contain indemnity insurance clauses in cases where prima facie negligence of CCO is not made out or arranging for an independent enquiry by the Nodal officer.
Dealing with a petition w.r.t to the regulation of OTT platforms, the Supreme Court had observed that the IT Rules are formulated more in the shape of guidelines for such platforms, and do not provide an effective mechanism for ensuring compliance and penalty on violation thereof. The task at hand is a hornet’s nest even when it is looked at through the most nonchalant viewpoint.
To draw a middle ground between two contrary yet cardinal components of a healthy democracy, i.e., ensuring the freedom of speech and expression and tackling the mounting menace of fake news, simply put, needs to be done. Such is the nature of social media that it has spilled into an interesting tussle amid two schools of thought. While in the era of fake news and deepfakes, it is understandable that regulation of content being consumed is justified, the question that remains to be answered is how and to what degree can this regulation be done?
Toshaar Trivedi and Achyutam Bhatnagar, National Law University Odisha