(This brief note has been prepared by Neha Shetty and Shubham Singh. Neha is a Core Committee Member of PCLS and Shubham is a fourth year student of HNLU. The meeting was held on 14 August 2020 over Skype.) Internet intermediary is an entity that facilitates the use of the internet for a user. It acts as a medium or passage of transference of information, ideas and opinions. Since it does not actively participate in generating content, it ideally should not be liable for the acts of users that generate such content. However, this immunity (also known as safe harbour), is subject to certain conditions. To understand the same from different perspectives, PCLS had a discussion on August 14, 2020 as elaborated in the present note. The Information Technology Act and Safe Harbour
There are two kinds of regulatory structures that deal with internet intermediaries; first is the horizontal approach where a uniform regime exists for regulating the intermediaries and second is the vertical approach where each area of law specifically deals with intermediaries under their respective framework (example: copyright law, defamation law, etc.). India follows the horizontal approach where the guiding principles for regulating intermediaries are provided in the Information Technology Act, 2000 (‘Act’). While Section 3(w) defines an internet intermediary, Section 79 provides the same with immunity or safe harbour subject to certain exceptions and responsibilities. An intermediary, under Section 3(w), is an entity that; on behalf of another, receives, stores or transmits that record or provides any service with respect to that record. The provision also provides for certain kinds of intermediaries like search engines, online marketplaces, cyber cafes, etc. The issue pertaining to the same is with respect to its broadness. While other jurisdictions define certain classes of intermediaries, the Act encapsulates them in a generic sense under Section 3(w) which means that Wikipedia is subject to the same protection and liability as Reliance Jio is, even though both are completely different from each other in terms of their position in the intermediary framework. The safe harbour that an intermediary possesses in India is provided under Section 79 of the Act. Section 79(1) states that an intermediary shall not be liable for any third party information, data, or communication link made available or hosted by them. The confusion that arises from the same lies in Section 79(2) that provides three prerequisites to the immunity safeguarded in the first-clause. While the first and the second enumerate certain conditions that essentially mandates intermediary to only act as a facilitator of information and not actively participate in the process publishing, the third provides for intermediary to practice due diligence while performing its function over the internet. Both, Section 79(2)(a) and 79(2)(b) are separated by an “or” which is indicative of an optional requirement (of either ‘a’ or ‘b’), no such division is observed with Section 79(2)(c) and whether due diligence is a mandate or option is unclear. The same was observed and discussed in the matter of Super Cassettes Industry Ltd. v. Myspace Inc. where the court said that due diligence is a mandatory requirement. However, the same was not the part of ratio decidendi of that judgement and hence, remains ambiguous. Furthermore, Section 79(3) abstains an intermediary to commit any unlawful act and upon receiving actual knowledge or being notified by the government, mandates the intermediary to remove or disable access to the material or content of the concern. As discussed further in the next section, the meaning of what is unlawful and actual knowledge is not provided under the Act. Therefore, while internet intermediaries have immunity against any information hosted by them, the same is not unfettered and subject to vague restrictions. The Information Technology [Intermediary Guidelines (Amendment) Rules] Amidst the uncertainty of the Act that governs the liability of intermediaries, it becomes pertinent to scrutinise the draft of the Information Technology [Intermediary Guidelines (Amendment) Rules] 2018 (‘Draft Rules’), made under the Act as parent Act, on constitutional grounds as well as practical ones. Rule 3, that enlists a series of due diligence (an extension to Section 79(2)(c)) to be observed by an intermediary, is the centre of debate here. One of the primary objections raised against the Draft Rules emanates from Rule 3(2) that mandates an intermediary to advise the users that they should abstain from posting content that is ‘unlawful’ and extends to ‘disparaging,’ and ‘racially, ethically or otherwise objectionable’ content, amongst other things. The definitions of these terms and phrases are neither provided in the Draft Rules, nor in the parent Act which may lead to arbitrary and unauthorised actions against the users or intermediaries by the government. At the same time, it is argued that these grounds can be an interpretation of Article 19(2) of the Constitution, and hence, are justified and should be evaluated subjectively on case to case basis. Rule 3(4) states that an intermediary is required to publish at least once a month that the users are bound by the agreement (terms of use) and privacy policy, and a non-compliance would lead to termination of access and usage rights of the user and the intermediary can remove the non-compliant content as well. Although, this move is to make sure that users are aware of the consequence of any possible act but it clearly lacks any direct communication in the form of notice or appeal to the user for them to take cognisance of the intermediary’s acts. Further, the entities involved in operation and usage of the intermediaries are spread out at various levels which the Draft Rules do not mention. Another vaguely drafted provision is Rule 3(5) that mandates the intermediary to procure and provide certain information within 72 hours on a lawful order issued by a government agency for protective or cyber security matters connected with or incidental thereto. Not only the meaning of ‘lawful order’ and the procedure thereof remain unclear, but also the domain of information required is uncertain. This, coupled with the 72 hour time-period which is difficult to determine depending on the quantity of information and assistance required, may lead to arbitrariness on multiple tangents as mentioned. Rule 3(7) provides for certain requirements that an intermediary needs to fulfil if it has more than fifty lakh users in India. These requirements are provided under the Companies Act, 2013 which is contentious as this Rule is a subordinate legislation and it may exceed the scope that Section 79 of the parent Act holds. In addition to that, the practical problem arising out of it is the record of fifty-lakh users. Whether it refers to fifty lakh active users, active for a period of time or only registered since the inception of the intermediary in India? The requirement only applies to a specific kind of intermediary that keeps a user data with itself which is not the case for a lot of websites. Hence, the confusion is neither addressed in the Rules nor can be interpreted from the parent Act. It can be concluded that the Draft Rules contain a plethora of blind spots that can be used as an advantage by a government agency. The same can be solved by outlining the scope of the Draft Rules as well as Section 79 of the Act. Furthermore, defining terms like users, unlawful act, information, etc. as discussed can lead to clarity for the intermediaries, users and the courts to determine the course of action they should proceed with. Case Study: France and United States France In mid-May, the French Parliament passed Avia law intended to regulate online hate speech. Avia law brought in stringent provisions, forcing social networks to remove ‘hateful and manifestly illegal’ content within a span of 24 hours. Moreover, any content in relation to ‘terrorism or child pornography’ was to be taken down within 24 hours of flagging by users. Upon non-compliance, the intermediaries were liable to pay hefty fines. This law was met with widespread criticism for its chilling effect on free speech due to over regulation by the government. The Constitutional Court struck down significant portions of the law, for undermining free speech in an inappropriate and disproportionate manner. The Court remarked that free discourse on social media is vital for a democratic society. The onerous obligations imposed on the intermediaries by Avia Law, would lead to over-regulation as it places the onus on private entities to decide what constitutes legitimate speech within a short span of time. United States The First Amendment of the United States Constitution protects free speech from government interference. Provisions in the Communications Decency Act and Digital Millennium Copyright Act deal with intermediary liability. Section 230 of the Communications Decency Act, 1996, broadly protects intermediaries from liability for user generated content. Notable exceptions to this protection include, claims under federal criminal law, intellectual property, sex trafficking and prostitution. Additionally, Section 512 of the Digital Millennium Copyright Act, 1998 creates safe harbour for Internet Service Providers in copyright infringement claims. Implications for free speech: Article 19, censorship Recognising the divergence of approaches in regulating online speech in jurisdictions all over the world, the members then discussed the implications of over-regulation of online speech on freedom of speech and expression. Complete elimination of safe harbour provisions has chilling effects on free speech as it may result in over-censorship of online content by intermediaries to protect themselves from possible legal action. Moreover, passing the onus of regulating online speech completely to the intermediaries’ raises a concern as to whether private entities can be trusted to determine what constitutes ‘legitimate speech’. Way forward: How to tackle problems of online hate speech? The members discussed the United States’ Platform Accountability and Consumer Transparency (PACT) Act that is mooted as an amendment to Section 230 of the Communications Decency Act. Under the amendment, an interactive computer service provider’s immunity under Section 230 is subject to its ‘knowledge’ of the illegal content on its service and its failure to remove such content within 24 hours of that knowledge. The standard of knowledge here includes a written notification identifying the illegal content coupled with a government /judicial order declaring the content as such. The PACT Act seeks to introduce protocols requiring intermediaries to publish transparency reports pertaining to content moderation. The model adopted under PACT Act appears at first instance to be a better standard for online content moderation than the one adopted in France. Over regulation by governmental agencies and complete removal of safe harbour provisions available to intermediaries has dire effects on free speech. Therefore, any law on online content regulation should at its heart be aimed at protecting freedom of speech and expression.
0 Comments
Leave a Reply. |
Archives
September 2020
Categories |