The PCLS BLOG An initiative by the students of HNLU
The Blog has shifted to a new website! You can access it here or visit https://pclshnlu.wordpress.com/
All future blog posts will be published on the new website and the process of shifting the old posts is ongoing. |
(Nikunj Maheshwari and Kapil Shrivastava are 4th and 3rd year students respectively, at Institute of law, Nirma University, Ahmedabad)
COVID-19 pandemic has come as a Pandora box, where on one hand it has transpired loopholes in systems and mechanisms created by humans, and on the other hand, it has proved the importance of virtual connectivity for the progress of the humankind. With incessant growth of online content sharing platforms such as YouTube, Facebook, Instagram, and Tiktok, to name a few, as well as the number of users using these platforms, concerns pertaining to the content shared on these platforms have also ratcheted. One such incident that has again brought this perennial concern to limelight is a video created by a Tiktok user which allegedly promoted acid attacks on women. When this video surfaced over the internet National Commission for Women lodged a complaint against the video creator. While in this case a legal action was taken, a plethora of other sordid videos promoting violence, sexual abuse, and homophobic content receive umpteen amounts of views on a regular basis but escape the clutches of the authorities. This results in serious harm to the mentality of young viewers and as a sequitur promotes sinister thoughts among the users. Judicial opinion and exigency for regulation Multiple High Courts have opined how objectionable videos and content on unregulated platforms like Tiktok are causing harm to society. The Madurai Bench of the Madras High Court, in the year 2019 issued an interim order in the judgement of S Muthukumar v. Telecom Regulatory Authority of India and while taking an austere stance, banned the download of the Tiktok application along with circulation of videos created through it on other platforms. The court noted that – “From the above arguments, this Court expresses a serious concern over the possibility of women and children of our country being sexually abused by video sharing and some predators are exploiting the innocent victims. The learned Senior Counsel for the respondents 6 and 9 has agreed with the same and submitted that the Government should be keen in taking appropriate action in the larger public interest” The Orissa High Court also commenting on the same issue in a recent judgement of Shibani Barik v. State of Orissa noted that “Tiktok Mobile App which often demonstrates a degrading culture and encourages pornography besides causing pedophiles and explicit disturbing content, is required to be properly regulated so as to save the teens from its negative impact. The appropriate Government has got the social responsibility to put some fair regulatory burden on those companies which are proliferating such applications”. The High Court in the same judgement further went on to comment that “sections 66E, 67 and 67A of the Information Technology Act, 2000 which prohibits and punishes publication and circulation of obscene or lascivious content are grossly insufficient” to tackle the situation. As it is the duty of the intermediary to moderate the content, some of the platforms have in this direction increased the use of automatic flagging and human moderators to keep a check on the quality of the content. However other platforms like Tiktok have miserably failed to ensure the same. This has raised not only concerns but also demand for a law or a set of regulation to manage this growing menace. In India it is not the case that no regulations have been formulated by the government. In the year 2019, when the same issue arose before the Supreme Court, the Government responded by stating that they will notify The Information Technology (Intermediaries Guidelines (Amendment) Rules, 2018 ( from now onward, the rules) by January 2020, which has not yet been done. The authors in this post will argue the necessity to notify laws and will endeavor to critique the proposed law in light of fundamental rights by taking into account multiple incidents, judicial precedents and international jurisprudence. Shredding the Fabric of Fundamental Rights In a law regulating online content the primary concern that arises is what will be its insinuations on the fundamental human rights? Global standards have identified that the same rights that people have offline must also be protected online. By virtue of this, it can be held that any law that does not comply with fundamental rights guaranteed in constitution they should be held as unconstitutional. In this context, protection of three fundamental rights emerges. Firstly, protection of freedom of speech and expression under Article 19(1)(a). Secondly, right to privacy under Article 21, and Lastly protection of the rights to do business under article 19(1)(g).
Regulation 3 of the Rules asks online platforms to not display or publish certain category of information and to further use Artificial Intelligence to manage ‘unlawful information and content’. In this regulation multiple words and phrases for instance 'unlawful information or content', ‘otherwise unlawful in any manner whatever’, ‘blasphemous’ or ‘public order’ are not defined. With no definitions and clarity provided, the artificial intelligence and human moderators run the risk of identifying false negatives and false positives, because of which they may end up deleting lawful content, assuming it to be unlawful. Thus because of great arbitrariness, ambiguity and vagueness involved in the law, it has great potential to harm the freedom of speech and expression under Article 19(1)(a), and to misused by government to shun any kind of dissent.
However, the proposed law seems to be contrary to it. In the opinion of the author, right to privacy includes privacy from the government. Clause 3 of the rules obligates the online platforms to compulsorily share the information with the government when it asks for. With no checks and balance provided, the present law is not ‘privacy neutral’ and can be misused by the government to suppress voices of whistle blowers in the facade of regulating online content. Thus, it is pertinent for the Government to introduce regulations that are more ‘privacy neutral’ and actually attempts to solve the problem.
Thus, while framing regulation, the government will have to keep in mind that such harsh steps are saved for only exceptional circumstances. International view France has recently enacted a law to curb offensive content. It directs online platforms to pull down contents of racist, sexist etc nature, within 24 hours otherwise fine will be levied. Network Enforcement Act, in Germany, on the other hand, provides strict take-down timeline for hate speech with some relaxation if some necessary facts are to be determined for the reliability of the information. French law has been heavily criticised on the grounds, that firstly, 24 hours is an extremely short span of time to make conclusions about ‘hate speech’, which is an intrinsically circumstantial, fact-specific, area of law, and secondly, complete burden and liability, on the online platforms may push them to take down lawful content citing hate speech. Whereas German law tries to take all these challenges and perspectives into account and then imposes any punishment on the online platforms. For India, we must attempt to follow the German model, instead of becoming prey to the French one, as it will help us to fix liability on the true assailant and will also maintain healthy market for online platform companies. Conclusion It is evident that social media can cause great menace, so much so to provoke a person to commit suicide. Nonetheless, in democratic country like India, regulations of any nature need to be framed after balancing the fundamental rights of the citizens and cause in hand. Therefore, it is suggested by the author that the need of the hour is the law, which appropriately optimizes to balance all concerns i.e., a combination of legal provisions and self-regulation (by apps and their users) coupled with the tag of accountability. For this purpose, the legislature should borrow certain principles and practices from the German model and attempt to address real issue of regulating content instead of restricting freedom of speech under the facade of regulating online content.
10 Comments
24/6/2022 02:11:03 am
In this background, banning of application as done by madras high court in case of tiktok as explained in above-mentioned paragraphs cannot be permitted because firstly, Thank you for the beautiful post!
Reply
24/6/2022 02:51:04 am
Whereas German law tries to take all these challenges and perspectives into account and then imposes any punishment on the online platforms. Thank you, amazing post!
Reply
Harley
2/8/2022 04:41:56 am
Excellent post! This is informative and for sure helpful. I want to share a company that helped me sort out my community duties and build my club. "Community Club Victoria". https://www.ccv.net.au They understand club needs and provide support and advice to club networks. Once again, thank you.
Reply
21/12/2022 03:02:15 pm
İnstagram takipçi satın almak istiyorsan tıkla.
Reply
8/1/2023 07:40:08 pm
100 tl deneme bonusu veren siteleri öğrenmek istiyorsan tıkla.
Reply
7/6/2023 12:50:40 am
I encourage you to read this text it is fun described ...
Reply
30/6/2023 06:03:36 pm
En iyi amasya ilan sitesi burada. https://amasya.escorthun.com/
Reply
Leave a Reply. |
editorial board
Executive Editors
Ruchira Joshi Hriti Parekh Managing Editors Yashowardhan Agrawal Associate Editors Avani Bajpai Nipun Chandrakar Devashish Jain submission guidelines
DISCLAIMER
The views expressed in the posts published on The PCLS Blog are solely their author's. The PCLS or the editors do not necessarily subscribe to authors' views.
Archives
August 2020
Categories
All
|