Logo

Defender of your Digital Freedom

All Posts | May 24,2019

Any regulation of online speech in India must safeguard the rights to free speech and privacy

Unlike the US, free speech in India is not absolute. Our Constitution, while guaranteeing the freedom of speech and expression, places “reasonable restrictions” on this basic human right.

Before 2015, online and offline speech were treated on different pedestals under law. As per Section 66A, an infamous provision of India’s Information Technology Act, 2000, anyone who posted material that was grossly offensive, inconvenient, injurious, menacing in character or insulting, could be imprisoned for up to three years.

This draconian provision was struck down by India’s Supreme Court in 2015 for being violative of the constitutionally guaranteed right of free speech and expression, in the landmark case, Shreya Singhal vs Union of India.

Besides championing free speech in the online world, the Supreme Court, in Shreya Singhal, absolved content hosting platforms like search engines and social media websites from constantly monitoring their platforms for illegal content, enhancing existing safe-harbour protection (legal protection given to internet companies for content posted by their users).

The court made it clear that only authorised government agencies and the judiciary could legitimately request internet platforms to take down content. As content hosting platforms are the gatekeepers of digital expression, this was a turning point in India’s online free speech regime.

Despite Shreya Singhal, state authorities continued their use of Section 66A and other legal provisions to curb online speech. In 2017, a youth from the state of Uttar Pradesh was booked under Section 66A for criticising the state’s chief minister on Facebook.

Journalists are often targeted by state authorities for their comments on social media. In September, last year, a Delhi-based journalist was arrested for his tweets on sculptures at the Sun Temple in Konark, Odisha, and another journalist from Manipur was booked under the stringent National Security Act, 1980, and jailed for uploading a video on the internet in which he made remarks deemed to be “derogatory” towards the chief minister of the state.

Proposed amendment

In December, the Union Ministry of Electronics and Information Technology, the nodal ministry for regulating matters on information technology and the internet, released a draft amendment to guidelines under the Information Technology Act, which prescribe certain conditions for content hosting platforms to seek protection for third-party content.

The amendment, which was brought along to tackle the menace of “fake news” and reduce the flow of obscene and illegal content on social media, seeks to mandate the use of “automated filters” for content takedowns on internet platforms and requires them to trace the originator of that information on their services (this traceability requirement is believed to be targeted at messaging apps like WhatsApp, Signal and Telegram).

Apart from state authorities, content sharing and social media companies take down content in tandem with their community standards and terms and conditions. This is often arbitrary and inconsistent.

In February, Twitter was heavily criticised for blocking journalist Barkha Dutt’s account after she posted personal details of people who were sending her rape threats and obscene pictures. While blocking her account, Twitter failed to takedown the obscene content directed at Dutt.

Similarly, in March, Facebook blocked the account of prominent YouTuber and social media personality Dhruv Rathee after he shared excerpts from Adolf Hitler’s biography Mein Kampf on his Facebook page.

Threat to free speech

Our online speech is heavily dependent on policies (both government and industry lead) which affect digital platforms like Facebook, Twitter and YouTube. Recognising this fact, SFLC.in, in March, published a comprehensive report which captures the legal landscape in India and key international developments on content liability on internet platforms.

We believe that government regulation such as the draft amendment to the rules that regulate platform liability undermines free speech and privacy rights of Indians in the online world, while promoting private censorship by companies.

Having said that, acknowledging the problems of circulation of illegal content, legitimate access to law enforcement and disinformation on the internet, the law should mandate governance structures and grievance mechanisms on the part of intermediaries, enabling quick takedown of content determined as illegal by the judiciary or appropriate government agencies.

The “filter bubble” effect, where users are shown similar content, results in readers not being exposed to opposing views, due to which they become easy targets of disinformation.

The way forward

Content hosting platforms must maintain 100% transparency on political advertising and law enforcement agencies should explore existing tools under law (such as Section 69 of the Information Technology Act and exploring agreements under the Clarifying Lawful Overseas Use of Data or CLOUD Act in the US) for access to information.

Tech-companies must also re-think their internal policies to ensure that self-initiated content takedowns are not arbitrary and users have a right to voice their concerns.

Government agencies should work with internet platforms to educate users in identifying disinformation to check its spread.

Lastly, the government should adhere to constitutionally mandated principles and conduct multi-stakeholder consultations before drafting internet policy to safeguard the varying interests of interested parties.

All Posts | May 02,2019

Google v. Visakha: Final Arguments

In 2009, Visakha Industries, a construction company involved in the manufacturing of asbestos cement sheets, filed a criminal defamation case against Ban Asbestos Network India (BANI), its coordinator and Google India. It alleged that the coordinator of BANI had written blog posts on a website owned by BANI, that contained scathing criticism of the company and therefore harmed its reputation in the market. Google India was also arraigned as a party in the litigation because the blog post was hosted on the blog publishing service of Google (Google Groups).

Google India moved the High Court of Andhra Pradesh for dismissal of the criminal charges against it on the grounds that it enjoyed safe-harbour protection under Section 79 of the IT Act. It was contended that Google is neither the publisher nor endorser of the information, and only provides a platform for dissemination of information. It, therefore cannot be held liable. The High Court refused to accept Google’s contention and dismissed the petition on the ground that Google failed to take appropriate action to remove the defamatory material, in spite of receiving a take-down notice from the company. Aggrieved by the judgment of the High court, Google filed an appeal in the Supreme Court in 2011, where the matter is currently pending.

On April 23, 2019, the matter came up for final arguments in the court of Justices Ashok Bhushan and K.M. Joseph. Senior Counsel Sajan Poovayya appeared on behalf of Google India and submitted that Google India is a subsidiary of Google US, which is a company incorporated under the laws of the United States of America. He asserted that Google India, by virtue of being a mere subsidiary does not exercise any editorial control over content posted on google groups. It was further contended that the relief claimed by the complainants is misdirected and hence not maintainable since Google India has been arraigned wrongly in this matter. Since Google US is the parent company of Google India, it is an intermediary in the present case, stated Mr. Poovayya. He mentioned that even if Google US is made a co-accused, it will be eligible for safe harbour under Section 79 of the Information Technology Act. Mr. Sajan Poovvayya pointed out that child pornography or content that is blatantly illegal is immediately pulled down from platforms, but whether a particular content is a defamatory cannot be determined by a private company. It has to be a judicial determination. It was highlighted that the terms of services is a document of unimpeachable character which clearly mentions that there is a contract between the content creator and Google Inc. The terms of services also lays down the types of content that cannot be uploaded on Google Groups. On country-specific domain names, Mr. Poovayya clarified that google.co.in is owned and operated by Google Inc and not Google India and the kind of content that can be posted on platforms is different for different countries, depending on the social and cultural context of that particular country.

Mr. Poovayya then emphasized that the present complaint is not against Google, but technology itself, since the complainant has mentioned that Google provides a service that helps in dissemination of information. He mentioned that it is not humanly possible to verify each blog post that is posted on the website since the volume of content is phenomenal. Ms. Madhavi Divan, appearing for Union of India interjected and said that Google India cannot wash its hands off saying that they are a subsidiary and therefore have no control over the content. Someone has to be made responsible.

Mr. Poovayya went on to explain the intermediary liability regime in India, including the procedure for notice and take down which had been overhauled by the Shreya Singhal judgment. He asserted that giving adjudicatory powers to intermediaries is dangerous and will lead to chilling of free speech. The law recognizes that the primary responsibility is on the originator of information since she is the author, and not on the intermediary. In a free speech democracy like India, there cannot be control of content on the Internet, he emphasized. Justice Ashok Bhushan remarked that defamation is subjective. What may be defamatory for one might not be defamatory for another. At this point, Mr. Poovayya gave the example of flag burning which is an offence in India but a symbolic act in the United States.

The hearing continued on May 1, 2019.

Mr. Sajan Poovayya appearing on behalf of Google India reiterated that removal of any content is in the hands of the parent company (Google Inc) and not Google India. He requested the bench to go through Section 2(1)(w) of the IT Act that defines ‘intermediary’ and Section 81 (IT Act to have overriding effect). He then explained that an intermediary is the connector between the ‘originator’ and ‘addressee’ as explained in the IT Act. Mr. Poovayya stated that in this case, the pre-amended section 79 of the Act will apply, but it doesn't make a difference, because Google India is not an intermediary in the present case.

Mr. Poovayya mentioned that in common law, the intermediary is not the publisher of the information. The author and publisher converge in the electronic world. He pointed out that simply hosting content does not constitute ‘knowledge’. Justice K.M Joseph enquired about the kind of functions Google India performs, to which Mr. Poovayya responded by saying that Google India is involved in research and development of software. He remarked that Google India is the subsidiary of an intermediary, but to say Google India is an intermediary is wrong.

Mr. Poovayya then addressed the criminal complaint against Google India and asserted that just because Google’s technology is responsible for dissemination of information, they cannot be made an accused in the present case. He read out Google Inc’s response to the defamation complaint, which said that Google Inc had asked for the exact message ID of the alleged defamatory content and other relevant information. Mr. Poovayya highlighted that the summons that was issued to Google India in Bangalore was outside the jurisdiction of the court. Thereafter, Google had moved the Andhra Pradesh High Court to quash the complaint. He maintained that Google Inc has no liability by virtue of being an intermediary. Explaining Shreya Singhal v. Union of India, Mr. Poovayya stated that ‘actual knowledge’ was read down to mean that a private takedown notice cannot be sent to the intermediary. It has to be either a court order or a government notification. He remarked that Google cannot sit in judgment to decide the legality of a private notice, and therefore the notice and takedown system was scrapped in Shreya Singhal. He cited the American case of Anderson v. New York wherein it was held that a telephone company is a mere conduit and cannot be held responsible for the actions of a third party.

The counsel for Visaskha Industries contended that the exemption given under Section 79 of the IT Act is subject to fulfilment of certain due diligence conditions and it will be a matter of fact whether Google adhered to the Rules. He mentioned that Google India is very secretive about the kind of functions they perform. They should come forward and tell the court what exactly is their function in India. Counsel for Visakha Industries highlighted that Google had all the power to remove the content but did not remove it which suggests that they had consented to the publication of the defamatory content. He maintained that Google may have a defence prior to Visakha Industries sending them a notice, but after receiving the notice, they were obligated to remove it as it affected the complainant’s right to reputation. Concluding his arguments, he stated that Google is present everywhere and hence cannot claim the defence of geographical location. Google ads are local in nature depending on the geographical location of the user.

Senior Counsel Madhavi Divan appeared for the Union of India. She submitted that the role of intermediaries has changed over time. Intermediary is not the publisher of information was the view 2-3 years ago. They were regarded as neutral highways, but this has changed. Now they are curating content on the basis of user behaviour and therefore cannot be regarded ‘neutral’, Ms. Divan remarked. She stated that governments all across the world are grappling with the issue of moderating illegal content on the Internet and gave the example of the Christ Church shooting in New Zealand where the perpetrator live streamed the act on Facebook. Citing Shreya Singhal, Ms. Divan maintained that it cannot be left to the subjective judgment of intermediaries to determine the legality of particular content.

The matter is reserved for judgment and all the parties are requested to submit their written submissions within a week.

All Posts | Apr 08,2019

Madras High Court Bans Downloading of TikTok

On April 3, 2019, the Madurai Bench of the Madras High Court issued an order (the court order is attached below) prohibiting downloading the TikTok mobile app. The bench comprising Justices N Kirubakaran and SS Sundar has also restrained the media from telecasting videos made on the app and has asked the Central Government whether it will enact a statute similar to the United State’s Children’s Online Privacy Protection Act. The public interest litigation [Writ Petition (MD) no. 7855 of 2019] was filed by one S. Muthukumar against the Telecom Regulatory Authority of India, Ministry of Communications, District Collector (Madurai District), Commissioner of Police (Madurai) and the Business Head of TikTok.

Citing misuse of the app, the petitioner prayed (a copy of the petition is attached below) for issuance of a writ of mandamus and banning the TikTok mobile application. The petitioner highlighted widespread circulation of pornography, exposure of children to disturbing content and their susceptibility to paedophiles, degrading culture, social stigma and medical health issues between teens, as reasons for filing the petition. The petition sought to direct the attention of the court to Tamil Nadu State Information Technology Minister’s statement before the State Assembly, urging the Central Government to ban the TikTok app. To emphasize the instances of menace created by various apps, the petitioner cited 159 deaths in India due to incidents involving selfies. The petitioner also mentioned the steps taken by countries such as Indonesia which blocked the TikTok app for circulation of pornographic and blasphemous content as well as the USD 5.7 million fine imposed on the app by the United States under its Children’s Online Privacy Protection Act (COPPA). The petition doesn't quote any legal provision and it simply relies on conjecture for asking the court to ban the TikTok app.

A quick reading of the order indicates that the court has not relied on prevailing judicial or legal principles on free speech, censorship and intermediary liability in arriving at the said order. Based on one-sided averments, the court has taken recourse to social morality and endorsed the remarkably overbroad language of the petition which says that the app promotes “degrading culture and encourages pornography besides causing pedophiles and explicit disturbing content, social stigma and medical health issues between teens.” The bench goes on to assert that “nobody can be pranked or shocked or being made a subject of mockery by any third party and it would amount to violation of privacy”. The court further affirmed that addictive apps like TikTok spoil the future and the minds of youngsters.

TikTok, an online intermediary that provides a platform for users to create and share short videos, enjoys safe-harbour protection under Section 79 of the Information Technology Act, 2000 (Section 79 provides intermediary platforms like TikTok protection against third-party content on their platforms). The Madras High Court has completely disregarded the dictum of the Supreme Court in the case of Shreya Singhal v. Union of India, wherein the apex court had held that intermediaries enjoy a safe-harbour protection for  third-party user generated content on their platforms. The court had also stated that intermediaries are neutral platforms that cannot judge the legitimacy of the content posted on their website(s).

The order violates the fundamental right of free speech as enshrined under Article 19(1)(a) of the Constitution of India. Several judicial pronouncements such as Romesh Thapar v. State of Madras[fn][1950] S.C.R 594 at 602[/fn], Bennett Coleman & Co. v. Union of India[fn][1973] 2 S.C.R 757 at 829[/fn], and Shreya Singhal v. Union of India[fn]AIR 2015 SC 1523[/fn] have held that freedom of speech and expression is the bedrock of democracy. The grounds for prohibiting the use of the TikTok app does not fall within the purview of Article 19(2) (which provides for constitutional restrictions on free speech), but instead seems to be based on a skewed sense of morality, which cannot be a guiding principle for constitutional interpretation. In the case of Navtej Singh Johar v. Union of India, the Supreme Court had laid down that constitutional morality trumps social morality and ‘it is the responsibility of all the three organs of the State to curb any propensity or proclivity of popular sentiment or majoritarianism.’

The gag order on media is a classic example of judicial pre-censorship and is against the judgment of the Supreme Court on the issue. The position in common law, as espoused by William Blackstone has been that ‘the liberty of the press is indeed essential to the nature of a free state; but this consists in laying no previous restraints upon publications.’ In 2017, a bench of Justices Khehar and D.Y Chandrachud clarified that pre-broadcast or pre-publication regulation of content was not in the court’s domain.[fn]Common Cause v. Union of India[/fn], The court also said that the role of a court or a statutory authority will come in only after a complaint is levelled against a telecast or publication.

In the case of R. Rajagopal v. State of Tamil Nadu, the court had held that there is no law that authorizes prior restraint. Citing New York Times v. United States, the court observed that "any system of prior restraints of (freedom of) expression comes to this court bearing a heavy presumption against its constitutional validity.”

Though the Madras High court has urged the Government of India to enact a statute like the United State’s COPPA for protecting online privacy of children, it has ultimately banned further download of the application. The court’s order prohibiting the use of the app far exceeds a “proportionate regulatory response” to protect children’s data, as recommended by the Justice B.N.Srikrishna Committee.

TikTok has challenged the order of the Madras High Court before the Supreme Court of India.

The petition and a copy of the order are attached below:

All Posts | Feb 18,2019

Our Counter Comments to MeitY on the Draft Intermediaries Guidelines (Amendment) Rules, 2018

The Draft Information Technology [Intermediaries Guidelines (Amendment) Rules], 2018 (“the Draft Rules”), were issued by the Ministry of Electronics and Information Technology (“MeitY”) on the 24th of December, 2018. The Draft Rules seek to amend existing ‘due diligence’ guidelines [The Information Technology (Intermediaries Guidelines) Rules, 2011] which are to be followed by ‘intermediaries’ [as per the Information Technology Act, 2000 (“IT Act”)]. Section 79 of the IT Act provides for a safe-harbour to intermediaries for, “any third party information, data, or communication link made available or hosted by him”. Intermediaries are required to observe due diligence while discharging their duties under the IT Act and observe guidelines as laid down by the Central Government.

We had submitted detailed comments to MeitY on the Draft Rules on January 31, 2019, highlighting concerns like - the 'one size fits all' approach to regulation; use of vague and ambiguous terms; violation of free speech and privacy rights; excessive delegation of legislative powers; and lack of procedural safeguards. Our detailed comments can be found, here.

Subsequently, MeitY had uploaded the comments it received from various stakeholders in two batches, they can be found, here and here.

The time period for submitting counter comments to MeitY concluded on February 14, 2019 and we submitted our detailed counter comments to MeitY on the due date. Our counter comments as submitted to MeitY are as follows:

All Posts | Feb 01,2019

Our Comments to MeitY on the Draft Intermediaries Guidelines (Amendment) Rules, 2018

The Draft Information Technology [Intermediaries Guidelines (Amendment) Rules], 2018 (“the Draft Rules”), were issued by the Ministry of Electronics and Information Technology (“MeitY”) on the 24th of December, 2018. The Draft Rules seek to amend existing ‘due diligence’ guidelines [The Information Technology (Intermediaries Guidelines) Rules, 2011] which are to be followed by ‘intermediaries’ [as per the Information Technology Act, 2000 (“IT Act”)]. Section 79 of the IT Act provides for a safe-harbour to intermediaries for, “any third party information, data, or communication link made available or hosted by him”. Intermediaries are required to observe due diligence while discharging their duties under the IT Act and observe guidelines as laid down by the Central Government.

We have submitted detailed comments to MeitY on the Draft Rules highlighting concerns like - the 'one size fits all' approach to regulation; use of vague and ambiguous terms; violation of free speech and privacy rights; excessive delegation of legislative powers; and lack of procedural safeguards. Our detailed comments are as follows:

All Posts | Feb 01,2019

Joint Letter to MeitY Addressing Key Issues with the Draft IL Rules, 2018

The Ministry of Electronics and Information Technology, Government of India ("MeitY"), released the Draft Intermediaries Guidelines (Amendment) Rules, 2018 on the 24th of December, 2018 ("the Draft Rules"). The Draft Rules seek to amend the existing Intermediaries Guidelines, which enlist certain conditions for online intermediaries to follow in order to qualify for the safe-harbour protection offered to them under Section 79 of the Information Technology Act, 2000. MeitY had requested relevant stakeholders to provide their comments/ suggestions by the 31st of January, 2019 on the Draft Rules.

We have submitted a joint letter to Shri Ajay Prakash Sawhney, Secretary, MeitY, highlighting key provisions of the Draft Rules which impact basic human rights such as free speech and privacy. The joint letter has been signed by various civil society organizations, free software associations, public spirited academicians and professionals.

We have also submitted our detailed comments to the Ministry stating our concerns with the Draft Rules. Our detailed comments will be available on this website soon.

We endeavour to regularly consult with concerned government departments on proposals which adversely affect citizen's digital rights.

We thank all the signatories for extending their support and standing for digital rights.

A copy of the joint letter is as follows:

All Posts | Jan 28,2019

Countering Misinformation: Policies and Solutions, Kochi

SFLC.in is organizing “Countering Misinformation: Policies and Solutions” a round table discussion to understand the phenomenon of misinformation and disinformation, and discuss solutions to tackle the issue.

In recent times, the impact of misinformation and disinformation campaigns in terms of inciting violence, mob lynchings and manipulating elections have reached unprecedented levels. The Internet has undoubtedly facilitated the spread of fake news due to its wide reach and the ease it provides to spread information easily.

This brings us to some important questions that we need to answer in order to engender a discourse to fight fake news. We will seek to answer these questions and brainstorm ideas to inform the policy debate around this issue.

We have published FAQs on Draft Amendment of Intermediary Guidelines Rules in India - https://sflc.in/faq-draft-amendment-intermediary-guidelines-rules-india

Date: January 30, 2019 (Wednesday)

Venue: Hotel Coral Isle,St. Benedict Road, Opp. North Railway Station, Kochi, Kerala

Time: 4.45 pm - 8.00 pm

Please find more details in the agenda below.

All Posts | Jan 14,2019

Countering Misinformation: Policies and Solutions, Mumbai

SFLC.in is organizing “Countering Misinformation: Policies and Solutions” a roundtable discussion to understand the phenomenon of misinformation and disinformation, and discuss solutions to tackle the issue.

In recent times, the impact of misinformation and disinformation campaigns in terms of inciting violence, mob lynchings and manipulating elections have reached unprecedented levels. The Internet has undoubtedly facilitated the spread of fake news due to its wide reach and the ease it provides to spread information easily.

This brings us to some important questions that we need to answer in order to engender a discourse to fight fake news. We will seek to answer these questions and brainstorm ideas to inform the policy debate around this issue.

Date: January 16, 2019 (Wednesday)

Venue: Office of MouthShut.com - 7, Pali Village, Bandra (West), Mumbai

Time: 1.00 pm -5.30 pm

Please find more details in the agenda below.

All Posts | Jan 14,2019

Countering Misinformation: Policies and Solutions, Bangalore

SFLC.in is organizing “Countering Misinformation: Policies and Solutions” a round table discussion to understand the phenomenon of misinformation and disinformation, and discuss solutions to tackle the issue.

In recent times, the impact of misinformation and disinformation campaigns in terms of inciting violence, mob lynchings and manipulating elections have reached unprecedented levels. The Internet has undoubtedly facilitated the spread of fake news due to its wide reach and the ease it provides to spread information easily.

This brings us to some important questions that we need to answer in order to engender a discourse to fight fake news. We will seek to answer these questions and brainstorm ideas to inform the policy debate around this issue.

Date: January 15, 2019 (Tuesday)

Venue: Hotel Royal Orchid, Golf Avenue, Airport Road, Bangalore

Time: 1.00 pm -5.30 pm

Please find more details in the agenda below.