Logo

Defender of your Digital Freedom

All Posts | Sep 26,2019

CJEU rules that search engines cannot be asked to de-list information globally under EU right to be forgotten requests

CJEU rules that search engines cannot be asked to de-list information globally under EU right to be forgotten requests

On September 24, 2019, Europe’s top court, the Court of Justice of the European Union (CJEU) held that search engines who’ve been requested for de-referencing links under the right to be forgotten (after an order from a supervisory or judicial authority of a member state of the EU) are not obliged to perform the de-referencing from all (global) versions of its service. SFLC.in had intervened in this matter along with other civil society organisations from around the world.

The question was referred to the CJEU by the Conseil d’Etat in France, after Google refused to de-reference information from all its versions (jurisdictions beyond the EU) as ordered by the French data protection authority, the CNIL (Commission Nationale de l'Informatique et des Libertés). The right to be forgotten (RTBF) is a privacy right enjoyed by the citizens of EU member states previously under the personal data protection directive of the EU (Directive 95/46/EC), which has been now replaced by the the General Data Protection Regulation [Regulation (EU) 2016/679] (GDPR). Since this reference was made to the CJEU before the enforcement of GDPR, the court examined the law taking into account both – the personal data protection directive and the GDPR.

The CJEU in its reasoning stated that numerous countries around the world either do not recognize the RTBF (including the right to de-referencing) or have different approaches to it. Referring to GDPR, the court stated that the protection of personal data was not an absolute right and is required to be balanced with competing rights such as the freedom of information of Internet users, as per the principle of proportionality, within the EU, such a balance has not been struck for de-referencing of information which will apply outside the EU. The court also opined that such a balance between these competing rights is likely to vary significantly around the world.

The CJEU clarified that EU law requires search engine operators to carry out de-referencing on all versions of its service which are accessible from EU member states. The court clarified that search engines are also required to implement measures to effectively prevent or seriously discourage Internet users form accessing links on the subject matter of de-referencing from within the EU, but using non-EU versions of the website.

The top court also held that global de-referencing in matters of RTBF are not specifically prohibited under EU law. Local data protection authorities of EU member states have the power to ascertain whether de-referencing is required globally or not after balancing data subject’s right to privacy on one hand and the right to freedom of information on the other hand.

This ruling from the CJEU is a great victory for the freedom of speech on the Internet for all users. If nation states start requesting search engines like Google to de-list links from their global versions, the Internet will practically become the bastion for countries with the most regressive laws on free speech online. The RTBF has been often criticised for enabling bad actors to take down listings from popular search engines, which negatively impacts freedom of information and speech on the Internet, but the CJEU’s recognition of the principle of proportionality and maintaining a balance between the competing rights of privacy and free information in cases of RTBF, puts regulation on the right path.

In a similar case in Canada (Google Inc. v. Equustek Solutions Inc., 2017), where Google was asked to de-index listings for protection of trade secret rights of a subject from its global versions and it refused to do so, the Supreme Court of Canada ruled against Google and ordered a global take down requiring the search engine to de-index the relevant listings from its global versions. This judgment of the Canadian court was heavily criticised by civil society organisations and Internet advocates for violating the free speech and information rights of global Internet users. SFLC.in had also intervened in the Google v. Equustek matter in Canada.

For a detailed analysis of the RTBF and key cases on it, kindly refer to our comprehensive report on intermediary liability, here – Intermediary Liability 2.0 – A Shifting Paradigm.

The provisional text of the judgment as downloaded from the official website of the CJEU, can be accessed, here:

All Posts | Oct 12,2018

Summary Report: Series of Discussion on Personal Data Protection Bill 2018

We at SFLC.in conducted a series of multi-stakeholder round table discussions on the Data Protection Bill, 2018 submitted by the Expert Committee on Data Protection headed by Justice (Retd.) B.N. Srikrishna. We organized this series of discussions in four different cities of India, namely Delhi(September 4th ,2018), Bangalore(September 25th,,2018), Mumbai (September 26th,2018) and Kochi (September 27th, 2018). Experts from the civil society, academia, independent lawyers, banks, startups, industry bodies and representatives from media, industry and tech companies participated and expressed their views on the Personal Data Protection Bill, 2018.

The round-table events featured three separate panel discussions focusing on data principal rights and data fiduciary obligations; data localization and exemptions; and administration and enforcement which were discussed in detail.

These discussions were aimed to urge leaders and key stakeholders to put forth their views on the draft Personal Data Protection Bill and to urge the Ministry of Electronics and Information Technology (MeitY) to make appropriate amendments in the Bill. MeitY invited comments on the Bill from the public by September 10, 2018, which had been extended to September 30, 2018 at the time of these discussions. The deadline has now been extended to October 10, 2018 in light of the judgment of the Supreme Court of India in the case of Justice K.S. Puttaswamy (Retd.) v. Union of India [W.P. (C) 494 of 2012] delivered on September 26, 2018, thereby allowing more time for stakeholders to submit their research and comments for the Bill. The inputs from these discussions will form a part of the recommendations that we will submit to MeitY.

Session one focused on data principal rights and data fiduciary obligations. Key takeaways from this session were:

  • There are a lot of ambiguities in this Bill. There is no clear definition of phrases such as 'fair and reasonable processing', and ‘sensitive and critical Data’, among others. Furthermore, functions of the State are widely worded, neglecting the test of necessity and proportionality.

  • The rights of Internet users have been severely limited, particularly compared to European Union’s GDPR. The participants agreed that the concept of Right to be Forgotten has been inaccurately borrowed from the GDPR and does not include right to delete/erase your personal data.

  • Concerns were raised with respect to provisions regarding the age to obtain a child’s consent. It was stated that in India, many teenage girls try to protect their data from their parents, who strictly monitor their phone usage. In that light, it would be ironical that parental consent will be needed to protect the data of children. In our country, parents do not wish their daughters to be on certain social media platforms and discourage them to engage with the opposite sex. Therefore, if such a provision is strictly implemented, it will directly impact minors.

Session two, was on topic of Data Localisation. Key takeaways from this session were:

  • Many startup founders expressed that the interest of small and medium enterprises has not been considered. They raised concerns that data localization would harm small businesses and startups with compliance burden and raised costs.

  • The Bill would heavily impact the BPO, AI and IoT industries as they thrive on huge amount of data that is generally crowd sourced. Data mirroring/localization requirements would limit the possibilities of business and research. The Bill could benefit from additional clarity with regard to the classification of data, what data must be stored within the country and what may be transferred outside as these provisions are ambiguous at best.

  • India requires significant investment in data center infrastructure, multiple Optic Fiber backbones and enhanced power generation and grid capacity before we mandate data localization/mirroring. Data storage, cloud computing and bandwidth costs in US are a fraction of the current costs in India, making it economically infeasible to mandate storage of data within India at this point in time. The increased costs would pose a tremendous deterrent to the viability, sustainability and competitiveness of startups in India. This would be detrimental for the government’s efforts to promote a startup ecosystem within the country.

Session Three covered the issues with respect to Administration and Enforcement of this Bill. Key takeaways from this session were:

  • It was pointed out that the Data Protection Authority of India (DPAI), the proposed body for enforcement and administration of the Bill is not completely independent considering the critical responsibilities bestowed upon it. Attendees were of the view that excessive governmental control exists via power to make appointments and remove members of the DPAI, power to determine salaries and allowances, and power to notify certain categories of personal data that can be processed only in India, among other provisions in the Bill.

  • The Bill provides for criminal liability in cases of breach, it was opined that if employees of the companies will be held liable on the charge of data theft done at much higher level in the company then government employees working with the state should also be held accountable. Thus, it was opined that the law should be drafted and executed without any bias.

  • The Bill provides for data mirroring and creation of data centers. All these provisions lead to nationalization of data. It provides for data that is generated in India to be stored in India in order to create jobs in India and revenue for India. But at the same time the Bill requires damages of Companies with foreign presence to be calculated on the basis of their global revenue. Some companies found it unfair to calculate damages from their global revenues.

  • The shortcomings of the Bill were highlighted in the light of privacy and Aadhaar judgments. It was opined that this Bill does not address the concerns regarding profiling and targeted advertising deployed by state and non state actors. Participants highlighted the manner in which the Bill fails to stand the test of proportionality under the nine judge bench Right to Privacy judgment.

The panels across three cities unanimously recommended that there should be adequate sensitization, training and compliance certification for the people and businesses to be able to understand the implications of this Bill. It was agreed that the Data Protection Authority of India (DPAI) has been overburdened with roles and responsibilities. Many participants expressed that the draft law is heavily tilted towards the Central Government and is not a balanced law that considers the interests of all stakeholders.

 

All Posts | Sep 27,2017

Notable technology and rights related litigations

Below is a compilation of some notable technology and rights related litigations from India, on-going and concluded. This list includes matters filed before the Indian Supreme Court as well as various High Courts, and covers the broad topics of Privacy, Aadhaar, Intermediary Liability, Free Speech, and Right to be Forgotten.

Image credit: Legaleagle86 at en.wikipedia [CC BY-SA 3.0 or GFDL], via Wikimedia Commons

All Posts | Aug 08,2017

Summary Report: Asia Pacific Regional Internet Governance Forum, 2017 (July 26-29th; Bangkok, Thailand)

The 8th Asia Pacific Regional Internet Governance Forum (APrIGF) convened from 26th to 29th July, 2017 at Chulalongkorn University in Bangkok, Thailand, with the objective ofEnsuring an inclusive and sustainable development in AsiaPacific: A regional agenda for internet governance”.

APrIGF is a multi-stakeholder platform for public policy on internet and its impact on society. Since 2010, this prime annual conference draws in discussions and incubates collaborations for the developments of universally affordable, accessible, non-discriminated, secure and sustainable internet across the region. Discussion points from APrIGF are linked to the global Internet Governance Forum in the form of a ‘Synthesis Document’.

This year, APrIGF saw participation from over 550 stakeholders from around the region in addition tothe 60 youth participants alongside at the Youth IGF. The broad topics covered during the sessions included access, empowerment, and diversity; cybersecurity, privacy, and safer Internet; digital economy and enabling innovation; and ensuring human rights online.

SFLC.in was represented by Prasanth Sugathan (Legal Director) and Vaishali Verma (Counsel) at the APrIGF. We organised two sessions at the forum and participated as speakers in two others.

Sessions Organized:

  • Merger 2-Understanding Solutions towards Online Harassment (July 26th, 2:30-3:30 PM):

    This session was co-organised by Digital Rights Foundation, Pakistanand SFLC.in, and followed the panel discussion format. The panelists for this session included Malavika Jayaram (Executive Director, Digital Asia Hub), Lisa Garcia (Gender Coordinator, Foundation for Media Alternatives), Shmyla Khan (Project Manager, Digital Rights Foundation) and Vaishali Verma (Counsel, SFLC.in). The session was moderated by Prasanth Sugathan.

    During the course of the session, Vaishali briefly spoke about the findings of the report prepared by SFLC.in, titled “Online Harassment: A Form of Censorship”, published in November 2016.The session further elaborated upon the practical challenges faced by the victims of online harassment, the efforts being made by various organisations to address these difficulties and the need to make this discussion mainstream.

    Video archive of this session can be accessed here and the transcript here.

  • WS 80- Algorithmic Transparency: Understanding why we are profiled in a certain manner (July 29th, 9:00-10:30 AM)

    This session aimed at understanding the importance of disclosure of algorithms, leading to an increase in privacy awareness through openness and transparency. The panelist for this session were Dr. Virgil Griffith (Scientist, Ethereum Foundation), Arthit Suriyawongkul (Digital Culture and Internet Policy Researcher, Foundation for Internet and Civic Culture), Rajat Kumar (Program Manager -Digital Transformation, Friedrich-Naumann-Stiftung für die Freiheit), Jyoti Pandey (Senior Policy Analyst, Electronic Frontier Foundation), and Vaishali Verma (Counsel, SFLC.in). The session was moderated by Prasanth Sugathan.

    The panelist deliberated upon the need for transparency in algorithms and the effect it would have upon the privacy of the individuals. It was acknowledged that disclosure of algorithms would lead to an increased awareness of privacy amongst the stakeholders. The session delved into the liability assessments in case of malfunction of an algorithm. The panel further discussed the possible ways to facilitate disclosure of algorithms while also balancing the commercial interest with the public interest at the same time.

    The video archive of this session is available here and the transcript here.

Sessions Participated in:

  • Merger 1- Publicness and the Right to be Forgotten: the Debates Begin (July 28th, 11:00-12:30 PM)

    This session was co-organised by Open Net Korea and American Bar Association Rule of Law Initiative. The panel discussed the question of right to be forgotten from the perspective of the visibility of public information, which falls under the realm of the freedom of expression and the right to know, instead of seeing it solely in the context of privacy. Prasanth Sugathan participated in this session on behalf of SFLC.in.

    Official video archive of this session can be accessed here and the transcript here.

  • WS 94 – Engaging with the #KeepItOn Movement: Fighting Internet Shutdowns (July 29th, 11:00-12:30 PM)

    This session was organised by Access Now and reflected on the planned outcomes and developments from the #KeepItOn member organisations. The panelists discussed the status of disruptions and internet shutdowns in Asia-Pacific over the first half of 2017 and explored the opportunities for possible collaborations and initiatives in the region. SFLC.in was represented by Vaishali Verma in this session, who also spoke about the Internet shutdowns trackermaintained by SFLC.in.

    The video archive of this session is available here and the transcript here.

All Posts | May 29,2017

SFLC.in & 17 other NGOs file an intervention before France’s highest court on dangers of the ‘right to be forgotten’

SFLC.in joined 17 other non-governmental organisations from across the globe as amicus in a voluntary intervention filed before France’s highest court, the Council of State (Conseil d’État), raising serious concerns about a ruling of France’s data protection authority, the Commission nationale de l’informatique et des libertés (CNIL), on the “right to be forgotten”.

In 2014, CNIL ordered Google to remove 21 links from the results of an Internet search on the name of a French citizen who claims a “right to be forgotten.” Google initially removed the links from its French search site (www.google.fr) and other European search sites (such as www.google.ie), but CNIL demanded it go further. Google then blocked the links from results returned to European users, even when using Google’s non-European sites, including www.google.com. CNIL however demands that when it orders content to be “forgotten” from search results, this decision must be given effect worldwide, meaning that the results must be made unavailable to all users internationally, regardless of where they are accessing internet search engines. CNIL has also imposed a huge fine on Google, of €100,000.

In our legal submissions filed with the Council of State, we highlighted grave concerns about CNIL’s approach and its implications for human rights worldwide. We pointed out that we, like many people across the world, rely on freedom of expression and the free exchange of ideas and information online in order to carry out our important work in protecting human rights internationally. CNIL has unilaterally imposed draconian restrictions on free expression upon all organisations and individuals who use the internet around the world, even imposing a “right to be forgotten” upon countries which do not recognise this principle. The CNIL ruling causes particularly serious damage to human rights protection in the developing world. In our submissions, we urged the Council of State to annul the CNIL’s decision, stating:

“In the developing world, given that some governments are already trying to restrict freedoms on the internet through restrictive local laws, a precedent compelling companies to remove content based on already limiting laws will have the effect of eliminating checks and balances that inhere in international law. Countries such as Pakistan are already making efforts to ensure that certain political and critical content is removed from cyber space and the interveners are concerned that compelling companies to follow restrictive laws will further stymie the right to access to information and free speech. Such a precedent will also mean that dissent within a country can be censored in equal measure internationally.

As a result, the order of the CNIL sets a dangerous precedent, by opening the door for national authorities in other countries to impose global restrictions on freedom of expression through remedies grounded solely in their own domestic law. The possible race to the bottom is of the utmost concern to the interveners.”

The decision of the Council of State on Google’s appeal is expected later this year.

The original intervention as filed in french can be found here.

An English translated copy of the intervention is available here: