Data PrivacyDelhi High Court issues Guidelines for taking Down Involuntarily shared Explicit Content

March 27, 20240

The High Court of Delhi, in the case of Mrs. X versus Union of India and others, WP (Crl.) No. 1505 of 2021 and Crl. MAs No. 12645 of 2021 and 811 of 2022, decided on April 26, 2023, issued directions and guidelines on the takedown of Non-Consensual Intimate Images (herein referred to as “NCII”) and personal data/information.

FACTS

Mrs. X (herein referred to as Petitioner”), in line with a Supreme Court directive aimed at protecting victims of sexual violence, filed a writ petition under Article 226 of the Constitution of India and Section 482 of the Code of Criminal Procedure, 1973 against the accused, while also impleading the Union of India (herein referred as “the Respondent”). The petition sought the blocking of websites displaying intimate images of the Petitioner, and the registration of a First Information Report (hereinafter referred to as “FIR”) based on her complaint to the Lajpat Nagar Police Station in New Delhi.

In December 2019, the Petitioner, who is a married woman with a young son, met Mr. Richesh Manav Singhal (hereinafter referred to as “Accused”) online and exchanged contact numbers subsequently. Moving forward, in July, 2020, the Accused came over to her rented accommodation and “forced himself upon her” and also transferred explicit photos of Mrs. X, which she “had taken of herself for the purpose of sharing them with her husband”, from her phone to his. It appeared that the Accused had also involved Mrs. X’s minor son in sexual acts as well.

The Petitioner filed a complaint against the Accused at the Lajpat Nagar Police Station, leading to a Zero FIR registered and transferred to Gurugram. The Accused further threatened the Petitioner with leaking her explicit images online and extorting money and jewelry from her. When the Petitioner could no longer meet the demands of the Accused, he leaked her explicit images on various pornographic websites without her consent, leading her to file a new complaint on 03.08.2021. It was stated in the Complaint that the Accused made a YouTube channel with Petitioner’s name, and posted her explicit videos and photographs on a daily basis.

The Petitioner approached various grievance cells and filed complaints on cybercrime.gov.in, but her images were not removed from the Internet. After getting frustrated with the lack of redressal, the Petitioner filed the instant writ petition seeking directions for the removal of her NCII from the Internet.

The Petition highlights the troubled situation of the Petitioner and the failure of existing mechanisms to address her grievances, urging the Hon’ble High Court to intervene and protect her rights.

ISSUES BEFORE THE HIGH COURT

  1. Whether intermediaries like Google LLC, and Microsoft liable to undertake privacy-respecting measures to prevent Technology Facilitated Sexual Violence (herein referred to as“TFSV”)?
  2. Whether there is any correlation between the right to privacy, including the right to be forgotten, and the right to freedom of expression in the context of NCII content?

CONTENTIONS OF THE PARTIES

The Petitioner contended that her intimate images were distributed without her consent, violating her privacy under Article 21 of the Constitution. In her petition, she has prayed to delink/de-tag/de-reference/de-index her name from the search engines.

The Petitioner further submitted that despite approaching various platforms i.e., Grievance Cells of Respondents No. 3 to 6, i.e., Google LLC, Microsoft India Pvt. Ltd, YouTube.com and Vimeo.com, her NCII were not taken down from the Internet.  The Petitioner also asserted that the offending material was consistently being reproduced and re-uploaded despite efforts taken by her to remove it.

On the contrary, the Respondent submitted that though all the offending material had been removed from YouTube, and the URLs which had been specifically supplied were de-indexed by Google, which means that it could not be found on the Internet through other search engines and that merely directing only the search engines to de-index the links would not be an adequate solution.

The Respondent further contended that Petitioner’s Prayer seeking delinking/de-tagging/de-referencing/de-indexing the name of the Petitioner from the search engines would adversely effect on the freedom of expression and speech of other individuals having the same or similar name as that of Petitioner.

The Respondent No. 2 i.e., Government of NCT of Delhi (hereinafter referred to as “GNCTD”) with regard to the Uniform Resource Locator (hereinafter referred to as “URLs”) also asserted by filing a Status Report dated 14.09.2021, stating that all possible efforts were being made to get the remaining active URL/links blocked/removed through the concerned intermediaries.

The Respondent also submitted that in the present Petition, the grievances of the Petitioner falls under Rule 3(2)(b) of the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (hereinafter referred to as the “IT Rules”) and accordingly, the Petitioner has an efficacious remedy to approach the intermediary directly or through any person on her behalf, including law enforcement agencies, for removal of URLs containing offending content.

Learned Amicus Curiae provided submissions concerning the obligations of intermediaries under Information Technology Act, 2000 (hereinafter referred to as “IT Act”) and IT Rules. He emphasized that intermediaries must not only remove specific offending URLs provided by users but also all offending content from their platforms to avail themselves of safe harbor provisions under Section 79 of IT Act.

Additionally, he asserted that Section 79(3)(b) extends the obligation to remove unlawful content beyond specific URLs to content deemed unlawful by judicial order. Rule 3 of the IT Rules also mandates removal of unlawful content, including that infringing on other’s rights or privacy, within specified timeframes upon receiving complaints or court orders.

The contentions made by Google LLC was that it had taken all necessary measures to prevent offending content from persisting on its platforms, including disabling re-uploads and removing errant YouTube channels. Google Search, as a search engine, merely indexes content from third-party websites and does not host or control it, thus it is not considered an “originator” under Section 2(1) (za) of the IT Act. Its role is that of a library catalog, directing users to content but not generating it. Image-based search results pose additional challenges due to complex algorithms, making it impractical to solely direct actions towards search engines rather than the primary sources of NCII content.

Microsoft in its contentions had stated a dedicated webform is accessible for reporting instances of NCII. Any member of the public can utilize this webform to request the removal of nude or sexually explicit images or videos shared without their consent. However, Bing (www.bing.com), which is the search engine, lacks automatic NCII detection technology and can only remove such content globally upon notification. Users/victims must collaborate with webpage owners to remove NCII from the Internet. Microsoft also informed the High Court that it is working on implementing image scanning technology but awaits the development of cryptographic databases, interoperability standards, and APIs to identify NCII duplication, collaborating with tech companies for user safety.

The Respondent relied on the case of MySpace Inc. v. Supercassettes Industries Ltd., 2016 SCC OnLine Del 6382 to show that Section 79 of the IT Act protects intermediaries by acknowledging their ability to act diligently. However, forcing intermediaries to distinguish between infringing and non-infringing content would restrict free speech and encourage private censorship. Moreover, it could make intermediaries liable for contempt of court if they can’t comply with impossible orders. Therefore, intermediaries should only remove content when they receive specific complaints and avoid proactive removal.

The Delhi Police being the Law Enforcement Agency submitted that it has implemented several measures to monitor and prosecute offenses against women and children on the Internet. These include the www.cybercrime.gov.in portal with a feature that automatically directs complaints to the relevant police station based on the complainant’s residence, as well as the establishment of District Cyber Police Stations in each district, each equipped with dedicated cells for handling cyber issues related to women and children, including those involving child pornography and rape. Additionally, a round-the-clock helpline 1930 is also been made available to assist victims of cybercrimes. The former District Cyber Cells have been replaced with District Cyber Police Stations, with contact details accessible on various Delhi Police websites, ensuring easier access for reporting and addressing such crimes.

DECISION AND FINDINGS

The High Court has observed that with the Internet’s vast reach and borderless nature, unlawful content dissemination has become effortless, challenging to trace and control. This poses concerns for privacy and the right to be forgotten, as once uploaded, content is nearly impossible to erase.

In addressing such issues, the High Court emphasized on the non-adversarial nature of the matter and also vocalized its aim to minimize further distress to victims of NCII dissemination. The High Court acknowledged the protection granted to intermediaries under Section 79 of the IT Act, exempting them from secondary liability for user actions to preserve fundamental rights and information flow. However, if an intermediary fails to remove unlawful content upon notification, exceptions to this protection apply.

The High Court observed that under exceptional circumstances, intermediaries become liable for the actions of third-party users, despite the harm caused to others. Additionally, IT Rules provide for a mechanism for users/victims to directly request content removal from intermediaries, either through court orders or Government directives, ensuring proactive action against NCII content.

The High Court has asserted that the IT Act and IT Rules provide clear and comprehensive guidelines regarding the obligations of intermediaries. It emphasizes the imperative for all relevant authorities and entities to diligently adhere to and enforce these provisions. Furthermore, the High Court also granted liberty to the Learned Amicus Curiae to file necessary applications, including those for modifications or clarifications of the directives and suggestions provided in the present case. The High Court directed intermediaries to remove offending content within 24 hours as per Rule 3(2)(b) of the IT Rules. Additionally, intermediaries were instructed to utilize proactive monitoring with automated tools to identify and remove or disable access to content exactly identical to the offending content mentioned in the court order.

The High Court emphasized that it is morally unjustifiable to expect victims of NCII abuse to continually endure the trauma of searching the Internet for content related to them and repeatedly approaching authorities for assistance. To prevent this situation, the court has given the following directions:

1. Dissemination and Reporting: Individuals encountering NCII can report it to Grievance Officers of relevant intermediaries or through the Online Cybercrime Reporting Portal (www.cybercrime.gov.in).

2. Legal Action: Upon receiving a complaint, the police must promptly register a formal complaint under Section 66E of the IT Act and work to apprehend the originator of the NCII.

3. Court Petitions: Victims can approach the court directly to file petitions identifying NCII content and its URLs for the court to determine its illegality.

4. Search Engine De-indexing: Search engines must employ hash-matching technology to de-index webpages containing NCII reported under Rule 3(2)(b) of the IT Rules. Users should be able to request de-indexing of new URLs with previously removed content directly from search engines.

5. Helpline Support: A 24/7 helpline manned by sensitized staff should be available for reporting NCII, directing victims to social and legal support organizations.

6. Takedown Orders: Search engines must use digital identifiers to de-index NCII content and cannot demand specific URLs for takedown. Complainants submit URLs of copies of identified NCII for quicker removal.

7. Trusted Third-Party Platform: Ministry of Electronics and Information Technology (hereinafter referred to as “MeitY”), may develop a trusted encrypted platform with search engines for registering and removing NCII content using hash-matching, ensuring content removal while maintaining safeguards against abuse.

Subsequently, the High Court disposed of the writ petition, along with any pending applications.

AMLEGALS REMARKS

The High Court acknowledged the severe psychological and societal consequences of NCII dissemination and with the issuance of the above parameters, sought to ease the burden on victims by ensuring they aren’t continually tasked with identifying and de-indexing new URLs containing their NCII.

Hash-matching technology through Meta’s ‘Stop NCII’ program is favored over broad proactive monitoring mandates. This technology allows victims to create unique fingerprints of offending images stored in a database to prevent re-uploads.  It emphasizes the importance of only including NCII content in any hash database to prevent lawful content from being mistakenly added and continually removed.

However, the High Court emphasized the need for transparent and accountable operationalization of such technology due to its potential for abuse. In its judgment, the High Court also issued various directions and recommendations to the MeitY, the Delhi Police, and search engines to tackle the circulation of NCII online.

Notably, it expanded the definition of NCII to include sexual content intended for “private and confidential relationships,” broadening the scope of illegal NCII to encompass instances where content is shared without consent, even if initially obtained consensually within private relationships. This addresses the issue of NCII generated within private realms but illegally shared online.

Strong institutional safeguards are crucial for ensuring public accountability of such databases to prevent large technology companies from imposing their own terms. Although the High Court did not delve deeply into these institutional mechanisms, it suggested that the forthcoming Digital India Bill presents an opportunity to address these issues comprehensively and advance discussions on combating NCII effectively.

– Team AMLEGALS assisted by Ms. Prishita Saraiwala


For any queries or feedback feel free to reach out to mridusha.guha@amlegals.com or jason.james@amlegals.com

© 2020-21 AMLEGALS Law Firm in Ahmedabad, Mumbai, Kolkata, New Delhi, Bengaluru for IBC, GST, Arbitration, Contract, Due Diligence, Corporate Laws, IPR, White Collar Crime, Litigation & Startup Advisory, Legal Advisory.

 

Disclaimer & Confirmation As per the rules of the Bar Council of India, law firms are not permitted to solicit work and advertise. By clicking on the “I AGREE” button below, user acknowledges the following:
    • there has been no advertisements, personal communication, solicitation, invitation or inducement of any sort whatsoever from us or any of our members to solicit any work through this website;
    • user wishes to gain more information about AMLEGALS and its attorneys for his/her own information and use;
  • the information about us is provided to the user on his/her specific request and any information obtained or materials downloaded from this website is completely at their own volition and any transmission, receipt or use of this site does not create any lawyer-client relationship; and that
  • We are not responsible for any reliance that a user places on such information and shall not be liable for any loss or damage caused due to any inaccuracy in or exclusion of any information, or its interpretation thereof.
However, the user is advised to confirm the veracity of the same from independent and expert sources.