UncategorizedRole of Privacy-Enhancing Technologies in Curbing Data Breaches

May 1, 20240

INTRODUCTION

In today’s digital landscape, as data breaches continue to pose substantial risks to individual’s privacy and security, the adoption of privacy-enhancing technologies (hereinafter referred to as “PETs”) emerges as an essential response to mitigate these threats and uphold the integrity of personal information.

Privacy-enhancing technologies comprise a set of digital tools and methodologies that facilitate the collection, processing, analysis, and sharing of information while safeguarding the privacy and confidentiality of personal data. A core privacy principle underlying PETs, which strives to offer anonymity, pseudonymity, or unobservability for users and/or data subjects, is the principle of data minimization.

The Organization for Economic Co-operation Development (hereinafter referred to as “OECD”) published a report titled “Inventory of Privacy-Enhancing Technologies” in 2002, offering a comprehensive definition of PETs which states that “Privacy-enhancing technologies (PETs) commonly refer to a wide range of technologies that help protect personal privacy. Ranging from tools that provide anonymity to those that allow a user to choose if, when, and under what circumstances personal information is disclosed, the use of privacy-enhancing technologies helps users make informed choices about privacy protection.”

TYPES OF PRIVACY-ENHANCING TECHNOLOGIES

PETs assist organizations and people in maintaining data management and reducing privacy issues in an increasingly data-centric environment. Some of the main types of PETs are as follows:

1. Homomorphic Encryption

Homomorphic encryption is a specialized encryption method enabling computations on encrypted data without requiring decryption first. Consequently, data remains secure even during processing, accessible only to authorized parties. This encryption technique finds applications across diverse fields, including data storage, sharing, cloud computing, and analytics. It proves invaluable particularly for organizations handling extensive datasets, ensuring secure management and sharing while facilitating data utilization for analytical and operational needs.

2. Synthetic Data Generation

Another method of modifying data for privacy protection involves generating entirely new synthetic data. This goes beyond pseudonymization, where real data is replaced with altered versions, or differential privacy, which involves adding fake information to real datasets. Synthetic data is typically generated using machine learning techniques and mimics the attributes of real-world data.

Through this process, real data is inputted into machine learning algorithms, which then identify patterns and characteristics to replicate in the synthetic data. It is worth noting that while synthetic data can serve the purpose of PETs by preserving privacy, it is also commonly used to augment training datasets for machine learning models, potentially expanding their utility beyond solely privacy protection.

3. Differential privacy

It is a technique that involves adding noise to the insights derived from underlying data. This approach aims to restrict an outsider’s ability to discern information about individual identities while still allowing for the sharing of meaningful insights about the dataset or a group. It surpasses basic data aggregation by employing an advanced mathematical framework to safeguard the privacy of any individual within the dataset.

4. Secure Multi-Party computation

It is a method that allows various entities to engage with data while maintaining the confidentiality of the complete underlying information. This technique involves dividing the data into multiple ‘shares’, which are then distributed and processed by different entities. By partitioning the information, the risk of compromising the entire dataset is minimized, even if one entity is compromised. Additionally, multi-party computation can be integrated with approaches like homomorphic encryption, as discussed earlier, ensuring that even the individual ‘shares’ remain undisclosed during data analysis.

5. Trusted Execution Environment

Another important aspect of PETs is the Trusted Execution Environment (hereinafter referred to as “TEE”). To safeguard the security and integrity of data and code, it provides a secure and isolated processing environment within a computer system, allowing sensitive operations to be performed independently of the main processor and memory.
Organizations can use TEE to protect critical processes and computations such as secure data processing, key management, and cryptographic operations.

REGULATORY FRAMEWORK GOVERNING PETs

Article 25 of the General Data Protection Regulation (hereinafter referred to as “GDPR”) stipulates that organizations must implement appropriate technical and organizational measures to effectively enforce the data protection principles, such as privacy by design and default principles.

Further, Article 32 of the GDPR necessitates organizations to adopt appropriate technical and organizational measures to ensure a level of security commensurate with the associated risks. This includes measures such as pseudonymization and encryption of personal data when deemed suitable.

In June 2023, the UK Information Commissioner’s Office (hereinafter referred to as “ICO”) issued guidance on PETs also known as the “ICO Guidance”. The ICO Guidance highlights that PETs not only enhance the protection of personal data but also have the potential to unleash the full value of data and spur innovation.

In 2022, the European Union Agency for Cybersecurity (hereinafter referred to as “ENISA”) released a report titled “Data Protection Engineering,” which concluded Data Protection Engineering can be seen as a fundamental part of ensuring data protection through intentional design and default settings. Its aim is to simplify the recognition, application, and tailoring of appropriate technical and organizational measures to support particular data protection principles.

ENISA defines PETs as “software and hardware solutions comprising technical processes, methods, or knowledge aimed at achieving specific privacy or data protection functions or mitigating risks to the privacy of individuals or groups of natural persons.”

In 2007, the Italian Data Protection Authority (“Garante”) highlighted in its “Guidelines Applying to the Use of E-Mails and the Internet in the Employment Context” the significance and responsibility of data controllers to integrate PETs to reduce the utilization of identification data transmitted via the Internet and email within workplace settings.

In 2017, the Office of the Privacy Commissioner of Canada (hereinafter referred to as “OPC”) issued a report titled “Privacy Enhancing Technologies – A Review of Tools and Techniques”, which delineated the disparities between the risks posed by the continuous advancement of technologies to the rights and freedoms of data subjects (e.g., risks associated with identity exposure, correlating data traffic with identity, etc.) and the significance of integrating PETs to mitigate these risks.

In the case of Data Privacy Foundation v. Facebook Netherlands BV, Meta Platforms, Inc., and Meta Platforms Ireland Ltd (C/13/683377 / HA ZA 20-468), wherein the Amsterdam District Court ruled that Facebook Ireland, a Meta subsidiary had violated GDPR regulations by utilizing personal data for advertising purposes. The Court underscored the necessity for a data controller to strike a balance between its legitimate interests and the rights and interests of data subjects. It enumerated several measures aimed at preventing adverse repercussions for data subjects, such as extensive utilization of anonymization techniques, data aggregation, privacy-enhancing technologies, privacy by design, and conducting privacy and data protection impact assessments.

IMPLEMENTATION OF PETs IN VARIOUS SECTORS

PETs are being utilized across various industries such as healthcare, e-commerce, and finance to foster innovative approaches to data monetization and AI-driven product development, all while preserving user privacy.

In the healthcare sector, one notable employment of PETs is through the implementation of differential privacy. This technique involves injecting noise into data to protect individual privacy while still maintaining the data’s analytical value. An exemplary case of this technology occurred during the 2020 census conducted by the US Census Bureau, where this approach enabled the public sharing of datasets while ensuring the anonymity of individual residents.

Another instance involves employing secure multi-party computation, enabling multiple parties to collectively examine data without divulging the raw data itself. This method safeguards privacy while facilitating collaborative research and analysis. Its application is especially beneficial in healthcare, where the exchange of sensitive data is crucial for enhancing patient care and fostering innovation.

Within the realm of e-commerce, PETs serve to safeguard user data during its exchange, processing, and utilization for targeted advertising. For instance, encryption, secure multi-party computation, and differential privacy are employed to ensure the confidential exchange of data between parties, maintain the security and confidentiality of personal data throughout processing, and analyze anonymized user data without disclosing personal information.

Within the advertising technology industry, PETs play a pivotal role in safeguarding sensitive user data throughout processes like ad targeting, data analysis, and identity matching. Their implementation ensures the preservation of user privacy while facilitating the effectiveness of advertising strategies.

Similarly, in the financial domain, PETs can be harnessed to safeguard financial data, fortify transactions, and uphold compliance with data protection regulations. This utilization ensures the protection of customer privacy and confidentiality.

AMLEGALS REMARKS

Along with the increasing data breaches and privacy issues, the use and growth of PETs become critical to preserving the integrity, and confidentiality of personal information. The adoption of PETs within India’s privacy framework is steadily gaining momentum, particularly in sectors like healthcare, e-commerce, and advertising technology.

However, introducing PETs into India’s privacy landscape may encounter hurdles such as limited awareness, resource constraints, regulatory adherence, interoperability issues, data security apprehensions, user acceptance challenges, and requirements for data localization. Overcoming these obstacles will be crucial for effectively implementing and embracing PETs within India’s privacy framework, thereby ensuring user data protection and compliance with data protection regulations.

– Team AMLEGALS assisted by Ms. Deepanshi Kapoor (Intern)


For any queries or feedback, feel free to reach out to mridusha.guha@amlegals.com or liza.vanjani@amlegals.com

© 2020-21 AMLEGALS Law Firm in Ahmedabad, Mumbai, Kolkata, New Delhi, Bengaluru for IBC, GST, Arbitration, Contract, Due Diligence, Corporate Laws, IPR, White Collar Crime, Litigation & Startup Advisory, Legal Advisory.

 

Disclaimer & Confirmation As per the rules of the Bar Council of India, law firms are not permitted to solicit work and advertise. By clicking on the “I AGREE” button below, user acknowledges the following:
    • there has been no advertisements, personal communication, solicitation, invitation or inducement of any sort whatsoever from us or any of our members to solicit any work through this website;
    • user wishes to gain more information about AMLEGALS and its attorneys for his/her own information and use;
  • the information about us is provided to the user on his/her specific request and any information obtained or materials downloaded from this website is completely at their own volition and any transmission, receipt or use of this site does not create any lawyer-client relationship; and that
  • We are not responsible for any reliance that a user places on such information and shall not be liable for any loss or damage caused due to any inaccuracy in or exclusion of any information, or its interpretation thereof.
However, the user is advised to confirm the veracity of the same from independent and expert sources.