Data PrivacyData Privacy in the Age of Biometrics

August 17, 20220

INTRODUCTION

With the advancement of technology, unlocking one’s phone is now as simple as a glance or just a tap.  Face ID or fingerprint authentication is now used as one of the primary ways to authenticate oneself, be it while using a personal device or while entering a workspace.

As biometric technology is increasingly being used and accepted in the digital sphere, questions surrounding the privacy and security concerns are rapidly on the rise. Because biometric data is stored on personal devices and other such systems, and in cloud-based biometric databases, inevitably questions arise as to how our personal data is being secured from the outside world.

The Government and several private organizations  are readily acquiring an exorbitant amount of personal data on a day-to-day basis. Additionally, in the present times, almost everything is authenticated by biometrics- be it the attendance in a workspace or the identity proof of a citizen.

In the recent years the technological evolution has produced new ways to identify individuals and collect data through biometric information. Due to the present age technological revolution, certain countries are taking steps to expand the definition of privacy in terms of what information will be protected. One such example is the European Union’s (hereinafter referred to as “EU”) General Data Protection Regulation (hereinafter referred to as “GDPR”), which has considered biometric data as a special category of personal data that calls for stricter rules on the processing of that data.

HOW DOES BIOMETRICS WORK?

A person’s biometric information is initially entered and enrolled into a biometric system to begin with. This information may be recorded as raw data (such as an image of the fingerprint or retinal scan) or a digital template.  However, in some instances the original images of the enrolment characteristics (for example, images of fingerprints) may also be retained. Accordingly, storage of the templates comes at a much lower risk than storage of the raw biometric characteristic such as the image of a fingerprint or retinal scans. Where raw images of the biometric are stored, strict security controls are essential and regular monitoring and auditing of those controls should be undertaken.

Limitations of Biometric Systems

While biometric systems are becoming more effective as the technology advances, they are not a full proof method of authentication or identification. Some of the limitations of biometric systems are outlined below.

  • Failure to enroll

Ensuring effective enrolment rates is crucial for the successful operation of a biometric verification or authentication system. This occurs when a template for biometric information cannot be successfully created. This may be due to a number of factors, such as low-quality reference information or a person’s physical or medical condition.

  • False acceptance and rejection rates

Biometric systems can make two basic errors. A “false positive” occurs when the system incorrectly matches an input to a non-matching template. Contrarily, in a “false negative”, the system fails to detect a match between an input and a matching template.

  • Spoofing

Biometrics are not a bullet-proof solution for fraud or identity theft. As with other security measures, use of biometrics has vulnerabilities and can be compromised. Fake artefacts can sometimes be created and used to compromise a biometric sensor. This is commonly known as spoofing.

  • Compromised Biometrics

Unlike passwords or ID tokens, biometric characteristics cannot be reissued or cancelled. If a person’s fingerprint or other physiological biometric data is compromised or incorrect in any manner, it can be difficult to change the same.

PRIVACY CHALLENGES

Biometrics, like many other technologies, can jeopardize privacy. However, it is crucial to emphasise that biometrics are not inherently incompatible with privacy; the extent to which biometrics enhance or infringe on people’s privacy is determined by how the relevant systems are built and used. The following are some of the privacy concerns that may arise from the use of biometrics.

  • Function Creep

This becomes a concern when an information is used for a different purpose than it was collected for and this secondary use is not communicated to the individual at the time of providing their information. For example, an organisation may collect an employee’s facial biometric information for authentication purposes, such as to enable access to a building, but thereafter use it for other purposes which is not communicated to the employee.

  • Covert Collection

Another privacy risk is the covert or passive collection of individuals’ biometric data without their consent, participation, or knowledge. Facial biometric information can be captured from photographs that individuals do not know are being taken, and latent fingerprints can be traced long after contact with hard surfaces.

  • Secondary Information

Some biometric characteristics could reveal the secondary information about an individual, beyond the ambit of what such biometric data was initially collected for. A raw image of a facial biometric could potentially reveal health information that an individual may not want to provide, or did not consent to the collection of that information.

  • Consent

Biometrics challenge the notion of consent in the context of information privacy. Individuals may be unable to provide consent or exercise control over what biometric information is collected and how it is used. Apart from privacy concerns there may also be some legal restrictions on such systems.

  • Data Privacy Impact Assessment

Public sector organisations should undertake a data privacy impact assessment (hereinafter referred to as “DPIA”) in the early stages of considering the implementation of biometric systems. A DPIA will help an organisation to assess the privacy impacts of its initiative and identify any potential privacy risks that may arise, as well as develop risk mitigation strategies to address those risks.

REGULATION OF BIOMETRIC SYSTEMS IN INDIAN REGIME

Personal information or personal data is information that relates to a natural person and is capable of identifying such person, whether independently or in combination with other available information. It is pertinent to note that Information Technology Act, 2000 (hereinafter referred to as “IT Act”) regulates biometric data since such data can be collected and processed using a computer resource, and it constitutes to be a form of personal data. Generally speaking, the best practices prevalent in the industry accords a higher level of protection and stricter rules for processing, dealing or handling any data or information that qualifies as Sensitive Data. Some of the key practices are as under:

  • Consent for Collection: Biometric data can only be collected for a lawful purpose which is essential to an organisation’s function. This consent would mean giving the data subject an option to not provide biometric data sought by the body corporate requesting such data. Given its sensitive nature, this consent must be given in writing.
  • Retention: Once the purpose is fulfilled, the entity can no longer retain the biometric data collected by it.
  • Disclosure: Biometric data can be used for identification verification, or for prevention, investigation, prosecution and punishment of offences. Disclosure may also be made for compliance with law or is being made to Government agencies mandated to obtain information. The Entity must obtain the data subject’s permission before sharing their biometrics with third parties.

The use of biometric data has become more regulated after the landmark judgment of Justice K.S. Puttaswamy and Ors. v. Union of India and Ors, (2017) 10 SCC 1 (hereinafter referred to as “Puttaswamy judgment”).

USE OF BIOMETRIC DATA, POST AADHAAR JUDGMENT

The entire point of discussion in the Puttaswamy judgment is completely based on the issue of data privacy with regards to Aadhaar, which is a biometrics-based identification system for the citizens of India. The Supreme Court, while deliberating the issues pertaining to data privacy, decided the larger issue of constitutionality of the Aadhaar (Targeted Delivery of Financial and Other Subsidies, Benefits and Services) Act, 2016 (hereinafter referred to as “Aadhaar Act”).

A number of changes were brought about after the Puttaswamy judgment was passed, including the introduction of the Personal Data Protection Bill, 2019, which has currently been withdrawn. One of the primary observations discussed in the Puttaswamy judgment was to specify the purposes and the extent to which a private entity can use biometric data to perform Aadhaar-based authentication.

The Reserve Bank of India (hereinafter referred to as “RBI”) recently updated the Master Direction – Know Your Customer Directions, 2016 (hereinafter referred to as “KYC Directions”) to add specific scenarios for utilising Aadhaar authentication technologies. The KYC Directions detail the procedures for establishing account-based relationships and monitoring transactions for RBI-regulated organisations.

In terms of the amended KYC Directions, only banks are permitted to use biometric data driven Aadhaar authentication facility for opening accounts of customers. This can however be done only if the customer voluntarily uses Aadhaar number for authentication. On the other hand, non-bank players are only permitted to use verification mechanisms that do not involve collection of biometric data.

AMLEGALS REMARKS

On a concluding note, the existing Indian legal environment considers biometric data as sensitive data. Such sensitive data should be governed and processed in a regularized manner. While storing biometric data, the entities should be extremely cautious, and ensure implementation of proper data protection measures.

Following the Supreme Court’s decision, it will be interesting to see what authentication mechanisms the Government implements that do not require the use of biometric data, as well as the safeguards prescribed for such use by private bodies that are not otherwise regulated by regulators such as the RBI.


For any queries or feedback, please feel free to get in touch with chaitali.sadayet@amlegals.com or mridusha.guha@amlegals.com.

Leave a Reply

Your email address will not be published. Required fields are marked *

Current day month ye@r *

© 2020-21 AMLEGALS Law Firm in Ahmedabad, Mumbai, Kolkata, New Delhi, Bengaluru for IBC, GST, Arbitration, Contract, Due Diligence, Corporate Laws, IPR, White Collar Crime, Litigation & Startup Advisory, Legal Advisory.

 

Disclaimer & Confirmation As per the rules of the Bar Council of India, law firms are not permitted to solicit work and advertise. By clicking on the “I AGREE” button below, user acknowledges the following:
    • there has been no advertisements, personal communication, solicitation, invitation or inducement of any sort whatsoever from us or any of our members to solicit any work through this website;
    • user wishes to gain more information about AMLEGALS and its attorneys for his/her own information and use;
  • the information about us is provided to the user on his/her specific request and any information obtained or materials downloaded from this website is completely at their own volition and any transmission, receipt or use of this site does not create any lawyer-client relationship; and that
  • We are not responsible for any reliance that a user places on such information and shall not be liable for any loss or damage caused due to any inaccuracy in or exclusion of any information, or its interpretation thereof.
However, the user is advised to confirm the veracity of the same from independent and expert sources.