IFF explains DigiYatra: Turbulence ahead

tl;dr

Continuing our prior explainer of the DigiYatra Scheme (“Scheme”) and its accompanying policy document (“Policy”) that is scheduled to take flight in March, 2022 we answer practical questions on the basis of evidence. Will DigiYatra force you to sacrifice your privacy at the altar of convenience? Is it even going to deliver on the convenience offered? We look at these concerns and also how other jurisdictions have responded to similar facial recognition for air travel schemes in this post.

Who will have access to your data?

In the first part of this series, we looked closely at the Policy to understand how the Scheme will operate. While the Policy does state that the airports using the DigiYatra Biometric Boarding System (BBS) will conform and adhere to the data protection laws as applicable and mandated by the Government Of India, presently India does not have any specific law on data protection. Moreover, since the Policy does not have the force of law, due to being untethered to any policy or legal framework, even the privacy protection principles included in it directly will not be enforceable against any authority.

It is essential to ensure that data collected under this Scheme is processed only for those specific purposes that are essential to successfully carry out the Scheme. However, in the absence of such an enforceable purpose limitation principle due to the absence of any legal framework around the Scheme, the personal data collected may end up being shared further and processed for purposes which the data principal may not have consented to. In our previous post, we have discussed how the personal data collected may be shared with government and security agencies, which could lead to a potential rights violation.

What happens if your data is shared with private entities?

The Scheme will be implemented by a joint venture company (JVC) or special purpose vehicle (SPV) under Section 25 of the Companies Act 1956 that will be established by the Airports Authority of India (with minority stake) and all private airport operators. Presently, the only legal framework regulating how sensitive personal data is to be processed are the Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011 (“SPDI Rules”).

Under Rule 3 of the SPDI Rules (link), sensitive personal data or information of a person means such personal information which consists of information relating to passwords, financial information such as bank account, credit card, debit card or other payment instrument details, physical, physiological and mental health condition, sexual orientation, medical records and history, and biometric information. Rule 6 of the SPDI Rules states that disclosure of sensitive personal data or information by body corporate to any third party shall require prior permission from the provider of such information. Therefore, the JVC or SPV shall have to obtain the consent of the passenger before sharing their biometric information, such as facial scans, to a third party. However, Rule 6 also creates an exemption where information may be shared by the body corporate with government agencies for the purpose of verification of identity, or for prevention, detection, investigation including cyber incidents, prosecution, and punishment of offences.

It is important to note here that while the SPDI Rules do create some protection, it is only applicable to biometric information. However, it’s implementation in terms of enforcement or remedy (“Can I file a complaint and actually obtain damages?”) is wholly absent. The JVC or SPV may still be able to share information which does not fall under Rule 3 with third parties without obtaining consent from the passengers. This could include information such as mobile numbers and other travel details which could then be used by private companies to “hypertarget” customers for advertising purposes. Still, some might argue that the convenience that DigiYatra brings outshines any future privacy concerns. However, this might be a mistaken view as the it may end up causing more inconvenience to passengers.

Will DigiYatra truly increase the ease of air travel?

It is highly unlikely that DigiYatra will satisfactorily deliver on its main claim; which, as per the DigiYatra Policy, is to “enhance passenger experience and provide a simple and easy experience to all air travellers”. (To understand the objectives behind the Scheme in detail, please refer to pg. 8 of the Policy document here) This is due to the simple fact that facial recognition technology is inaccurate, especially for people of color (which includes Indians) and women. (Studies on lower accuracy rates for people of color linked here and women linked here)

Imagine a situation where you are running late for your flight and decide to use the DigiYatra Scheme in order to get through the airport formalities quickly before your flight departs. You register for the DigiYatra Scheme online and select your Aadhaar card as the ID against which your face is to be verified at the airport. However, when you reach the registration kiosk, the machine fails to identify your face as the one within the Aadhaar database. You lose precious time in resolving this issue at the registration kiosk, and end up being hassled for the same issue at each checkpoint where the DigiYatra facial recognition is needed which includes the entry point check, entry into the security check, self-bag drop, check-in and aircraft boarding. Ultimately, you end up missing your flight and your privacy.

This situation could soon become reality. Mumbai’s Chhatrapati Shivaji Maharaj International Airport, handled a record over 91,000 travellers on October 17, 2021 in a single day. (Report titled “Passenger traffic at Mumbai airport on Sun highest since Mar 23'20” dated October 21, 2021 published in the Business Standard linked here). Even assuming that the facial recognition technology being adopted under the Scheme has a low inaccuracy rate of 2% (which is highly unlikely as facial recognition technology has been known to be more inaccurate towards people of color as mentioned above), this would mean that almost 1,820 passengers a day will not be correctly verified at the Mumbai airport, which will contribute enormously to overall delays at the airport. Thus, the Scheme’s claims of increasing convenience may be far-fetched and require an independent, third party audit even if limited to efficiency.

But haven’t similar schemes been launched in the United States already?

Yes, a similar scheme has been implemented in the United States by the US Department of Homeland Security (DHS). However, according to a report by the Georgetown Law’s Centre on Privacy & Technology, the system has multiple legal, technical and privacy problems similar to the ones that we have stated above (Refer to full the report here). Similar legal concerns have also been raised by the Electronic Frontier Foundation, which states that, “(w)e cannot overstate how big a change this will be in how the federal government regulates and tracks our movements or the huge impact this will have on privacy and on our constitutional “right to travel” and right to anonymous association with others”. They have also highlighted how the system will end up discriminating against minorities due to technical problems.

“Additionally, these systems are notoriously inaccurate, contain out-of-date information, and due to the fact that immigrants and people of color are disproportionately represented in criminal and immigration databases, and that face recognition systems are less capable of identifying people of color, women, and young people, the weight of these inaccuracies will fall disproportionately on them. It will be the passengers who bear the burden when they are stuck watching the flights they paid for take off without them because there was an error with a database or an algorithm, or because they preferred non-biometric options that weren’t in place.” (Refer to the EFF’s statements here and here)

As a result of these shared concerns, the American Civil Liberties Union ultimately sued the DHS and other related agencies, for records related to the US government’s use of facial-recognition technology that the group said could pose “grave risks to privacy”. (Report titled “ACLU sues federal agencies seeking records of facial-recognition use at airports and U.S. border” dated March 12, 2020 published in the Washington Post linked here)

In short, the DigiYatra Scheme sounds too good to be true, because it is. If implemented it will only result in multiple legal, technical and privacy problems, which the common passenger will have to bear the brunt of while failing to deliver on any of the promised convenience.

  1. The dangers of DigiYatra & facial recognition enabled paperless air travel dated January 18, 2022 (link)
  2. DigiYatra Project on Panoptic website (link)
Published By
Anushka Jain

Anushka Jain

Policy Counsel, IFF

Share this Case Study