This article exists as part of the online archive for HuffPost India, which closed in 2020. Some features are no longer enabled. If you have questions or concerns about this article, please contact indiasupport@huffpost.com.

The Privacy Cost Of Digi Yatra’s Seamless Travel Promise

The efficiency introduced by Digi Yatra makes a strong business case, but significantly undermines privacy and autonomy. Ultimately, it is not worth shorter queues and slightly less time spent at airports.
ASSOCIATED PRESS

HYDERABAD—On July 1, 2019, Hyderabad’s international airport launched a pilot project, Digi Yatra, that uses facial recognition for passenger check-in, security and boarding. This industry-led project falls under the Ministry of Civil Aviation (MCA), and seeks to facilitate efficient, speedy and secure travel. Face scans through the airport at various check-points will allow passengers to avoid cumbersome security checks and lengthy queues.

While this sounds like a desirable travel experience, we believe that Digi Yatra’s promise of seamless travel comes at too high a cost. Some of our concerns, both technical and legal, are as follows.

First, facial recognition systems undermine privacy by definition—a passenger who opts in consents to having her face mapped out, reduced to a data point and then stored for future authentication. Each time her face is scanned, the image is matched against the database to confirm her identity, and this could happen multiple times within an airport. She has no (or limited) control over her data, and neither is there any way to constrain the ways in which it is processed and mined.

For the latest news and more, follow HuffPost India on Twitter, Facebook, and subscribe to our newsletter.

The current facial recognition system undergoing trials is designed for a 1:1 matching of an individual’s face with the photo stored in the database. This means that you have to enter your Digi Yatra ID, and then your face is compared against your stored photograph.

The more invasive version of this system is 1:N matching, where one individual’s face is compared to all images in the database, with no ID needed. This will be introduced, according to the Digi Yatra Policy, in a phased manner.

Second, while the Digi Yatra policy mandates adhering to applicable data protection law, India does not currently have a legal framework that could apply. Even though the Sri Krishna Committee appointed to frame the data protection law has submitted its report a full year ago, the government has not brought this to Parliament.

This essentially grants a carte blanche to companies and airport authorities to collect, use, share and store data as they see fit; for instance, the policy states that companies are currently allowed to use the biometric information collected from Digi Yatra for marketing.

The absence of purpose limitation in particular is a cause of concern, as datasets containing biometric information of passengers could be used in completely different, and concerning contexts. While the current proposal requires deletion of biometric data after a passenger’s takeoff, inferred data during the time the passenger was in the airport, and insights derived therein are not subject to the same level of scrutiny.

Third, privacy concerns exist beyond just one’s movement in the airport. Facial recognition systems require large datasets to train on, and investigative research has found that most systems are trained using pictures from the internet without the consent or knowledge of the individuals whose faces are being tracked, classified and logged.

The sources of data can be anything from CCTV footage to social media data from sites like Facebook and IBM. Digi Yatra is particularly dangerous for individual autonomy as it will be backed by Aadhaar and allows for eKYC transactions (there are also plans for rolling out facial recognition for Aadhaar).

SCREENSHOT/ india.gov.in

Fourth, Digi Yatra is built through a public-private joint venture between the Airports Authority of India and private companies, with the former having a minority stake. This implies the delegation of public authorities’ responsibilities for authentication and security being outsourced to privately developed technology.

Facial recognition is essentially surveillance infrastructure that is problematic regardless of who deploys it; however, it is particularly worrying as the secondary use case for private companies can be dangerous, particularly in a regulatory vacuum.

File notings of the Digi Yatra project show that meetings between the government and private companies took place behind closed doors, and the impact on individuals is isolated from key discussions.

Fifth, facial recognition technology is fundamentally inaccurate and riddled with controversy. It has shockingly low accuracy rates around the world, including India (where accuracy rates are in single digits), the United Kingdom, and the United States. The systems are particularly inaccurate in the case of minorities, and exposes them to the risk of being wrong identified and leading to severe consequences.

Finally, while Digi Yatra is currently voluntary, we exercise a healthy amount of skepticism towards it staying that way. Voluntary schemes often become mandatory once they acquire enough information and takers. Experience from around the world shows that once facial recognition is rolled out at scale at airports, an opt out option is not easy to navigate, and intentionally made difficult. Even though Digi Yatra is currently voluntary, it may not be long before it becomes mandatory like it’s peer biometric systems.

As facial recognition systems are being implemented across the country, it is important to have a deliberate approach to deployment, where consequences and risks are considered before rollout. The efficiency introduced by Digi Yatra makes a strong business case, but significantly undermines privacy and autonomy. Ultimately, it is not worth shorter queues and slightly less time spent at airports.

Close
This article exists as part of the online archive for HuffPost India, which closed in 2020. Some features are no longer enabled. If you have questions or concerns about this article, please contact indiasupport@huffpost.com.