Dark Mode Light Mode

Scanning Style: Biometric Surveillance and Data Protection in Fashion Retail

biometric biometric

Fashion retail is no longer confined to fabrics and silhouettes; it includes data-driven personalisation and immersive digital experiences. Augmented reality try-ons, smart mirrors, and body scanning applications are gaining popularity. These services are powered by biometric technologies, which capture and process sensitive biological characteristics such as facial structure, body dimensions, and gait. However, their unregulated use risks infringing individual privacy, especially when consumers are unaware of how their data is processed, stored or shared.

Biometric Data and Its Use in Fashion

Biometric data encompasses physical or behavioural characteristics that can uniquely identify an individual. These may include fingerprints, facial recognition patterns, retina or iris scans, voice recognition, and even gait analysis. Since 70% of shoppers crave personalised experiences, biometrics let retailers provide personalised customer experiences. Additionally, customer loyalty and happiness are anticipated to rise due to businesses customising the shopping experience. According to a 2023 survey, at least half of consumers are interested in biometrics, and merchants could persuade them to use the technology. In fashion retail, biometric inputs are utilised in several ways.

  1. Smart mirrors and Augmented Reality Technology are employed in physical stores to capture real-time images of a customer’s face and body. These systems use advanced sensors to assess body measurements, proportions, and physical contours to suggest garments that fit better and match a customer’s style.
  2. AI-based Virtual Assistants: Leverage facial recognition data to analyse a customer’s facial features and expressions. This information provides style recommendations, colour suggestions, or product pairings tailored to the individual’s aesthetic preferences.
  3. Size Prediction Tools and Virtual Fitting Rooms: These tools use biometric body measurements to customise clothing sizes, reducing return rates and improving customer satisfaction. They rely on detailed scanning and algorithms to identify the most suitable clothing dimensions for a customer’s unique physique.

Since Virtual try-on technology (VTOT) improves the bottom line while facilitating a higher-quality shopping experience, it is not unexpected that major retail companies like Walmart, Adidas, and Prada have adopted various VTOT models. These applications offer convenience, personalisation, and improved customer experiences, but the underlying biometric data collected remains extremely sensitive. Such data is globally classified as sensitive personal data, which requires heightened levels of protection because it can uniquely and irrevocably identify individuals.

Global Regulatory Frameworks

  1. Digital Personal Data Protection Act (DPDP)

The Digital Personal Data Protection (DPDP) Act, 2023, is India’s principal statute governing digital personal data processing. Although it does not explicitly define biometric data, the Act includes biometric information under the umbrella of personal data since it relates to a person who is identifiable by or about such data.

Section 6 mandates that personal data must not be processed without obtaining freely given, informed, specific, and unambiguous consent from the data principal. This means that retailers must ensure customers are aware of what biometric data is being collected and for what purpose. The Act enshrines the grounds for processing personal data under Section 4. This means that biometric data can only be collected and used for the specific purpose that has been communicated to the individual at the time of consent. Individuals retain the right to withdraw their consent at any time under Section 6(4). This requires retailers to provide accessible options for customers to retract their consent without affecting service quality.

Under Section 8, data fiduciaries, entities processing personal data, are obligated to adopt reasonable security safeguards to prevent data breaches, leaks, or unauthorised access to biometric data.

However, despite these provisions, the Act lacks detailed guidelines on how biometric data should be processed. It also does not outline specific retention periods, storage protocols, or accountability measures unique to biometric information, leaving significant gaps in enforcement.

 

  1. IT (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules

The IT Rules, framed under Section 43A of the Information Technology Act, 2000, provide further legal grounding for the regulation of biometric data in India. These rules specifically classify biometric information as Sensitive Personal Data (SPD). According to these rules, entities must obtain prior written consent before collecting or processing biometric data. Consent must not be implied or embedded in general terms and conditions but must be explicitly given by the individual for each data category.

Entities are also required to publish a clear and accessible privacy policy that explains the types of biometric data collected, the purpose of collection, methods of storage, duration of retention, and contact information for grievance redressal. Furthermore, organisations must implement comprehensive security practices, including encryption, restricted access, and routine audits, to protect biometric data. Non-compliance can result in both civil liability under Section 43A and reputational harm.

Despite existing provisions and rules, there is a lack of clear guidelines on biometric data processing, retention, and storage. Coupled with low awareness among fashion retailers and the absence of a central enforcement authority until the DPDP Act is fully operationalised, significant enforcement gaps remain.

  1. General Data Protection Regulation (GDPR)

The GDPR, enforced since 2018, treats biometric data used for uniquely identifying an individual as special category data under Article 9. The default position is that such data cannot be processed unless specific exceptions apply.

These exceptions include obtaining the explicit consent of the individual, processing necessary for employment or public interest, or for the establishment, exercise, or defence of legal claims. Retailers in the EU must therefore ensure that any biometric data used for personalisation purposes is collected with explicit consent, and that data subjects are informed of their rights, including the right to access, rectify, and erase such data.

Retailers must also adhere to GDPR principles such as data minimisation, where only the minimal necessary data is collected, and storage limitations, which mandate that data be retained only for as long as necessary for the stated purpose.

  1. Illinois’ Biometric Information Privacy Act (BIPA)

Enacted in 2008, the Biometric Information Privacy Act (BIPA) was an early legislative response to the rise of biometric technology in retail. It:

  • Requires companies to implement clear policies and procedures for the secure and transparent use of biometric data.
  • Prohibits the sale or profit from biometric data due to its personal and immutable nature.
  • Covers biometric identifiers such as retina/iris scans, fingerprints, voiceprints, and facial or hand geometry scans.
  • Defines biometric information as any data derived from these identifiers used to identify an individual.

Retailers who gather any biometric data covered by BIPA are required to do the following, among other things:

  • Write to the customer to explain what data is being collected or stored.
  • Inform the customer why biometric data is being collected and the length of time the data will be collected, stored, and used and
  • get the customer’s written consent before collecting and storing the biometric data.
  1. Texas’s Capture or Use of Biometric Identifier Act (CUBA)

Enacted in 2009, Texas’s Capture or Use of Biometric Identifier Act (CUBI) closely follows Illinois’s BIPA and imposes serious penalties for violations. CUBI regulates the capture, possession, sharing, and retention of biometric identifiers for commercial purposes, including retina or iris scans, fingerprints, voiceprints, and hand or face geometry.

Before collecting such data, businesses must inform the individual and obtain their consent. Once collected, companies are required to protect biometric data using reasonable care, not sell or disclose it, and destroy it within a reasonable period,                           unless an exception applies.

  1. California Consumer Privacy Act (CCPA) and California Privacy Rights Act (CPRA)

California does not have a law solely focused on biometric data. Still, the California Consumer Privacy Act (CCPA), amended by the California Privacy Rights Act (CPRA), includes biometric information within its broader framework for protecting personal data.

The CCPA broadly defines biometric information, covering physical traits like DNA, facial recognition data, and vein patterns, as well as behavioural biometrics such as gait patterns, sleep, health, and exercise data, offering broader protection than most other states.

Under the CCPA, businesses collecting biometric data from California residents must notify consumers about the collection and use of their data, provide clear notice at the point of collection, implement reasonable security measures, and respect consumer rights related to accessing and deleting their personal information.

The CCPA also provides consumers with a private right of action, similar to Illinois’s BIPA. However, lawsuits are allowed only in cases of data mishandling or breaches caused by a lack of proper security procedures. Unlike BIPA, the CCPA applies only to businesses meeting specific thresholds, thereby exempting smaller companies from its scope.

Cases

In 2022, Louis Vuitton North America (LVNA) was sued under Illinois’ BIPA for its online VTOT sunglasses program. Consumers alleged violations of Sections 15(a) and 15(b). Although LVNA moved to dismiss the case, the court allowed the 15(b) claim to proceed. The parties reportedly settled in 2023.

In 2022, the U.S. District Court for the Northern District of Illinois ruled that Estée Lauder could face a BIPA lawsuit for using VTOT on one of its subsidiary websites. The court held that the company had purposefully availed itself of the Illinois cosmetics market, creating a substantial connection between the alleged biometric data collection and the state, thus establishing specific jurisdiction. The decision also clarified that plaintiffs need only present a coherent and plausible narrative, not just repeat the statute language, to allege a BIPA claim sufficiently. In early 2024, another BIPA class action against Estée Lauder involving VTOT was dismissed. This time, the court held that plaintiffs must show that the biometric data collected could be used to identify individuals, either alone or in combination with other data. This decision introduced a stricter pleading requirement, potentially making it harder for consumers to bring successful BIPA claims against retailers using similar technologies.

In 2023, Amazon was hit with a class action lawsuit under New York City’s biometric information law, which mandates that businesses clearly notify customers before collecting or using their biometric identifier information. The lawsuit claimed that Amazon failed to provide the required notices at its Amazon Go stores, where the “Just Walk Out” technology uses palm scans, computer vision, deep learning, and body measurement tracking to identify customers and track purchases without checkout. Although the case was eventually dismissed, it reflects rising public concern over how companies use unique biometric data without adequate transparency or consent, even in the name of convenience.

Conclusion

Biometric technology presents a powerful tool for revolutionising customer experiences in fashion retail. However, unchecked deployment without legal safeguards endangers individual privacy, risks abuse and invites regulatory backlash. Fashion retailers must transition to a privacy-first approach that respects consumer autonomy. Integrating privacy by design principles, transparent data governance, and ethical AI development can help bridge the gap between innovation and individual rights, paving the way for sustainable, lawful, and responsible fashion commerce.

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Add a comment Add a comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Previous Post
trend, vibe, fashion, are microtrends dying, microtrends decline, are microtrends over

Is This the End for Microtrends? A Deep Dive into Fashion's Ever-Changing Trend Cycle

Next Post

Why Fashion Needs Lawyers More Than Ever