Dark Mode Light Mode

The Right of Publicity vs. AI Influencers: Who Owns a Digital Likeness?

AI influencers AI influencers

In this new era of AI, a new generation of influencers has emerged, smarter, sleeker, algorithm-driven, and entirely virtual. AI-generated influencers and digital personalities are replacing human influencers at an unprecedented rate, but this also raises concerns about the potential violation of publicity rights and the misuse of AI to mislead consumers. As these digital AI influencers enter the market, an important question arises: who bears responsibility for their actions?

According to a 2022 survey by The Influencer Marketing Factory, more than half of social media users (58%) follow at least one virtual influencer, with YouTube being the most popular platform (28.7%) in the social media industry. This highlights how virtual personas have already become an integral part of mainstream digital culture.[1] As this influence grows, so does the need for greater responsibility, especially where consumers’ rights and publicity rights are at stake. Through this article, we will explore publicity rights, their infringement, and the rise of AI influencers. Furthermore, we will examine their impact on society at large and the reforms necessary to address these issues.

What are Publicity Rights? 

The right of publicity has been defined by the International Trademark Association[2] as an intellectual property right that protects against the misappropriation of a person’s name, likeness, or other indicia of personal identity, such as nickname, pseudonym, voice, signature, likeness, or photograph, for commercial benefit.

The Hon’ble Indian courts have recognized publicity rights in various judgments, such as the recent judgment of Anil Kapoor vs. Simply Life India & Ors (2023).[3] The Delhi High Court ruled that Anil Kapoor’s publicity rights should be protected against abuse caused by a third party, where unauthorised personalities used artificial intelligence to modify his voice, dialogues, and photos, which led to dilution of his reputation in the market. Similarly, in Titan Industries v. Ramkumar Jewellers [4] (2012), the Delhi High Court held that the unauthorised use of Amitabh and Jaya Bachchan’s names and photos in jewellery ads violated their publicity rights.

The right of publicity is enshrined under Article 21 of the Constitution of India, which is also known as the Right to Privacy. The Supreme Court’s decision in Justice K.S. Puttaswamy v. Union of India[5] (2017) laid the constitutional foundation by interpreting the right to privacy to include a person’s control over their identity and personal data, thereby providing the constitutional bedrock for personality and publicity rights.

Therefore, under this evolving digital landscape, where AI-generated avatars, voice clones, and deepfake technologies are becoming common, the protection of publicity rights is more vital than ever.

The AI Influencers 

The first AI Gen Z travel influencer in India, Radhika Subramaniyam, [6] speaks both Tamil and English. Her content mainly focuses on travelling to small villages and depicting their local stories, deeply rooted in culture and identity. The audience likes her content, as it familiarizes them with the recent trends of exploring our culture and the concept of ‘where we come from’, yet she is never physically present, raising doubts about authenticity.

Credits: @indiawithradhika | Instagram

This answers the drawbacks of AI influencers. Since she is never physically present at the location, her reciting a story on culture and roots just seems artificial in nature. Although her content appears authentic, she has never set foot in these villages. This absence undermines the lived creativity and cultural depth that only physical presence can provide. AI influencers cannot absorb the atmosphere, emotions, or cultural essence of a place; they can only mimic what they’ve been shown. Further, they raise concerns such as the safety of the women involved. They cannot feel the discomfort, fear, or emotions of joy that make up genuine storytelling as compared to a real-life Gen-Z influencer. The experiences presented by these influencers often differ from reality, making reliance on them risky. AI influencers should be seen as an inspiration, but they cannot be relied upon entirely.

Another example is influencer Noonoouri[7], who has 400,000 followers on Instagram and starred in fashion campaigns for Dior, Balenciaga, and Valentino. Her content consists of her singing songs and posting them on Instagram and YouTube. Her music collaborations include a track with American singer ENISA titled “Up All Nite”. Her voice reportedly combines real vocal samples with AI, raising concerns about authenticity and the dilution of artistic originality.

Credits: @noonoouri | Instagram

Similarly, AI influencer, Shudu Gram [8], was created in April 2017 by British photographer Cameron James Wilson using DAZ 3D and Blender, and she became the world’s first digital supermodel. Her lifelike appearance, darker skin, distinctive aesthetic, and editorial poses have convinced people that she is real [9]. Within months, Shudu garnered over 100,000 Instagram followers, with influential brands like Fenty Beauty reposting her content. Business Insider recounted its own confusion: “I seriously thought this computer‑generated Instagram model with 100,000 followers was real”.

Credits: @shudu.gram | Insatgram

Therefore, if AI influencers are correctly used, they can become an asset to society; however, they must be handled with responsibility, as they can lead to harmful repercussions if not.

Who owns the rights to these AI Influencers? 

The AI influencers have become relevant in today’s society, with YouTube having the highest percentage of virtual influencers across its platform, at 28.7 per cent [10], surpassing other platforms. In Thaler vs. Perlmutter [11], a U.S. case, Thaler owned a computer system known as the “Creativity Machine.” The Creativity Machine, having “Artificial Intelligence” (“AI”), generated a piece of visual art of its own accord titled “A Recent Entrance to Paradise” (the “Art”). Thaler sought to register the copyright for the Art at the U.S. Copyright Office (the “Copyright Office”), listing Creativity Machine as the author and explaining that the copyright should transfer to him as the owner of the machine. The court denied authorship to an AI, holding that copyright extends only to works created by human beings. The Shenzhen Tencent v. Shanghai Yinxun[12] (2019) case in China held that Tencent’s AI Dreamwriter article was copyrightable because it reflected human creative input, but made clear that AI alone cannot hold copyright. In India, article 2 (d) (vi) of the Copyright Act, 1957[13] clearly states that “author” means, “in relation to any literary, dramatic, musical or artistic work which is computer-generated, the person who causes the work to be created”. Thus, AI itself cannot hold copyright. Current laws recognize only human creators or AI-assisted works involving human creative input.

The rights of the AI influencers currently lie with the operator, creator, or the third party to which it is sold. Lil Miquela[14], a virtual influencer, was initially created and owned by the company Brud, which held all associated intellectual property rights, including copyright in her design, trademarks in her name, and related branding elements.

Credits: @lilmiquela | Instagram

In 2022, these rights were acquired by Dapper Labs, demonstrating that the “likeness” of a virtual persona constitutes a bundle of intellectual property assets under the control of its creator or operator, which can be transferred, licensed, or sold like any other commercial asset.

Why do brands choose AI influencers and how can it affect consumers? 

Brands favour AI influencers because they are cost-effective, produce content quickly, and adapt easily to platform algorithms. According to Forbes[15], human influencers charge 46 per cent more than AI influencers. A survey by Gartner also found that AI influencers have reduced the advertising costs by 30 per cent. Therefore, AI influencers are much cheaper than human influencers. Brands choose AI influencers because they have a high reach among consumers, and people genuinely enjoy their content. For example, Lil Miquela has a large following, and brands like Prada, Calvin Klein, and others collaborate with her to increase engagement on their products.

However, this growing reliance on virtual influencers raises significant concerns because these virtual influencers lack transparency, have unrealistic beauty standards, and can lead to job displacement of human influencers who are much more transparent than AI influencers. Most importantly, they still lack emotional intelligence and may inadvertently make misleading or harmful claims, which can significantly impact consumers. The audience might trust them as real people, raising potential concerns about consumer deception and publicity infringement. This demonstrates that current rules are outdated and require reform. This harms creative opportunities for real influencers and displaces human jobs. Consequently, old laws have become outdated, and legal reforms are needed to keep up with virtual influencers.

Recent laws for virtual influencers and the need to update outdated laws? 

The recent AI laws address virtual influencers, but they remain insufficient and outdated. There is an urgent need for the upgradation of laws on AI so that humans can avoid being exploited by AI.

The function of the REGULATION (EU) OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL[16] aims to create a single, transparent legal framework for AI across Europe. Its purpose is to ensure that AI is human-centric, safe, and trustworthy, protecting health, fundamental rights, democracy, and the environment, while also promoting innovation. The EU AI Act, although it discusses transparency, accountability, and copyright data, does not clearly mention virtual influencers, their impact on consumers, or measures to mitigate this impact.

The Advertising Standards Council of India 2021[17] defines virtual influencers as “fictional computer-generated people or avatars who have realistic characteristics, features, and personalities of humans, and behave similarly as influencers”. These guidelines state that any advertisement on their accounts must carry a clear disclosure label (e.g., Ad, Sponsored, Partnership), which is placed prominently so it cannot be missed. Most importantly, a virtual influencer must also disclose that they are not authentic influencers, so as not to deceive human beings. A Drawback to these ASCI guidelines is that they are self-regulatory, and there is no strict enforcement. Therefore, the procedures are not followed judiciously.

The Indian Influencer Governing Council (IIGC) Code of Standards, April 2025, [18] establishes a guiding framework for influencers, ensuring that their content is legal, honest, transparent, and respectful of societal values. Virtual influencers under the IIGC guidelines must ensure their clear identification as AI; they should make ethical endorsements and avoid manipulating humans. They should also not resemble real human beings, nor should they mimic the likeness, voice, or persona of any real individual. The AI influencer usage must align with national and international guidelines on advertising, AI ethics, and data protection. The IIGC code is self-regulatory and hence not strictly enforceable.

Thus, current laws remain inadequate and must be updated.

Conclusion

AI influencers are reshaping the contours of influence, but they raise serious questions around identity, authenticity, and ownership. While virtual faces can engage with the audience and do campaigns, they exist in a legal grey zone. The law must evolve to reconcile digital creativity with real-world rights.

References

[1] Steven Lai, ‘35% of People Have Bought a Product or Service Promoted by a Virtual Influencer’ (ION – Influencer Orchestration Network, 14 April 2022) https://www.ion.co/over-half-of-people-follow-at-least-one-virtual-influencer accessed 21 August 2025.

[2] International Trademark Association, Right of Publicity (INTA, 2025) https://www.inta.org/topics/right-of-publicity/ accessed 21 August 2025

[3] Anil Kapoor v Simply Life India & Ors (Delhi High Court, 30 April 2024).

[4] Titan Industries Ltd v M/s Ramkumar Jewellers (Delhi High Court, 26 April 2012) Titan_Industries_Ltd_vs_M_S_Ramkumar_Jewellelrs_on_26_April_2012.PDF.pdf accessed 21 August 2025.

[5] Justice K. S. Puttaswamy (Retd) & Anr v Union of India & Ors (Supreme Court of India, 26 September 2018) https://api.sci.gov.in/supremecourt/2012/35071/35071_2012_Judgement_26-Sep-2018.pdf accessed 21 August 2025

[6] TOI Tech Desk, ‘Who is Radhika Subramaniam? Tamil-English speaking AI influencer who’s redefining travel content in India; the Gen Z solo travelle.’ The Times of India (12 June 2025) https://timesofindia.indiatimes.com/technology/social/who-is-radhika-subramaniam-tamil-english-speaking-ai-influencer-whos-redefining-travel-content-in-india-the-gen-z-solo-traveler/articleshow/121797014.cms?utm_source=chatgpt.com accessed 21 August 2025.

[7] Noonoouri (VirtualHumans.org) https://www.virtualhumans.org/human/noonoouri accessed 21 August 2025

[8] Shudu Gram, Wikipedia (last modified 21 August 2025) https://en.wikipedia.org/wiki/Shudu_Gram accessed 21 August 2025.

[9] Lauren Michele Jackson, ‘Shudu Gram Is a White Man’s Digital Projection of Real-Life Black Womanhood’ (The New Yorker, 4 May 2018) https://www.newyorker.com/culture/culture-desk/shudu-gram-is-a-white-mans-digital-projection-of-real-life-black-womanhood?utm_source=chatgpt.com accessed 21 August 2025.

[10] Steven Lai, ‘35 % of People Have Bought a Product or Service Promoted by a Virtual Influencer’ (ION – Influencer Orchestration Network) (14 April 2022) https://www.ion.co/over-half-of-people-follow-at-least-one-virtual-influencer?utm_source=chatgpt.com accessed 21 August 2025.

[11] Thaler v Perlmutter (US Court of Appeals for the D.C. Circuit, No 23-5233, decided 18 March 2025) https://media.cadc.uscourts.gov/opinions/docs/2025/03/23-5233.pdf accessed 21 August 2025.

[12] Shenzhen Tencent Computer System Co Ltd v Shanghai Yingxun Technology Co Ltd (Shenzhen Nanshan District People’s Court, (2019) Yue 0305 Min Chu 14010, 24 December 2019) https://www.chinajusticeobserver.com/law/x/2019-yue-0305-min-chu-14010 accessed 21 August 2025.

[13] Copyright Act 1957 (India)

[14] Miquela (Wikipedia, last modified August 2025) https://en.wikipedia.org/wiki/Miquela accessed 21 August 2025.

[15] Goldie Chan, ‘Human Influencers Still Earn 46× More Than AI Influencers’ (Forbes, 2 July 2024) https://www.forbes.com/sites/goldiechan/2024/07/02/human-influencers-can-still-earn-46x-more-than-ai-influencers/?utm_source=chatgpt.com accessed 21 August 2025.

[16] Regulation (EU) 2024/1689 of the European Parliament and of the Council of 13 June 2024 on artificial intelligence (Artificial Intelligence Act) [2024] OJ L 168/1

[17] Advertising Standards Council of India, Guidelines for Influencer Advertising in Digital Media (effective 14 June 2021) https://www.ascionline.in/influencer-resource/ accessed 21 August 2025.

[18] Indian Influencer Governing Council, Code of Standards for Influencers (April 2025) https://iigc.org/code-of-standards/influencers/ accessed 21 August 2025. 

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Add a comment Add a comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Previous Post
fashion week, conservative

Fashion Weeks and Event Law: Liability, Sponsorship, and Performer Rights

Next Post
Loro Piana

From Atelier to Audit: Loro Piana’s Judgement