By: Halie B. Peacher

By 2025, Augmented Reality (“AR”) is expected to reach $198.17 billion in global revenue.[1]AR is an application or a device that allows users to personalize their real-world experience.[2]  This is done by utilizing an AR device and adding computer generated images or features to the world around you.[3]  For example, Sephora utilizes AR to allow consumers to virtually try on makeup by looking into amirror, a laptop, or a cell phone.[4]   Similarly, Amazon plans to create a personalized shopping experience via AR changing rooms that allows a consumer to virtually try on clothes.[5]  

One of the rising issues with AR devices that utilize facial recognition technology, is the technology’s ability to mine a person’s data which will lead to issues related to user privacy rights.[6] Consumer privacy may be impacted when the AR technology collects the user’s personal data to create the user’s personal image.[7]The user’s personal data may then be used for cross-device tracking, re-targeting for advertisement purposes, and combining the user’s online and offline data.[8]  For example, Amazon’s virtual changing room will collect the user’s personal data from all of the user’s social media sites and devices and will then be able to utilize the user’s personal data to guess what the consumer wants, when the consumer wants it, and how the consumer wants to view his or herself.[9]  

To protect consumer privacy, regulators must ensure that users consent to the collection and use of their data either in an opt-in or opt-out format.[10]  In gaining user consent, businesses must be transparent when giving users knowledge of what data is collected, how the data is collected, and what the data is used for.[11]This will protect businesses from liability by ensuring that the users acknowledge that the business is using their data and that consumers are somewhat informed of the data collection and use policy.[12]

In addition, regulators must ensure that a business does not use a person’s data in a way that could impact intellectual property use, added-value sponsorships, and advertisements.[13]For example, a targeted ad might pop up on your phone screen or mirror when virtually trying on clothes through Amazon’s virtual “changing room.”[14]  To prevent this, regulators could create a law that makes it very difficult for Amazon or any company to track and identify a person’s personal interest for advertisement purposes.[15]  This is loosely enforced through the Global Data Protection Regulation (“GDPR”) and the California Consumer Protection Act (“CCPA”) when the GDPR and CCPA state that a business may only use the consumer’s data for business related purposes.[16]

As AR becomes a bigger part in the way that users shop, businesses will undoubtedly face more legal implications.[17]However, to avoid heavy restrictions that could detract from utilizing AR, businesses should put the consumers first by implementing policies and best practices that protect the consumer interest in their data.[18]  

[1]  See Ian Dyer, Why Augmented Reality Stocks Show Long-Term Growth, Bryan Hill (Feb. 7, 2019)

[2]See Virtual Legality: Virtual Reality and Augmented Reality – Legal Issues, Dentons, 1, 3 (Feb. 2017),

[3]See id.  

[4] See Ashley Carman, Sephora’s Latest App Updates Lets You Try Virtual Makeup on at Home with AR, The Verge (Mar. 16, 2017 1:13 PM),

[5] See Steve Bird, Amazon to “Revolutionise” Shopping With ‘Virtual Changing Room’ App, The Telegraph(Jan. 26, 2019 9:00 PM), (Amazon’s virtual changing room allows a consumer to see how he or she would look in certain clothes by scanning the consumer’s pictures to create a virtual depiction of that user).

[6] See Venable LLP, Are You Prepared for the Legal Issues of Augmented Reality?, Lexology (July 26, 2017),;see also Definition of ‘Data Mining’, The Economic Times: Analytics, (last visited Feb. 13, 2019),, (explaining that data mining is a process “used to extract usable data from a larger set of any raw data” through behavioral analysis that predicts patterns, predictions based on likely outcomes, and creations of decision based information); see also Dentons, supra note 1 at 1.  

[7]See Vejay Lalla, et al, Top Legal Issues for Retailers Enhancing Customer Experience with New Formats, Chain Storage (Mar. 16, 2018),;see also Amanda G. Ciccatelli, The Intersection of Fashion, Virtual Reality and the Law, IPWatchdog (Aug. 16, 2017)

[8]See Lalla, supra note 7. 

[9]See Bird, supra note 4. 

[10]Under the Global Data Protection Regulation (“GDPR”), businesses must give consumers the right to opt-in and to access and delete their collected data. See generally 2016 J.O. (L119) 1.

[11]See Kris Lahiri, What is General Data Protection Regulation?, Forbes(Feb. 14, 2018 1:21 PM)

[12]See generally Kevin Granville, Facebook and Cambridge Analytica: What You Need to Know as Fallout Widens, New York Times(March 18, 2018)
technology/facebook-cambridge-analytica-explained.html (explaining that investigations and fines might be proper when a company fails to inform users of the use of the data and that the company gave a third-party access to the data).

[13]See Dentons, supra note 1 (explaining that intellectual property use may be impacted when a business asserts rights over a product such as a shirt with a particular logo or the user’s virtual self). 

[14]SeeBird, supra note 4. 

[15]See idsee also Dacia Green, Big Brother is Listening to You: Digital Eavesdropping in the Advertising Industry, 16 Duke L. & Tech. Rev.352, 383-391 (“A statute similar to the BROWSER Act, which allows consumers to opt-in to the collection of sensitive information and to opt-out of the collection of non-sensitive information, could be an effective solution”)

[16]See 2016 J.O. (L119) 1;California Consumer Privacy Act, Cal. Civ. Code §§ 1798.100 to 1798.198 (2018).

[17]See Augmented and Virtual Reality: Emerging Legal Implications of the “Final Platform,”Reed Smith LLP, 1, 1-7 (2017).

[18]See id. 

Share this post