By: Meghan Chilappa

Recently, the state of California passed legislation to protect its citizens’ privacy.  A few weeks ago, the legislature moved to ban facial recognition technology (FRT) in law enforcement body cameras, making California the first and largest state to do so.[1]  This legislation could impact not only law enforcement, but the private sector as well. Specifically, technology companies such as Amazon, Microsoft, Accenture, Axon, and others would be impacted.[2]  The bill, AB1215,[3] gained national attention when FRT technology mistook twenty percent of California state lawmakers as criminals in its database.[4]  Proponents of FRT – the law enforcement community in particular – argue that the technology reduces crime, diminishes effective policing, and enhances public safety during large-scale events such as Coachella or the Rose Bowl.[5]  Opponents of FRT argue that the technology turns communities into a surveillance state and will ultimately strain trust between citizens and law enforcement.[6]  Although California moves to extend this three-year ban on facial recognition technology used in law enforcement body cameras, it is ultimately up to Congress to regulate this technology on a federal level.[7] 

In addition, California has also enacted the California Consumer Privacy Act (CCPA), which passed in the California legislature in 2018 and takes effect in January 2020.[8]  The CCPA is a monumental California law that heightens consumer protection and, among other things, will give residents the ability to tell companies not to sell or retain their personal information.  While some states are choosing to focus on “reasonable security requirements” in their privacy laws and place a larger significance on cybersecurity compliance by businesses, the CCPA seems more concerned with California residents’ individual control over their privacy rights.[9]  Given the effect of the CCPA, it is being compared to the European Union’s General Data Protection Regulation (GDPR) GDPR, but it is not quite as stringent.[10]  

California district courts, and the Ninth Circuit, have taken an expansive view on the “digital is different” approach.  In a recent ruling by the Ninth Circuit, the Court affirmed a Northern District of California ruling  certifying a class of Facebook users and recognized that the Plaintiffs suffered a concrete injury when the company used their “face templates” technology without the users’ consent.[11]  It was the first federal Circuit Court of Appeals to do so.[12]  Most civil cases under the Biometric Information Privacy Act (BIPA) have not recognized the first step of litigation: that the Plaintiffs have met the constitutional standing requirements to be heard before the court.  Other district courts disagree with the Ninth Circuit, but if the divergent views on standing and the concrete injury requirement become more pronounced, the Supreme Court will have to resolve this issue.[13]  It appears that California is willing to be the battleground for these types of legal arguments against large technology corporations.

FRT and the CCPA are intrinsically linked to the business and business law communities. Amazon and Microsoft are the two major technology corporations that produce FRT tools, but even they (specifically Microsoft) agree that the technology needs to be regulated.[14]  Law firms that assist clients with compliance are already gearing up for the potential onslaught of CCPA litigation.[15]  Looking ahead, California may be the standard that Congress looks to when developing and enacting federal data privacy and facial recognition technology legislation.

[1] See Reis Thebault, California Could Become the Largest State to Ban Facial Recognition in Body Cameras, Wash. Post (Sep. 11, 2019), (explaining the legislative history and process behind the FRT bill).

[2] See id (discussing private sector involvement in FRT, and the relationship between Amazon and law enforcement specifically).

[3] Cal. Penal Code § 832.19 (pending Governor’s signature).

[4] See Thebault, supra, note 1 (finding from ACLU indicated that “twenty percent of legislators were mismatched to someone who had been arrested.”).

[5] See id.

[6] See id (noting that communities will distrust law enforcement even more if they know that their faces and movements are being consistently tracked through technology without their consent).

[7] See Matt Binder, Congress Agrees: It’s Time to Regulate Facial Recognition Technology, Mashable (May 22, 2019) (explaining the tone at the federal level related to FRT technology).

[8] See Jeff John Roberts, Here Comes America’s First Privacy Law: What the CCPA Means for Business and Consumers, Fortune (Sep. 13, 2019), (outlining the key provisions of the CCPA, who it aims to protect, and what types of data protection are regulated).  

[9] See Stuart L. Pardau, The California Consumer Privacy Act: Towards a European-Style Privacy Regime in the United States?, 23 J. Tech L. & Pol’y, 68 99 (2018) (discussing that the CCPA is more focused on “consumer control”).

[10] See Jon Fielding, Four Differences Between the GDPR and the CCPA, Help Net Sec. (Feb. 4, 2019), (explaining 4 categories that GDPR penalties and oversight measures are stricter than the CCPA, namely: the businesses that must comply, the financial penalties, consumer rights, and enactment and enforcement).

[11] See Patel v. Facebook, Inc., 932 F.3d 1264, 1273 (9th Cir. 2019) (concluding that “. . . [t]he development of a face template using facial-recognition technology without consent (as alleged here) invades an individual’s private affairs and concrete interests.”); see also Torsten Kracht & Bennett Sooy, Ninth Circuit Facebook Ruling Adds Another Piece to BIPA Standing Chessboard, Bloomberg Law News (Aug. 14, 2019), (signaling that statutory and legislative intent is enough to confer a concrete injury for Article III standing purposes).

[12] Kracht, supra note 11.

[13] See Rivera v. Google, Inc., 366 F. Supp. 3d 998, 1010, 1014 (holding that Google users did not suffer a concrete injury for Article III purposes because, among other reasons, there was no evidence of a substantial risk that their biometric identifiers would be distributed to a third party).

[14] See Brad Smith, Facial Recognition Technology: The Need for Public Regulation and Corporate Responsibility, Microsoft on the Issues (July 13, 2018), (recognizing that human rights, privacy, and freedom of expression are on the line with FRT absent federal legislation).

[15] See Rod Christensen, CCPA Compliance: Preparing for the California Consumer Privacy Act, Corporate Compliance Insights, (July 17, 2019), (sharing tips for businesses and areas of compliance that various industries need to start thinking about before January 2020),

Share this post