By: Brad Espinosa

Image by 377053 from Pixabay

Google’s automated flagging system came under scrutiny in a landmark digital privacy rights case decided by the Ninth Circuit on September 21, 2021. The Ninth Circuit Court of Appeals decided in United States. v. Wilson, that law enforcement officials will need a warrant to access digital information such as email attachments, even if it is a third-party’s automated system that reported the attachments as illegal.[1] Federal law states that providers have a duty to report any facts or circumstances that are a “proliferation of child abuse.”[2] Following federal law, Google has chosen to deploy an automated system to scan its servers for child sexual abuse material (hereinafter “CSAM”).[3] As a result, Google reported to the National Center for Missing and Exploited Children (hereinafter “NCMEC”) that “Wilson had uploaded four images of apparent child pornography to his email account as email attachments.”[4] There was no human check of the content by Google or NCMEC.[5]  NCMEC then sent the images to the San Diego Internet Crimes Against Children Task Force (ICAC) where an officer saw the email attachments and applied for warrants for both Wilson’s email and house, describing the email attachments in detail in the warrant application.[6]

Wilson challenged whether the government’s warrantless search of his email attachments violated the Fourth Amendment. Law enforcement officials argued that their search was permitted under the “private search doctrine” which states that when a private party searches and has information, and provides information to the government, a search of that “same information does not implicate the Fourth Amendment.”[7] The Ninth Circuit controversially decided that it was not justified under the Fourth Amendment, going against decisions in the Fifth and Sixth Circuit courts.[8]

Circuit courts are split on whether law enforcement officers need a search warrant to search images flagged by automated programs which large tech companies have put in place for crime detection measures.[9] Previously, law enforcement officials were given permission to search images flagged by service providers to be “CSAM” without a warrant since courts held that they were only “confirming” the accuracy of the flags.[10] The Fifth and Sixth Circuits decided that the private search exception justified the government’s warrantless search because the government’s agent’s viewing the images was akin to “opening the file merely to confirm that the flagged file was indeed child pornography as suspected.”[11] In their decision, the Ninth Circuit made it clear that if law enforcement wanted to see the image, they would need a warrant.[12] The Ninth Circuit stated that since Google and NCMEC had not seen the images, and that the police were the first ones to actually see the image, the government’s actions do not fall within the private search exception.[13] This decision adds more confusion to how courts are applying the Fourth Amendment and the private search exception with regard to government searches of images that private sector companies flag.[14]

As a result of these decisions, companies are looking to address their automation systems while balancing privacy concerns.[15] Changes may include the insertion of human review and a deep analysis of the use of automation detection systems to prevent groups from becoming hostile towards companies that employ them due to a belief that individual privacy is under threat. Companies such as Apple recently delayed plans for a system to scan user images for signs of child sexual abuse material after backlash from privacy advocates.[16] Apple intends to include a layer of human review to verify images marked as explicit before a user is reported to law enforcement.[17] As circuit courts split on what law enforcement can do and see, companies can only now tread the waters on how to protect children while maintaining user privacy.

[1] See United States v. Wilson, No. 18-50440, 2021 WL 4270847, at *1 (9th Cir. Sept. 21, 2021).

[2] See generally 18 U.S.C. § 2258A.

[3] See Jennifer Lynch, In U.S. v Wilson, the Ninth Circuit Reaffirms Fourth Amendment Protection for Electronic Communications, Electronic Frontier Foundation, (Sept. 28, 2021),

[4] Wilson, 2021 WL 4270847 at *3.

[5] Id.

[6] Id.

[7] Adam A. Bereston, The Private Search Doctrine and the Evolution of Fourth Amendment Jurisprudence in the Face of New Technology: A Broad or Narrow Exception?, 66 Cath. U. L. Rev. 445, 446 (2017) (citing to United States v. Jacobsen, 466 U.S. 109, 117 (1984)).

[8] See United States v. Miller, 982 F.3d 412, 427 (6th Cir. 2020); see also United States v. Reddick, 900 F.3d 636 (5th Cir. 2018).

[9] Andrea Vittorio, Google-Flagged Child Porn Case Shows Court Split on Privacy, Bloomberg L. (Sept. 24, 2021, 5:01 AM),

[10] Id.

[11] See United States v. Miller, 982 F.3d 412, 427 (6th Cir. 2020); see also United States v. Reddick, 900 F.3d 636 (5th Cir. 2018).

[12] See generally United States v. Wilson, No. 18-50440, 2021 WL 4270847 (9th Cir. Sept. 21, 2021).

[13] Id. at *20–*22.

[14] Vittorio, supra note 9.

[15] Id.

[16] Id.

[17] Id.

Share this post