Searching over 5,500,000 cases.

Buy This Entire Record For $7.95

Download the entire decision to receive the complete text, official citation,
docket number, dissents and concurrences, and footnotes for this case.

Learn more about what you receive with purchase of this case.

United States v. Wilson

United States District Court, S.D. California

June 26, 2017



         Before the Court is Defendant Luke Noel Wilson's (“Defendant's” or “Wilson's”) Motion to Suppress Evidence as a Result of an Illegal Search. (Dkt. No. 57.) The motion has been fully briefed. (Dkt. Nos. 62, 65.) The Court conducted an evidentiary hearing and took the matter under submission on May 18, 2017. (Dkt. No. 67.) Upon consideration of the moving papers, applicable law, and argument of counsel, and for the reasons set forth below, the Court DENIES Defendant's motion to suppress.


         A. Factual Background

         1. Google, Inc. (“Google”) Has a Statutory Duty to Report Known Child Pornography Violations.

         Google is mandated by law to report known child pornography violations to the CyberTipline of the National Center for Missing and Exploited Children (“NCMEC”). 18 U.S.C. 2258A(a) mandates that Internet service providers (“ISPs”) that “obtain[] actual knowledge of any facts or circumstances” evincing “apparent” child pornography violations must submit, “as soon as reasonably possible, ” reports to the CyberTipline.[1] 18 U.S.C. § 2258A(a). An ISP may include in the report information about the identity and geographic location of the individual involved; historical reference information regarding the uploading, transmittal, or receipt of the apparent child pornography, or regarding the circumstances of the ISP's discovery of the apparent child pornography; any image of apparent child pornography relating to the incident in the report; as well as “[t]he complete communication containing any image of apparent child pornography.” Id. § 2258A(b). ISPs that “knowingly and willfully” fail to make a report to the CyberTipline face financial sanctions. See Id. § 2258A(e). The statute requires NCMEC to forward each report it receives to federal law enforcement agencies and permits NCMEC to forward the reports to state and local law enforcement. See Id. § 2258A(c).

         2. Google's Proactively Screens for Child Pornography to Further its Private Business Interests.

         To further its private business interests, Google takes proactive measures beyond what is statutorily mandated by 18 U.S.C. § 2258A to screen for, report, and remove child pornography from its products and services. (See Dkt. No. 62-2, Declaration of Cathy A. McGoff (“McGoff Decl.”).)

Google has a strong business interest in enforcing our terms of service and ensuring that our products are free of illegal content, and in particular, child sexual abuse material. We independently and voluntarily take steps to monitor and safeguard our platform. . . . Ridding our products and services of child abuse images is critically important to protecting our users, our product, our brand, and our business interests.

(Id. ¶ 3.)

         Google identifies and removes child pornography by employing a process that involves both visual inspection by trained employees and technological screening by Google's proprietary “hashing” technology. Google has been using its own hashing technology to identify child pornography since 2008. (Id. ¶ 4.) The process is as follows. First, Google trains a team of employees on Google's statutory duty to report apparent child pornography. (Id. ¶ 6.) This team is further “trained by counsel on the federal statutory definition of child pornography and how to recognize it on [Google's] products and services.” (Id.)

         Second, offending images are catalogued and assigned “hash values, ” which are often described as “digital fingerprints.”[2] Specifically,

Each offending image, after it is viewed by at least one Google employee, is given a digital fingerprint (“hash”) that our computers can automatically recognize and is added to our repository of hashes of apparent child pornography as defined in 18 USC § 2256. Comparing these hashes to hashes of content uploaded to our services allows us to identify duplicate images of apparent child pornography to prevent them from continuing to circulate on our products.

(Id. ¶ 4 (emphasis added).)[3]

         Third, Google's system searches its products and services for hash values that match hash values in its repository of known child pornography images.

When Google's product abuse detection system encounters a hash that matches a hash of a known child sexual abuse image, in some cases Google automatically reports the user to NCMEC without re-reviewing the image. In other cases, Google undertakes a manual, human review, to confirm that the image contains apparent child pornography before reporting it to NCMEC.

(Id. ¶ 7.) Finally, Google provides a CyberTipline Report to NCMEC. (Id. ¶ 8.) As a result of this multi-tiered process, Google's proprietary hashing technology “tag[s] confirmed child sexual abuse images” that are “duplicate images of apparent child pornography” previously identified by at least one trained Google employee. (Id. ¶ 4 (emphasis added).)

         3. Defendant Agreed to Google's Terms of Service and Created a Google Email Account.

         On March 13, 2014, Defendant created a Google email account with the username (Dkt. No. 62-1 at 2, Ex. 1.) Defendant agreed to Google's November 11, 2013 Terms of Service upon creation of the account. (Dkt. No. 62 at 4.) On April 14, 2014, Google modified its Terms of Service. (Dkt. No. 62-2 at 5-7, McGoff Decl. Ex. A.)[4] The April 14, 2014 Terms of Service contained the following provisions, in relevant part.

         Google instructed users: “You may use our Services only as permitted by law, ” and “[w]e may suspend or stop providing our Services to you if you do not comply with our terms or policies or if we are investigating suspected misconduct.” (Dkt. No. 62-2 at 5, McGoff Decl. Ex. A.)

Regarding user content, Google stated,
We may review content to determine whether it is illegal or violates our policies, and we may remove or refuse to display content that we reasonably believe violates our policies or the law. But that does not necessarily mean that we review content, so please don't assume that we do.


         Google notified users, “Our automated systems analyze your content (including emails) to provide you personally relevant product features, such as customized search results, tailored advertising, and spam and malware detection. This analysis occurs as the content is sent, received, and when it is stored.” (Id.)

Finally, Google reminded users that Google
may modify these terms or any additional terms that apply to a Service to, for example, reflect changes to the law or changes to our Services. You should look at the terms regularly. We'll post notice of modifications to these terms on this page. . . . If you do not agree to the modified terms for a Service, you should discontinue your use of that Service.


         4. Google Identified Four Confirmed Child Pornography Images in Defendant's Email and Provided a CyberTipline Report to NCMEC.

         On June 4, 2015, Google, by way of its proprietary hashing technology, became aware that Defendant uploaded four image files depicting child pornography to an email in his Google account. (Dkt. No. 62-3 at 4, 11-13, Ex. 3.) Google complied with its legal obligation under 18 U.S.C. § 2258A and provided a CyberTipline Report to NCMEC. (Id.) Defendant's email account was terminated on June 4, 2015. (Dkt. No. 62-2, McGoff Decl. ¶ 9; Dkt. No. 62-1 at 2.)

         On June 5, 2015, NCMEC received CyberTipline Report # 5074778 from Google. (Dkt. No. 62-3, Ex. 3.) The report included information about the date and time Defendant uploaded the four child pornography images, the email address and recent login information associated with the account (including logins from a device possessing Internet protocol (“IP”) address on June 4, 2015 at 15:07:19 UTC and on May 9, 2015 at 15:48:04 UTC), and the secondary email address associated with the account, (Id.) The report also included the four image files, each of which Google classified as “A1” in accordance with the industry classification standard. (Id.; see also Dkt. No. 62-2, McGoff Decl. ¶¶ 9-11.) “A1, ” in short, indicates that the file content contains a depiction of a prepubescent minor engaged in a sex act. (Id.)

         Specifically, “A” signifies “Prepubescent Minor, ” whereas “B” signifies “Pubescent Minor.” (Dkt. No. 62-3 at 14, Ex. 3.) “1” denotes “Sex Act, ” defined as: “Any image of sexually explicit conduct (actual or simulated sexual intercourse including genital-genital, oral-genital, anal-genital, or oral-anal whether between person of the same or opposite sex), bestiality, masturbation, sadistic or masochistic abuse, degradation, or any such depiction that lacks serious literary, artistic, political, or scientific value.” (Id.) “2” denotes “Lascivious Exhibition, ” defined as: “Any image depicting nudity and one or more of: restraint, sexually suggestive poses, focus on genitals, inappropriate touching, adult arousal, spreading of limbs or genitals, and such depiction lacks serious literary, artistic, political, or scientific value.” (Id.)

         Google did not forward the email itself to NCMEC. The report did not include any email body text or header information associated with the reported offending content. (Dkt. No. 62-2, McGoff Decl. ¶¶ 9-11.) The report indicated that a Google employee did not manually review the images after Google's proprietary hashing technology tagged the images as apparent child pornography.[5] (Id.)

         5. NCMEC Forwarded the CyberTipline Report to the San Diego Internet Crimes Against Children (“ICAC”) Task Force Program.

         On or about June 17, 2015, NCMEC forwarded the CyberTipline Report to the San Diego ICAC Task Force Program. (Dkt. No. 62-3 at 17, Ex. 3.) NCMEC forwarded the information supplied by Google and the four image files to ICAC. (Id.) NCMEC did not forward the email itself. (Id.) NCMEC clarified, “Please be advised that NCMEC has not opened or viewed any uploaded files submitted with this report and has no information concerning the content of the uploaded files other than information provided in the report by the ESP.” (Id.)

         6. Homeland Security Investigation (“HSI”) Special Agent (“SA”) William Thompson Reviewed the CyberTipline Report, Visually Examined the Four Image Files, and Confirmed the Four Images Depict Child Pornography.

         The San Diego ICAC office printed the report it received from NCMEC and the four attached image files. The printed report and images were given to SA Thompson. SA Thompson's review was limited to the contents of the CyberTipline Report and the four image files. He did not view or have access to Defendant's email at this time.

         SA Thompson visually examined the four images and confirmed that they depict child pornography. Each of the images depicts a prepubescent minor engaged in a sex act, in line with Google's classification of the images as “A1” content. SA Thompson described the four images as follows.

1. 140005125216.jpg - This image depicts a young nude girl, approximately five (5) to nine (9) years of age, who is lying on her stomach with her face in the nude genital region of an older female who is seated with her legs spread. A second young girl, approximately five (5) to nine (9) years of age, is also visible in this image and she is partially nude with her ...

Buy This Entire Record For $7.95

Download the entire decision to receive the complete text, official citation,
docket number, dissents and concurrences, and footnotes for this case.

Learn more about what you receive with purchase of this case.