Judge Tosses Biometric Data Suit Against X
X's child porn detection system doesn’t violate an Illinois biometric privacy law, the judge ruled.
A federal judge dismissed a lawsuit concerning the software X (formerly Twitter) uses to find illegal porn images. The suit was brought by Mark Martell, who objected to X using Microsoft's PhotoDNA software.
Martell argued that PhotoDNA—which is used across the tech industry to detect and report child porn—required the collection of biometric data and that this collection violated Illinois' Biometric Information Privacy Act (BIPA).
A win for Martell could have imperiled the use of PhotoDNA and similar software by all sorts of tech companies, thwarting tools that have proved useful in fighting sexually explicit images of minors, non-consensual adult pornography (a.k.a. "revenge porn"), terroristic images, and extremely violent content. Tech companies voluntarily employ these tools in a way that seems minimally invasive to the privacy of your average user—no biometric data collection actually required.
So, while "dude loses biometric privacy suit against big tech" may seem on its surface like a sad story, it's probably good news that U.S. District Judge Sunil R. Harjani granted X's motion to dismiss the case.
Want more on sex, technology, bodily autonomy, law, and online culture? Subscribe to Sex & Tech from Reason and Elizabeth Nolan Brown.
PhotoDNA 'cannot be used to identify a person'
"An Illinois federal judge has thrown out a proposed class action accusing X Corp. of violating the state's biometric privacy law through its use of software to police pornographic images," Law360 explains. Reading that summary, I'll admit, I was prepared to be outraged about the decision. But upon closer examination, the ruling seems like a good one.
This is not actually a case about X using biometric data, nor about X using it to police sex workers or adult sexuality.
Martell's objections center on the use of hashes, which function as a sort of secret signature attached to digital photos. Using PhotoDNA, X assigns hashes to newly uploaded photos and compares these to hashes in an existing database.
Hashes can be a powerful tool in fighting known objectionable content, helping platforms identify such content and prevent it from being reposted endlessly. Using a database of known (illegal or objectionable) images that have already been assigned hashes, tech companies can easily compare newly uploaded images against this database and flag them for further review.
Of course, any tools used to moderate bad stuff must be weighed against their potential to cross privacy lines. If X was using PhotoDNA to do facial recognition on all posted images (or all potentially explicit images), that would be cause for concern.
But it doesn't seem that's what the PhotoDNA software does.
"PhotoDNA is not facial recognition software and cannot be used to identify a person or object in an image," according to Microsoft.
Illinois Biometric Act Explicitly Excludes Photos
In his complaint, Martell argued that the hash creation process necessarily involved facial scans. But Martell "did not allege facts indicating that the hash is a scan of face geometry, as opposed to merely a record of the photo," Harjani wrote.
I might be missing something, but the method employed here doesn't seem particularly worrying from a privacy perspective.
Nor does it fall within the confines of the Illinois BIPA.
BIPA applies when biometric identifiers are used to identify individuals and defines "biometric identifier" as "a retina or iris scan, fingerprint, voiceprint, or scan of hand or face geometry." The law—which explicitly excludes photos from the definition of biometric identifiers—requires companies to provide written notice and receive explicit consent before collecting "an individual's biometric identifier used to identify an individual."
"Allegations that a photo was scanned are insufficient to plausibly allege that PhotoDNA creates a scan of an individual's face geometry under BIPA," writes Judge Harjani in his decision. "He failed to allege that any type of facial scan occurs during the hash creation process. Without that, there can be no scan of face geometry which could be used to identify an individual, as is required to be considered a biometric identifier under BIPA."
The judge left Martell open to file an amended complaint if possible.
More Sex & Tech News
Netflix has announced the first two cities for its gigantic new in-person experience venues, slated to open in 2025.
The new Netflix Houses — to open next year in King of Prussia, Pa. and Dallas — will feature a wide array of shopping, eateries and experiential activities tied… pic.twitter.com/DRGvr2Pe0t
— Variety (@Variety) June 18, 2024
• D.M. Bennett, publisher of The Truth Seeker, was convicted in 1879 of violating the Comstock Act by publishing an Ezra Heywood critique of marriage. The Comstock Act is a Victorian-era law that prophets mailing "obscene, lewd, or lascivious" material, and some conservatives want to revive it today to go after abortion pills. The Foundation for Individual Rights and Expression (FIRE) has launched a campaign to get the Biden administration to posthumously pardon Bennett and "also to put the administration on record in opposing the resurgence of this law that threatens the rights of all Americans," writes FIRE lawyer Robert Corn-Revere.
• "What the public needs to understand" about the Supreme Court's mifepristone ruling last week "is that this is a procedural ruling and not a proclamation that the distribution of mifepristone is lawful or that birth control is protected by the Constitution," said University of Colorado Law School professor Aya Gruber (author of the very excellent book The Feminist War on Crime). "Plaintiffs with standing may lodge the same challenge that the [U.S. Food and Drug Administration] lacked authority to approve mifepristone. Additionally, pro-life groups are dusting off an 1800s obscenity law, the 'Comstock Act,' to argue for a national ban on abortion and 'abortifacient' drugs, equipment, and even information. This fight is far from over."
• Law professor Nadine Strossen will be chatting with the Woodhull Freedom Foundation's Ricci Levy about free speech tomorrow at 1 p.m., in a talk you can watch online. (Read my interview with Strossen earlier this year here and here.)
• "California lawmakers are debating an ill-advised bill that would require internet users to show their ID in order to look at sexually explicit content," notes the Electronic Frontier Foundation, which is trying to rally opposition to the measure (Assembly Bill 3080).
Show Comments (25)