Privacy-minded consumers could soon spur demand for camouflage clothing, not merely as a fashion statement, but as a matter of practicality.
A group of tech-savvy artists is preparing to unveil a line of patterned garments this month specially designed to confuse facial-recognition software. The international collaboration builds on past projects that have used asymmetric hair-and-makeup styles for the same purpose: to give back to individuals a shred of the anonymity they have lost to mass surveillance by governments and businesses alike.
Berlin-based artist and technologist Adam Harvey says his ideas, though lighthearted on their surface, have drawn attention to important topics and drawn the attention of important people.
“These projects, I approach them in a playful way, but they also touch on some very serious issues of national security and surveillance,” Mr. Harvey told an audience last month during a half-hour presentation at the Chaos Communication Congress in Hamburg. “What I can’t predict is who will find these threatening or interesting.”
American officials have taken notice, Harvey said, showing the audience that the US Air Force General Counsel had tweeted about his past work and that a redacted “three-letter agency” had sought permission to republish some of his imagery in “an internal publication.”
Facial recognition software is used for military and law enforcement purposes, but it has also become common for private businesses. Facebook scans photos and prompts users to tag their friends, and Amazon analyzes the faces of those who enter its physical stores, as The Guardian reported.
The purported commercial goal, tailoring ads to individual users, may seem innocent enough, but Facebook and other technology firms have faced lawsuits challenging their practices. Since the massive social network can recognize people’s faces 97.25 percent of the time – that’s more powerful than the FBI’s similar technology and just 0.25 percent less than what humans can do – privacy advocates worry Facebook could be providing biometric data to law enforcement without proper consent, as The Christian Science Monitor reported last year.
These data have already been used for law enforcement purposes and have already resulted in problematic outcomes in the United States, where police keep images of 117 million adults as part of facial recognition programs. After a year-long study by Georgetown Law’s Center on Privacy and Technology raised serious questions about racial bias, saying black Americans were disproportionately targeted in criminal investigations, more than 40 groups petitioned the Department of Justice last fall to investigate the technology.
“These algorithms don’t see race in the same way you and I do, but that doesn’t mean they’re not racist,” Jonathan Frankle, a PhD student at the Massachusetts Institute of Technology who worked on the study, told the Monitor at the time.
Officials have sought out this facial recognition technology not only for domestic law enforcement but also for anti-terrorism efforts. A contractor for the US Department of Homeland Security hired Israeli firm Faception, which claims to combine machine learning with facial recognition to “identify everything from great poker players to extroverts, pedophiles, geniuses, and white collar criminals,” as Evan Selinger and Woodrow Hartzog wrote in an opinion piece for the Monitor last year.
“That’s a problem,” Dr. Selinger and Dr. Hartzog wrote. “The government should not use people’s faces as a way of tagging them with life-altering labels. The technology isn’t even accurate. Faception’s own estimate for certain traits is a 20 percent error rate. Even if those optimistic numbers hold, that means for every 100 people, the best-case scenario is that 20 get wrongly branded as a terrorist.”
For that reason, Harvey listed Faception as among the most pernicious facial-recognition services available.
“What all this reminds me of is Francis Galton and eugenics. And who the real criminal in these cases would be is people who are perpetrating this idea, not the people who are being looked at,” Harvey said to applause.
To offer individuals a modest shield against these prying eyes, Harvey is collaborating with artists Ashley Baccus-Clark, Carmen Aguilar y Wedge, Ece Tankal, and Nitzan Bartov with the Hyphen Labs’ NeuroSpeculative AfroFeminism project. Although the final designs are still being developed, a textile print is scheduled to launch Jan. 16 at the Sundance Film Festival, as VICE’s The Creators Project reported.
Much like his silver-lined anti-thermal imaging hijab and burqa prototypes, the anti-facial recognition textiles are not designed to make a wearer seemingly disappear, Harvey said.
“I think camouflage is often misunderstood as a Harry Potter invisibility cloak, when camouflage actually is about optimizing the way that you appear and reducing visibility,” he said, “moving matter between different parts of the electromagnetic spectrum, possibly just even for a brief moment to evade observation.
“So achieving 100 percent of course would be great, but I don’t think that should be a requirement for the way we think about camouflage.”
New York Times
Picture source: Alamy
Picture used for illustrative purposes alone