French search engine shows ‘potentially explicit’ photos of children – Security

FR search engine shows ‘potentially explicit’ photos of children

»theintercept.com/2022/07 pimeyes/

Quotation:


The online facial recognition search engine PimEyes allows anyone to search for images of children retrieved from the Internet, raising an alarming multitude of possible uses, an Intercept survey has found. Often called the Google of facial recognition, PimEyes’ search results include images the site calls “potentially explicit,” which could lead to increased exploitation of children at a time when the dark web has unleashed an explosion of images of abuse. “There are privacy concerns raised by the use of facial recognition technology writ large,” said Jeramie Scott, director of the surveillance monitoring project at the Electronic Privacy Clearinghouse. “But it’s especially dangerous when you’re talking about children, when someone can use that to identify a child and find them.” In recent years, several child victim advocacy groups have lobbied for police to use surveillance technology to combat trafficking, arguing that facial recognition can help authorities locate victims. A child abuse prevention nonprofit, Ashton Kutcher and Demi Moore’s Thorn, has even developed its own facial recognition tool. But searches on PimEyes for 30 AI-generated children’s faces yielded dozens of pages of results, showing how easily those same tools can be turned against the people they’re supposed to help. … On its website, PimEyes claims that people should only use the tool to search for their own faces, claiming that the service is “not intended for the surveillance of others and is not designed for that purpose”. But the company offers subscriptions that allow users to perform dozens of unique searches per day. the cheapest plan, at $29.99 per month, offers 25 daily searches. People who shell out for the premium service can set alerts for up to 500 different images or combinations of images, to be notified when a particular face appears on a new site. Gobronidze claimed that many of PimEyes’ subscribers are women and girls looking for revenge porn images of themselves, and that the site allows multiple searches so those users can get more robust results. “With one photo you can get one set of results, and with another photo you can get a totally different set of results, because the index combination is different on each photo,” he said. Sometimes, he added, people find new illicit images of themselves and have to set additional alerts to search for those images. He acknowledged that 500 unique alerts is a lot, although he said that as of Thursday, 97.7% of PimEyes subscribers had lighter accounts. The former owners of PimEyes marketed it as a way to interfere in the lives of celebrities, German digital rights site Netzpolitik reported in 2020. Following criticism, the company turned to claiming the search engine was a privacy tool. Gobronidze said heavy features were being reviewed under his ownership. “Previously, I can say that PimEyes was tailor-made for stalkers, [in that] he used to explore social media,” he said. “Once you drop a photo, you can find the social media profiles for everyone. Now it is limited only to public searches. But many people clearly don’t see PimEyes as a privacy aid. The site has already been used to identify adults in a wide variety of cases, from so-called sedition hunters working to find perpetrators after the January 6 uprising, to users of the notorious site 4chan seeking to harass women. PimEyes’ marketing materials also do not suggest great concern for privacy or ethics. In a version of the “people kill people” argument favored by the US gun lobby, a blog post on the site gleefully alludes to its many uses: “PimEyes merely provides a tool, and the user is obligated to use the tool responsibly. Everyone can buy a hammer, and everyone can either craft with this tool or kill. “These things should only be instrumentalized with the clear and informed consent of users,” said Daly Barnett, a technologist at the Electronic Frontier Foundation. “This is just another example of the great global problem with technology, whether it is surveillance-based or not. There is no privacy from the start, and users have to choose not to see their privacy compromised.


Read quite extensively, and from a “this is not a conversation” can we do this “but a conversation” should we do this “we should have” POV, this definitely raises the hairs on the back of my neck .

Cheers

Harry L. Blanchard