Wednesday, 20 December 2017

UK Police Have a Porn-Spotting AI That Gets Confused by Desert Photos

UK police are turning to high-tech artificial intelligence to help wage war against the scourge of child pornography, but the system currently has a tricky problem: the AI has a hard time telling the difference between nudity and photos of deserts.

The Telegraph reports that the Metropolitan Police’s digital forensics department is deploying AI to scan child pornography suspects’ phones and computers so that human police officers are no longer subjected to the psychological trauma that comes with the task.

The department, which had to search through 53,000 devices just last year, hopes to have an AI system capable of doing the job within two or three years.

Although the system is quite good at spotting some subjects in photos — things like drugs, guns, and money — it has been failing at distinguishing between naked bodies and sand dunes.

“Sometimes it comes up with a desert and it thinks its an indecent image or pornography,” says Met digital forensics head Mark Stokes. “For some reason, lots of people have screen-savers of deserts and it picks it up thinking it is skin color.”

Image recognition AI technology has come a long way in recent years — Google has a photo captioning AI that can describe photos with 94% accuracy. But these AI systems have had notable blunders as well. Both Google and Flickr apologized in 2015 after their auto-tagging systems identified people with darker skin colors as apes.

(via Telegraph via Gizmodo)


Image credits: Desert photos by Uncoated Photos and TTS_Adliswil



from PetaPixel https://petapixel.com/2017/12/20/uk-police-porn-spotting-ai-gets-confused-desert-photos/

No comments:

Post a Comment