Artificial intelligence will take over from British police in the scanning of child abuse images to save officers from “trauma,” according to a report.
In their report, the Telegraph claimed AI “will take on the gruelling task of scanning for images of child abuse on suspects’ phones and computers so that police officers are no longer subjected to psychological trauma within ‘two to three years.'”
“The Metropolitan Police’s digital forensics department, which last year trawled through 53,000 different devices for incriminating evidence, already uses image recognition software but it is not sophisticated enough to spot indecent images and video,” the Telegraph explained, adding, “The force is currently drawing up an ambitious plan to move its sensitive data to cloud providers such as Amazon Web Services, Google or Microsoft.”
In a comment to the Telegraph, Mark Stokes, the Metropolitan Police Service’s head of digital and electronics forensics, claimed, “We have to grade indecent images for different sentencing, and that has to be done by human beings right now, but machine learning takes that away from humans.”
“You can imagine that doing that for year-on-year is very disturbing,” he expressed, adding, however, that the software they use frequently proves problematic.
“Sometimes it comes up with a desert and it thinks its an indecent image or pornography,” declared Stokes. “For some reason, lots of people have screen-savers of deserts and it picks it up thinking it is skin colour.”
The Metropolitan Police is not the only service increasingly relying on artificial intelligence, with Facebook using AI to scan users’ posts for suicide warning signs, terrorism-related content, and revenge porn.
In September, it was also reported that artificial intelligence can predict whether your relationship will last, while this month it was revealed that realistic-looking fake celebrity porn videos were being produced using AI.