Chinese scientists have developed an artificial intelligence (A.I.) system that can examine “evidence” and charge people with “crimes,” as defined by China’s totalitarian parody of a Western justice system.
The developers claim the A.I. prosecutor is over 97 percent accurate when it files charges.
The South China Morning Post (SCMP) on Sunday reported the system was “built and tested by the Shanghai Pudong People’s Procuratorate, the country’s largest and busiest district prosecution office.”
The project was managed by the “big data and knowledge management laboratory” at the Chinese Academy of Sciences, which predicted the robo-prosecutor would reduce the workload of human prosecutors by handling routine cases, allowing them to “focus on more difficult tasks.”
Chinese prosecutors already employ a data processing tool called “System 206” that helps them “evaluate the strength of evidence, conditions for an arrest and how dangerous a suspect is considered to be to the public,” SCMP noted.
According to Chinese Academy of Sciences development team leader Shi Yong, the new A.I. system can go a step further by making decisions based on its data and filing charges automatically, without human intervention:
The AI prosecutor developed by Shi’s team could run on a desktop computer. For each suspect, it would press a charge based on 1,000 “traits” obtained from the human-generated case description text, most of which are too small or abstract to make sense to humans. System 206 would then assess the evidence.
The machine was “trained” using more than 17,000 cases from 2015 to 2020. So far, it can identify and press charges for Shanghai’s eight most common crimes.
They are credit card fraud, running a gambling operation, dangerous driving, intentional injury, obstructing official duties, theft, fraud and “picking quarrels and provoking trouble” – a catch-all charge often used to stifle dissent.
It does not exactly take state-of-the-art processing power to charge hapless Chinese dissidents with “picking quarrels and provoking trouble,” a “crime” that requires very little in the way of evidence and almost nothing that would be recognized as due process by citizens of the free world. It is not difficult to see how an A.I. system could go overboard charging political dissidents with “obstructing official duties.”
The robo-prosecutor seems like the next logical step in China’s tyrannical “social credit system,” already capable of detecting dissident behavior and punishing citizens in various ways with minimal human oversight.
While Shi burbled happily about the A.I. system growing even more powerful and auto-filing charges for ever more complex offenses, an unnamed Chinese prosecutor told the SCMP he was worried about prosecutors losing what little autonomy and responsibility they have.
“The accuracy of 97 per cent may be high from a technological point of view, but there will always be a chance of a mistake. Who will take responsibility when it happens? The prosecutor, the machine or the designer of the algorithm?” the anonymous critic asked.
“A.I. may help detect a mistake, but it cannot replace humans in making a decision,” he insisted.