China Develops A.I. ‘Prosecutor’ Capable of Evaluating Crimes and Filing Charges

A man wearing face mask looks at a robot at the China National Convention Centre, the venu
WANG ZHAO/AFP via Getty Images

Chinese scientists have developed an artificial intelligence (A.I.) system that can examine “evidence” and charge people with “crimes,” as defined by China’s totalitarian parody of a Western justice system.

The developers claim the A.I. prosecutor is over 97 percent accurate when it files charges.

The South China Morning Post (SCMP) on Sunday reported the system was “built and tested by the Shanghai Pudong People’s Procuratorate, the country’s largest and busiest district prosecution office.”

The project was managed by the “big data and knowledge management laboratory” at the Chinese Academy of Sciences, which predicted the robo-prosecutor would reduce the workload of human prosecutors by handling routine cases, allowing them to “focus on more difficult tasks.”

Chinese prosecutors already employ a data processing tool called “System 206” that helps them “evaluate the strength of evidence, conditions for an arrest and how dangerous a suspect is considered to be to the public,” SCMP noted.

MEISHAN, CHINA - OCTOBER 12: (CHINA OUT) Police escort 19 gang members during a trial at the Meishan Intermediate People's Court on October 12, 2005 in Meishan of Sichuan Province, southwest China. According to police, the gang has committed murder, robbery, rape, blackmail, gambling, drug trafficking and other crimes. Six members are sentenced to death. Local authorities have strengthened their fight against gangs and other surging organized criminals. (Photo by China Photos/Getty Images)

Meishan Intermediate People’s Court (Photo by China Photos/Getty Images)

According to Chinese Academy of Sciences development team leader Shi Yong, the new A.I. system can go a step further by making decisions based on its data and filing charges automatically, without human intervention:

The AI prosecutor developed by Shi’s team could run on a desktop computer. For each suspect, it would press a charge based on 1,000 “traits” obtained from the human-generated case description text, most of which are too small or abstract to make sense to humans. System 206 would then assess the evidence.

The machine was “trained” using more than 17,000 cases from 2015 to 2020. So far, it can identify and press charges for Shanghai’s eight most common crimes.

They are credit card fraud, running a gambling operation, dangerous driving, intentional injury, obstructing official duties, theft, fraud and “picking quarrels and provoking trouble” – a catch-all charge often used to stifle dissent.

It does not exactly take state-of-the-art processing power to charge hapless Chinese dissidents with “picking quarrels and provoking trouble,” a “crime” that requires very little in the way of evidence and almost nothing that would be recognized as due process by citizens of the free world. It is not difficult to see how an A.I. system could go overboard charging political dissidents with “obstructing official duties.”

The robo-prosecutor seems like the next logical step in China’s tyrannical “social credit system,” already capable of detecting dissident behavior and punishing citizens in various ways with minimal human oversight.

Security guards patrol below surveillance cameras on a corner of Tiananmen Square in Beijing on September 6, 2019. - Some Beijing karaoke bars are closing, toy bombs are banned and every delivery package is being scanned: the capital is taking no chances weeks ahead of a massive military parade to mark Communist China's 70th anniversary. (Photo by GREG BAKER / AFP) / TO GO WITH AFP STORY CHINA-POLITICS-ANNIVERSARY,FOCUS BY POORNIMA WEERASEKARA (Photo credit should read GREG BAKER/AFP via Getty Images)

Security guards patrol below surveillance cameras on a corner of Tiananmen Square in Beijing on September 6, 2019. (GREG BAKER/AFP via Getty Images)

While Shi burbled happily about the A.I. system growing even more powerful and auto-filing charges for ever more complex offenses, an unnamed Chinese prosecutor told the SCMP he was worried about prosecutors losing what little autonomy and responsibility they have.

“The accuracy of 97 per cent may be high from a technological point of view, but there will always be a chance of a mistake. Who will take responsibility when it happens? The prosecutor, the machine or the designer of the algorithm?” the anonymous critic asked.

“A.I. may help detect a mistake, but it cannot replace humans in making a decision,” he insisted.

COMMENTS

Please let us know if you're having issues with commenting.