The Beijing Institute of Technology (BIT) this week announced an ambitious “experimental program for intelligent weapons systems” that recruited several dozen teenagers with exceptional high-school grades to design killer robots.
As the South China Morning Post reported on Thursday, brains are not enough for this cutting-edge program. Political reliability and a passion for warfare are also essential:
“These kids are all exceptionally bright, but being bright is not enough,” said a BIT professor who was involved in the screening process but asked not to be named because of the sensitivity of the subject.
“We are looking for other qualities such as creative thinking, willingness to fight, a persistence when facing challenges,” he said. “A passion for developing new weapons is a must … and they must also be patriots.”
Each student will be mentored by two senior weapons scientists, one from an academic background and the other from the defense industry, according to the program’s brochure.
After completing a short program of course work in the first semester, the students will be asked to choose a speciality field, such as mechanical engineering, electronics or overall weapon design. They will then be assigned to a relevant defense laboratory where they will be able to develop their skills through hands-on experience.
BIT said it has recruited 27 boys and four girls so far, a gender mix that social justice warriors would consider highly problematic if the Chinese Communists had to worry about such things. One of the male recruits rhapsodized about his lifelong fascination with guns on the BIT website and said he “couldn’t resist the attraction” of the A.I. weapons program.
The A.I. warfare curriculum is expected to last four years and lead into a full Ph.D. program, creating a new generation of combat cyber experts. Observers outside China found the prospect of recruiting such young people into A.I. warfare programs disturbing:
Eleonore Pauwels, a fellow in emerging cybertechnologies at the Centre for Policy Research, United Nations University in New York, said she was concerned about the launch of the BIT course.
“This is the first university programme in the world designed to aggressively and strategically encourage the next generation to think, design and deploy AI for military research and use.”
While the US had similar programmes, such as those run by the Defence Advanced Research Projects Agency, they operated in relative secrecy and employed only the cream of established scientists, Pauwels said.
Pauwels imagined the young recruits combining A.I. technology with cutting-edge research in other fields to produce such horrors as nanobot swarms seeding enemy food supplies with chemical and biological weapons, or killer robots that can “target, with surgical precision, specific populations” using facial recognition technology. The Chinese have lately developed a keen interest in targeting specific populations in troublesome regions.
“The fact that China’s AI national strategy is built on a doctrine of civil-military fusion means that an AI prototype for military use could be co-opted and perverted for surveillance or harm in the civilian context,” as Pauwels put it.
There are three particularly disturbing aspects of the coming generation of A.I. weapons: they can execute strategies that would require prohibitive amounts of time and manpower if delegated to humans, they will perform those tasks without hesitation or remorse, and they can execute them so quickly that human commanders will not have a chance to rescind orders with unexpectedly horrific consequences. The world can only hope extensive ethics courses are part of every cyber-warfare training curriculum.