Chinese Officials Want National ‘Data Bank’ of Faces and Fingerprints

Online crime scene with a finger print left on backlit keyboard with Chinese flag on it
daoleduc/Getty Images

Top Chinese political adviser Tan Jianfeng on Tuesday called for the establishment of a national “data bank” of biometric data, including facial and fingerprint recognition data, to protect Chinese “national security” and “information security.”

China’s state-run Global Times quoted Tan suggesting biometric data must be aggressively harvested and zealously protected because it will become increasingly necessary for life in the pervasively-monitored Communist state, and the data is extremely difficult to replace if lost, corrupted, or stolen:

Data security has become an important issue concerning national security, said Tan Jianfeng, a member of the National Committee of the Chinese People’s Political Consultative Conference, noting that some key data, such as personal biometric data (face, fingerprint, and DNA data) have unique and non-renewable characteristics that can’t be recovered and changed once they are stolen and bring huge and irreversible risks.

Tan, also the head of the Shanghai Information Security Trade Association, proposed to accelerate the establishment of relevant laws and regulations, and strictly standardize and implement the collection, storage and use of key data.

He also suggested to establish data classification and management and a negative list should be made to prohibit the use of data in key areas such as biology and medicine on the internet.

UK-based technology firm Comparitech released a study in January that found China was “the world’s worst offender for its invasive use of biometric data.”

China’s oppressive use of fingerprint and facial recognition technology includes drones that flew through Chinese cities last year, using facial recognition to identify people violating coronavirus lockdown orders, facial systems that can identify people wearing masks, cameras that identify everyone who uses mass transit systems, and cameras that identify and shame jaywalkers and toilet-paper thieves.

Chinese companies have installed systems that monitor the brainwaves of workers to gauge their productivity. In case brainwave monitoring does not work, the Chinese are developing “smart cushions” that can monitor the other end of their employees.

This relentless march to total biometric surveillance is proceeding at a rapid clip even though polls show some 90 percent of Chinese citizens are uncomfortable with it. Tan Jianfeng’s call for a heavily-protected “national data bank” was part of the authoritarian government’s effort to reassure citizens that their biometrics will be protected.

“Ordinary people here in China aren’t happy about this technology but they have no choice. If the police say there have to be cameras in a community, people will just have to live with it. There’s always that demand and we’re here to fulfil it,” shrugged a representative of Taigusys, a firm that produces emotion-recognition technology, as quoted by the UK Guardian on Wednesday.

The Guardian noted that the emotion-recognition industry is “booming in China,” to a projected revenue of $36 billion in 2023, because dictator Xi Jinping and other top officials have “emphasized the creation of ‘positive energy’ as part of an ideological campaign to encourage certain kinds of expression and limit others.”

Taigusys boasts that its technology can also be used to “predict dangerous behavior by prisoners, detect potential criminals at police checkpoints, problem pupils in schools and elderly people experiencing dementia in care homes.” Emotion-recognition systems have been installed everywhere from nursing homes to schools.

Critics said this latest evolution of surveillance technology is based on physiological quackery, has few safeguards for privacy, discriminates against China’s oppressed racial technology, and has little demonstrable effectiveness at predicting sudden violent outbursts.

“A lot of biometric surveillance, I think, is closely tied to intimidation and censorship, and I suppose [emotion recognition] is one example of just that,” digital human rights analyst Vidushi Marda told the Guardian.


Please let us know if you're having issues with commenting.