China enlisted surveillance firms to help draw up standards for mass facial recognition systems, researchers said on Tuesday, warning that an unusually heavy emphasis on tracking characteristics such as ethnicity created wide scope for abuse.

The technical standards, published by surveillance research group IPVM, specify how data captured by facial recognition cameras across China should be segmented by dozens of characteristics – from eyebrow size to skin color and ethnicity.

“It’s the first time we’ve ever seen public security camera networks that are tracking people by these sensitive categories explicitly at this scale,” said the report’s author, Charles Rollet.

The standards are driving the way surveillance networks are being built across the country – from residential developments in the capital, Beijing, to police systems in the central province of Hubei, he said.

In one instance, the report cites a November 2020 tender for a small “smart” housing project in Beijing, requiring suppliers for its surveillance camera system to meet a standard that allows sorting by skin tone, ethnicity and hairstyle.

“It’s ripe for abuse,” said Rollet, whose report comes amid growing global scrutiny over Beijing’s treatment of Uighur Muslims and other minorities in western Xinjiang province. The Chinese government denies any rights abuses in the region.

Concern over racial detection in facial recognition systems is growing globally, said Caitlin Bishop, a campaigns officer with the human rights group Privacy International.

But while police in New York, Italy and New Zealand and elsewhere have sought technology to filter faces by race or ethnicity, Bishop said the scale and centralization of the Chinese approach was unmatched.

And given China’s role as a major exporter of surveillance technology, it could be of international concern.

“If you’re meeting these repressive standards in China, that’s a big problem already,” Bishop said. “But then you’re sending technology around the world and that could be even more worrying.”

Rollet’s report looked at surveillance system standards drawn up by the Ministry of Public Security and police departments in the provinces of Henan, Xinjiang and Shenzhen.

It said the standards, which are followed by government agencies – including police – seeking to build any kind of camera system, were drafted by the government in conjunction with some of the country’s biggest surveillance firms.

They included security camera manufacturers Uniview, Hikvision and Dahua. Washington blacklisted Hikvision and Dahua in 2019 for their alleged role in “high-technology surveillance against” ethnic minorities in China.

Dahua described as “false” media reports that it had helped draft government standards for detecting individual ethnic groups.

“Dahua was not involved in creating the database section of the document that mentions ethnic groups,” the company added in an emailed statement.

Asked about the IPVM report, a Hikvision spokesman said the company was “committed to upholding the highest standards and respect for human rights”.

“As a manufacturer that does not oversee the operation of our products, we do ensure our cameras are designed to protect communities and property,” he added.

Neither Uniview nor the Chinese embassy in Washington responded to a request for comment.


Some of the standards – including the rules cited in the Beijing housing scheme – have been in place for several years, while others are set to come into force in the near future, the IPVM report found.

One standard for systems to be used in video surveillance for public security, set to be adopted in May 2021, catalogues skin color into five categories – white, Black, brown, yellow and others.

Although some of the standards describe ethnic sorting as an “optional” feature, IPVM said in practice they are understood as requirements.

Using such criteria would make it easier for authorities to comb different databases for specific individuals, or members of a particular ethnic group such as the Uighurs in Xinjiang province, the report and rights campaigners said.

“We’re not talking about a standard for how long pencils should be – this is not neutral technology,” said Maya Wang, a senior China researcher with Human Rights Watch.

Activists and U.N. rights experts say at least a million Muslims have been detained in camps in Xinjiang. The activists and some Western politicians accuse China of using torture, forced labor and sterilizations.

China has repeatedly denied all accusations of abuse and says its camps offer vocational training and are needed to fight extremism.

Wang said the facial recognition standards documented in the IPVM report provided fresh evidence of cooperation between the Chinese government and local surveillance tech companies on measures that could target minority groups.

“They are working together on mass surveillance, the mass tracking of people – there’s no way its compatible with basic human rights standards,” she said.

In January, IPVM said a number of major Chinese AI companies had applied for patents that described technology to analyze images for the presence of Uighurs, and hook into ongoing surveillance cameras and facial recognition networks.

Still, experts said it was not clear to what extent ethnic facial recognition systems – or associated national databases – were fully operational in China.

The standards detailed in the new report could reflect Beijing’s future plans to develop mass AI-powered tracking systems, said Greg Walton, an expert on Chinese surveillance at the Canadian think-tank SecDev.

“These standards reveal the aspirations for all the data the Chinese government wants to gather from people in urban places,” said Walton, who studies live police databases.

Many of them already contain the details included in the standards – eyebrow length, race, and skin color – though those fields are not always filled in, he said.

“The level of details they want to collect – it’s an extraordinary breach of privacy,” he added.

Another purpose of the government standards might be to organize data in such a way to enable the development of more powerful facial recognition systems in years to come, said Liz O’Sullivan, who worked on image annotation for the computer visions firm Clarifai Inc.

“It’s horrifying to imagine an increasingly precise biometric detection and collection system that captures so much detailed information, and could be turned on minorities,” she said.