Huawei tested facial recognition that ID’d Uighur Muslims, set off alarm: report

img-2249

Huawei reportedly tested a “Uighur Alarm” feature on its network of cameras, with Megvii’s facial recognition technology.


Óscar Gutiérrez/CNET

The Chinese tech giant Huawei and artificial intelligence company Megvii developed and tested facial recognition software that triggered alerts whenever the technology detected Uighur Muslims, according to an internal document obtained by researcher IPVM and provided to The Washington Post.

The document dates back to January 2018, when Huawei tested Megvii’s Face++ facial recognition on its network of cameras and gave a passing grade to its ability to recognize people’s age, gender and ethnicity. The test report also highlighted a passing grade for a “Uyghur Alarm” — an alert designed specifically to identify members of the oppressed minority population in China. 

The Chinese government has used surveillance technology including facial recognition in a myriad of ways to oppress Uighur Muslims. The government’s actions against Uighur Muslims include what’s been described by US lawmakers as “the largest mass incarceration of a minority population in the world today,” with an estimated 1 million people detained by the Chinese government. 

Chinese tech companies are helping with this: facial recognition, surveillance cameras and voice recognition are all being used to track and identify Uighur Muslims in the country. In Oct. 2019, the US Commerce Department blacklisted eight Chinese companies for contributing to human rights abuses against the minority population.  

While you might recognize Huawei as the second-largest phone maker in the world, the company is also China’s biggest tech company, and supplies surveillance cameras both across the country and internationally.  

Megvii is among the eight blacklisted Chinese companies, and one of the largest facial recognition providers in the world. It’s used across China for daily activities like getting on the train and entering offices.

The documents detailing Huawei and Megvii’s tests were labeled as confidential, but IPVM discovered them publicly available on Huawei’s European website. It’s since been removed after the Washington Post first reported on the discovery. 

 Huawei and Megvii didn’t respond to requests for comment. In a statement to IPVM, Huawei said that the Uyghur Alarm feature was “simply a test and it has not seen real world application.” The Chinese tech giant didn’t explain why it would even need to test a technology specifically to target an oppressed minority group. 

Megvii told the organization that its technology was “not designed or customized to target or label ethnic groups,” despite the test with Huawei doing exactly that. 

Huawei and Megvii are not the only Chinese tech companies that have offered facial recognition capabilities to identify and track Uighur Muslims. Hikvision, the world’s largest surveillance camera provider, also marketed its abilities to identify the population, IPVM reported in Nov. 2019. 

The Chinese government has also used malware and phone hacking to specifically target Uighur Muslims. In March, a group of 17 senators called out China for using facial recognition as “instruments of state power.” 

Facial recognition raises privacy concerns because of its ability to track and identify people on a mass scale. Police in the US have used it to track and identify protesters, despite the United Nation’s human rights chief calling for a moratorium against the practice.  

Source Article