Quantcast
Channel: Artificial Intelligence
Viewing all articles
Browse latest Browse all 1375

Huawei reportedly worked with 4 additional companies to build surveillance tools that track people by ethnicity, following recent revelations that it tested a 'Uighur alarm'

$
0
0

Huawei China

Summary List Placement

Huawei tested an AI-powered facial-recognition technology that could trigger a "Uighur alarm" for Chinese authorities when it identified a person from the persecuted minority group in 2018, The Washington Post reported last week.

At the time, Huawei spokesperson Glenn Schloss told The Post that the tool was "simply a test and it has not seen real-world application."

But a new investigation published by The Post on Saturday found that Huawei has worked with dozens of security firms to build surveillance tools — and that products it developed in partnership with four of those companies claimed to be able to identify and monitor people based on their ethnicity.

Documents publicly available on Huawei's website detailed the capabilities of those ethnicity-tracking tools as well as more than 2,000 product collaborations, according to The Post. The publication also reported that after it contacted Huawei, the company took the website offline temporarily before restoring the site with only 38 products listed.

FILE PHOTO: Huawei headquarters building is pictured in Reading, Britain July 14, 2020. REUTERS/Matthew Childs/File Photo

"Huawei opposes discrimination of all types, including the use of technology to carry out ethnic discrimination," a Huawei spokesperson told Business Insider. "We provide general-purpose ICT [information and communication technology] products based on recognized industry standards."

"We do not develop or sell systems that identify people by their ethnic group, and we do not condone the use of our technologies to discriminate against or oppress members of any community," the spokesperson continued. "We take the allegations in the Washington Post's article very seriously and are investigating the issues raised within."

Huawei worked with Beijing Xintiandi Information Technology, DeepGlint, Bresee, and Maiyuesoft on products that made a variety of claims about estimating, tracking, and visualizing people's ethnicities, as well as other Chinese tech companies on tools to suppress citizens' complaints about wrongdoing by local government officials and analyze "voiceprint" data, according to The Post.

Beijing Xintiandi Information Technology, DeepGlint, Bresee, and Maiyuesoft could not be reached for comment.

Human rights groups, media reports, and other independent researchers have extensively documented China's mass surveillance and detainment of as many as one million Uyghurs, Kazakhs, Kyrgyz, and other Muslim minority groups in internment camps, where reports allege they are subjected to torturesexual abuse, and forced labor for little or no pay.

To help it build the surveillance apparatus that enables such widespread detainment, the Chinese government has at times turned to the country's technology firms.

"This is not one isolated company. This is systematic," John Honovich, the founder of IPVM, a research group that first discovered the 2018 test, told The Post. He added that "a lot of thought went into making sure this 'Uighur alarm' works."

In October 2019, the US Commerce Department blacklisted 28 Chinese government agencies and tech companies including China's five "AI champions"— Hikvision, Dahua, SenseTime, Megvii, and iFlytek — on its banned "entity list," thus preventing US firms from exporting certain technologies to them.

Still, some of those blacklisted companies have managed to continue exporting their technologies to Western countries, and BuzzFeed News reported last year that US tech firms, including Amazon, Apple, and Google, have continued selling those companies' products to US consumers via online marketplaces.

In the US, law enforcement agencies and even schools have also increased their reliance on facial recognition software and other AI-powered surveillance technologies, despite growing evidence that such tools exhibit racial and gender bias.

But recent pushback from activists, tech ethicists, and employees has pushed some tech companies to temporarily stop selling facial recognition tools to law enforcement, and some US cities have issued moratoriums on their use, highlighting some divides between approaches to policing in the US and China.

Join the conversation about this story »

NOW WATCH: We tested a machine that brews beer at the push of a button


Viewing all articles
Browse latest Browse all 1375

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>