Xinjiang covers 16 percent of China’s landmass but includes only a tiny fraction of its population—22 million people, roughly 13 million of whom are Uighur and other Turkic Muslims, out of nearly 1.4 billion people in China. Hardly lax about security anywhere in the country, the Chinese government is especially preoccupied with it in Xinjiang, justifying the resulting repression as a fight against the “Three Evils” of “separatism, terrorism, and extremism.”

China’s Mass Surveillance Phone App

“Our research shows, for the first time, that Xinjiang police are using illegally gathered information about people’s completely lawful behavior – and using it against them.”

Yet far from targeting bona fide criminals, Beijing’s actions in Xinjiang have been extraordinarily indiscriminate. As is now generally known, Chinese authorities have detained one million or more Turkic Muslims for political “re-education.” This latest “Strike Hard Campaign” has yielded the world’s largest case of mass arbitrary detention in decades.

Beijing has tried to pass off the proliferating indoctrination centers as “vocational training” sites. In reality, the purpose is forced assimilation. Turkic Muslims are confined indefinitely until authorities determine that they have sufficiently replaced their religious and ethnic identity—their Islamic beliefs, language, culture, and traditions—with loyalty to the Chinese Communist Party. In some areas, the government considers children with a parent or parents in detention to be “orphans” and holds them in state-run orphanages where they face similar brainwashing.

But the use of mass detention is only part of Xinjiang’s story. What is even more striking is Beijing’s establishment there of a surveillance state, which plays a central role in determining who will be detained. The scope and intrusiveness of this effort may well be unprecedented. If this new form of totalitarianism is not curtailed, it portends a dystopia that other governments can be expected to emulate, threatening us all.

Even in countries where the legal protection of privacy is more developed than in China, the law often lags far behind the changing technical capacities illustrated in Xinjiang. There is an urgent need to elaborate the right to privacy in concrete regulations that constrain a government’s surveillance powers, whether in China or the rest of the world.


The extraordinary nature of China’s surveillance effort in Xinjiang begins with the vast resources devoted to it. One million government employees are regularly dispatched to stay as “guests” in the homes of Turkic Muslims in Xinjiang, with instructions to report any sign of religiosity or unusual thinking. The authorities have also recruited tens of thousands of new police officers for Xinjiang, set up thousands of new police stations and checkpoints throughout the region, and dramatically increased the public-security budget.

Beijing then uses the latest technology to collect and analyze information gathered about Muslims there. Some Xinjiang checkpoints are equipped with special machines called “data doors” that—unbeknown to the people passing through them—vacuum up identifying information from their mobile phones and other electronic devices. Machine-readable QR codes are engraved on knives and posted on people’s front doors (and officials are equipped with mobile apps to scan them), allowing the authorities to quickly link individuals to their homes and possessions. To track, monitor, and profile Turkic Muslims, agents also rely on artificial intelligence, including facial and number-plate recognition, which have been connected with  surveillance cameras that blanket both the region and other parts of the country. In addition, the authorities collect biometric data—including voice samples, iris scans, and DNA—and store them in searchable databases.

Chinese authorities have had to deploy a new and innovative system to integrate, sort, and analyze this enormous quantity of data. The mobile app that police and other officials use to communicate with the Integrated Joint Operations Platform (IJOP), one of the main policing platforms that Xinjiang authorities deploy, provides insight into this system.

Based on its aggregated data, the IJOP program flags for officials anyone deemed a potential threat. Some of those suspects are targeted for further investigation, and some for detention and re-education. By “reverse-engineering” this mobile app—looking at its source code—our organization, Human Rights Watch (HRW), was able to look inside it to see the vast array of information collected. The breadth of that intelligence-gathering helps to explain the bewildering set of questions that Xinjiang residents report being asked by the police.

That information ranges from obvious personal attributes—a person’s blood type or height—to their “religious atmosphere” and political affiliations. It includes whether someone has obtained a new phone number, donated to a mosque, or preached the Qur’an without authorization. The platform incorporates assessments of whether a person might not be “socializing with neighbors” or is “often avoiding using the front door.” If a phone suddenly goes “off-grid,” the system sends an alert to an official nearby to investigate. All of this information is fed into the Integrated Joint Operations Platform’s central system and linked to a person’s national identification card number.

In some cases, investigations require officials to check people’s phones. One Turkic Muslim from Xinjiang told HRW what happened when he was pulled over by police in a traffic stop: “SWAT police officers came and demanded that I give them my phone. I did, and they plugged the phone in.” A few days later, his wife experienced a similar check on her phone while they were stopped at a gas station. The platform considers “suspicious” fifty-one types of software and communications systems, including VPNs, as well as software that permits end-to-end encryption such as WhatsApp, Viber, and Telegram.

The platform predictably devotes special interest to personal relationships: Is a person in question connected to someone who has recently obtained a new phone number? Has that person traveled with someone whom the authorities find problematic? Is the person in contact with anyone abroad?

The scope of the surveillance can be terrifying to Xinjiang residents, who have no ability to challenge it. “There is a place I go at night that nobody knows. But the app knows. That’s when I got really scared,” said a Uighur Muslim who was familiar with the system from his time in Xinjiang, when shown the reverse–engineered app. Once, he input his friend’s ID card number into the system and was shocked when the app spat out “immediate arrest.”

The platform works with the region’s many checkpoints so that, when people pass through, their movement can be restricted depending on how “trustworthy” the computer system (or its programmers) deems them. Former residents said they have been stopped at checkpoints and taken for police interrogation simply because their relatives were being held in a political re-education camp. The system also stops people who are traveling to a different location from the one where they are registered to live. The effect of all this is to impose a series of digital fences around Xinjiang residents.

Children play outside the entrance to a school ringed with barbed wire, security cameras, and barricades, in Peyzawat, Xinjiang, August 31, 2018.
Children play outside the entrance to a school ringed with barbed wire, security cameras, and barricades, in Peyzawat, Xinjiang, August 31, 2018.

 © 2018 AP Photo


The use of mass surveillance is not limited to Xinjiang. The Chinese police are researching and putting similar mass surveillance systems in operation throughout the country. For example, Human Rights Watch has documented the use of a big-data policing platform called Police Cloud, which collects and integrates people’s personal data—from their supermarket memberships to their health records.

Another system designed to shape social behavior is the “social credit” system that Chinese authorities are developing. Under this system, which the government has begun to put into operation and hopes to roll out more fully by 2020, people are, as it says, “rewarded everywhere” for good social behavior and “restricted everywhere” for bad behavior. Some types of measured behavior might seem relatively innocuous, such as whether a person obeys traffic regulations, pays court fines, or refrains from eating on public transport. But it would take little to add political criteria.

The details of the system vary in different parts of the country but have in common an attempt to link social reliability to eligibility for desirable social goods. Does one get residency in an attractive city? The ability to send one’s children to a private school? Permission to travel on a plane or high-speed train?

The ingenuity of these social control systems is that, for most people, the desire for such social benefits will be enough to keep them in line, even without the threat of detention. That is all the more true because most people in China, for reasons of self-preservation, already exercise a significant degree of self-censorship. They know to refrain from publicly criticizing the government and to keep their distance from outspoken acquaintances.

Given the human resources needed to build and maintain such elaborate systems of social control, the Chinese government recognizes that it must also monitor and regulate the conduct of the large number of police and bureaucrats who operate the system, particularly because many of the tasks involved are tedious and grueling. The Xinjiang police officer who completes the eleven pages of information requested by the Integrated Joint Operating Platform is engaged in pages and pages of the most mundane process of data collection. The app monitors how well officers carry out these tasks, giving them a score that is available to both the officer and his or her supervisors.

Technology also helps to ease any qualms that police officers might have about the consequences of their work. Unlike the executioner or the torturer who knows that what he is doing is wrong, the officer inputting material into the platform is just doing routine police work, albeit with an unusual level of intrusiveness. The resulting evil is the consequence of computer programs, managed by siloed parts of the police state, that determine who is to be arrested. Responsibility is diffuse.


Taken together, these surveillance powers in Xinjiang suggest that the Chinese government is perfecting a system of social control that is both all-encompassing and highly individualized, using a mix of mechanisms to impose varying levels of supervision and constraint on people depending on their perceived threat to the state. John Garnaut, an expert on Chinese politics, traced from Mao Zedong to Xi Jinping the Communist Party leaders’ lineage as “engineers of the soul.” Both Mao and Xi shared the belief that humans can be conditioned “in the same way that [the Russian psychologist] Pavlov had learned to condition dogs” by “controlling all incentives and disincentives” in their lives, he said. That is why the Chinese government under Xi—who enjoys greater resources, more advanced technologies, and a stronger bureaucracy than Mao—rarely needs to resort to overt violence.

That is also why, for most people in China, life can seem “normal,” despite the social controls. This illusory effect also works in China’s favor abroad, because many visitors miss how carefully and coercively choregraphed its superficial calm is. Yet even in Hong Kong—a city under Chinese sovereignty that still retains some freedoms—many participants in the continuing pro-democracy protests are taking steps to protect themselves, with measures such as turning off location-tracking on their phones, buying old-format subway cards with cash, pointing laser beams at surveillance cameras, wearing face masks, and switching to encrypted communication platforms like Telegram to avoid identification and tracking.

Terrifying as the emerging system of social control is, though, it has its limits. Researchers developing these surveillance systems have bemoaned the difficulty of mining genuinely useful analytics from such huge quantities of data. Among the problems cited are that frontline officers lack the motivation to collect data accurately, or that surveillance systems developed by different companies are not fully compatible. While the ubiquity of surveillance tools, from biometric databases linked to national ID numbers to pervasive surveillance cameras, suggests fearsome capabilities, many of these systems do not yet work as intended.