Mind Matters Natural and Artificial Intelligence News and Analysis
Computer algorithm productivity efficiency, cyber security concepts
Adobe Stock, liciensed

How China’s Pre-Crime Algorithms Work — and Their Fatal Flaw

The algorithms target, for example, those who complain about or draw attention to social injustices and abuses
Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

In a previous article, we looked at the way George Orwell ’s dystopian 1984 is looking less and less like fiction as the Chinese Communist Party exploits the capabilities of AI and Big Data to surveil its entire population. But beyond surveilling citizens’ movements in real time, the CCP also hopes to predict crimes and protests before they happen.

In a follow-up story in the New York Times, Paul Mozur, Muyi Xiao, and John Lui look at how the CCP is also bringing the dystopian world of Philip K. Dick ’s Minority Report (2002) to real life, with one difference: Rather than human “precogs” who can predict the future, the CCP relies on algorithms that can interrogate large swaths of data for patterns of behavior.

Surveillance cameras at Tiananmen square in Beijing, China
Tienanmen Square surveillance

The Times notes that the algorithms — which would prove controversial in other countries — are often trumpeted as triumphs by the Chinese government:

The new approach to surveillance is partly based on data-driven policing software from the United States and Europe, technology that rights groups say has encoded racism into decisions like which neighborhoods are most heavily policed and which prisoners get parole. China takes it to the extreme, tapping nationwide reservoirs of data that allow the police to operate with opacity and impunity.

Paul Mozur, Muyi Xiao, and John Lui, “An invisible cage: How China is policing the future” at New York Times (June 25, 2022)

Chinese provincial authorities have posted bidding documents for algorithms that can flag patterns of behavior that indicate a potential future crime. However, in China under the Communist Party’s rule, crimes can include practicing a forbidden religion or organizing a protest, if Beijing sees the religion or protest as a threat to stability. The surveillance tools automatically flag certain groups, whether or not they have any criminal history, including Uyghurs, other ethnic minorities, migrant workers, and those with mental illness.

Wi-Fi sniffers, for example, will intercept phone communications, and phone apps like a Uyghur-to-Chinese dictionary are automatically flagged. Additionally, the surveillance tools have a list of people to ignore, known as the “Red List,” which The Times says are mostly government officials.

Surveillance in the regions surrounding Beijing is particularly extensive.

The algorithm targets those who draw attention to social problems

The system collects data on legions of Chinese petitioners, a general term in China that describes people who try to file complaints about local officials with higher authorities.

It then scores petitioners on the likelihood that they will travel to Beijing. In the future, the data will be used to train machine-learning models, according to a procurement document.

Paul Mozur, Muyi Xiao, and John Lui, “An invisible cage: How China is policing the future” at New York Times (June 25, 2022)

Beijing becomes particularly apprehensive when discontent appears organized and widespread. The government does not want people gathering at the capital or drawing attention to a problem. For this reason, the Cyberspace Administration of China censors online disgruntlement toward government failures. The Times and ChinaFile’s study of procurement documents showed:

Many people who petition do so over government mishandling of a tragic accident or neglect in the case — all of which goes into the algorithm. “Increase a person’s early-warning risk level if they have low social status or went through a major tragedy,” reads the procurement document.

Paul Mozur, Muyi Xiao, and John Lui, “An invisible cage: How China is policing the future” at New York Times (June 25, 2022)

Recently, several protests have been quelled — sometimes with violence — and then censored online. One “erased” protest called for boycotts on mortgage payments until developers finished their unfinished projects. The other concerned a rural bank’s freeze on customers savings accounts..

How accurate is algorithmic prediction anyway?

Science fiction is prescient here. In Minority Report, three people with superhuman abilities can predict when a murder (in the film) or any crime (in the earlier novella) will occur. Police can then make an arrest on charges of a “pre-crime” and thus prevent the murder. However, things get strange when one of the officers is accused of the future murder of someone he doesn’t know. Dick called the story from which the film originated “Minority Report” because two out of three of the precogs’ reports were used by the computer to predict the crime. Thus the one that did not predict the crime provided the minority report.

As the story unfolds in the novella, the police officer realizes the precogs’ reports are not as similar as people thought. The computer system pieces together similarities in the reports and assumes that this is the correct assessment of the future. There are more twists and turns in both the film and the novella, but both stories deal with the question of security and freedom in a techno-dystopian near-future world.

Our world does not have “precogs” that can see possible futures. Instead, we have algorithmic systems that rely on historical trends, which is one of the origins of algorithmic biases. Algorithms are only predictive in so far as human behavior matches the historical data used to train the algorithm. They leave no room for people who break out of those patterns, either by choice or because of their unique experiences. Algorithms, for example, do not account for changes in infrastructure or the work of ministries and charitable organizations that can give people opportunities to break out of established patterns.

In this sense perhaps, a better science fiction analogy than Minority Report (2002) is Isaac Asimov’s Foundation trilogy, which inspired the recently launched Apple TV series, Foundation. In the book series Hari Seldon, a psychohistorian, uses statistical analysis of the past to predict the future. His calculations predict the fall of society, which he develops a plan to avert. The plan spans millennia, during which society discovers holographic messages he has left. All goes according to his predictions until the Mule appears.

The Mule is an anomaly, a mutant that can read and even manipulate others’ minds. The Mule represents the problem with real-life attempts to use algorithms and big data to predict the future. Algorithms cannot predict anomalies in human behavior.

The fatal flaw in algorithmic prediction

Even if an algorithm can, in most cases, predict human behavior, an even more fundamental flaw inhibits algorithmic attempts at prediction: Algorithms only work when their surveillance subjects do not know how they work. In Asimov’s story, the Achilles heel in Hari Seldon’s plan to save society is that society cannot know about two groups of scientists and engineers, known as the Foundations, who live on opposite sides of the galaxy. The Mule, who knows about them, thus seeks to thwart Seldon’s plan and rule over one of the Foundation planets.

Similarly, in real life, predictive algorithms do not work when a person “games the system.” A subject who knows what the algorithm is looking for or how it works can manipulate it. That’s why the architects of the algorithms try to keep its function (and sometimes existence) a secret.

Chinese citizens are willing to compromise privacy and acquiesce to some surveillance in the name of stability and security. But in the last five years, the CCP has had to crack down on more and more discontent among the people. As Maya Wang, senior China researcher at Human Rights Watch summarizes the current scene, “This is an invisible cage of technology on society.”


You may also wish to read:

Big Brother is watching you (and trying to read your mind) (Gary Smith)

China is quite serious about total surveillance of every citizen (Heather Zeiger)

and

Largest data grab ever stole Shanghai’s mass state surveillance (Heather Zeiger)


Heather Zeiger

Heather Zeiger is a freelance science writer in Dallas, TX. She has advanced degrees in chemistry and bioethics and writes on the intersection of science, technology, and society. She also serves as a research analyst with The Center for Bioethics & Human Dignity. Heather writes for bioethics.com, Salvo Magazine, and her work has appeared in RelevantMercatorNet, Quartz, and The New Atlantis.

How China’s Pre-Crime Algorithms Work — and Their Fatal Flaw