- WND - https://www.wnd.com -

Frightening system targets anyone not behaving to gov't standards

 

(Image courtesy Pixabay)

Many Americans have issues with receiving a news report produced by artificial intelligence. Or having videos censored because some tech geek has created an algorithm that defines it as inappropriate.

They should thank their lucky stars they’re not in China.

Human Rights Watch has uncovered the “inner workings” of software used by police in the autonomous territory of Xinjiang that monitors people for any deviation from “an algorithmically determined norm.”

The Electronic Frontier Foundation said the app is used by authorities to communicate with the Integrated Joint Operations Platform, IJOP, which collects information from all surveillance in the territory.

The system targets “just about anyone” who isn’t behaving according to the government’s standards.

EFF noted IJOP “requires a massive amount of manual labor, all focused towards data entry and translating the physical world into digital relationships.”

Xinjiang is home to the Uighurs and other Turkic minority groups that have been targeted in crackdowns by the Chinese communists.

“Political education centers” now house an estimated 1 million people, and spying on citizens is a standard practice.

“While we fight the introduction and integration of facial recognition and street-level surveillance technologies in the U.S., existing research from Human Rights Watch gives us insight on how facial-recognition-enabled cameras already line the streets in front of schools, markets, and homes in Kashgar. WiFi sniffers log the unique addresses of connected devices, and police gather data from phone inspections, regular intrusive home visits, and mandatory security checkpoints,” EFF said.

Human Rights Watch obtained a copy of a mobile app that police officers use and released its source code, revealing that it allows police to do “investigative missions” such as individuals or tracking vehicles.

It also incorporates facial recognition.

The app focuses on 36 “suspicious” person types.

For example, someone who travels, such as leaving the territory, even if it is legal.

Those who have communication software on their phones, such as WhatsApp or Telegram, get extra attention.

A red flag is raised also when any disappears from an online presence.

“Any small deviation from what the IJOP system deems ‘normal behavior’ could be enough to trigger an investigation and prompt a series of intrusive visits from a police officer,” EFF said.

The report said: “And as all behavior is pulled into the state‚Äôs orbit, ordinary people can become instant suspects, and innocent actions have to be rigorously monitored. Using certain software becomes, if not a crime, then a reason for suspicion. Wandering from algorithmic expectations targets you for further investigation.”