Pre-Crime and Social Credit Systems – On The Way

27th June 2017 / United Kingdom
Pre-Crime and Social Credit Mechanism Systems - On Their Way

By Graham Vanbergen – Last May I reported on Britain’s latest anti-crime technology already in use in the UK in a piece entitled “Britain’s Minority Report Style ‘Pre-Crime’ Programme Unexpectedly Unearthed“.

I described how Britain already had a reputation for deploying the most intrusive surveillance systems against its own people anywhere in the Western world and how “Pre-Crime” systems were being tested with police using big data to stalk potential offenders who’ve yet to break the law.

At the time I also mentioned that police departments around the world were partnering with private companies to use public data, personal information and algorithms to predict where illegal actions are most likely to occur and, crucially, who is most likely to commit them.

A recent documentary aired in Canada called “Pre-Crime’ focused on pre-crime systems. One such system in the USA is at the forefront of the technology:

There’s a list in Chicago with 1,500 people on it. They are under surveillance by the police and there is a special algorithm that calculates the risk of them committing a crime.

This is pre-crime technology that professes to predict when a crime is to be committed and by whom. In just a few short months various systems are already operational around the world.  But read on, as the more we learn about these systems the more dystopian they get because something else is coming on the back of it.

 

First up – BloombergChina: “China’s effort to flush out threats to stability is expanding into an area that used to exist only in dystopian sci-fi: pre-crime. The Communist Party has directed one of the country’s largest state-run defence contractors, China Electronics Technology Group, to develop software to collate data on jobs, hobbies, consumption habits, and other behaviors of ordinary citizens to predict ‘terrorist’ acts before they occur. “It’s very crucial to examine the cause after an act of terror,” Wu Manqing, the chief engineer for the military contractor, told reporters at a conference in December. “But what is more important is to predict the upcoming activities.”

The program is unprecedented because there are no safeguards from privacy protection laws and minimal pushback from civil liberty advocates.” In all of these examples from around the world, laws are being quickly changed, usually under the guise of protecting citizens from acts of terrorism.

 

The Daily Beast: “TOKYO—On Thursday morning 14th June 2017, Japan’s deceptively named “anti-terrorism” bill was steamrolled into law by its parliament, after the ruling coalition gutted standard legislative protocol, avoiding more embarrassing questions about the bill known as the “criminal-conspiracy law.” It stipulates 277 crimes that police can arrest people for planning, or simply discussing. Technically, because social media is covered in the legislation, even liking a related tweet or retweeting it could now be grounds for arrest on conspiracy charges.”

SafeSubcribe/Instant Unsubscribe - One Email, Every Sunday Morning - So You Miss Nothing - That's It


BigThink – “United States: Lead by the U.S. Department of Homeland Security, an initiative called Future Active Screening Technology or FAST aims to use sensor technology to detect cues “indicative of malintent,” defined by the Dept of Homeland Security as intent or desire to cause real harm — “rapidly, reliably, and remotely.” It would be used, they say, to fight terror.

The FAST system has the capability to monitor physiological and behavioural cues without contact. That means capturing data like the heart rate and steadiness of gaze of passengers about to board a plane. The cues are then run through algorithms in real-time to compute the probability that an individual is planning to commit a crime. According to the science journal Nature, the first round of field tests for the program was completed in an undisclosed location in the northeast several months ago. In lab tests, the FAST has a reported 70% accuracy rate. ”

That’s all pre-crime tech. Unnerving it may sound but coming next is ‘social credit mechanisms.’

 

As Civil Society Future points out in a recent article on civil society in an age of surveillance: “Citizens are increasingly categorised and profiled according to data assemblages, for example through data scores or by social credit scores, as developed in China. The purpose of such scores is to predict future behaviour and allocate resources and eligibility for services (or punishment) accordingly. In other words rules will be set for citizens to live by.

And those rules are not just hypothetical. In China, parts of the social credit mechanism have already been put into practice, according to Rogier Creemers, a researcher specialising in Chinese law and governance at the Van Vollenhoven Institute at Leiden University.

“When rules are broken and not rectified in time, you are entered in a list of ‘people subject to enforcement for trust breaking’ and you are denied access from things.”

According to a document released by China’s State Council, “trust-breakers” can face penalties on subsidies, career progression, asset ownership and the ability to receive honorary titles from the Chinese government.

In a similar vein, those who fail to repay debts are punished by travel restrictions. Just last month, the Supreme People’s Court announced that 6.15 million people in the country had been banned from air travel over the last four years for defaulting on court orders, according to local media.

To enforce penalties, the court announced it was working with a total of 44 government institutions.

 

Just four years ago when it was evident that the British and American surveillance agencies GCHQ and NSA were collecting citizen data of everything from emails, calls, conversations, contacts, health and financial information to taking intimate images without permission in homes, people rapidly started changing their online behaviour. This is called self-censorship. And this alongside reward and penalty systems is what the government has in store for its citizens next.

In a Guardian article just two months ago regarding the Britain’s rapidly declining democracy, another truly disgraceful use of technology was already in practice without public debate:

“Documents seen (by the Observer) show that this was a proposal to capture citizens’ browsing history en masse, recording phone conversations and applying natural language processing to the recorded voice data to construct a national police database, complete with scores for each citizen on their propensity to commit crime. The plan put to the minister was Minority Report. It was pre-crime. And the fact that Cambridge Analytica (the company involved in data collection) is now working inside the Pentagon is, absolutely terrifying.”

 

Equate this to the expansion of say, the speeding fine system of penalty or black-points on your driving licence. You are caught speeding by a speed camera, a penalty notice is automatically issued and points added to your licence. You are further penalised by insurance companies for the known increased risk and so you try not to get caught speeding. Imagine the same system applied to protesting at fracking sites or joining a civil liberty action group to question systems such as these.

Blacklisting union members has already been covertly used by UK corporations in the construction industry. Thousands were stopped from gaining contracts or work simply for being members of a trade union. The British government blacklists some companies as poor outsourcers, quite rightly, but then awards contracts to G4S, even after the numerous crimes they have committed against the state. The point being that the government actively manage people and companies without their knowledge for their own gain. The jump between speeding points and blacklisting is not big at all and for all we know is coming sooner than you think. After all, none of us were aware that images of us were being taken in our own homes, our conversations recorded and internet browsing tracked until Edward Snowden broke the big secret of our government.

Make no mistake, pre-crime system technology has already arrived but it has more than the simple purpose of predicting crime. By combining this technology with other mass data collection it will, by its nature, clearly have the ability to force self-censorship and over time, technology such as social credit systems will apply punishments and rewards similar to those systems being trialled in China right now.

 

 

 

 

At a time when reporting the truth is critical, your support is essential in protecting it.
Find out how

The European Financial Review

European financial review Logo

The European Financial Review is the leading financial intelligence magazine read widely by financial experts and the wider business community.