New government agency surveillance system found

8th November 2018 / United Kingdom
New government agency surveillance system found

By TruePublica: Steve Shaw, the local democracy reporter at Your Thurrock has published a story about the local council and its use of technology about local residents. It demonstrates how unaccountable government organisations and their agencies can be.

 

It was only two years ago that we found out that local councils were using anti-terror laws to spy on people walking dogs, feeding pigeons via secret listening devices, cameras, surveillance systems and even hiring private detectives using all manner of technologies.

Wolverhampton used covert surveillance to check on the sale of dangerous toys and car clocking; Slough did so to aid an investigation into an illegal puppy farm, and Westminster to crack down on the selling of fireworks to children. All of these examples may seem justified – the problem is that none of them can be considered acts of terrorism or matters of national security.

Technology is being used for all sorts of government activities without debate or permission. Take the police for instance.

We found out in May that the police were using big data to stalk potential offenders who’ve yet to break the law, in essence – a pre-crime, predictive software system for arresting people. Last July we found out police were using facial recognition systems with a mismatch rate of over 90 per cent. In August we found out that London police are roaming the streets armed with fingerprint scanners, the modern equivalent of being asked to show your papers.

The list of these alarming discoveries is getting lengthier each month.

Britain already has the unenviable reputation for deploying the most intrusive surveillance systems by any government against its own people anywhere in the Western world. But now we have something new.

 

“New information has been revealed about a controversial computer system that is being used to create profiles on vulnerable people in Thurrock. Information on the system known to the council as the “predictive modelling platform” has been revealed through a freedom of information request. It outlines how council data from housing, education, social care, benefits and debt all contribute to the creation of a profile that is used to predict whether a person is at risk.”

SafeSubcribe/Instant Unsubscribe - One Email, Every Sunday Morning - So You Miss Nothing - That's It


 

The report says that – The profiles then assign “potentially vulnerable people” a score that indicates whether they need attention from social services. That risk score is stored in a centre where identifiable details are replaced with artificial ones, a process known as pseudonymised data.

The identifying details are only re-added when the case is accessed by a professional, such as a social worker.

The system has been created by a private company called Xantura and aims to cut costs and allow services to identify cases early before they become more complex. It is estimated that by 2020, the council will have spent £1.14million on the system.

In four years the system has become so embedded within Thurrock’s social services that it is responsible for 100 per cent of referrals to the Troubled Family programme, a government-led scheme aimed at early social work intervention. The council also claims it has an 80 per cent success rate in predicting children who are at risk and should enter safeguarding.

 

To the unsuspecting or uninformed here lies the problem. Mission creep. It’s the hallmark of all things to do with surveillance technology and government.

A new phase of this particular project got underway last July, which sees predictions being made for “homelessness prevention and anti-social behaviour profiling”. Apparently, it is also being used for debt-collections.

Xantura has denied that the goal is to accuse people of future crimes and stress that they do not have the ability to recommend what actions the council should take once the profiles have been created.

When details of the programme emerged last month civil liberties group Big Brother Watch branded it a “terrible idea” which risks “profiling families and casting suspicion over their parenting abilities on the basis of high tech stereotyping”.

Councillor Sue Little, portfolio holder for Children and Adult Social Care, said: “It is important to emphasise that data analytics systems are only part of the process and further verification and checks are carried out prior to any intervention.”

However, like the abuse by local authorities using the Terrorism Act to collect parking fines, this system has already morphed from its primary role to one of enlarging its remit to the wider community. At what point does this stop? And how long will it be before some or all the data of this privately owned company managing the extremely sensitive data and information of entire communities is compromised? Or sold?

 

 

At a time when reporting the truth is critical, your support is essential in protecting it.
Find out how

The European Financial Review

European financial review Logo

The European Financial Review is the leading financial intelligence magazine read widely by financial experts and the wider business community.