UK’s most vulnerable subjected to welfare decisions by algorithms

29th November 2018 / United Kingdom
UK’s poorest are put at risk by automated welfare decisions

Big Brother Watch wrote to the UN Special Rapporteur on Extreme Poverty and Human Rights (before his visit) to raise the alarm about the hidden use of automation and predictive analytics for decisions about benefits entitlements and social care, and the negative human rights impact on the UK’s poorest people. The submission follows them sending of over 1,000 Freedom of Information requests to authorities about their uses of AI, algorithms and big data in decision-making.

In BBW’s submission to the UN Special Rapporteur, they explain that authorities’ lack of transparency and inept legal frameworks mean that the rapid adoption of new technologies are engaging human rights in ways that are difficult to analyse or challenge.

 

Claimants already have to deal with a frequently changing and punitive assessment process – now they are being affected by complex technological systems they rarely know about, cannot understand, and are not able to challenge.”

 

The new tools being used by authorities include risk analysis and profiling; automated fraud detection; predictive analytics, spanning from late rent payments to ‘child abuse’ and ‘youth offending’; benefit entitlement calculations; and has previously even included voice analytics to detect claimants who are ‘lying’.

These new technologies are largely supplied by private contractors and are fuelled by big datasets, sometimes involving hundreds of millions of data items and including sensitive fields such as health data, education records and ethnicity.

 

Human rights and welfare

The UK’s welfare system touches on a spectrum of rights: the right to life, the right to health, the right to be free from inhuman or degrading treatment, freedom from discrimination, the right to education, the rights of children, the right to fair work, access to justice, and the right to peaceful enjoyment of property.

In the context of welfare estimations and decisions, the stakes could not be higher. This myriad of rights is engaged and people’s lives, health and social integration are often at risk.

Welfare and social care decisions should be transparent, accessible and challengeable to officials and claimants – not just for highly trained lawyers, but for the members of the public actually affected by the decisions.

SafeSubcribe/Instant Unsubscribe - One Email, Every Sunday Morning - So You Miss Nothing - That's It


 

Loopholes in the law

As is stands, UK citizens can be subject to purely automated decisions by the authorities – even where those decisions engage their rights. This is a step change and could have huge implications with the dawn of new and emerging technologies.

Big Brother Watch lobbied for vital protections during the passage of the Data Protection Act 2018 that would ensure that citizens would always be entitled to human decisions where their rights are engaged. However, the Government rejected this safeguard.

Crucially, citizens should be notified when an automated decision has been made about them and given a right to appeal – but a loophole in the new Act allows authorities to circumvent those minimal safeguards if there is merely tokenistic human involvement in the decision. In BBW’s submission to the Special Rapporteur, they raised concerns that this new legal framework is:

 

leaving claimants vulnerable to welfare decisions that are for all intents and purposes automated decisions, without individuals being notified of this fact or their right to appeal.”

 

BBW raised additional concerns about the Digital Economy Act 2017, which permits mass data sharing between public authorities and private companies for “public benefit”. In fact, the Act allows broad data sharing to improve citizens’ “physical and mental health”, “emotional well-being”, “the contribution made by them to society”, and “their social and economic well-being”, enabling new forms of paternalistic intrusion on the private lives of those who are most vulnerable. We have raised concerns that data sharing to evaluate and improve the “contribution” one makes to society risks amplifying the punitive potential of some welfare sanctions, such as work schemes that have adversely impacted those with disabilities and ill health.

In fact, the Act also allows data sharing between the state and private companies to prevent or detect crime or anti-social behaviour, for criminal investigations, for legal proceedings, for “safeguarding vulnerable adults and children”, for HMRC purposes, or if required by EU obligations. BBW told the Special Rapporteur:

 

In effect, personal data can be shared across government departments to investigate, penalise or otherwise intrude on the lives of those in receipt of welfare, pensioners, and some of the country’s most vulnerable people.”

 

Hidden systems

There is no mechanism currently in place for authorities to report their use of algorithms, despite the UK claiming the place of a world leader in technology innovation. Indeed, our new legal frameworks fail to require that vital transparency.

BBW issued over 1,000 Freedom of Information requests in an attempt to provide that transparency, but were unable to gain a comprehensive picture of authorities’ use of technologies in welfare. They explore dsome of the reasons for this in their submission, including a lack of shared definitions and understandings of new technologies within authorities; their reluctance to share information; their use of highly integrated and complex systems; and their agnosticism towards the functions and impact of new systems.

 

Wilful blindness

New technologies are being rapidly and enthusiastically adopted by local authorities, partly due to the climate of austerity and partly due to the general trend of technological solutionism. Our research reveals agnosticism within authorities towards the inner workings of the technologies they use, sometimes with a wilful blindness to the data fed to the tools, and frequently with apathy towards their impact on citizens.

Human rights are engaged in complex ways by new and emerging technologies. The impact must be assessed – but transparency is currently being obstructed, and our rights are at risk. Big Brother Watch looks forward to engaging with the UN Special Rapporteur further and shining a light on hidden new technologies.

 

Big Brother Watch exposes and challenges threats to our privacy, our freedoms and our civil liberties at a time of enormous technological change in the UK

 

 

At a time when reporting the truth is critical, your support is essential in protecting it.
Find out how

The European Financial Review

European financial review Logo

The European Financial Review is the leading financial intelligence magazine read widely by financial experts and the wider business community.