Digital Welfare
We hear a lot about the impact of automation, robots and artificial intelligence on work. Whole professions may disappear, millions will need new skills to survive, working hours will be cut drastically; any number of predictions are made on a daily basis.
There is far less debate about the impact of these technologies on the lack of work and the systems set up to deal with unemployment and benefits, but the impact for those needing this support may be even greater. New technologies, algorithmic decision-making and the use of huge datasets are being introduced across the globe with little citizen engagement, transparency or debate about the ethical issues.
The Department for Work & Pensions (DWP) is investing in AI to both augment and replace some of their processes and decision-making, but the extent and content of this work and its implications are not in the public domain, and there has so far been no public engagement or debate
In Colombia, the Sisben social assistance programme for people on a low income uses large amounts of both public and private sector data to assess eligibility and identify fraud
The Bolsa Familia conditional cash transfer programme in Brazil collects enormous amounts of data as an integral part of the design of the system
Biometrics are used in Venezuela to control access to food and medicines.
These and other examples tend to exhibit common areas of concern:
Opacity
What data is collected, how it is used, where it comes from and how it is shared is rarely known by people interacting with digitised welfare systems. The algorithms used to analyse data and make decisions on claims, eligibility, ID, fraud or conditionality are rarely made public. Without access to this information, individuals wishing to claim benefits or challenge decisions have very little to go on.
Discrimination and anomalies are very difficult to identify or correct in a data-driven system, and indeed can be baked into the core functioning of systems, by accident or design. A Dutch system was ruled to be breaching human rights law in its targeting of poorer neighbourhoods without due cause. Welfare systems are rarely known for their transparency or simplicity of decision-making, and datafication appears to be adding another layer of mystery.
Privacy and inequality
If claimants do not know what data is being used to assess and process their claims, they are equally unlikely to know who has access to what is very likely extremely personal information. The combination of multiple datasets allows authorities to piece together highly detailed profiles of individuals, and in some cases such as Bolsa Familia it is believed that far more data is used than is necessary to administer the programme.
Who has access to this information beyond those running programmes may also be unknown; in cases where the private sector is involved in public programmes this presents significant concerns. Can private companies sell this data or use it to improve their own products? Are claimants asked for their consent, and do they have the chance to opt out of their data being used in this way?
Individuals and households needing public assistance are required to give up their rights to privacy in exchange for support in ways that those with sufficient wealth or income are not. Inequalities are further entrenched and the autonomy of beneficiaries is undermined.
Technologies are not neutral
The use of data and technologies can be framed as a neutral, technical solution that simply speeds up existing processes, introducing efficiencies and reducing errors. But the decisions about what technology to use, for what purposes and the ‘settings’ of the tools are entirely driven by policy decisions, and therefore ideology and politics. A system that already penalises claimants for infractions of the rules, such as Universal Credit, will surely develop data-driven approaches that build on its core narratives of the undeserving unemployed, benefit cheats and families dependent on welfare.
When you layer ideological decisions on top of the biases well known to exist in algorithmic systems, the potential for discrimination and unfair decisions just ramps up further. The existing problems are deepened, rather than ameliorated.
Digital divide
Making benefits and public assistance programmes ‘digital by default’ disadvantages those unable to use digital technologies, through lack of skills, confidence or suitable equipment. In 2018 10% of the UK adult population were not regular users of the internet
Older people, people with disabilities, women, people economically inactive and those on lower incomes are more likely to be non-users and lack access to the right tools, for example an internet connection at home. The potential for these groups, already often excluded and marginalised economically, to be further disadvantaged by digital welfare systems is clear.
Where do we go from here?
The surge of benefit claims in the UK as a result of Covid-19 and, unfortunately short-lived, temporary changes in conditions attached to receiving benefits, started to change the debate about how these benefits should work, and for whom. At the same time, Covid-19 appears to be accelerating the adoption of new technologies in many fields, .
Now is therefore the time when we should be asking how technology could and should be used in making decisions about social assistance and benefits. Data and AI could open up access to those not captured by current systems, offer claimants more information and transparency on how decisions are made, and be used to enhance support for jobseekers with tailored advice based on local labour market data and personal details, . These questions need active engagement and debate, before a new age of digital welfare is ushered in without scrutiny.
Enjoyed this? Read more from my blog or other publications.
(c) Anna Dent 2020. I provide social research, policy analysis and development, writing and expert opinion, and project development in Good Work and the Future of Work / In-Work Poverty and Progression / Welfare benefits / Ethical technology / Skills / Inclusive growth