The Use of Artificial Intelligence in Housing, Welfare, and Social Care
‘Social services already treat us as numbers. Most professionals are very quick to judge you based on a report, before even meeting you. They don’t try to put themselves into your shoes. Artificial intelligence will make this even worse, treating us as a series of tick boxes instead of as whole people who are part of families.’
– Tammy Mayes, an activist representing ATD Fourth World
On 19 July, four representatives of ATD Fourth World were invited to the University of Sheffield to offer critical feedback from lived experience of poverty. The event began with the presentation of findings from a rapid scoping review by the Artificial Intelligence (AI) in Housing, Welfare, and Social Care Network (AI-HoWS) on what a knowledge exchange professional called ‘a thorny issue that needs a balanced approach’.
Before travelling to Sheffield, Tammy, Eric Knibbs, Amanda Button, and Patricia Bailey met with a dozen other members of ATD Fourth World with lived experience of poverty to discuss the issues. Among the many concerns raised were:
- that computer algorithms based on generalised data reinforce existing negative stereotypes, for instance about low-income neighbourhoods where heavy policing can produce artificially high crime statistics;
- that the requirement of using computers to apply for benefits creates accessibility barriers for people in poverty and increases the risk of identity theft or coercion of people who have not had the opportunity to learn digital literacy;
- that positive social work practices will not be taken on board by AI specialists determining algorithms;
- that inequalities will be exacerbated, as will negative social work practices and stereotypes.
Tammy Mayes, second from left
These activists also had questions about the supposed benefits of AI. Government policy makers see AI ‘as essential for providing service improvements and efficiencies and for lowering internal costs’. Activists asked:
- Could the increase in efficiency be designed to lower waiting times, for example the five weeks that people must currently wait before receiving Universal Credit?
- Could the cost savings be invested directly to benefit people in poverty, for instance by offering them training in digital literacy or in funding more widespread access to computers in public libraries?
During the workshop in Sheffield, ATD Fourth World pointed out that algorithms based on risk and past negative situations do not allow policies to build on the hope and aspirations that help people to overcome the obstacles of poverty. Dr. Stephen Potter, a researcher in the AI-HoWS Network, built on this point to stress that ‘AI based on past data is designed to perpetuate the existing systems and decision-making processes. By its nature, it conserves past policies and does not allow for creative innovation.’
Researcher Jenny Hayes warned that in some places AI is being used prescriptively, where the computer’s decisions about possible benefit fraud are imposed on humans. The workshop participants all felt it would be better for AI to be used only in a more open approach where humans can use their emotional intelligence to override algorithms.
Throughout the session, participants were invited to engage in break-out dialogues with the others seated at their tables. Eric Knibbs had mixed feelings about his table. He said, ‘On the whole it was fairly good, with a lot of interesting discussions. The problem was that certain people were very fixed on their ideas as to the use of AI in assessments. They were not going to give an inch to other people’s views. I would be happy to go back for further sessions, just in the hope that I would be in a different group and have people who were prepared to listen to other people’s views.’
Amanda Button was pleasantly surprised by the interactions at her break-out table: ‘I was expecting at least one person to be disagreeable, but everyone seemed quite open to a frank discussion. Each of us had a chance to put our point of view across. Everybody listened, and took on board what others had said. We enjoyed the discussion so much that we even got carried away and wanted to keep adding points when the time was up.’
Several other participants gave feedback highlighting the need for ‘jargon-busting with plainer English and real-life examples’ and a more inclusive approach, such as involving people with lived experience of poverty in planning the structure and activities of the day.
In addition to sharing existing academic knowledge, the event’s purposes included establishing the network’s research aims, attracting diverse potential collaborators, shaping the priorities for this research to go forward, and promoting the representation of people with lived experience in the development of AI in public services. Dr. Desiree Fields pointed out that algorithms are constantly tweaked, adding that this means it would be important for people in poverty to be involved continually.
Another researcher commented about the event: ‘Great mix of people in the room. We need more diverse research networks like this, not the usual way of doing things.’
Researcher Calum Webb thanked the ATD Fourth World delegates, saying, ‘You all bring so much valuable knowledge and energy to the discussions. It wouldn’t have been the same without you. We’re a very new network and had some great feedback from this first event. We are really keen to make sure that we listen to everyone’s voices and create a space where everyone feels welcome, involved, and listened to. It’s great that our faculty has supported us in involving people with lived experience from the start and we’d now like to build on that and involve people further and in more meaningful ways.’
Despite the technical nature of the issue, activists with lived experience of poverty felt that they had contributed importantly to the discussion and that this is a subject which they want to continue to contribute to in link with other stakeholders as the research develops. ATD Fourth World is grateful for the opportunity to bring its knowledge to this debate and for support from the University of Sheffield.
To contact the Artificial Intelligence (AI) in Housing, Welfare, and Social Care Network and to register interest in future events, please visit: ai-hows.sheffield.ac.uk
For more information about the use of AI in public goverment services, please see the Report of the United Nations Special Rapporteur on extreme poverty and human rights on his visit to the United Kingdom, which states:
‘Fraud and error detection and prevention is also being automated. The Department [of Work and Pensions] has invested in data matching to identify fraud and error. It has subsidized “risk-based verification systems”, mostly built by private IT vendors, which flag claimants as being at low, medium or high risk of fraud and error and covertly subject those flagged as high risk to more intense scrutiny. The Department is also developing a “fully automated risk analysis and intelligence system for fraud and error”. More public knowledge about the development and operation of automated systems is necessary. In the absence of transparency about the existence and workings of automated systems, the rights to contest an adverse decision and to seek a meaningful remedy are illusory.’