Plan to trial 'big data' analytics in social care sparks ethics debate
Published by Professional Social Work magazine – 12 February 2019
A trial to test if computers can spot risks in families using predictive analytics has sparked debate over the ethics and privacy issues involved.
The What Works Centre for Children’s Social Care wants to test the ability of “machine learning” to spot patterns by analysing “large quantities” of case notes and outcomes data from children’s services departments in England who sign up to the project.
Last year The Guardian revealed at least five English councils had developed predictive analytics software for child safeguarding, some in partnership with private firms.
The government-created What Works Centre says there is a pressing need for “an evidence-led debate on when and where these tools are effective, ethical and acceptable”.
Its pilot is backed by Anne Longfield, the Children’s Commissioner for England, who said it would help test the technology's potential to help services earlier spot children who need support.
But some social workers and children’s campaigners say more consideration of the ethical issues is needed prior to launching.
Sue White, professor of social work at the University of Sheffield, said: “There needs to be a debate on the ethics of this before it is done - it can’t just be tacked on and it must involve far more than just our sector because the implications of using artificial intelligence in this way are profound.
“A major ethical concern I have is that it can only increase surveillance of people who are already in difficult circumstances. Then there are questions of how this data would be used in the future.
"I also have doubts over the legality of doing this at the level of individuals and families - how do they meet the threshold for data sharing without consent?
“Given the services we are talking about, the data being shared will be incredibly sensitive. This would be a major change in the way the state interacts with citizens.”
Kathy Evans, chief executive of children’s charity Children England, said the issue of consent was among a host of questions that needed to be answered about the pilot.
She said she'd been alarmed by a privacy assessment document released last year by Hackney council under the Freedom of Information Act on its predictive analytics work. It showed people whose data was used would not be informed or given the chance to opt out.
While welcoming the What Works Centre’s call for dialogue, she also stressed an ethics discussion was needed before experimenting on “people’s real lives”.
Other revelations on the impact of big data and algorithms on society, such as the Cambridge Analytica scandal, also pointed to the need for a much wider debate on the impact of the technologies, Evans added. She drew a parallel to the Warnock Commission of the 1980s, which considered the potential brought by advances in fertilisation treatments and the safeguards required.
“I remember how huge that was when I was a child. Medical technology was advancing to the point there were serious ethical questions that had to be answered before it went further.
"The commission was set up in recognition that there were some incredible benefits that might come from it but it could also go in directions that nobody would want – so let’s take a look and put in place an ethical framework around it. I feel like we’re probably at that point.”
Writing on Twitter, Steve Walker, director of children’s services at Leeds council, said most directors could already point to the localities of greatest need in their areas without the need for algorithms.
“I have no problem with new technologies. What I am saying is simply we know a lot about the issues that affect children and result in them experiencing poor outcomes.
“What we should be doing is focusing on how we tackle them through better practice."
Maris Stratulis, national director of BASW England, said machine learning in social care raised "significant ethical, consent and clinical issues about the value of how we engage with children and families”.
She added: "The increasing role of digital profit-making companies in the field of social care is an area of concern and rather than focusing on what we can learn from machines and algorithms, let’s focus on good relationship based social work practice.”
The What Works Centre said it will publish a report on the ethics of using machine learning in social care as part of the pilot.
Its executive director Michael Sanders said there were “a number of risks using big data, as well as a lot of misinformation out there regarding the application of predictive analytics which we hope to uncover, consider and address.”
He added the pilots aimed to see if big data analytics could help frontline workers make decisions and were not about “making decisions for them.”
Asked about the issues of consent, the What Works Centre said local authorities taking part in the pilot would be responsible for considering issues around the processing of specific individuals’ data, including permissions.
“Needless to say, we’ll work closely with all potential partners to ensure these pilots fully comply with all aspects of the data protection legislation, and individuals’ rights under it,” a spokesperson added.
England’s chief social worker for children and families Isabelle Trowler urged social workers to “get on the front foot” in contributing to discussions on the issue adding they were “well placed to lead the ethical debates involved”.
The What Works Centre plans to host a discussion event on machine learning in social care in March or April in order "to have an open conversation" about the topic. To register click here.
This article is published by Professional Social work magazine which provides a platform for a range of perspectives across the social work sector. It does not necessarily reflect the views of the British Association of Social Workers.