Singing like a state
3 May 2024
Sam Fender and Richard Dawson
At the emotional peak of 17 Going Under, Sam Fender (think Springsteen, but Geordie) hollers ‘I see my mother, DWP see a number!’.
It’s an old trope, but I get a shiver when I hear it, and I think of Longbenton - the giant DWP/HMRC office on the edge of Newcastle, where the song is set.
Another Geordie singer, Richard Dawson, presents the view from the other side of the Job Centre desk.
In Civil Servant he adopts the role of a DWP employee, doling out penalties to benefits claimants who are only slightly more impoverished than he is.
It contains the astounding lyric:
I don't want to go into work this morning
I don't think I can deal with the wrath of the general public
And I don't have the heart to explain to another poor soul
Why it is their Disability Living Allowance will be stopping shortly
By the end of the song the narrator has completely snapped, reduced to shouting ‘Refuse! Refuse! Refuse!’ in a high falsetto.
The right to refuse #
Dawson's civil servant eventually refuses to participate in the cruelty of the system he finds himself part of.
But as we automate more and more of our services, how can we retain that power - to pull the plug, blow the whistle or just make an exception?
All states have reductive tendencies. That’s how you scale governance; by reducing real things (people, situations) to categories, then applying rules to them.
It’s like dialling down the resolution of an image until it’s a blocky matrix. If you’re an edge case you get anti-aliased.
Sam Fender and Richard Dawson
But something is lost in this process, and this can cause harm - often to more vulnerable people and environments.
I think harm reduction is a crucial role of designers and user researchers (and all of us, really) working on government services.
If we start with the assumption that our service sometimes causes harm, it can motivate us to seek it out, call it out and address it.
But if increasing automation means we have fewer people designing and running services, will there be fewer opportunities to recognise when harm is caused, and act?
It's possible that AI automation may be able to meet the challenge of accomodating the rich diversity of human circumstance without overly reducing it.
But if it can't adequately explain why it's making the decisions it's making, can we trust it?
PS - The brilliant Seeing Like a State by James C. Scott contains many examples of reductive and harmful state schemes; from imposed last names and scientific forestry to city planning and official languages.
PPS - The awesome Vicky Teinacki reminded me that DWPs information charter currently states that it does not use AI to 'determine or deny a payment to a claimant'.