(2024-08-01) Davies Seeing Like A State Machine

Dan Davies: seeing like a state machine. I was asked by a friend to expand on this short comment, in obituary for the political thinker and author of “Seeing Like A State”, James C Scott:

What I meant by it is that Scott is an example of what I’ve called elsewhere “intellectual carcinization” – the phenomenon whereby people from other fields reinvent some of the important principles of management cybernetics simply because they’re studying the same problems and the underlying mathematical structures are there to be found

So, inaugurating a new series called “If I Had Been Present At The Creation (I could have offered some useful advice)”, I will try in this post to explain how Seeing Like A State could have addressed many of the objections later made by Brad deLong and by Henry Farrell in the post linked above, if only its author had ever heard of management cybernetics.

To start with. The central problem of Seeing Like A State is recognisable as what I’ve argued in the past is the central problem of all management theory – that of “getting a drink from a firehouse”

States do this by engaging in “The Politics Of Large Numbers”, to take the title of Alain Desrosière’s excellent book. They collect numbers. As always, collecting numbers is never an innocent or technical business. (legibility)

Furthermore, the act of tabulation and collection is always also an exercise in deciding what you’re not going to collect information about

oversimplifying mightily, he argues that lots of things go to crap because the bureaucracy isn’t able to handle “metis” type information and so ends up trampling over populations and institutions that it doesn’t understand.

And the schemes have indeed often failed. There’s an absolutely horrifying anecdote (from Susan Greenhalgh’s book but told to me by Dan Wang) about how the Chinese one-child policy basically began when one of the country’s best engineers was first introduced to the idea of using applied mathematics

But … it’s a big step from noting that things can go off the rails in this way, to presuming that it’s an intrinsic failing of the bureaucratic state

Couldn’t it just be a design failure? Early rockets and steam engines blew up, a lot, but that didn’t mean that they were intrinsically bound to fail as a means of propulsion

as I think Scott would have seen if he’d come at things from a more cybernetic perspective as designed objects, rather than an anthropological one, although plumbing isn’t architecture, big systems are very dependent on the “plumbing” of their information

big corporations and states need to have information channels going from the bottom to the top – the “red handle signals”, as I call them in my book, that can bypass the normal hierarchy and get information to the decision making centre, in time and in a form which can be understood. It’s the lack of that which has caused so many schemes to fail


Edited:    |       |    Search Twitter for discussion