Future Sailors: Notes on Inventing The Future (ii)

What does the “full” in “full automation” mean? Does it mean the automation of literally everything, or the automation of everything in some class of presumably-automatable things? We can rule out the former immediately, because of the halting problem. This is a more abstract, and much less interesting, reason than “what about care work?”; but it also shows that “full automation” can’t interestingly be assumed to mean the automation of literally everything: it must mean the automation of some class of presumably-automatable things, and that immediately opens up the question of how that class is to be specified. (Again, for abstract and not very interesting reasons, it can’t be specified automatically: it’s impossible to have an automatic procedure for determining whether or not something is automatable, at least if we assume that “automatable” implies “computable” in some sense).

Without totally dismissing the idea that some kinds of automation might make some kinds of care work easier to do (in the manner of “labour-saving devices”), I think we can rule out robot mental health nurses, childminders, or other kinds of looker-after-of-vulnerable-human-beings. That human beings are recurrently vulnerable in ways that require looking after, and that this looking-after isn’t amenable to automation, ought to frame our understanding of the ways in which automation can expand our capacity for action, or relieve us of the necessity of having to do certain kinds of work for ourselves. Lyotard makes the point that all human beings pass through a kind of neoteny, which we call childhood, and that this is as much “the season of the mind’s possibilities” as it is a kind of impairment. Childhood is a condition under which the things we want to do, and the things we need doing for us, are complex and dialogical, and we remain substantially under this condition as adults even if we have managed to find transitional objects to tide us over. Automation can do very little to relieve us of the work of childhood (although psychoanalysis is often concerned with a kind of automatism that takes hold in these processes, replacing dialogic tension and release with monologic fixation).

So, the Universal Abstract Subject that finds its opportunities for self-and-other-actualisation enhanced and amplified by technology is in a sense a subject separated from its childhood, a grown-up subject, with relatively stable needs and purposes. I am playing a game, and I want to be more successful at it; I script a bot to execute efficiently some of the combinations of moves I commonly have to make, or to evaluate the state of play and suggest winning strategies. Or: I have been given a well-specified and repetitive task to do, and it occurs to me that something much less intelligent than me could do it instead. Or: I offload part of the cognitive burden of detecting patterns in sensory information to a machine-learning system that has become superlatively good at noticing when something looks a bit like a cat, so that I can concentrate on something the machine-learning system isn’t quite so good at doing. What do these scenarios have in common? That the goals to be achieved are specified in advance, and that technical means exist through which they can be accomplished by a proxy agent with less and less involvement from the agent (me) whose goals they originally were.

“Full” automation, then, means that things we already know we want to do, and already know how to do, should be done less and less by us, and more and more by proxies, so that we can spend more time on things we don’t already know we want to do, or how to do. There isn’t actually any specifiable endpoint to this process: we’ll never know when we’ve finished, when we’ve automated all the things. The argument of ItF, as I understand it, is that we’re lagging a long way behind: that there are still a great many things that human beings are doing unnecessarily, because capitalism (like Sports Direct) will happily use cheap labour rather than even quite “low” technology for as long as it can get away with it. The demand for full automation is then a (perfectly reasonable) demand to “catch up” with what technology has already made possible. But the dynamic I’ve been describing here suggests that this will mean not so much the elimination of work as its ongoing transformation.