Tuesday, 25 October 2022

Postdigital values, Marion Milner and John Seddon

I'm giving a talk on Thursday at the Carnet Users Conference (https://cuc.carnet.hr/2022/en/programme/) as part of the extensive strand on "postdigital education". My talk has gone under the rather pompous title of "Practical Postdigital Axiology" - which is the title of a book chapter I am writing for the Postdigital group - but really this title is about something very simple. It's about "values" (axiology is the study of value), and values are things which result from processes in which each of us is an active participant. Importantly, technology provides new ways of influencing the processes involved in making and maintaining values. 

It's become fashionable in recent years to worry about the ethics of technology, and to write voluminous papers about what technology ought to be or how we should not use it. In most cases in this kind of discourse, there is an emotional component which is uninspected. It is what MacIntyre calls "emotivism" in ethical inquiry (in After Virtue), and it is part of what he blames for the decline in the intellectual rigour of ethical thought in modern times. 

I wonder if the emotivism that MacIntyre complains of relates more to mechanisms of value which precede ethics. Certainly, emotivist ethical thought is confused with value-based processes. The emotion comes through in expressing something as "unethical" when in fact what has happened is that there is a misalignment of values usually between those who make decisions, and those who are subject to those decisions. More deeply, this occurs because those in power believe they have the right to impose new conditions or technologies on others. This would not happen if we understood the benefit to all of effective organisation as that form of organisation where values are aligned. This suggests to me that the serious study of value - axiology - is what we should be focusing on. 

I think this approach to value is a core principle behind the idea of the "postdigital". This label has resulted from a mix of critique of technology alongside a deeper awareness that we are all now swimming in this stuff. A scientific appreciation of what we are swimming in is needed, and for me, the postdigital science has a key objective in understanding the mechanisms which underpin our social relations in an environment of technology. It is about understanding the "betweenness" of relations, and I think our values are a key things that sit between us. 

This orientation towards the betweenness of value is not new - indeed it predates the digital. In my talk, I am going to begin with Marion Milner, who in the early 1930s studied the education system from a psychoanalytic perspective. In her "The Human Problem in Schools", she sought to uncover the deeper psychodynamics that bound teachers, students and parents together in education. It is brilliant (and very practical) work which in education research has gone largely ignored. In her book, Milner made a striking statement:

"much of the time now spent in exhortation is fruitless; and that the same amount of time given to the attempt to understand what is happening would, very often, make it possible for difficult [students] to become co-operative rather than passively or actively resistant. It seems also to be true that very often it is not necessary to do anything; the implicit change in relationship that results when the adult is sympathetically aware of the child's difficulties is in itself sufficient."

This is a practical axiological strategy. If in our educational research with technology, we sought to manage the "implicit change in relationship that results when the "teacher" or "manager" is sympathetically aware of the "other's" difficulties" then we would achieve far more. Partly this is because we would be aware of the uncertainties and contingencies in our own judgements and the judgements of others, and we would act (or not act) accordingly. What are presented as "ethical" problems are almost always the result unacknowledged uncertainties. Even with things like machine learning and "bias", the problem lies in the overlooking or ignoring of uncertainty in classification, not in any substantive problem of the technology. 

In my new job in the occupational health department at Manchester university (which is turning into something really interesting), there is a similar issue of value-related intervention. One of the emerging challenges in occupational health is the rising levels of stress and burnout - particularly in service industries. A few years ago I invited John Seddon to talk at a conference I organised on "Healthy Organisations". It was a weird, playful but emotional conference (two people cried because it was the first time they had a chance to express how exhausted they were), but Seddon's message struck home. It was that stress is produced by what he calls "Failure demand" - i.e. the system being misaligned and making more work for itself. The actual demand that the system is meant to manage is, according to Seddon, often stable. 

It strikes me that Seddon's call to "study the demand" is much the same idea as contained in Milner's statement. It is not, strictly speaking, to do nothing. But it is to listen to what is actually demanded by the environment and to respond to it appropriately. That way, we can understand the potential value conflicts that exist, and deal with them constructively. 


No comments: