There's so much critique of higher education these days that little thought is going in to thinking about how an institution might be optimally organised in the modern world. This is partly because critique is "cheap". Bateson made the point that we are good at talking about pathology and bad at talking about health. This is partly because to talk about health you need to talk about the whole system. To talk about pathology you need to point to one bit of it which isn't working and apportion blame for its failure. Often, critique is itself a symptom of pathology, and may even exacerbate it.
The scientific problem here is that we lack good tools for analysing, measuring and diagnosing the whole system. Cybernetics provides a body of knowledge - an epistemology - which can at least provide a foundation, but it is not so good empirically. Indeed, some aspects of second-order cybernetics appear almost to deny the importance of empirical evidence. Unfortunately, without experiment, cybernetics itself risks becoming a tool for critique. Which is pretty much what's happened.
Within the history of cybernetics, there are exceptions. Stafford Beer's work in management is one of the best examples. He used a range of techniques from information theory to personal construct theory to measure and analyse systems and to transform organisation. More recently, Loet Leydesdorff has used information theory to produce models of the inter-relationship between academic discourse, government policy and industrial activity, while Robert Ulanowicz has used information theory in ecological investigations.
Information theory is something of a common denominator. It was recognised by Ross Ashby that Shannon's formulae were basically expressing the same idea as his concept of "variety", and that this equation could be used to analyse complex situations in almost any domain.
However, there are some big problems with Shannon's information theory. Not least, it assumes that complex systems are ergodic - i.e. that their complexity over a short period of time is equivalent to their complexity over a long spell of time. All living systems are non-ergodic - they emerge new features and new behaviours which are impossible to predict at the outset.
Another problem with information theory is the way that complexity itself is understood in the first place. For Ashby, complex systems are complex because of the number of states they can exist in. Ashby's variety was a countable thing. But how many countable states can a human being exist in? Where do we draw the boundary around the things that we are counting and the things that we ignore? And then the word "complex" is applied to things which don't appear to have obvious states at all - take, for example, the "complex" music of J.S. Bach. How many states does that have?
I think one of the real breakthroughs in the last 10 years or so has been the recognition that it is not information which is important, but "not-information", "constraint", "absence" or "redundancy". Terry Deacon, Loet Leydesdorff, Robert Ulanowicz and (earlier) Gregory Bateson and Heinz von Foerster can take the credit for this. In the hands of Leydesdorff, however, this recognition of constraint and absence became measurable using Shannon information theory, and the theory of anticipatory systems of Daniel Dubois.
This is where this gets interesting. An anticipatory system contains a model of itself. It is the epitome of Conant and Ashby's statement that "every good regulator of a system must be a model of that system" (see https://en.wikipedia.org/wiki/Good_regulator). Beer integrated this idea in his Viable System Model in what he called "System 4". Dubois meanwhile expresses an anticipatory system as a fractal, and this potentially means that Shannon information can be used to generate this fractal and provide a kind of "image" of a healthy system. Which takes us to a definition:
Here however, we need to look in more detail at Dubois's fractal. The point of a fractal is that it is self-similar at different orders of scale. That means that what happens at one level has happened before at another. So theoretically, a good fractal can anticipate what will happen because it knows the pattern of what has happened.
I've recently done some work analysing student comments from using comparative judgement of a variety of documents from science, technology, creativity and communication. (I did this for the Global Scientific Dialogue course in Vladivostok last year - https://dailyimprovisation.blogspot.com/2018/03/education-as-music-some-thoughts-on.html). The point of the comparative judgement was to stimulate critical thought and disrupt expectations. In other words, it was to re-jig any anticipatory system that might have been in place, and encourage the development of a fresh one.
I've just written a paper about it, but the pictures are intriguing enough. Basically, they were generated by taking a number of Shannon entropy measurements of different variables and examining the relative entropy between these different elements. This produces a graph, and the movements of the entropy line in the graph can be coded as 1s and 0s to produce a kind of fractal. (I used the same technique for studying music here - https://dailyimprovisation.blogspot.com/2019/05/bach-as-anticipatory-fractal-and.html)
So here are my pictures below. Now I suppose there is a simple question - can you tell who are the good students and who are the bad ones?
But what about the well-run institution? I think it too must have an analysable anticipatory fractal. There will be patterns of activity at all levels of management - from learner utterances (like these graphs) through to teacher behaviour, management actions, policies, technologies and relations with the environment. Yet I suspect that if we tried to do this today, we would find little coherence in the ways in which modern universities coordination they activities with the world.
The scientific problem here is that we lack good tools for analysing, measuring and diagnosing the whole system. Cybernetics provides a body of knowledge - an epistemology - which can at least provide a foundation, but it is not so good empirically. Indeed, some aspects of second-order cybernetics appear almost to deny the importance of empirical evidence. Unfortunately, without experiment, cybernetics itself risks becoming a tool for critique. Which is pretty much what's happened.
Within the history of cybernetics, there are exceptions. Stafford Beer's work in management is one of the best examples. He used a range of techniques from information theory to personal construct theory to measure and analyse systems and to transform organisation. More recently, Loet Leydesdorff has used information theory to produce models of the inter-relationship between academic discourse, government policy and industrial activity, while Robert Ulanowicz has used information theory in ecological investigations.
Information theory is something of a common denominator. It was recognised by Ross Ashby that Shannon's formulae were basically expressing the same idea as his concept of "variety", and that this equation could be used to analyse complex situations in almost any domain.
However, there are some big problems with Shannon's information theory. Not least, it assumes that complex systems are ergodic - i.e. that their complexity over a short period of time is equivalent to their complexity over a long spell of time. All living systems are non-ergodic - they emerge new features and new behaviours which are impossible to predict at the outset.
Another problem with information theory is the way that complexity itself is understood in the first place. For Ashby, complex systems are complex because of the number of states they can exist in. Ashby's variety was a countable thing. But how many countable states can a human being exist in? Where do we draw the boundary around the things that we are counting and the things that we ignore? And then the word "complex" is applied to things which don't appear to have obvious states at all - take, for example, the "complex" music of J.S. Bach. How many states does that have?
I think one of the real breakthroughs in the last 10 years or so has been the recognition that it is not information which is important, but "not-information", "constraint", "absence" or "redundancy". Terry Deacon, Loet Leydesdorff, Robert Ulanowicz and (earlier) Gregory Bateson and Heinz von Foerster can take the credit for this. In the hands of Leydesdorff, however, this recognition of constraint and absence became measurable using Shannon information theory, and the theory of anticipatory systems of Daniel Dubois.
This is where this gets interesting. An anticipatory system contains a model of itself. It is the epitome of Conant and Ashby's statement that "every good regulator of a system must be a model of that system" (see https://en.wikipedia.org/wiki/Good_regulator). Beer integrated this idea in his Viable System Model in what he called "System 4". Dubois meanwhile expresses an anticipatory system as a fractal, and this potentially means that Shannon information can be used to generate this fractal and provide a kind of "image" of a healthy system. Which takes us to a definition:
A well-run university contains a good model of itself.How many universities do you know like that?
Here however, we need to look in more detail at Dubois's fractal. The point of a fractal is that it is self-similar at different orders of scale. That means that what happens at one level has happened before at another. So theoretically, a good fractal can anticipate what will happen because it knows the pattern of what has happened.
I've recently done some work analysing student comments from using comparative judgement of a variety of documents from science, technology, creativity and communication. (I did this for the Global Scientific Dialogue course in Vladivostok last year - https://dailyimprovisation.blogspot.com/2018/03/education-as-music-some-thoughts-on.html). The point of the comparative judgement was to stimulate critical thought and disrupt expectations. In other words, it was to re-jig any anticipatory system that might have been in place, and encourage the development of a fresh one.
I've just written a paper about it, but the pictures are intriguing enough. Basically, they were generated by taking a number of Shannon entropy measurements of different variables and examining the relative entropy between these different elements. This produces a graph, and the movements of the entropy line in the graph can be coded as 1s and 0s to produce a kind of fractal. (I used the same technique for studying music here - https://dailyimprovisation.blogspot.com/2019/05/bach-as-anticipatory-fractal-and.html)
So here are my pictures below. Now I suppose there is a simple question - can you tell who are the good students and who are the bad ones?
But what about the well-run institution? I think it too must have an analysable anticipatory fractal. There will be patterns of activity at all levels of management - from learner utterances (like these graphs) through to teacher behaviour, management actions, policies, technologies and relations with the environment. Yet I suspect that if we tried to do this today, we would find little coherence in the ways in which modern universities coordination they activities with the world.