Wednesday, 4 December 2019

Institutions, Art and Meaning

Much of what I am reading at the moment - Simondon, Erich Hörl, Stiegler and Luhmann - is leading me to rethinking what an institution is in relation to an "individual". It's like doing a "reverse Thatcher" (which is a good slogan) - there is no such thing as an "individual". There is a continual process of distinction making (and remaking) and transduction by which "institutions" - as biological organisms like you and me - or families, friendship groups, universities, or societies preserve meaning. This is a Copernican shift in perspective, and it is something that I think Luhmann and Simondon saw most clearly, although there are aspects of their accounts which miss important things.

This is a helpful definition, because it seems we live in a time when our institutions don't work very well. Life in them becomes meaningless and alienating. So what's going on?

I think the answer has something to do with technology. Technology did something to institutions, and a hint at an answer is contained in Ross Ashby's aphorism that "any system which categorises throws away information". Those words echo like thunder for me as I'm in the middle of trying to upgrade the learning technology in a massive institution.

So institutions do something with information which preserves meaning. Institutions which lose information risk losing meaning. Thanks to so-called IT systems, most of our institutions from schools to government are losing information.

I've been thinking (again alongside Luhmann) about art and music. A string quartet or an orchestra is an institution, and through their operations, there is little doubt that meaning is preserved. But what is interesting me is that this preservation process is not simply in the current operations of the group - the practice schedule, or performance for example. It is also something to do with history.

Playing Beethoven is to preserve the meaning in Beethoven. And we have a good idea that Beethoven meant his meaning to be preserved: "alle menschen werden brüder" and all that. What is the mechanism for preserving this meaning? A big part of it is notation: a codification of bodily skilled performances to reproduce a historical consciousness.

The art system preserves meaning over a large-scale diachronic period. It seems commonsense to suppose that if the skills to perform were lost, then the process of preserving the meaning would be damaged. Would we lose this stuff? But is this right? What if the skills to perform are lost, but recordings survive? Some information is lost - but is it the technology of recording which loses the information about performance skill, or does the loss of performance skill necessitate recording as a replacement?

In a age of rich media, "performance" takes new forms. There is performance in front of the camera which might end up on social media. There is a kind of performance in the reactions of the audience on Twitter. But is the nuance of "playing Beethoven" (or anything else) lost?

We need a way of accounting for why this "loss" (if it is a loss) is significant for an inability to preserve meaning. Of course, we also need a way of accounting for meaning itself.

So I will have an attempt: meaning is coherence. It is the form something takes which articulates its wholeness. More abstractly, I suspect coherence is an anticipatory system (borrowing this from the biological mathematics of Robert Rosen and Daniel Dubois). It is a kind of hologram which expresses the totality of the form from its beginning to its end in terms of self-similar (fractal) structures.

The act of performing is a process of contributing to the articulation of an anticipatory system. If information is lost in an institution, or an art system, then the articulation of coherence becomes more difficult. This may be partly because what is lost in not-performing is not information, but redundancy and pattern. Coherence is borne through redundancy and pattern. How much redundancy has been lost in the rituals of convivial meetings within our institutions, where now email of "Teams" takes over?

If our lives and our institutions have become less coherent it is because technology has turned everything into information in which contingency and ambiguity is lost. As Simon Critchley argued in his recent "Tragedy, the Greeks and Us", this loss of ambiguity is a serious problem in the modern world, and it can only be resolved, in his view, through the diachronic structures of ritual and drama. We have to re-enchant our institutions.

I think he's right, but I think we can move towards a richer description of this process. Technology is amazing. It is not technology per se which has done this. It is the way we think.

Saturday, 16 November 2019

Maximum Entropy

On discussing Rossini's music with Loet Leydesdorff a couple of weeks ago (after we had been to a great performance of the Barber of Seville), I mentioned the amount of redundancy in the music - the amount of repetition. "That increases the maximum entropy," he said. This has set me thinking, because there is a lot of confusion about entropy, variety, uncertainty and maximum entropy.

First of all, the relationship between redundancy and entropy is one of figure and ground. Entropy, in Shannon's sense, is a measure of the average surprisingness in a message. That surprisingness is partly produced because all messages are created within constraints - whether it is the constraints of grammar on words in a sentence, or the constraints of syntax and spelling in the words themselves. And there are multiple constraints - letters, words, grammar, structure, meaning, etc.

Entropy is easy to calculate. There is a famous formula without which much on the internet wouldn't work.

Of course, there are lots of questions to ask about this formula. Why is the log there, for example? Just to make the numbers smaller? Or to give weight to something (Robert Ulanowicz takes this route when arguing that the log was there in Boltzmann in order to weight the stuff that wasn't there)

Redundancy can be calculated from entropy.. at least theoretically.

Shannon's formula suggests that for any "alphabet", there is a maximum value of entropy. It is called Maximum entropy. If the measured entropy is seen as a number between 0 and the maximum amount of entropy possible, then to calculate the "ground", or the redundancy, we simply calculate the proportion of the measured entropy to the maximum entropy and subtract it from 1.

Now mathematically, if the redundancy increases, then either the amount of information decreases (H) or the maximum entropy (Hmax) increases. If we simply repeat things, then you could argue that the entropy (H) goes down because it becomes less surprising, and therefore R goes up. If by repeating things we generate new possibilities (which is also true in music), then we could say that Hmax goes up.

No composer, and no artist, ever literally repeats something. Everything is varied (the variation form in music being the classic example). Each new variation is an alternative description. Each new variation introduces a new possibilities. So I think it is legitimate to say the maximum entropy increases. This is particularly true of "variation form" in music.

Now, away from music, what do new technologies do? Each of them introduces a new way of doing something. That too must be an increase in the maximum entropy. It's not an increase in entropy itself. So new technologies introduce redundant options which increase maximum entropy.

If maximum entropy is increased, then the complexity of messages also increases - or rather the potential for disorder and surprise. The important point is that in communicating and organising, one has to make a selection. Selection, in this sense, means to reduce the amount of entropy so that against however many options we have, we insist on saying "it's option x". Against the background of increasing maximum entropy, this selection gets harder. This is where "uncertainty" lies: it is the index of the selection problem within an environment of increasing maximum entropy.

However, there is another problem which is more difficult. Shannon's formula for entropy counts an "alphabet" of signals or events like a, b, c, etc. Each has a probability and each is added to the eventual number. Is an increase in the maximum entropy an increase in the alphabet of countable events? Intuitively it feels like it must be. But at what point can a calculation be made when at any point the full alphabet is incomplete?

This is the problem of the non-ergodic nature of life processes. I've attempted a solution to this which examines the relative entropies over time, considering new events as unfolding patterns in these relations. It's a bit simplisitic, but it's a start. The mechanism that seems to drive coherence is able, through the production of redundancies which increase maximum entropy, to construct over time a pattern which serves to make the selection and reduce the entropy to zero. This is wave-like in nature. So the process of increasing maximum entropy which leads to the selection of entropy to zero is followed by another wave, building on the first, but basically doing the same thing.

In the end, everything is zero.

Sunday, 10 November 2019

Design for an Institution: the role of Machine Learning #TheoryEDTechChat

There's an interesting reading group in Cambridge on the theory of educational technology at the moment. Naturally enough, the discussion focuses on the technology, and then it focuses on the agency of those operating the technology. Since the ontology of technology and the ontology of agency are mired in metaphysics, I'm not confident that the effort is going to go anywhere practical - although it is good to see focus on Simondon, and the particularly brilliant Yuk Hui.

But that raises the question: What is the thing to focus on if we want to get practical (i.e. make education better!)? I don't think it's technology or agency. I think it's institutions - we never really talk about institutions! And yet all our talk is framed by institutions, institutions pay us (most of us), and institutions determine that it is (notionally) part of our job to think about a theory of educational technology. But what's an institution? And what has technology done to them?

It is at this point that my theoretical focus shifts from the likes of Simondon, Heidegger, and co (great though I think this work is), to Luhmann, Stafford Beer, Leydesdorff, von Foerster, Ashby and Pask.

Luhmann is a good place to start. What's an institution? It is a autopoietic system which maintains codes of communication. "Autopoietic" in this sense means that codes of communication are reproduced by people ("psychic systems"), but that the "agency" of people in communicating is driven by the autopoietic mechanism (in Luhmann's jargon, it is "structurally coupled"). "Agency" is the story we tell ourselves about this, but it is really an illusion (as Erich Hörl has powerfully discussed in his recent "The archaic illusion of communication")

By this mechanism, institutions conserve meaning. I wonder if they also conserve information, and Leydesdorff has done some very important work in applying Shannon's information theory to the academic discourse.

Ashby's insight into information systems becomes important: "Any system that categorises effectively throws-away information" he wrote in his diary. That seems perverse, because it means that our so-called information systems actually discard information! But they do.

For Luhmann, discarding information means that the probability that communications will be successful (i.e. serve the mechanism of autopoiesis in the institution) will be reduced. As he pithily put it in his (best) book "Love as Passion": "All marriages are made in heaven, and fall apart in the motorcar". What he means is that when one person in a couple is driving, their lifeworld is completely different to their partner's. The context for meaningful communication is impaired by the mismatch in communicative activity which each is engaged in.

In our social media dominated world, where alternative lifeworlds metastasise at an alarming rate, the effect of technology in damaging the context for the effective conservation of meaning is quite obvious.

In the technocractic world of the modern university, where computer systems categorise students with so-called learning analytics, it is important to remember Ashby: with each categorisation, information is thrown away. With each categorisation, the probability that communications will be successful is diminished as the sphere of permissible speech acts becomes narrower. Instead of talking about the important things that matter most deeply, conversations become strategic, seeking to push the right buttons which are reinforced by the institutional systems: not only the bureaucratic systems of the university, but the discourse system of the publishers, and the self-promotion system of social media. This is the real problem with data.

The problem seems quite clear: Our institutions are haemorrhaging information. It is as if the introduction of information systems was like putting a hole in the hull of the institutional "ship".

Stafford Beer knew this problem. It is basically what happens when the coordination and control part of his "viable system model" (what he called "System 3") takes over, at the expense of the more reflective and exploratory curious function that probes the environment and examines potential threats and opportunities (what he called "System 4"). In companies, this is the R&D department. It is notable that universities don't have R&D departments! Increasingly, R&D is replaced by "analytics" - the system 4 function is absorbed into system 3 - where it doesn't belong.

But let's think more about the technology. System 3 tools categorise stuff - they have to - it's part of what system 3 has to do. This involves selecting the "right" information and discarding the rest. It is an information-oriented activity. However, the opposite of information is "redundancy" - pattern, repetition, saying the same thing in many ways... in education, this is teaching!

Machine learning is also predominantly a redundancy-based operation. Machine learning depends on multiple descriptions of the same thing from which it learns to predict data that it hasn't seen before. I'm asking myself whether this redundancy-oriented operation is actually a technological corrective. After all, one of the things that the curious and exploratory function of system 4 has to do is to explore patterns in the environment, and invent new interventions based on what it "knows". Machine learning can help with this, I think.

But only "help". Higher level coordination functions such as system 4 require human intelligence. But human intelligence needs support in being stimulated to have new kinds of conversations within increasingly complex environments. Machine learning can be incredibly and surprisingly creative and stimulating. It can create new contexts for conversations between human beings, and find new ways of coordinating activities which our bureaucratic systems cannot.

My hunch is that the artists need to get on to this. The new institutional system 4, enhanced by machine learning, is the artist's workshop, engaging managers and workers of an organisation into ongoing creative conversation about what matters. When I think about this more deeply, I find that the future is not at all as bleak as some make out.

Tuesday, 5 November 2019

Non-Linear Dynamics, Machine Learning and Physics meets education

In my recent talk about machine learning (in which I've been particularly focussing on convolutional neural networks because they present such a compelling case for how the technology has improved), I explored the recursive functions which can be used to classify data such as k-means. The similarity between non-linear dynamics of agent-based modelling and the recursive loss functions of convolutional neural network training are striking. It is hard for people new to machine learning to understand that we know very little of what is going on inside. The best demonstration of why we know so little comes from demonstrating the non-linear dynamic emergent behaviour in an agent-based model. Are they actually the same thing in different guises? If so, then we have a way of thinking about their differences.

The obvious difference is time. A non-linear agent-based model's behaviour emerges over time. Some algorithms will settle on fixed points (if k-means didn't do this it would be useless), while other models will continue to feed their outputs into their inputs endlessly producing streams of emergent behaviour. The convolutional process appears to settle on fixed points, but in fact it rarely fully "settles" - one can run the python "" function for ever, and no completely stable version emerges, although stability is established within a small fluctuating range.

I discussed this fluctuation with Belgian mathematician Daniel Dubois yesterday. Daniel's work is on anticipatory systems, and he built a mathematical representation of the dynamics that were originally introduced by biologist Robert Rosen. Anticipation, in the work of Dubois, results from fractal structures. In a sense, this is obvious: to see the future, the world needs to be structured in a way in which patterns established in the past can be seen to relate to the future. If machine learning systems are anticipatory (and they appear to be able to predict categories of data they haven't seen before), then they too will contain a fractal structure.

Now a fractal is produced through a recursive non-linear process which results in fixed points. This all seems to be about the same thing. So the next question (one which I was asking both Daniel Dubois, and Loet Leydesdorff who I saw at the weekend) is how deep does this go? For Loet, the fractal structures are in communication systems (Luhmann's social systems), and (importantly) they can be analysed using Shannon's information theory. Daniel (on whose work Loet has constructed his system), agrees. But when we met, he was more interested to talk about his work in physics on the Dirac equation and what he believes to be a deeper significance of Shannon. I don't fully understand this yet, but we both agreed that if there is a deeper significance to Shannon, then it was a complete accident because Shannon only half-understood what he was doing... Half-understanding things can be way forwards!

Daniel's work on Dirac mirrors that of both Peter Rowlands in Liverpool and Lou Kauffman in Chicago (and now Novosibirsk). They all know each other very well. They all think that the physical world is basically "nothing". They all agree on the language of "nilpotents" (things multiplying to zero) and quaternions (complex numbers which produce a rotational geometry) as the fundamental building blocks of nature. There is an extraordinary intellectual confluence emerging here which unites fundamental physics with technology and consciousness. Who could not find that exciting?? It must have significance for education!

What's it all about? The clue is probably in Shannon: information. And I think it is not so much the information that is involved in learning processes (which has always been the focus of cognitivism). It is the way information is preserved in institutions - from the very small institutions of friendship and family, to larger ones like universities and countries.

Our technologies are technologies of categorisation and they throw away information. Since the computer revolution, holes have appeared in our social institutions which have destabilised them. The anticipatory function, which is essential to all living things, was replaced with a categorising function. The way we use machine learning also tends to categorise: this would make things worse.  But if it is an anticipatory system, it can do other things - it can provide a stimulus for thought and conversation, and in the process put information back into the system.

That is the hope. That is why we need to understand what this stuff does. And that is why, through understanding what our technology does, we might understand not only what we do, but what our institutions need to do to maintain their viability.

Education is not really about schools and universities. Those are examples of institutions which are now becoming unviable. Neither, I think, is it really about "learning" as such (as a psychological process - which ultimately is uninspectable). Education is about "institutions" in the broadest sense: families, friendships, coffee bars, businesses, hospitals... in fact anywhere which maintains information. To understand education is to understand how the processes which maintain information really work, how they can be broken with technologies, and how they can be improved with a different approach to technology. 

Tuesday, 29 October 2019

About Aboutness and Relations: Thoughts on #TheDigitalCondition

As part of the Cambridge Culture, Politics and Global Justice group on the Digital Condition, I made a video response which sought to bring a cybernetic perspective to Margaret Archer's views on the "Practical domain" as pivotal in the relations between nature and the social. I remember challenging Archer on this many years ago when she gave a talk in London about her work on reflexivity and I suggested that Maturana and Varela's concept of "structural coupling" provided a clearer explanation of what she was trying to articulate in terms of the relations between people, practices and things. She brushed the point aside at the time, although more recently I heard her talk more approvingly of autopoietic theory, so I'd be interested to know what she thinks now. This is my video:

One of the things about making a video like this is that it is a very different kind of thing from  Archer's paper that we were all reading. Because it is more conversational, it expresses a certain degree of uncertainty about what it attempts to say. Not just in the messy diagrams, but in the pauses as I try to find the words for what I want to say. Also worth mentioning that having drawn the diagram, making the video was very quick. Why don't we do this more often? My suspicion is that as academics we are rather reluctant to reveal our uncertainty about things. Academic papers full of sophisticated verbiage are safer spaces to hide uncertainty. Personally, I think we should be doing the opposite to hiding uncertainty - and we have the technology to do it.

Anyway, this has elicited some defence of Archer - particularly in arguing that my critique is a misrepresentation of her argument. Well... I'm not sure.

In her paper she begins by focusing on "aboutness" and the relationship between consciousness and reality:
"Deprived of this reference to, or regulation by, reality, then self-referentiality immediately sets in – consciousness becomes to be conscious of our own ideas (generic idealism), experience equates knowledge with the experienced (pragmatism and empiricism), and language becomes the internal relationship between linguistic signs (textualism). Instead, consciousness is always to be conscious of something"
So "consciousness" is a thing which refers to another thing, "reality". So here are two distinctions. They are, of course, unstable and uncertain. What is reality? Well, what isn't reality?? What is consciousness? Well, what isn't consciousness?? (are rocks conscious, for example?) And if whatever is consciousness must refer to whatever is reality in order to be conscious, then what not-consciousness? Does that refer to anything? Lying behind all this is an implicit "facticity" behind the concepts of "consciousness", "reality" and "reference". Imposing the facticity effectively removes the uncertainty.

Archer says "consciousness has to be conscious of something", retreating from self-referentiality. But what if consciousness is self-referential? What does that do? It does two things:

  1. it creates a boundary, since self-reference is a circle.
  2. it creates uncertainty since whatever is contained in the boundary lies in distinction to what is outside it, where the nature of that distinction is unclear. Additionally, the totality of what is contained within the boundary cannot be accounted for within logic of the boundary (Gödel)

As I explain in my video, this then unfolds a topology.

So then what is reference? It must be about the way in which distinctions maintain themselves within the self-referential processes of consciousness. As I explain, this process entails transduction processes which operate both vertically (within the distinction) and horizontally (between a distinction and its environment).

There's a very practical example of this from biology. One of the central questions about DNA is "How does a molecule come to be about another molecule?" (thanks to Terry Deacon for that!). This is a profound question which throws into doubt what is known as the "central dogma" of biology, which places DNA at the centre of life. It really can't be right.

What is more likely is that there are processes of bio-chemical self-reference initially involving lipid membranes maintaining their internal organisation and boundaries in the context of an ambiguous environment. DNA can then be seen as an epiphenomenon of the evolution of this communicative process. In other words, the aboutness of DNA is our distinction concerning the emergent epiphenomena of self-reference.

It's the same with reference more generally. Once we see consciousness as self-reference, then the categories that we invent about "nature" or "the social" can be seen as epiphenomena of the self-referential process. This makes explaining this stuff a lot simpler, in my view. (And of course, explanation is another epiphenomenon of self-reference). It also helps to explain the ways in which we bring technology to bear on processes which help us to maintain our distinctions.

Wednesday, 23 October 2019

Ancestrality and Consciousness

Over the past year, I've been increasingly convinced of the correctness of the evolutionary biological theory of John Torday concerning the connection between consciousness, cellular evolution and "big" evolutionary history (from the deep origins of space and time). Of course, it's hugely ambitious - but we should be hugely ambitious, shouldn't we?

John's work in physiology and medicine (primarily focused on lung physiology and asthma) has presented a number of empirical phenomena that point towards a biological theory that includes evolutionary history, where consciousness is part of a process that explains how cells evolve from lipid bi-layers to sophisticated inter-cellular communication systems. It also addresses what he, and many other biologists see, as a scientific problem within their discipline - that it is not explanatory in the way that physics or chemistry describe causal mechanisms, but descriptive. In our extensive conversations, I have noted that education suffers the same problem: there is no mechanistic explanation for educational phenomena, only description. Since education is also a manifestation of consciousness, I am concerned to make the connection between these different lines of inquiry (biology, physics and evolution).

The central question in evolution is the relationship between diachronic (time-based) process and synchronic structures. What time-based process makes cells absorb parts of their environment (bacteria for example) as mitochondria? What time-based process introduces cholesterol as the critical ingredient to animate life? What time-based process governs the expression of proteins and their transformations in the cell signal-transduction pathways, which despite their complexity, maintain the coherence of the organism?

Time itself introduces further questions. When we look at evolutionary history - maybe at the red-shift of the expanding universe - what are we looking at exactly? "Once upon a time, there was absolutely nothing, and then there was this enormous explosion..." really?

I've been re-reading Quentin Meillassoux's "After Finitude". I have some misgivings about it, but this must surely be one of the greatest philosophical works of the last 20 years. The question of time is one of his central questions, and he calls it "ancestrality". The question is about the nature of reality, and particularly the reality of things like fossils, or electromagnetic radiation from outer-space. We either assume that the world is made by our consciousness, or we assume that the world exists in a pre-existing domain that exists independently of human consciousness and agency (what Bhaskar calls the "intransitive domain"). Meillassoux pursues the Platonist position which denies both (in line with Alain Badiou) arguing that objects are mathematically real to us - logic, in other words, is prior to materiality. At the heart of Meillassoux's (and Badiou's) argument is the contingency of nature. He asks, "Given this contingency, how do things appear stable?"

Pursuing this, the ancestrality of the universe - the big bang, evolutionary history - is (Meillassoux would claim) "logically" real. But this puts the emphasis on the synchronic reality of things - their logical structure out of time - and it assumes that the consciousness that conceives of this logic is similarly structured. Indeed, I'm not convinced that one of Meillassoux's central points - that the Western philosophical position is one of "correlation" between ideas and reality is escaped in his own position. But the care with which he lays out his arguments is nevertheless highly valuable, and his emphasis on contingency seems right to me (but maybe not! - how can anyone say an ontology of contingency is "right"?)

Torday's situating of time in the material and biological evolution of consciousness means that this "logic" has to become a "topo-logic": space and time - the diachronic dimension - are not separable from the "logic" of synchronic structure. What do we get when we have a "topo-logic"? We get contingency, process, uncertainty and the driving necessity for coherence. In essence, we get life.

Somehow, we have to grapple with topology. For a long time, I struggled with the concept of time within cybernetics. After all, you have to have time to have a mechanism, but where did "time" come from? There must be something prior to "mechanism". It turns out that when we think through one of the other key distinctions of cybernetics - difference - we find the answer. A difference results from a distinction. A distinction is a boundary which marks what is inside from what is outside. But distinctions are essentially unstable: whatever mark is made generates a question. It's the same question that Gödel addressed: the distinction demarcates a "formal system", but there are propositions expressible within the formal system which cannot be proved within that system. Uncertainty is inevitable - how is it managed?

Something must be invented in order to mop-up the uncertainty. Time is a powerful invention. By creating past, present and future, ambiguities can be situated in a way where contradictions can be expressed "now it is x" and "now it is not-x". The implications of this are that the topology becomes richer as the distinctions about time must also be negotiated. Part of the richness of this topology may also be the creation of deep symmetries in time and space, including concepts such as "nothingness" or nilpotency, and "dimensionality" - in essence, as my colleague Peter Rowlands would argue, the foundational principles of the material world. The invention of time entails a double management process, where part of it must coordinate with an environment which is also the cause of uncertainty. There are many distinctions in the universe, each creating time, space and matter, and each constraining other distinctions.

So is the "big bang" story a manifestation of a topology? Does the topology pre-exist the consciousness which conceives of it? (what does "pre-" mean in that sentence?) If this is so, what is "evolution"? I feel myself skirting foundationalism while denying the possibility of any "foundation"... and then seeing that process as a foundation... then denying it... then seeing it as a foundation... and so on.

"In the beginning was the word" says St. John's Gospel. That's a distinction - it unfolds a topology. Theologians like Arthur Peacock imagined that "logos" might also mean "information". If there is information, then there is topology, and then the "beginning" is "the word" - the distinction.  And beginnings are everywhere, not least the beginnings created by the distinctions of consciousness. But consciousness's beginnings have their roots in the beginnings of matter - in the "word".

We're very close to Torday's essential point: cells, from which consciousness emerges, are stardust which trace their evolutionary history to the beginning. In the topology of maintaining distinctions, new distinctions must be made as the ambiguity of the environment is dealt with. Indeed, the difference between atoms and organisms may be that atoms in maintaining their distinction, must find a way of organising themselves such that new distinctions may be made. The environment within which atoms organise is the essential driver for new forms of organisation. That way of organising is what life is: a search for new ways of making distinctions which manage the uncertainty generated by those distinctions. It is this, I suspect, which is the mechanism of evolution. It's only about history in the sense that our unstable distinctions require us to invent history to maintain our distinctions about ourselves and our environment.

What we call "homestasis" is the cell's drive for coherence in its distinction-making. What we call "information" or "negentropy" is the cell's interface with its environment. What we call "chemiosmosis" is the disturbance to the cell's equilbrium by forces in its environment and its gaining of energy.

Thought itself is the universe's way of making new distinctions. Since the universe is imprinted in the biology of consciousness, the symmetries of physics, biology and consciousness will contrive to form coherences which enfold both synchronic and diachronic dimensions. This may be why the crazily complex protein dance hangs together - because of the deep coherence between diachronic and synchronic dimensions.

What then of science itself? Of empiricism? In a world produced by thought, something happens within the time we invent to establish coherence between thought and the world. Thought looks closely at what it has made, it discards certain aspects of what it sees, it executes control on what is left, it observes what happens - not just one mind, but many - and then collectively it thinks more deeply. A new level of coherence is arrived at and the topology unfolds once more. 

Sunday, 6 October 2019

Creatively defacing my copy of Simon Critchley's "Tragedy, the Greeks and Us"

I've been defacing my copy of Simon Critchley's "Tragedy, the Greeks and Us". For me, this vandalism is a sign that something has got me thinking. It's not just Critchley. I went back to Jane Harrisson's "Ancient Art and Ritual" the other day, partly in response to my recent experiences in Vladivostok and a central question concerning the structure of drama and the structure of education. Basically: is education drama? Should it be? and, Is our experience online drama? Critchley's not dismissive of Harrison and the Cambridge ritualists - which I find encouraging - and I like his suggestion that art may not be so much "ritual" as "meta-ritual". 

It's funny how things revolve. I was introduced to Harrison by Ian Kemp at Manchester university as a student, who was also a passionate expert on Berlioz. Yesterday evening I took my daughter to hear a performance of Berlioz's Romeo et Juliette, which is Berlioz's brilliant and beautiful refashioning of Shakespeare into the form of a symphony via Greek drama: it has explicit sections of chorus, prologue, sacrifice, feast, etc. Beethoven meets the Greeks!

I'm very impressed with Critchley - and I very much get his vibe at the moment - that tragedy and the ambiguity of dramatic structure was overlooked in favour of philosophy (Plato particularly), and that we are now in a mess because of it. I agree. If we replace "tradegy" with "the drama of learning" or "the dialectic of self-discovery" then I think there are some important lessons for education. Critchley makes the point that our modern lives are determined by endless categorisation, and the resulting incoherence of this drives us back to Facebook and social media:
"We look, but we see nothing. Someone speaks to us, but we hear nothing. And we carry on in our endlessly narcissistic self-justification, adding Facebook updates and posting on Instagram. Tragedy is about many things, but it is centrally concerned with the conditions for actually seeing and actually hearing"
That's what I was missing in "The Twittering Machine". 

But he has an axe to grind about philosophy and Plato - and particularly with his contemporary philosophers, most notably Alain Badiou. Since Badiou also has a deep interest in the arts (and opera particularly) this is interesting, and I think Critchley is seeing a dichotomy where there isn't one. And that is where my doodling starts...

The essence of this goes back to the relationship between the synchronic, categorical frame of rationality and experience which demarcates times, and the diachronic, ambiguous frame which sees time as a continuous process. Critchley doesn't seem to see that the two are compatible. But I think they are in a fundamental way. 

The issue concerns what a distinction is, and the relationship of a distinction to time. We imaging that distinctions are made in time, and that time pre-exists any distinction. But it is possible that a distinction - the drawing of a boundary - entails the creation of time. So this was my first doodle:

The distinction on the right is simply a self-referential process. It embraces something within it, but it occurs within a context which cannot be known (in this case, a "universe" and an "earth" and an "asteroid"...) All of those things are distinctions too, and they are all subject to the same process as I will describe now. The essential point is that all distinctions are unstable.

When we think of distinctions as unstable, think of a distinction about education. There are in fact no stable distinctions about education. Everything throws us into conversation. In fact the conversation started long before anyone thought of education. Indeed, it may have started with the most basic distinctions of matter.

If I was to draw this instability, it is a dark shadow emerging within the distinction. These are the unstable forces which will break apart the distinction unless they are absorbed somehow. So we need something to absorb the uncertainty. It cannot be inside the distinction - it must be outside. So we are immediately faced with a duality - two sides of the Mobius strip.

But more than that, absorbing uncertainty (or ambiguity if you want) is a battle on two fronts. The internal uncertainty within the distinction is one thing, but it must be balanced with what might be known about the environment within which the distinction emerges and maintains itself.

Let's call this "uncertainty mop" a "metasystem".  And here it is. Note the shadow in the distinction and the shadow in the environment. Part of me wants to draw this like a Francis Bacon "screaming pope".

A war on two fronts is hard - the metasystem better get its act together! Part of it must deal with the outside and part of it must deal with the inside - and they must talk to each other.

The interesting and critical thing here is that in order to make sense of this balancing act, the metasystem has to create new distinctions: Past, Present and Future. We might call these "imaginary" but they are entirely necessary, because without them, there is no hope of any kind of coherent stability in our distinction. But what have we done? Our distinction has made time!

By inventing time, we have invented the realm of the "diachronic". This is the realm of drama and music. Whatever time is - and how can we know? - it expands our domain of distinction-making, and helps us to see the connection between past, present and future. In the language of "anticipatory systems" of Daniel Dubois and Robert Rosen, this is the difference between recursion (the future modelled on the past), incursion (the future modelled on the future), and hyperincursion (selection among many possible models of the future).

I ought to say that all of this happens all at once. A distinction is like a bomb going off - or even a "big bang". But these bombs are going off all the time - or rather, they are going off all the time that they themselves make. A distinction entails time, which entails dialectic (and conversation). But it also entails hope for a coherence and stability of a distinction within what it now sees as a changing world.

I think the best picture of coherence is the relationship between a fractal as a kind of map of the environment and the unfolding patterns of action within that environment. It's a fractal because only a fractal can contain seed of the future based on its past. Life goes on in the effort to find a coherent structure. When it does, we die.
Mostly, the search for coherence leads to new distinctions, and so the process goes on in a circle. This, I think, is what education is.

The structure of tragedy unfolds this circle in front of us for us to see. It is a circle of nature - of logic. It is the logic of every atom, cell, fermion, quark, whatever... in the universe. It is the logical structure of a distinction. 

When Critchley says of tragedy that it is about "actually seeing and actually hearing" he is spot-on. But I think his anti-Platonist stance is a reaction to where we are now. "Actually seeing and actually hearing" has been replaced with the processing of data. The part of the metasystem which does that is the lower-part, mopping up the internal uncertainty, but not really thinking about the environment. The diachronic bit - the time-making bit - has been crowded-out by our computer powered categorisation functions. If we continue like this, hope will be extinguished, because hope also sits in the upper bit.

Biological systems do not suffer this imbalance. We have had it forced on us by our rationality. That is our tragedy. Yet we are not as rational as we think - and our rationality is a biological epiphenomenon. It feels as if our technology is out of balance: a dialectical imbalance that presents us with a challenge to be overcome for the future.  But this may be as much of a question as to how we organise our institutions as it is about what kind of technology we produce in the future. 

We have the option to organise our education differently, and learn something from the past. We also have the option to use our technologies differently and use them to reinforce the ambiguities of distinctions, rather than the information-discarding processes of categorisation.