Wednesday 27 March 2019

@NearFutureTeach Scenarios and Getting many brains to think as one

I went to the final presentation of the Near Future Teaching project at Edinburgh yesterday. I've been interested in what's happening at Edinburgh for a while because it looked to me like a good way of getting teachers and learners to talk together about teaching and to think about the future. As with many things like this, the process of doing this kind of project is all important - sometimes more important than the products.

I'm familiar with a scenario-based methodology because this is what we did on the large-scale iTEC project (see;jsessionid=491AB3788AEB8152821138272A35C5E4) which was coordinated by European Schoolnet. Near Future Teaching has followed a similar plan - identification of shared values, co-design of scenarios, technological prototyping/provoking (using what they neatly called "provo-types"). iTEC took its technological prototypes a bit more seriously, which - on reflection - I think was a mistake (I wrote about it here:

During iTEC I wasn't sure about scenario-building as a methodology. It seemed either too speculative or not speculative enough, where the future was imagined as seen using lenses through which we see the present. We're always surprised by the future, often because it involved getting a new set of lenses. I was talking to a friend at Manchester university on Monday about how theologians/religious people make the best futurologists: Ivan Illich, Marshall McLuhan, C.S. Lewis (his "Abolition of Man" is an important little book), Jacques Ellul. Maybe its because the lens that allows you to believe in God is very different to the lens that looks at the world as it is - so these people are good at swapping lenses.

After Near Future Teaching, I'm a bit more enthusiastic about scenarios. I spoke to a primary school teacher who was involved in the project, and we discussed the fact that nobody is certain about the future. Uncertainty is the great leveller, teachers and learners are in the same boat, and this is a stimulus for conversation and creativity. Its not a dissimilar idea to this:

But then there is something deeper about this kind of process. Uncertainty is a disrupter to conventional ways of looking at the world. Each of us has a set of categories or constructs through which we view the world. Sometimes the barriers to conversation are those categories themselves, and making interventions which loosen the categories is a way of creating new kinds of conversation. Introducing "uncertain topics" does this.

In his work on organisational decision-making, Staffford Beer did a similar thing with his "syntegration" technique. That involved emerging issues in a group, and then organising conversations which deliberately aimed to destabilise any preconceived ways of looking at the world. Beer aimed to create a "resonance" in the communications within the group as their existing categories were surrendered and new ones formed in the context of conversation. The overall aim was to "get many brains to think as one brain". Given the disastrous processes of collective decision which we are currently witnessing, we need to get back to this!

Having said this, there's something about the whole process which IS teaching itself. That leads me to think that Near Future Teaching is closely aligned to the methods of Near Future Teaching. Maybe the scenarios can be dispensed with, almost certainly we have to rethink assessment, we have to rethink the curriculum and the institutional hierarchy, but the root of it all is conversation which disrupts existing ways of thinking and established coherence within a group.

If we had this in education, Brexit would just be a cautionary tale.

Sunday 24 March 2019

Human Exceptionalism and Brexit Insanty

Why have we managed to tie ourselves in knots? It's (k)not just over Brexit. It's over everything - austerity, welfare, tax, university funding, climate change, the point of education...

Following on from my last post, a thought has been niggling me: is it because we think human consciousness is exceptional?  Is our belief in the exceptionalism of consciousness in the human brain stopping us from seeing ourselves as part of something bigger? The problem is that as soon as we see ourselves as something special, that our consciousness is somehow special, we consider that one person's consciousness is more special than another. Then we hold on to our individual thoughts or "values" (they're a problem too) and see to it that the thoughts and values of one person must hold out against the thoughts and values of another. Is it because consciousness is not exceptional that this creates a terrible mess?

If consciousness is not exceptional, what does it do? What is its operating principle?

In my book, Uncertain Education, I aargued that "uncertainty" was the most useful category through which to view the education system. I think uncertainty is a good category to view an unexceptional consciousness too. Consciousness, I think, is a process which coordinates living things  in managing uncertainty. It is a process which maintains coherence in nature.

This process can be seen in all lifeforms from cells to ants to humans. What we call thinking is an aggregate of similar processes among the myriad of cellular and functionally differentiated components from which we are made, and which constitute our environment. The brain is one aggregation of cells which performs this role. It is composed of cells managing their uncertainty, and the aggregate of their operation and co-operation is what we think is thinking. Really, there's a lot of calcium and ATP which is pumped around. That's the work our cells do as they manage their uncertainty.

The same process occurs at different levels. The thing is fractal in much the same way that Stafford Beer described his Viable System Model. But we know a lot more about cells now than Beer did.

But what is the practical utility of a cellular view of consciousness?

Understanding that cells are managing uncertainty is only the beginning. More important is to realise that organisms and their cells have developed ("evolved") by absorbing parts of their environment as they have managed their uncertainty over history. This absorption of the environment helps in the process of managing environmental uncertainty: uncertainty can only be managed if we understand the environment we are in. Importantly, though, each stage of adaptation entails a new level of accommodation with the environment: we move from one stable state to the next "higher" level. You might imaging a "table" of an increasingly sophisticated "alphabet" of cellular characteristics and capacities to survive in increasingly complex environments.

The cellular activity of "thinking", like all processes of adaptation, occurs in response to changes in the environment. It may be that an environment once conducive to higher-level "thought" becomes constrained in a way that cells are forced to a previous, more simple, state of organisation in order to remain viable. It's a kind of regression. The kind that we see with intelligent people at the moment, paralysed by Brexit. In history, it is the thing that made good people do bad things in evil regimes. We become more primitive. Put a group of adults in a school classroom, and they will start to behave like children....!

Understanding this is important because we need to know how to go the other way - how to produce the conditions for increasing sophistication and richer adaptiveness. That is education's job. It is also the politician's job. But if we have a mistaken idea about consciousness, we are likely to believe that the way to increase adaptiveness is to do things which actually constrain it. This is austerity, and from there we descend back into the swamp.

Saturday 16 March 2019

Depth in Thought: Cosmological perspectives

Jenny Mackness is writing some great blog posts on Iain McGilchrist at the moment. Her post today is on the dynamic relationship between what composer Pauline Oliveros called "attention" and "awareness", and McGilchrist's take on this. As Jenny points out, this is not an idea unique to McGilchrist, and others - particularly Marion Milner, who she mentions - have had a similar insight. Her previous post was on "depth" ( and this is what I want to focus on.

McGilchrist's argument is based on a kind of updated bicamerality - not the rather crude distinctions about the "rational" left and "artistic" right, but a more sophisticated articulation of the way that attention and awareness work together. More importantly, he has pursued the social implications of his theory, suggesting that as a society we have created an environment within which attention is rewarded - particularly in the form of technology - and awareness and contemplation are confined to the shadows. There's a great RSA animate video of his ideas here:

There's much I agree with here. But something unnerves me in a similar way to previous theories of bicamerality like that of Julian Jaynes. Behind them all is the assumption that human consciousness is exceptional.

The problem is that "human exceptionalism" as biologist John Torday calls it, is a pretty devastating thing for the environment of everything - not just us. We think we're so great, so we have the arrogance to believe we know how to "fix" our problems. So we try to fix our problems - to treat our human problems as if they were technical problems (McGilchrist might say, to render the world in terms of the left hemisphere). And it doesn't work. It makes things worse. As an educational technologist, I see this every day. And I think if there is a "turn" in educational technology, it is that we once believed we could fix our problems with technology. Now we see that we've just made everything more complicated.

What if consciousness is not exceptional? We would first have to decide where it came from. Brains? Can we rule out consciousness in bacteria or plants? Eventually, we arrive at the cell. Brains are made from cells. In fact, recent research (which I know that Antonio Damasio among others, has been heavily involved in) in unpicking neural communication mechanisms has discovered that non-synaptic communication exists alongside communication along what we have always imagined to be a dendritic "neural network".

Cells talk to each other all throughout nature. The way they talk concerns a process which is characterised as transduction: the balancing of messages and protein expression by DNA inside the cell, with the reception to other proteins on the surface of the cell in its environment. I find this fascinating because these transduction processes looks remarkably like the psychodynamics of Freud and Jung. Is there a connection? Does our thinking go to the heart of our cells? (Or the cells of our heart?)

But there's more to this. One of the great mysteries of the cell is how it came to be as it is. Lynne Margulis's endosymbiotic theory suggests that all those mitochondria were once independent elements in the environment. Somehow an earlier version of the cell "decided" that it could organise itself better if it included those mitochrondria within its own structure. At an evolutionary level, cooperation took the place of competition. As a basic principle, Torday argues that cells have always organised themselves according the ambiguity of their environment. Consciousness is an emergent phenomenon arising from this process.

Each evolutionary stage moves from one state of homeostasis with the environment to another. Somehow, evolutionists tell us, we were once fish. Something happened to the swim bladder of the fish that turned it into the breathing organ we have in our chests. There must have been some kind of crisis which stimulated a fundamental change to cellular organisation.... and it stuck. Our conscious cells contain a myriad of vestigial fossils, of which the oldest is probably the cholesterol which allows my fingers to do this typing, and allows all of us to move about. In each of us is not only an operational mechanism which responds to immediate changes in its environment to maintain stability. In each cell is a history book, containing in a microcosm the millions of stages of endosymbiotic adaptation which took us to this point, and which we see in the physical and geological evidence around us. We really are stardust.

This isn't something that biologists alone talking about. It coincides with physics. David Bohm talked about the difference between the surface, manifest features of the world as the "explicate order", and the deep coherent structure of the universe as the "implicate order". This implicate order, Bohm imagined, was a kind of hologram - or rather a "holo-movement" (because it is not fixed), which acts as the root of everything. As a hologram, it has a fractal structure (holograms are a fractal encoding of light interference patterns of 3d images). This means that within each cell is a copy of a self-similar pattern of the cosmos, formed through the evolutionary history book that they contain. Each evolutionary stage of the cell, and each organisational configuration it forms (like the bicameral brain, bodies, fingers) is an express of what the physicists call "broken symmetries" of its initial organisation. Our manifest consciousness - the ideas we share (like this one) - are such a manifestation of our cellular broken symmetries.

When we think deeply, we think WE are doing the work. But the work is done by our cells (particularly the calcium pumps). They think deeply. Their behaviour is an attempt to bring coherence to their environment, and the ultimate coherence is to return to their origin and to get closer to the implicate order. Deep thought is time-travel. This is why, I think, a philosopher like John Duns Scotus in the 13th century could have anticipated the logic of quantum mechanics. In our current society, deep thought is not impossible, but the institutional structures we established to help it arise (the universities) have largely been vandalised.

I share many of McGilchrists concerns about the modern mind. But we need to look deeper than the brain. And we need to look deeper than us. I once asked Ernst von Glasersfeld, whose theory of Radical Constructivism has been very influential in education, about where the desire to learn came from. It was all very well, I suggested, to say what we thought the learning process was. But we never say why it is we want to learn in the first place. He didn't have an answer. Now I can tentatively suggest an answer. We don't want to learn. But our cells, and we who are constituted by them, need to organise themselves in relation to an environment so that it is coherent. Our drive to learn is the cell's search for the implicate order at its origin. All we need to do is listen - but in today's world, that is getting hard. 

Saturday 9 March 2019

Implication-Realisation and the Entropic Structure of Everything

The basic structure of any sound is that it starts from nothing, becomes something, and then fades to nothing again. In terms of the flow of time, this is a process of an increase in entropy as the features of the note appear, a process of subtle variation around a stable point (the sustain of a note, vibrato, dynamics, etc) where entropy will decrease (because there is less variation than when the note first appeared), and finally an increase in entropy again when the note is released.

A single note is rarely enough. It must be followed or accompanied by others. There is something in the process of the growth of a piece of music which entails an increase in the "alphabet" in the music. So we start with a single sound, and add new sounds, which add richness to the music. What determines the need for an increase in the alphabet of the sound?

In the Implication-Realisation theory of music of Eugene Narmour, there is a basic idea that if there is an A, there must be an A* which negates and compliments it. What it doesn't say is that if the A* does not exactly match the A, then there is a need to create new dimensions. So we have A, B, A*, B*, AB and AB*. That is no longer as simple as a single note - for the completion of this alphabet, we not only require the increase and decrease of entropy in a single variable, but in another variable too, alongside an increase and decrease in entropy of the composite relations of AB and AB*. The graph below shows the entropy of intervals in Bach's 3-part invention no. 9:

What happens when that alphabet is near-complete, but potentially not fully complete? We need a new dimension, C. So then we require A, A*, B, B*, AB, AB*, C, C*, AC, AC*, BC, BC*, ABC, ABC*. That requires a more complex set of increases and decreases of entropy to satisfy.

The relational values AB, AB*, AC, AC*, ABC, ABC* are particularly interesting because one way in which the entropy can increase for all of these at once is for the music to fall to silence. At that moment, all variables change at the same time. So music breathes in order to fulfil the logic of an increasing alphabet. In the end, everything falls into silence.

The actual empirical values for A, B and C might be very simple (rhythm, melody, harmony) etc. But equally, the most important feature of music is that new ideas emerge as composite features of basic variables - melodies, motivic patterns, and so on. So while at an early stage of the alphabet's emergence we might discern the entropy of notes, or intervals or rhythms, at a later stage, we might look for the repetition of patterns of intervals or rhythms.

It is fairly easy to first look for the entropy of a single interval, and then to look for the entropy of a pair of intervals, and so on. This is very similar to text analysis techniques which look for digrams and trigrams in a text (sequences of contiguous words).

However, music's harmonic dimension presents something different. One of the interesting features of its harmony is that the frequency spectrum itself has an entropy, and that across the flow of time, while there may be much melodic activity, the overtones may display more coherence across the piece. So, once again, there is another variable...