Tuesday, 18 June 2019

Machine Learning as a Personal Anticipatory System

Can a living system can survive without anticipation? As humans we take anticipation for granted as a function of consciousness: without an ability to make sense of the world around us, and to preempt changes, we would not be able to survive. We attribute this ability to high-level functions like language and communication. At the same time, the ability of all living things to adapt to environments whilst not always showing the same skill of language is apparent, although many scientists are reluctant to attribute consciousness to bacteria or cells. Ironically, this reluctance probably has more to do with our human language for describing consciousness, than it does to the nature of any "language" or "communication" of cells or bacteria!

We believe human consciousness is special, or exceptional, partly because we have developed a language for making distinctions about consciousness which reinforces a separation between human thought and other features of the natural world. In philosophy, the distinction boils down to "mind" and "body". We have now reached a stage of development where continuing to think like this will most likely destroy our environment, and us with it.

Human technology is a product of human thought. We might believe our computers and big data to be somehow "objective" and separate from us, but we are looking at the manifestations of consciousness. Like other manifestations of consciousness such as art, music, mathematics and science, our technologies tell us something about how consciousness works: they carry an imprint of consciousness in their structure. This is perhaps easiest to see in the artifice of mathematics, which whilst being an abstraction, appears to reveal fundamental patterns which are reproduced throughout nature. Fractals, and the imaginary numbers upon which they sit, are good examples of this.

It is also apparent in our technologies of machine learning. Behind the excitement about AI and machine learning lies a fundamental problem of perception: these tools display remarkable properties in their ability to record patterns of human judgement and reproduce them, but we have little understanding of how it works. Of course, we can describe the architecture of a convolutional neural network (for example), but in terms of what is encoded in the network, how it is encoded, and how results are produced, we have little understanding. Work with these algorithms is predominantly empirical, not theoretical. Computer programmers have developed "tricks" for training networks, such as training a full network with existing public domain image sets (using, for example, the VGG16 model), but then retraining the bottom layer for the specific images that they want identified (for example, images of diabetic retinopathy, or faces). This works better than training the whole network on specific images. Why? We don't know - it just does.

It seems likely that whatever is happening in a neural network is some kind of fractal. The training process of back-propagation involves recursive processing which seeks fixed points in the production of results across a vast range of variables from one layer of the network to the next. The fractal nature of the network means that retraining the network cannot be achieved by tweaking a single variable: the whole network must be retrained. Neural networks are very dissimilar from human brains in this way. But the fractal nature of neural networks does raise a question as to whether the structure of human consciousness is also fractal.

There is an important reason for thinking that it might be. Fractals are by definition self-similar, and self-similarity means that a pattern perceived at one level with one set of variables can be reproduced at another level, with a different set of variables. In other words, a fractal representation of one set of events can have the same structure as the fractal pattern of a different set of events: perception of the first set can anticipate the second set.

I've been fascinated by the work of Daniel Dubois on Anticipatory Systems recently partly because it is closely related to fractals, and it also seems to have a strong correlation to the way that neural networks work. Dubois makes the point that an anticipatory system processes events over time by developing models that anticipate them, whilst also generate multiple possible models and selecting the best fit. Each of these models is a differently-generated fractal.

If we want to understand what AI and machine learning really mean for society, we need to think about what use an artificial anticipatory system might be. One dystopian view is that it means the "Minority Report" - total anticipatory surveillance. I am sceptical about this, because an artificial anticipatory system is not a human system: its fractals are rigid and inflexible. Human anticipation and machine anticipation need to work together. But a personal artificial anticipatory system is something that is much more interesting. This is a system which processes the immediate information flows of experience and detects patterns. Could such a system help individuals establish deeper coherence in their understanding and action? It might. Indeed, it might counter the deep dislocation produced by overwhelming information that we are currently immersed in, and provide a context for a deeper conversation about understanding.

Sunday, 16 June 2019

Machine Learning and the Future of Work: Why eventually we will all create our own AIs

I'm on my way to Russia again. I've had an amazing couple of days with a Chinese delegation from Xiamen Eye Hospital and the leading experts in retinal disease in China, who are collaborating with us on a big EPSRC project. There was a very special atmosphere: despite the language differences, we were all conscious of staring at the future of medical diagnostics where AI and humans work in partnership.

There's a lot of critical dystopian stuff about technology in society and education in the e-learning discourse at the moment. I think history will see this critical reaction more as a response to desperately nasty things going on in our universities, rather than an accurate prediction of the future. I am also subject to these institutional pathologies, but I suspect both the dystopian critiques and the institutional self-harm are symptoms of more profound changes which are going to hit us. Eventually we will rediscover a sane way of organising human thought and creativity once more, which is what our universities used to do for society.

So this is what I'm going to say the students in Vladivostok:

Machine Learning, Scientific Dialogue and the Future of Work
It is not unusual today to hear people say how the next wave of the technological revolution will be Artificial Intelligence. Sometimes this is called the "4th industrial revolution": there will be robots everywhere - robot teachers, robot doctors, robot lawyers, etc. In this imagined future, machines are envisaged to take the place of humans. But this is misleading. The future will however involve a deeper partnership between humans and intelligent machines. In order to understand this, it is important to understand how our technologies of AI work, how the processes of creating AIs and machine learning are becoming available to you and me, and how human work is likely to change in the face of technologies which have remarkable new capabilities. 
In this presentation, I will explain how it will become increasingly easy to create our own AIs. Even now, the technologies of Machine Learning are widely available, increasingly standardised and accessible to people with a bit of computer programming knowledge. The situation at the moment is very much like the early web in the 1990s, when to create a website, people needed a bit of knowledge of HTML. As with the web, creating our own AIs will become something everyone can do.  
Drawing on my work, I will explain how in a world of networked services, there is one feature about Artificial Intelligence which is largely ignored by those not informed of its technical nature: AI does not need to be centralised. A machine learning algorithm is essentially a single (and often not very large) file, which can be embedded in any individual device (this is how, for example, the facial recognition works on your phone). The world of AI will be increasingly distributed. 
Finally, I will consider what this future means for human work. One of the important distinctions between human decision-making and AI is that humans make judgements in a context; AI, however, ignores context. In other words, AI, like much information technology, actually discards information, and this has many negative consequences on the organisation of institutions, stable society and the economy. The most potentially powerful feature of AI in partnership with humans is that it can preserve information by preserving the context of human judgement. I will discuss ways in which this can be done, and why it means that those things which humans do best – empathy, criticality, creativity and conversation – will become the essence of the work we do in the future.

Tuesday, 4 June 2019

German Media Theory and Education

I'm discovering a branch of media studies which I was unaware of before Steve Watson pointed me to Erich Hörl's "Sacred Channels: The Archaic illusion of Communication". Hörl's book is amazing: cybernetics, Luhmann, Bataille, Simondon & co all spiralling around a principal thesis that communication is an illusion, and that many of our current problems arise from the fact that we don't think it is. The "illusion" of communication is very similar to David Bohm's assertion that "everything is produced by thought, but thought says it didn't do it". This is not "media studies" as we know it in UK universities. But it is how the Germans do it, and have been doing it for some time.

Just as Luhmann has been a staple of the German sociology landscape for undergraduate sociologists for 20 years now, so Luhmann's thinking informed a radical view of media which Hörl has inherited. He got it from Friedrich Kittler. Kittler died in 2011, leaving behind a body of work which teased apart the boundaries between media and human being. Most importantly, he overturned the hypothesis of Marshall McLuhan that media "extend" the human. Echoing Luhmann, Kittler says that media make humans. Just as Luhmann pokes the distinction between psychology and sociology (he really doesn't believe in psychology), Kittler dissolves the "interface" between the human and the media.

The result is that practically everything counts as media. Wagner's Bayreuth was media (Kittler wrote extensively about music, culminating with a four volume work he never finished, "Music and Mathematics"), AI is media, the city is media. So is education media? Not just the media that education uses to teach (which educational technologists know all about). But education itself - the systemic enveloping of conversations between students and teachers - is that media?

As Erich Hörl has pointed out, these ideas are very similar to those of another voice in technology studies who is gaining an increasingly dominant following after his death, Gilbert Simondon. Like Kittler, Simondon starts with systems and cybernetics. Simondon's relevance to the question of education and technology is quite fundamental. Kittler, I don't think, knew his work well, and Hörl acknowledges that he has further to go in his own absorption of the work. Simondon made a fundamental connection between media, or machine, and human beings as distinction-making, individuating entities. The individuation process - that process which Jung saw as the fundamental process of personal growth - was tied-up with the process of accommodating ourselves to the media which comprise us. This accommodation was achieved through levels of "transduction" - the multiple processes which produce multiple levels of distinctions, from the distinctions between our cells, to the distinctions in our language, and the distinctions with our environment. What happens in education, basically, is that the media which make us us are transformed through changes in the ways the transductions are organised at different levels.

I described a lot of this in my book, albeit not in the elegant fashion that Kittler, Hörl  (or Simondon) would have done. Kittler, Simondon and Hörl have got me thinking in a new way about how we think about education. There's much more to say about this however, because Kittler and Hörl's approach opens the way for a more empirical approach to understanding education as media. I was privileged to have learnt about Luhmann through one of his best disciples, Loet Leydesdorff. Leydesdorff's work has been dedicated to making Luhmann's theory empirically useful, which he has done by relating it to Shannon (which Luhmann did in the first place), and to the mathematics of anticipation by the Belgian mathematician, Daniel Dubois.

Here, we may yet have a science of education which straddles the boundaries between technology, critique, pedagogy and phenomenology whilst maintaining an empirical focus and theoretical coherence. That is the best way of getting better education. This science of education may well turn out to be exactly the same as the empirical and coherent science of media that Kittler and Hörl are aiming for, which transcends the sociological critique of media (seeing that as simply more media!), by providing a meta-methodology for making meaningful distinctions about our distinction-making processes in our media-immersed state.

Sunday, 2 June 2019

Two kinds of information in music and media

My recent music has been exploring the idea that there are two kinds of information in the world. I am following the theory of my colleague Peter Rowlands, who had this to say (in the video below) on the subject of how nature is a kind of information system, but very different from the information systems of our digital computers. Peter summarises the difference by saying that digital information is made from 1s and 0s, but the significant thing is the 1. Nature, he contends, operates with multiple levels of zero. His reasons for thinking this are a thoroughly worked-through mathematical account of quantum mechanics, and particularly the Dirac equation (the only equation in Westminster Abbey!). Nature is all "Much ado about nothing".

I've been fascinated by "nothing" for a long time. Nothing is "absence" as opposed to "presence", and absence is (according to philosophers like Roy Bhaskar, cyberneticians like Gregory Bateson, and biologists like Terry Deacon) constraint. Constraint is important in digital information because it is represented by Shannon's concept of "redundancy". So there is a connection between nothing and redundancy. This resonates with me with something like music, because it is so full of redundancy, and music does appear to be "much ado about nothing".

There is something we do when we make music which somehow makes sense. The patterns we create create the conditions for richer patterns which eventually define a structure. Musicians create redundancy in the form of repetition which brings coherence to the music. There are different kinds of redundancy: pitch, rhythm, timbre, intervals, etc. Much of this patterning occurs in the context of an external nature which is always shifting the context in which the music is made. It might be the sound of the wind, or water, or traffic, computer sounds, or elevator music - our sonic environment is moving around us all the time. The musical sense may be the natural pattern-making response to this which seeks to produce coherence. If this is the case, then birdsong and the noises of all animals, and maybe even language itself, can be seen as a process of maintaining coherence of perception within an environment. This is a radical view when applied to language - it means that we don't communicate. We don't transfer "information" between us. As Niklas Luhmann says in his most famous quote,
"Humans cannot communicate; not even their brains can communicate; not even their conscious minds can communicate. Only communication can communicate."
He could be right. It's also quoted in Erich Hörl's new book "Sacred Channels: The archaic illusion of communication". Hörl follows a line of inquiry from Friedrich Kittler (who is also new to me) who argued that "media studies" needs to reject Marshall McLuhan's view that media extend the human; media makes the human. Gilbert Simondon said the same thing in connecting technology with human individuation. If there is a new theoretical way forwards for our thinking about technology, media and education, it rests with these people. Cybernetics is at the heart of it.

My music works with this idea where the electronic component of this piece represents the unstable shifting lifeworld of nature. Because this is "digital", we might think about it being not only the noise of the wind, but the noise of computers - digital information. The piano represents the musician's attempt to create pattern and maintain coherence in the whole. It is engaged in much ado about nothing.

Saturday, 1 June 2019

Augar's Intergenerational Conversation

"Education" as a topic is very complex and hard to define. We might think of schools, classrooms, teachers, but whatever we choose to include as "education" inevitably excludes something. This is the problem of making a distinction about anything - but it is exacerbated when we think of education. The exclusion/inclusion problem creates uncertainty, and this uncertainty has to be managed through a process which usually involves talking to each other. Since talking to each other about important things is something we do in education, the topic of "education" is uniquely caught in a web of conversation. At the beginning of my book, I quoted Everett Hughes, who I think gets it about right when he says that education is a "complex of arts" where:
"the manner of practicing them is the very stuff of the clash of wills and interests; thus, the stuff of politics."
This is the same confrontation of wills and interests that parents face with their children, that the younger generation faces with the older. But all the way through, it is conversation which is the process of negotiation.

Philip Augar's review of post-18 education funding has been fairly warmly received - partly because of the thoughtful tone it sets, and its modest reprimands against some of the more outrageous excesses of marketised higher education. However, as many commentators have pointed out, in cutting the headline fee for students, but increasing the repayment period, it is appears more socially regressive than the current system. The message hasn't changed: it is the job of job of students (the young) to pay for their education (pay for their elders to teach them) over the course of their lives, although it is recommended that the loan funding to pay for education may be available for more flexible study options. The rationale is that the young benefit from education financially.

This week I've been involved in two separate discussions about the future of work. That Artificial Intelligence and global data is going to transform the workplace is barely beyond doubt. Exactly what kind of impact it will have on opportunities for the young is as yet unclear. Will every automated service create an equivalent number of jobs in other areas? Will the growth of profits of large corporations which benefit from a falling salary bill trickle-down to those left behind in the rush to reduce expensive human labour? Or are we heading for a data-coordinated future of globalised gig-work at globalised rock-bottom wages? If this is the future for the young, who could blame them for questioning the fairness of the financial burden they bear for an education which turns out to fall short of the promises made by their universities?

This is how we depress the future. As Stafford Beer said (in an unpublished notebook):
"In a hundred years from any `now', everyone alive will be dead: it would therefore be possible for the human race to run its affairs quite differently - in a wise and benevolent fashion. Education exists to make sure this does not happen."
 What is AI? What is the Web? Are they technologies "for organising our affairs quite differently"? They could be. "In a wise and benevolent fashion"? Not currently, according to Tim Berners-Lee and many others, but they could be. Then we come to education. Beer is making a point about education's role in reproducing social class divisions, which Bourdieu famously explained. But education is conversation, and more importantly, an intergenerational conversation. Our technologies are tools which both afford the coordination of conversation, and create new kind of remarkable artefacts for us to talk about. And these conversations are intergenerational: to be able to summon-up movies, videos or documents on demand and watch/read them together, whether online or together in the living room with our kids is profound and powerful. Something very special happens in those conversations.

In these kinds of simple things - of the elders sharing resources and talking with the young - there is something very important that we've missed in our educational market. Teaching involves the revealing of one's understanding, and the existential need to teach may lie with the elders, not the young. The gains for the young to participate are not always obvious to them (or anyone else). Promises made by the elders to the young about future riches are not always believable, but behind them lies the desire of the elders to encourage the young and preserve humanity after the elders are dead. Successful companies understand the importance of supporting the next generation, and they don't do it for the future financial benefit of the young. They do it to preserve the viability of the business.

If the existential need is to teach, not for the young to learn for future financial gain, then the elders should pay the young to be taught, for them to reveal their understanding to the next generation before the elders die. Only seeing it this way round makes any sense looking into the future: the young will have their own children, they will become the elders, they will have an existential need to teach, and they will pay their young to learn. The spirit of encouragement drives one generation to the next.

Now look at what Augar has tweaked but otherwise left untouched. Despite some florid prose extolling the virtues of education, the underlying existential issue is financial gain for the young through the acquisition of knowledge and certificates. The elders (of whom Augar is one) are merely functionaries in delivering knowledge and certificates. The promise of financial gain will be broken amidst employment insecurity, rents, lifelong debt and inequality. They will look at the elders and see their big houses and long lifespans (damn it, they won't even die quickly and leave an inheritance!), and ask how it is that their hopes for the future were diminished. Their only respite will be to inflict a similar injustice on their own children as they mutter "there is no alternative". This is positive feedback: the spirit of despair infects one generation to the next.

Augar's report is thoughtful though, so I don't want to dismiss it. One of his targets is the breaking down of the monolith of the 3-year degree course, and reconfiguring the way the institution's transactions with its students work. This is good. But Corbyn was right about the financing of education and who should pay. It's not just an argument about one generation of students. It's an argument about a viable society. 

Thursday, 23 May 2019

Polythetic Analysis: Can we get beyond ethnography in education?

I had a great visit to Cambridge to see Steve Watson the other day - an opportunity to talk about cybernetics and education, machine learning, and possible projects. He also shared with me a great new book on cybernetics and communication about which I will write later - it looks brilliant: https://www.aup.nl/en/book/9789089647702/sacred-channels

One thing came up in conversation that resonated with me very strongly. It was about empirically exploring the moment-to-moment experience of education - the dynamics of the learning conversation, or of media engagement, in the flow of time. What's the best thing we can do? Well, probably ethnography. And yet, there's something which makes me feel a bit deflated by this answer. While there's some great ethnographic accounts out there, it all becomes very wordy: that momentary flow of experience which is beyond words becomes pages of (sometimes) elegant description. I've been asking myself if we can do better: to take experiences that are beyond words, and to re-represent them in other ways which allow for a meta-discussion, but which also are beyond words in a certain sense.

Of course, artists do this. But then we are left with the same problem as people try to describe what the artist does - in pages of elegant description!

This is partly why Alfred Schutz's work on musical communication really interests me. Schutz wanted to understand the essence of music as communication. In the process, he wanted to understand something about communication itself as being "beyond words". Schutz's descriptions are also a bit wordy, but there are some core concepts: "tuning-in to one another", "a spectrum of vividness of sense impressions", and most interestingly, "polythetic" experience. Polythetic is an interesting word - which has led me to think that polythetic analysis is something we could do more with.

If you google "polythetic analysis", you get an approach to data clustering where things are grouped without having any core classifiers which separate one group from another. This is done over an entire dataset. Schutz's use of polythetic is slightly different, because he is interested in the relations of events over time, where there is never any core classifier which connects one event to another, and yet they belong together because subsequent events are shaped by former events. I suppose if I want to distinguish Schutz from the more conventional use of polythetic, then it might be called "temporal polythetic" analysis.

While there are no core classifiers which distinguish events as belonging to one another, there is a kind of "dance" or "counterpoint" between variables. Schutz is interested in this dance. I've been working on a paper where the dance is analysed as a set of fluctuations in entropy of different variables. When we look at the fluctuations, patterns can be generated, much like the patterns below (which are from a Bach 3-part invention). The interesting question is whether one person's pattern becomes tuned-in to another person's. If it is possible to compare the patterns of different individuals over time then it is possible to have a meta-conversation about what might be going, to compare different experiences and different situations. In this way, a polythetic comparison of online experience versus face-to-face might be possible, for example, or a comparison of watching different videos.

So in communication, or conversation, there are multiple events which occur over time: Schutz's "spectrum of vividness" of sense impressions. As these events occur, and simultaneously to them, there is a reflective process whereby a model which anticipates future events is constructed. This model might be a bit like the fractal-like pattern shown above. In addition to this level of reflection, there is a further process whereby there are many possible models, many possible fractals, that might be constructed: a higher level process requires that the most appropriate model, or the best fit, is selected. 

Overall this means that Schutz's tuning-in process might be represented graphically in this way:

This diagram labels the "flow of experience" as "Shannon redundancy" - the repetitive nature of experience, the reflexive modelling process as "incursive", and the selection between possible models as "hyperincursive" (this is following the work on anticipatory systems by Daniel Dubois). 

Imagine if we analysed data from a conversation: everything can have an entropy over time - the words used, the pitch of the voice, the rhythm of words, the emphasis of words, and so on. Or imagine we examine educational media, we can examine the use of camera shots, or slides changing, or words on the screen, and spoken words. Our experience of education and media is all contrapuntal in this way.

Polythetic analysis presents a way in which the counterpoint might be represented and compared in a way that acts as a kind of "imprint" of meaning-making. While ethnography tries to articulate the meaning (often using more words than was in the initial situation), analysing the imprint of the meaning may enable us to create representations of the dynamic process, to make richer and more powerful comparisons between different kinds of experience.

Wednesday, 8 May 2019

Bach as an anticipatory fractal - and thoughts on computer visualisation

I've got to check that I've got this right, but it seems that an algorithmic analysis I've written of a Bach 3-part invention reveals a fractal. It's based on a table of entropies for different basic variables (pitch, rhythm, intervals, etc). An increase in entropy is a value for a variable "x", where a decrease in entropy is a value for "not-x". Taking the variables as A, B, C, D, etc, there is also the values for the combined entropies of AB (and not-AB), AC, BC, etc. And also for ABC, ABD, BCD, and so on.

The raw table looks a bit like this:
But plotting this looks something like this:

What a fascinating thing that is! It should be read from left to right as an index of increasing complexity of the variables (i.e. more combined variables), with those at the far left the simplest basic variables. From top to bottom is the progress in time of the music. 

My theory is that music continually creates an anticipatory fractal, whose coherence emerges over time. The fractal is a selection mechanism for how the music should continue. As the selection mechanism comes into focus, so the music eventually selects that it should stop - that it has attained a coherence within itself. 

Need to think more. But the power of the computer to visualise things like this is simply amazing. What does it do to my own anticipatory fractal? Well, I guess it is supporting my process of defining my own selection mechanism for a theory!

Tuesday, 7 May 2019

"Tensoring" Education: Machine Learning, Metasystem and Tension

I've been thinking a lot about Buckminster-Fuller recently, after I gave a talk to architecture students about methods in research (why does research need a method?). One of the students is doing an interesting research project on whether tall buildings can be created in hot environments which don't require artificial cooling systems. The tall building is a particular facet of modern society which is overtly unsustainable: we seem only to be able to build these monoliths and make them work by pumping a huge amount of technology into their management systems. Inevitably, the technology will break down, or become too expensive to run or maintain. One way of looking at this is to see the tall building as a "system", which makes its distinction between itself and its environment, but whose distinction raises a whole load of undecidable questions. Technologies make up the "metasystem" - the thing that mops up the uncertainty of the building and keeps the basic distinction it made intact.  Overbearing metasystems are the harbinger of doom - whether they are in a passenger plan (the Boeing 737 Max story is precisely a story about multiple levels of overbearing metasystems), in society (universal credit, surveillance), or in an institution (bureaucracy).

Buckminster Fuller made the distinction between "compression" and "tension" in architecture. We usually think of building in terms of compression: that means "stuff" - compressed piles of bricks on the land. His insight was that tension appeared to be the operative principle of the universe - it is the tension of gravity, for example, that keeps planets in their orbit. Fuller's approach to design was one of interacting and overlapping constraints. This is, of course, very cybernetic, and the geodesic dome was an inspiration to many cyberneticians - most notably, Stafford Beer, who devised a conversational framework around Fuller's geometical ideas called "syntegrity".

In education too, we tend to think of compressed "stuff": first there are the buildings of education - lecture halls, libraries, labs and so on. Today our "stuff"-focused lens is falling on virtual things - digital "platforms" - MOOCs, data harvesting, and so on, as well as the corporate behemoths like Facebook and Twitter. But it's still stuff. The biggest "stuff" of all in education is the curriculum - the "mass" of knowledge that is somehow (and nobody knows exactly how) transferred from one generation to the next. Fuller (and Beer) would point out that this focus on "stuff" misses the role of "tension" in our intergenerational conversation system.

Tension lies in conversation. Designing education around conversation is very different from designing it around stuff. Conversation is the closest analogue to gravity: it is the "force" which keeps us bound to one another. As anyone who's been in a relationship breakdown knows - as soon as the conversation stops, things fall apart, expectations are no longer coordinated, and the elements that were once held in a dynamic balance, go off in their different directions. Of course, often this is necessary - it is part of learning. But the point is that there is a dynamic: one conversation breaks and another begins. The whole of society maintains its coherence. But our understanding of how this works is very limited.

Beer's approach was to make interventions in the "metasystems" of individuals. He understood that the barriers to conversation lay in the "technologies" and "categories" which each of us has built up within us as a way of dealing with the world. Using Buckminster Fuller's ideas, he devised a way of disrupting the metasystem, and in the process, open up individuals to their raw uncertainty. This then necessitated conversation as individuals had to find a new way to balance their inner uncertainty with the uncertainty of their environment.

The design aspect of tensored education focuses on the metasystem. Technology is very powerful in providing a context for people to talk to each other. However, there is another aspect of "tensoring" which is becoming increasingly important in technology: machine learning. Machine learning's importance lies in the fact that it is a tensored technology: it is the product of multiple constraints - much like Buckminster-Fuller's geodesic dome. The human intelligence that machine learning feeds on is itself "tensored" - our thoughts are, to varying extents - ordered. Expert knowledge is more ordered in its tensored structure than that of novices. Machine learning is able to record the tensoring of expert knowledge.

When devising new ways of organising a tensored education, this tool for coordinating tension in the ordering of human understanding, and avoiding "compression" may be extremely useful.

Sunday, 28 April 2019

How the Roli Seaboard is changing the way I think about music

I am making very weird noises at the moment. Partly encouraged by a richly rewarding collaboration with John Hyatt and Mimoids (see https://www.facebook.com/john.hyatt.9210/videos/10212046977604430/), a digital musical instrument - the Roli Seaboard - is becoming my favoured mode of musical expression. A year ago, I would have thought that highly improbable. For me, nothing could touch the sensitivity, breadth of expression and sophistication that is possible with an acoustic piano - if you have the technique to do it. Having said that, I do wonder if we've run out of ideas within that medium.

Part of the problem with contemporary music is that the only way forwards is towards greater complexity. And with greater complexity sometimes comes a barrier with people: music becomes "clever" or "difficult" and we lose something of what matters about the whole thing in the first place.

While I've been thinking about this, I've also been thinking about what music really is in the first place. Why do I have some kind of "soundtrack" running in my head all the time? What's going on? Is it connected to the way I make sense of the world?

Music's profound quality arises from redundancy. That's interesting because it raises the question as to why my cognitive system has to continually generate redundancy. The interesting thing is that redundancy can create coherence. So maybe that continual soundtrack is simply my consciousness making sense of the chaos around me. I'm beginning to wonder about this with regard to all communicative musicality.. even learning conversations: they seem to arise from some profound need to make sense of things - and not just by learners, but by teachers too.

This also helps to explain why class music lessons in school are often terrible. Attempting to rationally codify the very thing that we use all the time to make sense of the world is likely to result in some kind of adverse reaction.

In a complex world, simplicity is important. Which brings me back to contemporary music. Not that we want to create simple music and put it on the pedestal of high art. But we need to express something of what music does to us, and perhaps to understand how it works better. The piano is a sophisticated and delicate instrument which can make simple things sound interesting. But the Roli Seaboard is an instrument which expresses ambiguity, complexity and variety in a way that the piano cannot. To me, the Seaboard sounds like the world around us - the noisy world of loudspeakers, garish colours, and distraction. The Seaboard is context, and it creates a frame for our simpler and more traditional forms of music to reveal what they really do for us: to create coherence, and (in terms of collective singing) conviviality.

Saturday, 27 April 2019

Tradition, Redundancy and Losing the Way

This week there was a rare opportunity to hear Michael Tippett's piano concerto in Manchester (it's rare anywhere) with Steven Osborne playing (who was a fellow student with me at Manchester University in the late 80s). I hadn't heard the Tippett for years - it's incredibly radiant and warm music. Another composer, John McCabe, said something fascinating about him: "I find Tippett's music tends to make me feel better" (see https://www.youtube.com/watch?v=NoS22TCM-7Q) I agree, and Tippett was very conscious that he was attempting to do something physiological with sound (he got this from Vincent D'Indy - see https://dailyimprovisation.blogspot.com/2012/01/vincent-dindy-and-breath-of-music.html). This, in his mind, was deeply connected to social concerns and emancipation, as well as to depth psychology. Jung and T.S. Eliot were profound influences.

Both these issues have been on my mind. On the day of the concert I had had a job interview (the first for a long time), which although I didn't get the job, prompted a fascinating discussion about individuation, both from a Jungian perspective and from that of Simondon. But during the concert I was thinking about the ritual of playing music, and returning to music from many years ago, and thinking about Eliot's famous essay "Tradition and the Individual Talent", which I had first got to know at Manchester with Tippett's biographer.

The whole arts world is a kind of ritual, seeming to preserve an elite social order. When that order is challenged - for example, by an 850 year old cathedral burning down - the human reaction seems irrational - but its elite nature is clear for all to see. The irony is that great art - and Tippett was a visionary artist - is made in the spirit of challenging the social order (he was also a Marxist). His piano concerto is a superb case-in-point: unlike any other concerto, it is anti-heroic. Few pianists would take it on because it doesn't put them in the spotlight. Audiences are disoriented because their expectations are frustrated by a fiendishly difficult piano part which causes the soloist to work very hard, but which remains veiled behind a collective radiant wall of sound. For most of it, they are an accompaniment, or a catalyst. Tippett was making a statement: one that is echoed in Eliot's essay -
The emotion of art is impersonal. And the poet cannot reach this impersonality without surrendering himself wholly to the work to be done. And he is not likely to know what is to be done unless he lives in what is not merely the present, but the present moment of the past, unless he is conscious, not of what is dead, but of what is already living.
Steven Osborne and Andrew Davies take this on because they understand this and believe in it. But there are contradictions (even in upholding them as "champions"!) Even in the wonderful performance in Manchester, I wondered if the point was lost on most of the audience. How do we get the point across about accompaniment or catalysis in a world which fetishises the individual achievement? Another way of asking this is to say "How do we see relations and the conversation as more important than the individual?" This was really what I talked about in my interview. And I have reflected on it more as I have thought that most of what I have done - in academia and in music - was catalysis.

But there are deeper questions about ritualised tradition. If one were to compress the years since the composition of Beethoven's 5th symphony, and examine the many millions of performances, then the ritualised repetition whereby people gather together and re-perform a set of instructions looks full of redundancy. Is redundancy the basis of tradition?

Redundancy is the basis of so much communication, from the crying of a baby, to the squawks of crows or music itself. Teaching depends on the redundancy of saying the same thing many different ways. Like playing Beethoven 5 in different ways (but rarely that different - apart from this... https://www.youtube.com/watch?v=wOiBlL9pHMw). What is it? What's going on?

My speculation is that the world is a confusing place. All living things struggle to bring coherence to it - and they do this through conversation. We are thrown into conversation from birth. Through conversation, living things negotiate the differences between the different distinctions they make. Although we see those agreed distinctions - like words in a language - as "information", the really important thing is the redundancy that sits in the background of the process that makes it. Its the redundancy that brings coherence - just as the redundancy of Beethoven's motifs gives form to his symphony.

To accompany, to catalyse, we have to see the redundancy that needs to be added to bring coherence. I think this is really what teachers do. Its actually the opposite of "information". What Eliot describes as "surrendering to the work to be done" is the process of identifying the redundancy that needs to be created. In Gregory Bateson's terms, it is identifying the "Pattern that connects". The ritual of teaching and the ritual of performance of tradition are all about the coherence of our civilisation. There's something profoundly necessary about it, and yet within it are dangers which can produce incoherence.

To lose one's way is to lose sight of the process of creating redundancy, of catalysing ongoing conversations. This can happen if we codify the products of a previous age to the point that we believe that merely repeating these "products" - the information - will maintain civilisation. It will instead do the opposite. That's why Tippett's message - and his example - is important. It's not the figure; it's the ground - the earth - our shared context.

Monday, 15 April 2019

Kaggle and the Future University: Learning. Machine. Learning.

One of the most interesting things that @gsiemens pointed out the other day in his rant about MOOCs was that people learning machine learning had taught themselves through downloading datasets from Kaggle (http://kaggle.com) and using the now abundant code examples for manipulating and processing these datasets with the python machine learning libraries which are also all on GitHub, including tensorflow and keras. Kaggle itself is a site for people to engage in machine learning competitions, for which it gathers huge datasets on which people try out their algorithms. There are now datasets for almost everything, and the focus of my own work on diabetic retinopathy has a huge amount of stuff in it (albeit a lot of it not that great quality). There is an emerging standard toolkit for AI: something like Anaconda with a Jupyter notebook (or maybe PyCharm), and code which imports ternsorflow, keras, numpy, pandas, etc. Its become almost like the ubiquity of setting up database connectors to SQL and firing queries (and is really the logical development of that).

Whatever we might think of machine learning with regard to any possibility of Artificial Intelligence, there's clearly something going on here which is exciting, increasingly ubiquitous, and perceived to be important in society. I deeply dislike some aspects of AI - particularly its hunger for data which has driven a surveillance-based approach to analysis - but at the same time, there is something fascinating and increasingly accessible about this stuff. There is also something very interesting in the way that people are teaching themselves about it. And there is the fact that nobody really knows how it works - which is tantalising.

It's also transdisciplinary. Through Kaggle's datasets, we might become knowledgeable in Blockchain, Los Angeles's car parking, wine, malaria, urban sounds, or diabetic retinopathy. The datasets and the tools for exploring them are foci of attention: codified ways in which diverse phenomena might be perceived and studied through a coherent set of tools. It may matter less that those tools are not completely successful in producing results - but they do something interesting which provides us with alternative descriptions of whatever it is we are interested in.

What's missing from this is the didacticism of the expert. What instead we have are algorithms which for the most part are publicly available, and the datasets themselves, and a question - "this is interesting... what can we make of it?"

We learn a lot from examining the code of other people. It contains not just a set of logic, but expresses a way of thinking and a way of organising. When that way of thinking and way of organising is applied to a dataset, it also expresses a way of ordering phenomena.

Through my diabetic retinopathy project, I have wondered whether human expertise is ordinal. After all, what do we get from a teacher? If we meet someone interesting, it's tempting to present them with various phenomena and ask them "What do you think about this?". And they might say "I like that", or "That's terrrible!". If we like them, we will try to tune our own judgements to mirror theirs. The vicarious modelling of learning seems to be something like an ordinal process. And in universities, we depend on expertise being ordinal - how else could assessment processes run if experts did not order their judgements about student work in similar ways?

The problem with experts is that when expertise becomes embodied in an individual it becomes scarce, so universities have to restrict access to it. Moreover, because universities have to ensure they are consistent in their own judgement-making, they do not fully trust individual judgement, but organise massive bureaucracies on top of it: quality processes, exam boards, etc.

Learning machine learning removes the embodiment of the expertise, leaving the order behind. And it seems that a lot can be gained from engaging with the ordinality of judgements on their own. That seems very important for the future of education.

I'm not saying that education isn't fundamentally about conversation and intersubjective engagement. It is - face-to-face talk is the most effective way we can coordinate our uncertainty about the world. But the context within which the talking takes place is changing. Distributing the ordinality of expert judgements creates a context where talk about those judgements can happen between peers in various everyday ways rather than simply focusing on the scarce relation between the expert and the learner. In a way, it's a natural development from the talking-head video (and it's interesting to reflect that we've haven't advanced beyond that!). 


Every improvisation I am making at the moment is dominated by an idea about the nature of reality as being  a hologram, or fractal. So the world isn't really as we see it: it's our cells that make us perceive it like that, and it's our cells that make us perceive a "me" as a thing that sees the world in this way.

This was brought home to me even more after a visit to the Whitworth gallery's wonderful exhibition of ancient Andean textiles. They were similar to the one below (from Wikipedia)

It's the date which astonishes: sometime around 200CE. Did reality look like this to them? I wonder if it might have done.

This music is kind-of in one key. It's basically just a series of textures and slides (which are meant to sound like traffic) that embellish a fundamental sound. I like to think that each of these textures overlays some fundamental pattern with related patterns at different levels. The point is that all these accretions of pattern produces a coherence through producing a fractal.

Saturday, 13 April 2019

Comparative Judgement, Personal Constructs and Perceptual Control

The idea that human behaviour is an epiphenomenon of the control of perception is an idea associated with Bill Power's "Perceptual Control Theory", which dates back to the 1950s. Rather than human consciousness and behaviour being "exceptional", individual, etc, it is rather seen as the aggregated result of the interactions of a number of subsystems, of which the most fundamental is the behaviour of the cell. So if our cells are organising themselves according to the ambiguity of their environment (as John Torday argues), and in so doing are "behaving" so as to maintain homeostasis with their environment by producing information (or neg-entropy), and reacting to chemiosmotic changes, then consciousness and behaviour (alongside growth and form) is the epiphenomenal result.

So when we look at behaviour and learning, and look back towards this underlying mechanism, what do we see? Fundamentally, we see individuals creating constructs: labels with which individuals deal with the ambiguity and uncertainty of the world. But what if the purpose of the creation of constructs is analogous to the purpose of the cell: to maintain homeostasis by producing negentropy and reacting to chemiosmosis (or perhaps noise in the environment)?

We can test this. Presenting individuals with pairs of different stimuli and asking them which they prefer and why is something that comparative judgement software can do. It's actually similar to the rep-grid analysis of George Kelly, but rather than using 3 elements, 2 will do. Each pair of randomly chosen stimuli (say bits of text about topics in science or art), are effectively ways of stirring-up the uncertainty of the environment. This uncertainty then challenges the perceptual system of the person to react. The "construct", or the reason for one choice or another, is the person's response to this ambiguity.

The interesting thing is that as different pairs are used, so the constructs change. Moreover, the topology of what is preferred to what also gradually reveals contradictions in the production of constructs. This is a bit like Power's hierarchies of subsystems, each of which is trying to maintain its control of perception. So at a basic level, something is going on in my cells, but as a result of that cellular activity, a higher-level system is attempting to negotiate the contradictions emerging from that lower system. And then there is another higher level system which is reacting to that system. We have layers of recursive transduction.

It's interesting to reflect on the logic of this and compare it to our online experience. Our experience of Facebook and the media in general is confusing and disabling precisely because the layers of recursive transduction are collapsed into one. Complexity requires high levels of recursion to manage it, and most importantly, it requires the maintenance of the boundaries between one layer of recursion and another. From this comes coherence. Without this, we find ourselves caught in double-binds, where one layer is in conflict with another, with no capacity to resolve the conflict at a new level of recursion.

If we want to break the stranglehold of the media on our minds, we need new tools for the bringing of coherence to our experiences. I wonder if were we to have these tools, then self-organised learning without institutional control becomes a much more achievable objective.

Tuesday, 9 April 2019

The OER, MOOC and Disruption Confusion: Some thoughts about @gsiemens claim about MOOCs and Universities

George Siemens made a strong claim yesterday that "Universities who didn't dive into the MOOC craze are screwed". Justifying this by acknowledging that although the MOOC experiment up until now has not been entirely successful, the business of operating at scale in the environment is the most important thing for universities. The evidence he points to is that many machine learning/big data experts taught themselves how to do it through online resources. Personally, I can believe this is true. George's view prompted various responses among many who are generally hostile to the "disruption" metaphor of technology in education (particularly MOOCs), but the most interesting responses suggested that the real impact of the MOOC was on OER, and that open resources were the most important thing.

I find the whole discussion around disruption, MOOCs and OER very confusing. It makes me think that these are not the right questions to be asking. They all seem to view whatever activities which happen online between individuals and content through the lens of what happens in traditional education:
Disruption = "hey kids, school's closed. Let's have a lesson in the park!";
MOOCs = "Hey kids, we're going to study with 6 million other schools today!";
OER = "Hey kids, look a free textbook!". 
The web is different in ways which we haven't fathomed yet. It's obvious now that this difference is not really being felt directly in education: as I have said in my book, and Steve Watson said the other day in Liverpool (see https://www.eventbrite.co.uk/e/cybernetics-and-de-risking-the-universities-talkdiscussion-tickets-59314680807#), education is largely using technology to maintain its existing structures and practices. But the difference is being felt in the workplace, in casualisation, among screen-addicted teens, and in increasingly automated industries which would once have provided those teens with employment.

The web provides multiple new options for doing things we could do before. The free textbook is a co-existing alternative to the non-free textbook; the MOOC is a not-too-satisfying but co-existing alternative to expensive face-to-face education. What we have seen is an explosion of choice, and an accompanying explosion of uncertainty as we attempt to deal with the choice. Our institutions and technology corporations have both been affected by the increase in uncertainty.

What we are now discovering in the way we use our electronic devices provides a glimpse into how our consciousness deals with uncertainty and multiplicity. On the surface, it doesn't look hopeful. We appear to be caught in loops of endless scrolling, swiping and distraction. But what we do not see is that this pathological behaviour is the product of a profit-driven model which demands that tech companies increase the number of transactions that users have with their tools: their share-prices move with those numbers. Every new aspect of coolness, from Snapchat image filters to Dubsmash silliness and VR immersive environments, serves to increase the data flows. Our tech environment has become toxic resulting in endless confusion and double-binds. But we are told a lie: that technology does this. It doesn't. Corporations do this, because this is the way you make money in tech - by confusing people. It unfortunately, is also the way universities are increasingly operating. Driven by financial motives, they have become predatory institutions. Deep down, everything has become like this because turning things into money is a strategy for dealing with uncertainty.

All human development involves bringing coherence to things. It is, fundamentally, a sense-making operation. Coherence takes a multiplicity of things and orders them in a deeper pattern. Newman put it well:
"The intellect of man [...] energizes as well as his eye or ear, and perceives in sights and sounds something beyond them. It seizes and unites what the senses present to it; it grasps and forms what need not have been seen or heard except in its constituent parts. It discerns in lines and colours, or in tones, what is beautiful and what is not. It gives them a meaning, and invests them with an idea. It gathers up a succession of notes into the expression of a whole, and calls it a melody; it has a keen sensibility towards angles and curves, lights and shadows, tints and contours. It distinguishes between rule and exception, between accident and design. It assigns phenomena to a general law, qualities to a subject, acts to a principle, and effects to a cause." 
This is what consciousness really does. What Newman doesn't say is that the means by which this happens is conversation. And this is where the web we have falls down. It instead acts as what Stafford Beer called an "entropy pump" - sowing confusion. The deeper reasons for this lie in fundamental differences between online and face-to-face conversation, which we are only beginning to understand. But we will understand them better in time.

I find myself agreeing with Siemens. I do not think that the traditional structures of higher education will survive a massive increase in technology-driven uncertainty. In the end, it will have to change into something more flexible: we will dispense with rigid curricula and batch-processing of students. Maybe the MOOC experiment has encouraged some to think the unthinkable about institutional organisation. Maybe.

A university, like any organism, has to survive in its environment. They are rather like cells, and like cells, they evolve by absorbing aspects of the environment within their own structures (those mitochondria were once independently existing). In biology this is endosymbiosis. That is how to survive - to embrace and absorb. Technology is also endosymbiotic in the sense that it has embraced almost every aspect of life. It feels like we are in something of a stand-off between technology and the university, where the university is threatened and as a result is putting up barriers, reinforced by "market forces". This is also where our current pathologies of social media are coming from. Adaptation will not come from this.

Creating and coordinating free interventions in the environment is at least a way of understanding the environment better. Personally, I think grass-roots things like @raggeduniversity also are important. MOOCs were an awkward way of doing this. But the next wave of technology will do it better, and eventually I think they will create the conditions whereby human consciousness can create coherence from conversations within the context of uncertainty in the challenging world of AI and automation that it finds itself in. 

Sunday, 7 April 2019

Natural information processing and Digital information processing in Education

In my book, I said:

"Educational institutions and their technology are important because they sit at the crossroads between the ‘natural’ computation which is inherent in conversation and our search for knowledge, and the technocratic approach to hierarchy which typifies all institutions."

I don’t mean to say that education institutions have to be hierarchical – but they clearly are. Nor do I mean to say that they have to be technocratic – but, increasingly and inevitably, they clearly are. It’s more about a distinction between the kind of “computing” that goes on in technocratic hierarchies, and the kind of “computing” that goes on in individuals as they have conversations with one another. And education seems to have to negotiate these two kinds of computing.

Without conversations, education is nothing. Without organisation, coherent conversation is almost impossible.

It’s as if one form of information – the information of the computer, of binary choice, or logistical control – has to complement the information of nature, organic growth and emotional flux. When the balance is right, things work. At the moment, the technocratic idea of information and its technologies dominate, squeezing out the space for conversation. And that’s why we are in trouble.
We know how our silicon machines work (although we may be a bit confused by machine learning!), but we don’t know how “natural” computing works. But we have some insights.

Natural computing seems to work on the basis of pattern – or, in information theoretical terms, redundancy. Only through the production of pattern do things acquire coherence in their structure. And without coherence, nothing makes sense: “can you say that again?”… We do this all the time as teachers – we make redundancy in our learning conversations.

Silicon, digital, information conveys messages in the form of bits, and while redundancy is a necessary part of that communication process, it is the “background” to the message. It is, in the simplest way, the “0” to the information’s “1”.

So is natural computing all about “0”? Is it “Much ado about nothing”? I find this an attractive idea, given that all natural systems are born from nothing, become something, and eventually decay back to nothing again. A sound comes from silence, wiggles around a bit, and then fades to silence again. All nature is like this. The diachronic structure of nature is about nothing.

Moreover, in the universe, Newton’s third law tells us that the universe taken as a totality must essentially be nothing too. There may have been a “big bang”, but there was also an “equal and opposite reaction”. Somewhere. And this is not to say anything about spiritual practices which are amongst always focused on nothingness.

When we learn and grow, we do so in the knowledge that one day we will die. But we do this in the understanding that others will live and die after us, and that how we convey experience from one generation to the next can help to keep a spirit of ongoing life alive.

Schroedinger’s “What is Life” considered that living systems exhibit negative entropy, producing order, and working against the forces of nature which produce entropy or decay. I think this picture needs refinement. Negative entropy can be “information” which in Shannon’s sense is measured in “bits” – or 1s and 0s. But negative entropy may also be an “order of nothings”. So life is an order of nothing from which an order of bits is an epiphenomenon?

Our “order of bits” has made it increasingly difficult to establish a coherent order of nothing. Our digital technologies have enforced an “order of bits” everywhere, not least in our educational institutions. But the relationship between digital information and natural information can be reorganised. Our digital information may help us to gain deeper insight into how natural information works.

To do this, we must turn our digital information systems on our natural information systems, to help us steer our natural information systems more effectively. But the key to getting this to work is to use our digital technologies to help us understand redundancy, not information.

Techniques like big data focus on information in terms of 1s and 0s: they take natural information systems and turn them into digital information. This is positive feedback, from which we seek an “answer” – a key piece of information which we can then make a decision. But we are looking for the wrong thing in looking for an answer. We need instead to create coherence.

Our digital information may be turned to identify patterns in nature: the mutually occurring patterns at different levels of organisation. It can present these things to us not in the form of an answer, but in the form of a more focused question. Our job, then, is to generate more redundancy – to talk to each other, to do more analysis – to help to bring coherence to the patterns and questions which are presented to us. At some point, the articulation of redundancies will bring coherence to the whole living system.

I think this is what we really need our educational technology to do. It is not about learning maths, technology, science, or AI (although all those things may be the result of creating new redundancies). It is about creating ongoing coherence in our whole living system.

Wednesday, 27 March 2019

@NearFutureTeach Scenarios and Getting many brains to think as one

I went to the final presentation of the Near Future Teaching project at Edinburgh yesterday. I've been interested in what's happening at Edinburgh for a while because it looked to me like a good way of getting teachers and learners to talk together about teaching and to think about the future. As with many things like this, the process of doing this kind of project is all important - sometimes more important than the products.

I'm familiar with a scenario-based methodology because this is what we did on the large-scale iTEC project (see http://itec.eun.org/web/guest;jsessionid=491AB3788AEB8152821138272A35C5E4) which was coordinated by European Schoolnet. Near Future Teaching has followed a similar plan - identification of shared values, co-design of scenarios, technological prototyping/provoking (using what they neatly called "provo-types"). iTEC took its technological prototypes a bit more seriously, which - on reflection - I think was a mistake (I wrote about it here: https://jime.open.ac.uk/articles/10.5334/jime.398/).

During iTEC I wasn't sure about scenario-building as a methodology. It seemed either too speculative or not speculative enough, where the future was imagined as seen using lenses through which we see the present. We're always surprised by the future, often because it involved getting a new set of lenses. I was talking to a friend at Manchester university on Monday about how theologians/religious people make the best futurologists: Ivan Illich, Marshall McLuhan, C.S. Lewis (his "Abolition of Man" is an important little book), Jacques Ellul. Maybe its because the lens that allows you to believe in God is very different to the lens that looks at the world as it is - so these people are good at swapping lenses.

After Near Future Teaching, I'm a bit more enthusiastic about scenarios. I spoke to a primary school teacher who was involved in the project, and we discussed the fact that nobody is certain about the future. Uncertainty is the great leveller, teachers and learners are in the same boat, and this is a stimulus for conversation and creativity. Its not a dissimilar idea to this: https://dailyimprovisation.blogspot.com/2018/10/transforming-education-with-science-and.html

But then there is something deeper about this kind of process. Uncertainty is a disrupter to conventional ways of looking at the world. Each of us has a set of categories or constructs through which we view the world. Sometimes the barriers to conversation are those categories themselves, and making interventions which loosen the categories is a way of creating new kinds of conversation. Introducing "uncertain topics" does this.

In his work on organisational decision-making, Staffford Beer did a similar thing with his "syntegration" technique. That involved emerging issues in a group, and then organising conversations which deliberately aimed to destabilise any preconceived ways of looking at the world. Beer aimed to create a "resonance" in the communications within the group as their existing categories were surrendered and new ones formed in the context of conversation. The overall aim was to "get many brains to think as one brain". Given the disastrous processes of collective decision which we are currently witnessing, we need to get back to this!

Having said this, there's something about the whole process which IS teaching itself. That leads me to think that Near Future Teaching is closely aligned to the methods of Near Future Teaching. Maybe the scenarios can be dispensed with, almost certainly we have to rethink assessment, we have to rethink the curriculum and the institutional hierarchy, but the root of it all is conversation which disrupts existing ways of thinking and established coherence within a group.

If we had this in education, Brexit would just be a cautionary tale.

Sunday, 24 March 2019

Human Exceptionalism and Brexit Insanty

Why have we managed to tie ourselves in knots? It's (k)not just over Brexit. It's over everything - austerity, welfare, tax, university funding, climate change, the point of education...

Following on from my last post, a thought has been niggling me: is it because we think human consciousness is exceptional?  Is our belief in the exceptionalism of consciousness in the human brain stopping us from seeing ourselves as part of something bigger? The problem is that as soon as we see ourselves as something special, that our consciousness is somehow special, we consider that one person's consciousness is more special than another. Then we hold on to our individual thoughts or "values" (they're a problem too) and see to it that the thoughts and values of one person must hold out against the thoughts and values of another. Is it because consciousness is not exceptional that this creates a terrible mess?

If consciousness is not exceptional, what does it do? What is its operating principle?

In my book, Uncertain Education, I aargued that "uncertainty" was the most useful category through which to view the education system. I think uncertainty is a good category to view an unexceptional consciousness too. Consciousness, I think, is a process which coordinates living things  in managing uncertainty. It is a process which maintains coherence in nature.

This process can be seen in all lifeforms from cells to ants to humans. What we call thinking is an aggregate of similar processes among the myriad of cellular and functionally differentiated components from which we are made, and which constitute our environment. The brain is one aggregation of cells which performs this role. It is composed of cells managing their uncertainty, and the aggregate of their operation and co-operation is what we think is thinking. Really, there's a lot of calcium and ATP which is pumped around. That's the work our cells do as they manage their uncertainty.

The same process occurs at different levels. The thing is fractal in much the same way that Stafford Beer described his Viable System Model. But we know a lot more about cells now than Beer did.

But what is the practical utility of a cellular view of consciousness?

Understanding that cells are managing uncertainty is only the beginning. More important is to realise that organisms and their cells have developed ("evolved") by absorbing parts of their environment as they have managed their uncertainty over history. This absorption of the environment helps in the process of managing environmental uncertainty: uncertainty can only be managed if we understand the environment we are in. Importantly, though, each stage of adaptation entails a new level of accommodation with the environment: we move from one stable state to the next "higher" level. You might imaging a "table" of an increasingly sophisticated "alphabet" of cellular characteristics and capacities to survive in increasingly complex environments.

The cellular activity of "thinking", like all processes of adaptation, occurs in response to changes in the environment. It may be that an environment once conducive to higher-level "thought" becomes constrained in a way that cells are forced to a previous, more simple, state of organisation in order to remain viable. It's a kind of regression. The kind that we see with intelligent people at the moment, paralysed by Brexit. In history, it is the thing that made good people do bad things in evil regimes. We become more primitive. Put a group of adults in a school classroom, and they will start to behave like children....!

Understanding this is important because we need to know how to go the other way - how to produce the conditions for increasing sophistication and richer adaptiveness. That is education's job. It is also the politician's job. But if we have a mistaken idea about consciousness, we are likely to believe that the way to increase adaptiveness is to do things which actually constrain it. This is austerity, and from there we descend back into the swamp.

Saturday, 16 March 2019

Depth in Thought: Cosmological perspectives

Jenny Mackness is writing some great blog posts on Iain McGilchrist at the moment. Her post today is on the dynamic relationship between what composer Pauline Oliveros called "attention" and "awareness", and McGilchrist's take on this. As Jenny points out, this is not an idea unique to McGilchrist, and others - particularly Marion Milner, who she mentions - have had a similar insight. Her previous post was on "depth" (https://jennymackness.wordpress.com/2019/03/07/the-meaning-of-depth-and-breadth-in-education/) and this is what I want to focus on.

McGilchrist's argument is based on a kind of updated bicamerality - not the rather crude distinctions about the "rational" left and "artistic" right, but a more sophisticated articulation of the way that attention and awareness work together. More importantly, he has pursued the social implications of his theory, suggesting that as a society we have created an environment within which attention is rewarded - particularly in the form of technology - and awareness and contemplation are confined to the shadows. There's a great RSA animate video of his ideas here:

There's much I agree with here. But something unnerves me in a similar way to previous theories of bicamerality like that of Julian Jaynes. Behind them all is the assumption that human consciousness is exceptional.

The problem is that "human exceptionalism" as biologist John Torday calls it, is a pretty devastating thing for the environment of everything - not just us. We think we're so great, so we have the arrogance to believe we know how to "fix" our problems. So we try to fix our problems - to treat our human problems as if they were technical problems (McGilchrist might say, to render the world in terms of the left hemisphere). And it doesn't work. It makes things worse. As an educational technologist, I see this every day. And I think if there is a "turn" in educational technology, it is that we once believed we could fix our problems with technology. Now we see that we've just made everything more complicated.

What if consciousness is not exceptional? We would first have to decide where it came from. Brains? Can we rule out consciousness in bacteria or plants? Eventually, we arrive at the cell. Brains are made from cells. In fact, recent research (which I know that Antonio Damasio among others, has been heavily involved in) in unpicking neural communication mechanisms has discovered that non-synaptic communication exists alongside communication along what we have always imagined to be a dendritic "neural network".

Cells talk to each other all throughout nature. The way they talk concerns a process which is characterised as transduction: the balancing of messages and protein expression by DNA inside the cell, with the reception to other proteins on the surface of the cell in its environment. I find this fascinating because these transduction processes looks remarkably like the psychodynamics of Freud and Jung. Is there a connection? Does our thinking go to the heart of our cells? (Or the cells of our heart?)

But there's more to this. One of the great mysteries of the cell is how it came to be as it is. Lynne Margulis's endosymbiotic theory suggests that all those mitochondria were once independent elements in the environment. Somehow an earlier version of the cell "decided" that it could organise itself better if it included those mitochrondria within its own structure. At an evolutionary level, cooperation took the place of competition. As a basic principle, Torday argues that cells have always organised themselves according the ambiguity of their environment. Consciousness is an emergent phenomenon arising from this process.

Each evolutionary stage moves from one state of homeostasis with the environment to another. Somehow, evolutionists tell us, we were once fish. Something happened to the swim bladder of the fish that turned it into the breathing organ we have in our chests. There must have been some kind of crisis which stimulated a fundamental change to cellular organisation.... and it stuck. Our conscious cells contain a myriad of vestigial fossils, of which the oldest is probably the cholesterol which allows my fingers to do this typing, and allows all of us to move about. In each of us is not only an operational mechanism which responds to immediate changes in its environment to maintain stability. In each cell is a history book, containing in a microcosm the millions of stages of endosymbiotic adaptation which took us to this point, and which we see in the physical and geological evidence around us. We really are stardust.

This isn't something that biologists alone talking about. It coincides with physics. David Bohm talked about the difference between the surface, manifest features of the world as the "explicate order", and the deep coherent structure of the universe as the "implicate order". This implicate order, Bohm imagined, was a kind of hologram - or rather a "holo-movement" (because it is not fixed), which acts as the root of everything. As a hologram, it has a fractal structure (holograms are a fractal encoding of light interference patterns of 3d images). This means that within each cell is a copy of a self-similar pattern of the cosmos, formed through the evolutionary history book that they contain. Each evolutionary stage of the cell, and each organisational configuration it forms (like the bicameral brain, bodies, fingers) is an express of what the physicists call "broken symmetries" of its initial organisation. Our manifest consciousness - the ideas we share (like this one) - are such a manifestation of our cellular broken symmetries.

When we think deeply, we think WE are doing the work. But the work is done by our cells (particularly the calcium pumps). They think deeply. Their behaviour is an attempt to bring coherence to their environment, and the ultimate coherence is to return to their origin and to get closer to the implicate order. Deep thought is time-travel. This is why, I think, a philosopher like John Duns Scotus in the 13th century could have anticipated the logic of quantum mechanics. In our current society, deep thought is not impossible, but the institutional structures we established to help it arise (the universities) have largely been vandalised.

I share many of McGilchrists concerns about the modern mind. But we need to look deeper than the brain. And we need to look deeper than us. I once asked Ernst von Glasersfeld, whose theory of Radical Constructivism has been very influential in education, about where the desire to learn came from. It was all very well, I suggested, to say what we thought the learning process was. But we never say why it is we want to learn in the first place. He didn't have an answer. Now I can tentatively suggest an answer. We don't want to learn. But our cells, and we who are constituted by them, need to organise themselves in relation to an environment so that it is coherent. Our drive to learn is the cell's search for the implicate order at its origin. All we need to do is listen - but in today's world, that is getting hard. 

Saturday, 9 March 2019

Implication-Realisation and the Entropic Structure of Everything

The basic structure of any sound is that it starts from nothing, becomes something, and then fades to nothing again. In terms of the flow of time, this is a process of an increase in entropy as the features of the note appear, a process of subtle variation around a stable point (the sustain of a note, vibrato, dynamics, etc) where entropy will decrease (because there is less variation than when the note first appeared), and finally an increase in entropy again when the note is released.

A single note is rarely enough. It must be followed or accompanied by others. There is something in the process of the growth of a piece of music which entails an increase in the "alphabet" in the music. So we start with a single sound, and add new sounds, which add richness to the music. What determines the need for an increase in the alphabet of the sound?

In the Implication-Realisation theory of music of Eugene Narmour, there is a basic idea that if there is an A, there must be an A* which negates and compliments it. What it doesn't say is that if the A* does not exactly match the A, then there is a need to create new dimensions. So we have A, B, A*, B*, AB and AB*. That is no longer as simple as a single note - for the completion of this alphabet, we not only require the increase and decrease of entropy in a single variable, but in another variable too, alongside an increase and decrease in entropy of the composite relations of AB and AB*. The graph below shows the entropy of intervals in Bach's 3-part invention no. 9:

What happens when that alphabet is near-complete, but potentially not fully complete? We need a new dimension, C. So then we require A, A*, B, B*, AB, AB*, C, C*, AC, AC*, BC, BC*, ABC, ABC*. That requires a more complex set of increases and decreases of entropy to satisfy.

The relational values AB, AB*, AC, AC*, ABC, ABC* are particularly interesting because one way in which the entropy can increase for all of these at once is for the music to fall to silence. At that moment, all variables change at the same time. So music breathes in order to fulfil the logic of an increasing alphabet. In the end, everything falls into silence.

The actual empirical values for A, B and C might be very simple (rhythm, melody, harmony) etc. But equally, the most important feature of music is that new ideas emerge as composite features of basic variables - melodies, motivic patterns, and so on. So while at an early stage of the alphabet's emergence we might discern the entropy of notes, or intervals or rhythms, at a later stage, we might look for the repetition of patterns of intervals or rhythms.

It is fairly easy to first look for the entropy of a single interval, and then to look for the entropy of a pair of intervals, and so on. This is very similar to text analysis techniques which look for digrams and trigrams in a text (sequences of contiguous words).

However, music's harmonic dimension presents something different. One of the interesting features of its harmony is that the frequency spectrum itself has an entropy, and that across the flow of time, while there may be much melodic activity, the overtones may display more coherence across the piece. So, once again, there is another variable...

Tuesday, 26 February 2019

Dialectic and Timelessness

One of the great arguments in physics concerns the nature of time: is it real? or is it a fiction which we construct? Physicists like Lee Smolin argue that time is not only real, but it's the foundation of every other physical process. Leonard Susskind upholds what he calls the "anthropic principle" - we make this stuff (time) up. Smolin's objection to this is that it is unfalsifiable (see https://www.edge.org/conversation/lee_smolin-leonard_susskind-smolin-vs-susskind-the-anthropic-principle).

I want to approach this from a different angle. Physics underpins biology in some way, and our biology appears to be the basis of our consciousness. Consciousness in turn is responsible for social goods and ills in the world, and these social goods and ills seem to be produced over time. Moreover consciousness gives us our ideas of physics and biology, and it allows us to create our institutions of science wherein those ideas are manufactured. To some extent, these ideas imprison us.

Our lives appear as an ecological ebb-and-flow of perceptions and events which from a broader vantage point look like what philosophers call "dialectic". For Marx and Hegel, dialectic is one of the fundamental constituents of reality - although Hegel's dialectic is an "ideal" one - it is ideas which oppose one another synthesising new ideas - whereas Marx's dialectic has to do with the fundamental material constitution of reality, which underpins social structures. The Marxist underpinnings are scientific - it is physics at the root. However, if time is not real, what happens to dialectic?

The intellectual challenge is this: imagine a timeless world where there is no past or future, but a whole and total "implicate order" from which we construct our "now" and our "then". In our constructing of a "now" and "then", we give ourselves the impression of a dialectical process, but actually this is an illusion which causes us to mistake the nature of reality, and in the process, leads to social ill.

So how might we re-conceive reality in a way that we don't impose an idealised dialectical process, but rather attempt to grasp the whole of time in one structure?

One of the problems is the hold that evolutionary theory has on us - and evolutionary theory was also influential on Marx. What if all the stages of evolution co-exist at any instant? It's not so difficult to imagine that the "you" that is now includes the "you" that was a child. But it's more challenging to think that the cells that make up "you" each include the cells that existed in the primordial soup of the beginning of life. If we accept this for a minute, then some interesting things emerge. For example, we might think of a dialectical process being involved in being struck by a bacterial infection, and fighting it off: in Hegelian language, thesis - healthy cells; antithesis - cells under bacterial attack; synthesis - healing, production of antibodies. This is time-based. But what if it is seen as a step-wise movement through a "table" of co-existing biological states?

Let's take our healthy cell as a stage in a "table" of evolutionary states. When the cell is attacked by bacteria, its physical constitution changes. In fact, it seems to regress to a previous stable evolutionary state (you might imagine the cell "moving" towards the left of a table). The healing process finds a path from this regressive state back to its original state - moving back towards the right. This is "dialectic" as a process of movement from one stable state to another - rather in the way that electrons shift from one energy band to another. John Torday remarked to me the other day that the cells of the emphysema lung become more like the cells of the lung of a frog. Disease is evolution in reverse.

So what about social disease? What about oppression or exploitation? If the free and enlightened human being exists on a table of possible states of being human (probably on the right hand side), and the slave exists on the left, how does this help us think about a dialectic of emancipation? Like the cell under attack, what pushes the cell to take an evolutionary step backwards is a threat in its environment (bacteria). What matters in both cases is the relationship to the environment:  the relationship between cells, and the relationship between people. In examining people at different stages of freedom, we are seeing different sets of relations with others. The pathology lies in the master-slave relation, not in the slave; health resides in the convivial openness of the enlightened person with all around them, not in the person themselves.

Marx's principal insight lay in the recognition that emancipation from slavery could arise from the organisation of the oppressed: workers of the world unite! The organisation of the oppressed might be seen as the creation of the conditions for growth from a basic state of evolution (slave) to a more advanced state. It is similar to the healing process of a wound. Marx's dialectic becomes a coordination between people where the collective management of the environment outweighs the pathological effects of that environment on any one individual. Each stage of development towards emancipation is a "stable state" which can be attained progressively with the production of the right conditions. Equally, evolution in reverse can be produced with the creation of negative conditions - for example, austerity.

Dialectic is not a temporal process: it is not a matter of "now" and "then". It is a process of structural alignment in a structure which simultaneously contains all possible states of increasingly sophisticated "wholes". Time is implicit in this structure. The better we understand the structure and how it affects the way we think, feel and act, the better our chances of survival in the future.