Tuesday, 27 July 2021

Challenging Perceptions of "Traditional University Education"

There are three critical and fundamental developments underway in higher education today:

  1. Confusion about the place of technology in education, which manifests as a crude "face-to-face vs online" discussion, but which reveals a much deeper confusion about education, learning and tool-use more generally.
  2. Confusion about the moral status of traditional disciplines in light of increasing awareness of their colonial roots, and acknowledgement of the effect that traditional disciplines (and those that uphold them) play in reproducing structural inequalities in society which have their roots in patriarchy and racism.
  3. Confusion about the purpose of education and its usefulness - made worse by the increasingly transactional nature of learning within most universities.
No discipline is safe from any of these. Technology obviously has made its presence felt over the last 18 months or so, and yes - we are all "zoomed out". But are we zoomed-out because of Zoom, or are we zoomed-out because of what we've done with Zoom? Even if the pandemic hadn't happened, we were probably staring at screens just as much as we had been during it. Perhaps we weren't quite so rigidly staring into a camera resisting the temptation to pick our noses, but we were staring at screens nonetheless. So what is it about our perception that was different during the pandemic? What was different in our physiology? This is a question about perception - technology blurs the bounds of conventional categories.

Perception about categories concerning the moral status of education are also changing. Black Lives Matter and MeToo have heralded fundamental changes to the way we look at institutions and the behaviour of people in them. What was it about the film industry that let Weinstein do what he did and get away with it? What is it about architecture education that allows for casual racism and sexism (as students at the Bartlett this week have revealed: Bartlett launches investigation after racism and sexism allegations (msn.com)) Why does traditional music education teach about a load of white male composers, or teach theories developed by other white men who had a vested interest in using their theory to promote the supremacy of their own culture? Some might be dismayed by the iconoclasm or the no-platforming - but iconoclasm always involved a fundamental perceptual shift. That's what matters.

Then there is the perception of the value of education itself.  This is the difference between what is perceived in the university and what is perceived in the outside world. Do they fit? Does one prepare for the other? Or does the university simply allow intelligent young people, who - if thrown into the world would work it out without a university - are instead allowed to be cossetted away from reality for a huge fee, are stressed from pointless assignments and grade obsessions, and set up with unreasonable expectations when they leave? This is going to become a bigger question as the world moves on and universities don't. 

What are we to make of these perceptual shifts? What's to be done with the "traditional academy"?

At the root of all of this is the relation between perception and knowledge. Universities have never taken perception that seriously. Their game was always knowledge because knowledge was measurable, structured and certifiable (even if this is notoriously inaccurate).  But knowledge sits on perception, and in the end, it is perception and its close cousin, "judgement", that matters. 

I wonder if, despite its best efforts (for example, reflective learning - although hardly a success) traditional education can't enter this confused realm of perception and judgement. Perhaps only direct engagement with the real world can do that. And educationally, only personal experience and experiment with one's own life can really produce the kind of learning which can equip the young with sufficient flexibility and good judgement to navigate a world where the traditional categories are vapourising before us. To quote Marion Milner, we now are all in a "Life of One's Own", and the focus of our inquiry is not on mastering traditional disciplines as much as finding enough space in our lives for individuation and creativity.

Monday, 26 July 2021

The Empirical Phenotype

One of the realisations that has been creeping up on me is that the view of ontology which I subscribed to for many years, Critical Realism, is the wrong way round. My accidental journey as an academic began 20 years ago in the University of Bolton, and one of the first things I did there was to explore the "ontology of technology". Clive Lawson had a conference in Cambridge on this topic which I didn't go to, but read about it and subscribed to the "Realist Workshop". This set me reading the work of Roy Bhaskar - and that was where I got my ontology from.

In Bhaskar's ontology, "reality" is envisaged as a set of embedded layers. The "empirical" layer concerns that aspect of reality which is directly experienced. Obviously this includes the observations of scientists who discover laws of nature. The empirical in this scheme is the most circumscribed layer of reality. Above the empirical is what Bhaskar calls the "actual": this is the world of possible events, which may or may not be experienced, but which result from deeper mechanisms in nature which may be unknown to our scientific understanding. At the deepest level, the "real" is the totality of "generative mechanisms" which exist independently of human ability to observe them. Discovery of these generative mechanisms, which Bhaskar argues can be either "intransitive" (existing independently of human agency) and "transitive" (existing through human agency), is the point of inquiry in the physical and social sciences. It's through this basic scheme that Bhaskar builds an impressive argument to say that scientific inquiry and emancipatory politics are tied together: Critical Realism was a kind of lifeboat for Marxists disillusioned by what had transpired in the name of Marxism.  

I liked this because I found that the emphasis on "mechanism" led to some powerful parallels between cybernetics and Critical Realism. However, the notion of mechanism is problematic. If we ask what a mechanism is, can we say that it is any more than an idea: something produced by an observer, not something inherent in a thing. This observer problem haunts Critical Realism as it does cybernetics. 

If a mechanism is an idea, what is an idea? It seems as something arises (or is constructed) through a process of inquiry which is akin to a kind of "dance with nature". More importantly, there is a dance with some unknowable, and ambiguous environment. As human beings, we know that we do this. But we also see similar dances going on in the microscopic world of our immune system, or in the fertilization of an egg, or the development of an embryo. It seems that everything dances - or at least, we know ourselves to dance, and perceive in observations of nature similar dances. The resonance between self-knowledge and perception of nature is enough to justify the word "everything".  

The fundamental nature of dance is an experiment. But the fact that all living things dance, and that the dance of ourselves seems to relate to the dance of our own cells, we should ask:

  1. Who had the first dance?
  2. How does it all hold together?

It is surprising that the study of dance is not taken more seriously. Only Maxine Sheets-Johnstone has really made a profound contribution. But because everything dances, the study of any particular dance can shed light on other levels of dance. From the dance of white blood cells to a tango, or just the music of the tango, fundamentally there is resonance, and the resonance too is part of the dance. When we dance, we move together with something or someone. We create multiple descriptions of the same thing. But what's the point? 

If a dance didn't stop, we would be exhausted. Indeed, our exhaustion can be what brings things to a pause. Dances have a beginning and an end, and the middle is a progress from one to the other. In the end, there is no dance - nothing. Everything is in the process of creating nothing. And nothing is made with resonance - multiple descriptions of the same thing.

Making nothing is not a trivial thing. It requires the seeking-out of new descriptions which can resonate or interfere with existing descriptions so that eventually things "cancel out". I suspect this seeking out of new descriptions or differences is what might otherwise be called "agency" - it is what calls us to act. We are always being called to act to create a clearing, and our response can range from meditation to rage. 

Another word for "nothing" is equipoise. The dance of cells is driven by this principle. They seek equipoise with an ambiguous environment, maintaining their boundary and garnering sufficient energy to continue their empirical journey. This is the feature of any phenotype: as far as life is concerned, the first cell had the first dance.  And the biological phenotype gave us observation by virtue of the fundamental distinction of the boundary between itself and its environment. Observation always requires a boundary. Before the first cell, did anything observe?

But making nothing may well have been going on before this. Newton, Einstein and Dirac all amount to nothing - but we can only intuit that through our biological lens of thought. And, perhaps most importantly, that biological lens of thought refers back to the "first dance", as does all the dancing of every cell, organ, plant, bacteria, etc. 

And when we experiment, this is what we are dancing with: not only physical nature, but evolutionary history. So Bhaskar is wrong: the empirical is the most fundamental, most active layer of reality, which underpins the principle of making nothing. It is tied to physiology. We might imagine deeper mechanisms and make observations of matter through particle accelerators or (eventually) quantum computers - but our imagination sits on a physiology which is already doing the thing that we see in our physics. And the politics of science? A society which fails to create the conditions for making clearings in the minds of its members will destroy itself. It is education's job to do this - right now we fill peoples' minds with noise.  

Monday, 28 June 2021

Creativity and Energy

One of the features of our online experience is its continuity: there's no space for breath. Personally, I find that endless continuity is very draining. I find myself in need of creating a clearing amidst a chaotic flow of events.

There is a connection between making a clearing and doing something new. A new departure, like all creative acts, must start from nothing. So making nothing before we do anything is the most important thing. In the digital flow of experience, making nothing is difficult. Perhaps we should turn everything off - but this is not the same as making nothing. I do wonder if creating nothing was easier in the past. Perhaps.

It is important to understand the dynamics of nothing. Nothing is not a state - it is a process. You can imagine that you are in a state of nothingness, but physiologically, your cells continue to reproduce themselves, your neurons pump calcium, you breathe, and so on. How is this state of nothing different from a state of something? 

The way I think about it is that getting into a nothing state is a process of tuning in to our most basic processes - the processes which in the beginning were the root of existence. The first cells, the quantum mechanics of the atoms which make up the substances from which our cells are made, the origin of the universe. Thinking is a manifestation of those processes. Nothing is where consciousness harmonises with those processes. Another way of thinking about this is to say that nothing is where energy manifests in its purest state.

The materialism of ordinary existence is the continual making of things around us - that thing which from which we might wish to escape. If nothing is energy in its purest state - a harmony - materialism is disordered energy. Matter is an epiphenomenon of things not being tuned in. I find this a helpful perspective because it sees matter and energy on a continuum - from disorder to deep order. 

Some aspects which we might consider material - for example, crystals or coral - reveal deep order to us. And some aspects of our scientific knowledge give us an insight into the processes of this deep order.  It is what Bohm called the "implicate order".

So to create, we have to become crystalline. Then we can make a start - with a mark.

So what happens then? In making a mark - it could be a line, words, some notes, etc - we start a new continuity. This is something which contains the seeds of its continuation. It takes energy to do this because what the energy must do is make a choice: this is part of the continuity and this isn't. And most continuities created like this quickly exhaust the energy that was the impetus to create them. We run out of steam.

But then the same thing must happen. We have a continuity which exhausts us from which we need to create a clearing. But if you cut a continuity, you create two continuities with nothing in between them. For some reason I find the act of cutting things and creating space between them gives me energy again. I guess if you cut an atom, you get energy. Perhaps cutting a continuity releases the energy inherent in it, and that then feeds a new process of making new marks in the clearing between them. Cutting a continuity creates a dynamic tension between the two parts too.

In this way, creation - like cells - grows from the middle-out. Cuts are made, clearings created, new continuities constructed. But the point of the whole is the creation of nothing. The point of a piece of music is the last note and then the silence. It is the making of a clearing for others. As things grow, nothing can be created not only by cutting and clearing, but by harmonising - which is a form of adding something to create nothing (no harmony is strictly necessary).

What does this all mean? 

Of our many problems, we have an education system which is materialist in its orientation, while it trumpets the virtue of creativity (often in "innovation") without really thinking what this is. Creativity is not "accretion" - it is not the continual adding of new things on top of one another - in the way our journals now work. That just creates noise and is exhausting. 

Creativity is harmonising with the processes of nothing. Some of our technologies are showing us how this principle might work (particularly AI which is a "nothing creating" technology). If we listened to what is happening, what we should do next would become much clearer to us. Perhaps the clearest signal is that learning is not "absorption", but creation. It's not that we learn mathematics, computing, chemistry or art. It is that we learn to generate nothing in the context of mathematics, computing, chemistry and art. 

Sunday, 13 June 2021

AI, Experiment and Pedagogy - Why we need to step back from the critical "Punch and Judy" battles and be scientific

There are some things going on around technology in education which I'm finding quite disturbing. Top of the list is the "Punch and Judy" battle going on between the promoters of AI, and the critics of AI. One way or another, it is in the interests of both parties that AI is talked about - for the promoters, they want to encourage big money investment (usually, but not always, for nefarious purposes); for the critics, they want a platform upon which they can build/enhance their academic reputations - what would they do without AI??

Nowhere, in either case, is there real intellectual curiosity about the technology. Both parties see it as a functionalist thing - either delivering "efficiency" (and presumably, profits), or delivering educational disaster. The former is possible, but a huge missed opportunity; the latter is unlikely because the technology is not what the critics imagine it is. In fact, the technology is very interesting in many ways, and if we were scientists, we would be taking an active interest in it.

As I have said before, "Artificial Intelligence" is a deeply misleading term. What machine learning is is an exploitation of self-referential, recursive algorithms which display the property of an "anticipatory system": it predicts the likely categories of data it hasn't seen before, by virtue of being trained to construe fundamental features of each category.  We have not had technology like this before - it is new. It is not a database, for example, which will return data that was placed in it in response to a query (although AI is a kind of evolution from a database).   

"Artificial Anticipatory Systems" are extremely important for reasons we haven't begun to fathom. The deep issue is that, as all biological systems, we are also "anticipatory systems". Moreover, the principles of anticipation in biological systems are remarkably similar to the principles of anticipatory systems in machine learning: both rely of vast amounts of "information redundancy" - that is, different descriptions of the same thing. Redundancy was identified by Gregory Bateson (long ago) as fundamental to meaning-making and cognition. Karl Pribram wrote a brilliant paper about the nature of redundancy and memory (see T-039.pdf (karlpribram.com). Poets (Wallace Stevens in "The necessary angel"), musicians (Leonard B. Meyer), physicists (David Bohm) and many others have said the same thing about multiple descriptions and redundancy. How does it work? We don't know. But instead of using the opportunity to inspect the technology, foolish academics posture either trying to shoot the stuff down, or to wear it as a suit. 

To hell with the lot of them!

I read a recently-published remark against empiricism itself the other day by a well-known and highly intelligent scholar. The argument was basically that "flat earth" campaigners (and other conspiracy theorists) were empiricists because they appealed to simple observations. What was needed in place of this "empiricism" was the carefully constructed critical argument of the social science discourse. 

I think I partly blame the philosophy of "Critical Realism" for this (and I speak as someone who for a long time had a lot of time for CR). Roy Bhaskar  makes a distinction between the "empirical", the "actual" and the "real", arguing that the empirical is the most constrained situation, because it involves observation of events (typically, but not exclusively, through artificial closed-system experiments designed to produce observable and reproducible successions of events). The actual, by contrast, considers events that might occur, but might not be observed. The real, by contrast, involves the world as it is beyond human perception - what Bhaskar considers to be the result of "generative mechanisms". 

Now what's wrong with this deflating of empiricism? The real problem arises because Bhaskar bases his arguments on a particular reading of Hume's scientific theory which suggests that science results from experiments producing regular events, and scientists constructing causes to explain these events. Hume's position is unfortunately misconstrued by many as a defence of a naive mind-independent empirical reality (which was the opposite of what Hume was really saying), but Bhaskar's point is to say that Hume was wrong in saying causes were constructs. [The tortuous complexity of these arguments exhausts me!] However, behind all this is a deeper problem in that experiment is seen as a thoroughly rational and cognitive operation - which it almost certainly is not. Moreover, this cognitive and rational view becomes embedded in the kind of authoritarian "school science" that we all remember. 

The flat earthers are not empirical. They are authoritarian - borrowing the cognitive misinterpretation of science from the schoolroom to make their points.  

As physiological entities, scientists are engaged in something much more subtle when doing experiments. Science is really a "dance with nature" - a process of coordinating a set of actions against a set of unknown constraints from nature. Producing regular successions of events are  a way of codifying some of constraints that might be uncovered, but that in a way is really an epiphenomenon of the empirical enterprise. Codification is important for reasons of social status in science, and perhaps for social coordination (if it is codification of the genome of a virus, for example). But it is not what drives the empirical effort. That is driven by continually asking new questions, and making new interventions to get ever-richer versions of reality. The drive for this curiosity may be to do with evolution, or energy and negentropy. As David Bohm pointed out, scientific understanding is rather like a continual accretion of multiple descriptions of nature. It is also about redundancy. 

This is why I find the machine learning thing so important, and why the ridiculous posturing around it drives me crazy. This is a technology which embodies (an inappropriate term, of course) a principle which lies at the heart of our sense-making of the world. Studying it will shed light on some deep mysteries of consciousness and learning, and our relationship with technology. As we move closer to quantum computing, and closer towards being able to study nature "in the raw", some of the insights from the current development phase of machine learning will provide a useful compass for future inquiries. They are, I'm sure, related. 

It may be more of a tragedy that the critics of AI are not scientists, than the many promoters of AI in education being criminals.  

Saturday, 29 May 2021

What is happening with "digitalization" in Education?

I am currently involved in a large-scale project on digitalization. The aim of the project is to instil practices relating to the manipulation of data, coding, algorithmic thinking and creativity throughout the curriculum in the University. While this appears, on the one hand, an attempt to reignite the "everyone should code" kind of stuff, something is clearly happening with the technology which is necessitating a reconfiguration of the activities of education with the activities of society. 

As with many big structural changes to education, there are already quite a few signs of changes in practice in the University: many courses in the humanities and sciences are using programming techniques, often in R and Python. My university has established faculty-based "data centres" - rather like centres for supporting e-learning - which provide services in analysis and visualisation. However, across the sector, there is as yet little coherence in approaches. It is rather like the situation with teachers using the web in the late 90s where enthusiastic and forward-thinking teachers would put their content on websites or serve them from institutional machines. The arrival of VLEs codified these practices and coordinated expectations of staff, students and managers. This is what is likely to happen again, but instead of codifying the means of disseminating content, it will codify programming practices across different aspects of the curriculum.  

There is a further implication of this, however, which has to do with the nature of disciplines and their separation from one another. One of the reasons why digitalization has such a hold in education at the moment is the dominance of digitalization in industry across all sectors. Where sectors might have distinguished themselves according to expectations formed around concepts, products and markets, increasingly we are seeing coordination of industrial activities around practices and processes. This has been slowly happening for the last 20 years or so, but is evidenced by the way that industries are realigning themselves by synergising practices and technologies across different fields of activity. Think of Amazon. This has been coupled with increasing "institutional isomorphism" in the management of institutions across the board. This has produced many problems in institutional organisation - partly because the old identities of institutions have been torn-up and new identities imposed which, although they exploit the new technologies available, almost always reinforce and amplify the hierarchies and inequalities of the old institution.  

With this in mind, this next phase of digitalization is going to be very interesting. The old hierarchies of the university are established around academic departments and subjects. These are basically codified around concepts which, within academia, operate to define and redefine themselves in contradistinction to one another. This is not to say that interdisciplinarity isn't something that's emerged: obviously we have things like biochemistry or quantum computing, but even within these new fields which appear interdisciplinary, the codification around concepts is the central mechanism which provides coherence. Look, for example, at how academic communities fracture and form tribes: not just the mutual antipathy between psychology and sociology, but between "code biology" and "biosemiotics", heterodox vs classical economics, etc. A lot of this kind of division has to do not just with disciplinary identity, but personal identity. Concepts are tools for amplifying the ego (am I not doing it here?), and the principal mechanism for this process has been the way we conduct scientific communication. 

Digitalization means that increasingly we are going to see research and learning coordinated around practices with tools. This is a more fundamental change to what is loosely called the "knowledge economy". It won't be enough to simply name a concept. We will need to show how what is represented by a concept actually works. Argument will be increasingly embellished with concrete examples, some in code, and all of which presented in a way in which mechanisms can be communicated, experimented with, new data applied to, refined, and continually tested. More importantly, because these practices become common, and because practices supersede concepts in scientific inquiry, the traditional distinctions between disciplines will be harder to defend. This will produce organisational difficulties for traditional institutions in which disciplines will perceive threats in the digitalization process and seek to defend themselves.

Another threat may come in the form of what might be called the "status machine" of the university. Concepts don't only codify a discipline, they codify the status of those who rise to positions where they can declare concepts (what Searle, who is not alone in pointing out this mechanism, calls "deontic power").  While new practices are codified in a similar way, practices are only powerful if they are adopted widely, and in being adopted, they are continually adjusted. Eventually we don't care about the concept or who thought of it, but about being part of the game which is developing and upholding a common set of practices. The operating system Linux is a good example: nobody really cares about who invented it; but we do care about using and developing it. We can start to make a list of similar practices which fit this model: computer languages, online programming environments, visualisation tools, etc.

But the university is a "status machine": its business ultimately lies in selling certificates and through codifying status. So if it comes to be about practice rather than status, what does the University then do? New forms of personal status codification are emerging. The online machine learning competition site Kaggle, for example, provides opportunities to do real and meaningful data analytic activities. Winning a Kaggle competition is an important marker of personal status carrying more meaning than a degree certificate because it demonstrates what someone can actually do, with references to things that were actually done. But Kaggle does not lock its status mechanisms behind the expensive close-system of an institution: it is open and free, funded by the fact that the fruits of intellectual labour become the property of Kaggle (and by extension, Google). Intellectual activity given to the platform is exchanged for status enhancement. It is in many ways an extension of the Web2.0 business model with some important differences. 

What happens in Kaggle educationally is particularly interesting. Kaggle teaches simply by providing a network of people all engaging in the same activities and addressing the same, or related, problems. There is no curriculum. There are emerging areas of special interest, techniques, etc. But nothing codified by a committee. And it exists in the ecosystem of the web which includes not only what Kaggle does, but what StackOverflow does, or anything else that can be found on Google. Human activity contributes to this knowledge base, which in turn develops practices and techniques. Learners are effectively enlisted as apprentices in this process. Experts, meanwhile, will go on to exploit their knowledge in new startups, or other industrial projects, often continually engaging with Kaggle as a way of keeping themselves up-to-date.

The University Professor, meanwhile, has both become increasingly managerial, and increasingly status-hungry as they seek the deontic power to declare concepts, or make managerial things happen ("we should restructure our University" is a common professorial refrain!), but increasingly (and partly because there are so many of the bastards now), nobody is really interested - apart from those who will lose their jobs as a result of professor x. We just end up with a lot of mini-Trumps. Deontic power doesn't work if nobody believes you, and it doesn't do any good if, even if they listen to you, they merely repeat the conceptualisations you claim (but with different understandings). The academic publishing game has become very much about saying more and more about less and less, where each professorial utterance merely adds to a confusing noise that only benefits publishers.

Kaggle shows us that we don't need professors. There will always be "elders" or "experts" who have more skin in the game and know how to use the tools well, or to apply deeper thinking. But it is not about leading through trying to coin some attractive neologism.  It is about leading through practice and example. 

Here we come to the root of the organisational challenge in the modern university. Their layers of management are not full of people leading by example with deep skill in the use of digital tools. They are full of people who postured with concepts. And yet, these are the people who have to change as the next wave of digitalization sweeps over us. I suspect it's not going to be an easy ride.

Thursday, 6 May 2021

Technology, Conversation and Maturana

I wrote this last month for the Post-Pandemic University blog (see Technology and Conversation – The post-pandemic university (postpandemicuniversity.net)). Maturana was on my mind. I saw him speak at the American Society for Cybernetics conference in Asilomar, CA in 2012. There were many other cybernetics luminaries there for the Bateson celebration "An Ecology of Ideas" (see Microsoft Word - all.docx (asc-cybernetics.org))  I also remembered from that event Jerry Brown's motorcade arriving (he was a Bateson student), Nora Bateson's film, Graham Barnes's talk ("How loving is your world?"... Graham also died recently), Terry Deacon's talk and Klaus Krippendorff's birthday celebration. It was quite an event.  

I don't remember an awful lot of Maturana's talk except for a remark he made about learning in response to a question: "What we learn, we learn about each other". 

That deserves a huge YES!

So here's my postpandemic piece. And that comment from Maturana about learning runs all the way through it.


Biologist Humberto Maturana once wrote a poem called “The Student’s Prayer” in response to the unhappiness of his son in school. It goes:

Don't impose on me what you know,
I want to explore the unknown
And be the source of my own discoveries.
Let the known be my liberation, not my slavery.
The world of your truth can be my limitation;
Your wisdom my negation.
Don't instruct me; let's walk together.
Let my richness begin where yours ends.
Show me so that I can stand
On your shoulders.
Reveal yourself so that I can be
Something different.
You believe that every human being
Can love and create.
I understand, then, your fear
When I asked you to live according to your wisdom.
You will not know who I am
By listening to yourself.
Don't instruct me; let me be
Your failure is that I be identical to you.

Maturana’s poem speaks of the importance of exploration and conversation in learning – what he calls “walking together”. Taken literally, conversation is actually “dancing together” because the Latin “con-versare” means “to turn together”. I find this a useful starting point for thinking through the confusing categories by which we distinguish online activities and artefacts from face-to-face engagements. Anyone who has danced with anyone else knows that it doesn’t work by one person imposing something on the other. It does require “leading”, but the leader of the dance engages in a kind of steering which takes into account the dynamics of the whole situation including themselves and their partner.

What happens in this steering process is also revealed in “conversation”: it is a negotiation of constraints – “this is how we can move”, “this is how I am able to move”, and so on. Like dancing, conversation is not about imposition. In a conversation, participants reveal their understanding and their uncertainty through the many utterances that they make. Those utterances are multiple attempts to describe something which lies beyond description. But taken together, something is revealed, and if it works, like the dancer and their partner, each person becomes a different version of the same thing – rather like a counter-melody to a familiar tune. A richer reality emerges through the counterpoint of multiple descriptions.  

This understanding of conversation is the antithesis of the increasingly transactional way in which the education system seems to view the “delivery” of education. This is not merely an exchange of words, essays, text messages, tweets, or blog posts. It is a coordination. Or perhaps more deeply (to borrow some terminology from Maturana) it is a “coordination of coordinations”.

When I try to explain this, I sometimes use some software called “Friture” which graphs the spectrum of sound in real-time. You can download the software here (http://friture.org). I ask people to sing a single note (or at least try) and capture it on the computer. The resulting spectrum shows a set of parallel lines representing the many frequencies which combine in making the single sound. Conversation is like this, I say.

Indeed, we can explore things further with sound. If you sing the note while gradually changing the shape of your mouth to make the different vowel sounds, the number of lines decrease and increase. The narrow “e” sounds are rather like a tinny transistor radio. The fuller “ah” sounds are more “hi-fi” and rich. So the more simultaneous versions of the same thing, the more “real” it feels. Try it!

The message is that our grasp on reality and the effectiveness of our social coordination requires the coordination of diverse voices – and that is what conversation is about. The physicist David Bohm, who made the connection between a view of quantum mechanics and scientific dialogue, explained it more elegantly here: (1) David Bohm on perception – YouTube. And there is a political message: the richness in our understanding of reality entails the conditions of a free society which embraces diversity and creates the conditions for conversation: that is, one that doesn’t impose one particular description of the world on everyone else. That merely produces the tinniest of transistor radios!

Technically, in the world of information theory, multiple descriptions of the same thing are termed “redundancy”, which is another word for “pattern”. This is useful when we try to make sense of the relationship between the conversations that we have face-to-face, and the phenomena that we experience online. 

The Internet’s Multiplicity

The internet has vastly expanded the multiplicity of descriptions of the world. Does the internet dance in much the same way that our face-to-face conversation dances? I think it does, but to understand how it does, we have to look a bit more deeply at the kinds of multiplicity involved in all conversation. 

The internet produces its multiplicities differently. Face-to-face conversation is comprised of gestures, words, phonemes, and prosody – we wave our arms, use our eyes, change the pitch of our voice, and often repeat ourselves. The repetition we might think of as a “diachronic” (over time) redundancy; the arm-waving, voice pitch, gestural stuff is “synchronic” (simultaneous). Returning to the sound spectrum analyzer, the parallel lines identify the synchronic aspects, while if we were to sing a melody, the changing pattern over time represents the diachronic dimension. 

So what if the balance between synchronic redundancy and diachronic redundancy can be shifted around? What if parts of what is synchronic, become diachronic? Isn’t that what happens on the internet? 

Our snippets of text, video, emails, game plays, hyperlinks, blogs, timelines, likes, shares and status updates do not happen at the same time. While some of them (like video) contain rich synchronic aspects similar to face-to-face engagement, and text itself is a remarkably rich synchronic medium (without which poetry wouldn’t exist!), much of the multiplicity (or the redundancy) occurs diachronically as well as synchronically. The timeline matters; the concern for a particular individual’s understanding matters. And we may never meet somebody face-to-face, but following them on Twitter might mean that we get to know them as if we had, and perhaps a bit better.

Why don’t your Zoom lectures Dance?

So why, when it comes to education online, does so much seem deathly? Why doesn’t your zoom lecture dance? If the internet dances, why can’t education join in?

To answer this, we have to examine education’s constraints. And here we meet the very things that Maturana was railing against. Why is it so dreadful? Because the basic function of the system is to impose on students what is already known, examine them and certificate them. It instructs, reproduces and fails (at least, in Maturana’s terms): more goose-step than dance. 

There are of course reasons why this is so. After all, how would a meaningful assessment system operate if learners were allowed to be different from one another or do completely different kinds of activities? Well-intentioned though ideologies like “constructive alignment” are, inevitably they get used to hammer abstract “learning outcomes” into students in the same way that we hammer in facts. In short, online learning is crap not because of technology, but because of the constraints of the institution. But our institutions took their form in a world where our available tools were limited, meaning that this was the most effective way to organise education at the time. If we started from scratch today, with the tools that we now have, might our institutions would look and behave very differently?

We have a vestigial education system which increasingly insists on a transactional “delivery of learning” and its measurement. Shifting this ideology online brings the added disadvantage that the internet does not afford the same synchronic richness of face-to-face situations (which at least mitigate the pain of instruction), while the education system cannot adapt to the internet’s rich diachronic mode of operation.

Dancing on Stilts

But it was always obvious to the pioneers of technology in education that learning with technology was a different kind of dance. One would end up dancing on stilts or trying to play Mozart wearing mittens if one insisted on reproducing established ways of institutional education online. 

In a very revealing passage explaining his core idea of “teach back” (where a teacher would ask a learner to teach back what they had learnt), Gordon Pask noted something fundamental in the patterns of teaching and learning processes that chimes with Maturana’s poem:

“The crucial point is that the student’s explanation and the teacher’s explanation need not be, and usually are not, identical. The student invents an explanation of his own and justifies it by an explanation of how he arrived at it” (Pask 1975)

What Pask argued was that it was the redundancy of the interactions that mattered in the dance. 

Technology and Institutional Structure

Now we have amazing technology, this redundancy can come in many forms and many different kinds of media. Videos, blogs, social media interactions, and so on. And yet within the context of formal education, we rarely harness this diversity because it presents organisational problems in the assessment and management of the formal processes of education. The root cause of why we dance on stilts lies in the structures of education, not in any particular pedagogy or “ed-tech”. 

This is the paradox of the current state of the uses and abuses of technology in education. The need for technical innovation in education lies in the use of technology to reform the management and structures of education which constrain teachers and learners to such an extent that it makes online education unbearable. The actuality of “ed-tech” innovation in education lies in corporations feeding on the obvious inadequacies of online learning, looking for a chunk of the enormous sums of money going into education, and pitching for minor improvements to fundamentally broken processes, while often burdening institutions with increasingly complex technical infrastructure and expensive subscriptions. 

There is hope. The internet really does dance, and the students are getting increasingly good at using it (and indeed, some teachers!). Educational institutions are like a Soviet-style old-guard in a rock-and-roll world. Technology is the lubricant that will eventually free everything up – but the old guard will be slow to shift. 

We have to decide what our educational institutions are for. Are they there to “deliver learning” and make profits and pay vice-chancellors obscene salaries? Or are they there for creating the contexts for new conversations? It is a fork in the road. One way lies a future of ed-tech which feeds on the inadequacies of the existing system like a parasite. The other lies a future of technology being used to transform the self-steering both of institutions and individuals in their dances with each other and society.

Thursday, 29 April 2021

Real "Digital" vs Education's idea of "digital": Some reflections of computational thinking

Digitalization is (once again) the hot topic in education. Amid concern that students leave university without digital skill, educational policy is focusing on "instilling digital skill" from primary school upwards. In Europe and the US, this is labelled "computational thinking", and is closely related to the (rapidly failing) drive to push computer science in schools.

Rather like the STEM agenda, to which it is of course related, there is a difference between education's idea of "digital" and the real-world of "digital" which is happening in institutions and companies, for which skills are definitely needed. 

What is the real world of digital? Perhaps the first thing to say is that there is no single "real world". There are dispositions which are shared by technical people working in a variety of different environments. And there are vast differences between the kinds of environments and the levels of skill involved. For example, Python programming to analyse data is one thing, using tools like Tableau is another. There are the hard-core software development skills involved in enterprise system development with various frameworks (I'm currently banging my head against Eclipse, Liferay and Docker at the moment), and then there are those areas of skill which relate to the sexier things in technology which grab headlines and make policymakers worry that there is a skills gap - AI particularly.

So what do governments and policy makers really mean when they urge everyone towards "digitalization"? After all, engineering is pretty important in the world, but we don't insist on everyone learning engineering. So why computational thinking? 

Part of the answer lies in the simple fact of the number of areas of work where "digital" dominates. The thinking is that "digital skill" is like "reading" - a form of literacy. But is digital skill like reading and writing? Reading, after all, isn't merely a function which enables people to work. It is embedded in culture as a source of pleasure, conviviality, and conversation. By contrast "digital skill" is very pale and overtly functionalist in a way that reading and writing isn't.  

The functionalism that sits behind computational thinking seems particularly hollow. These are, after all, digital skills to enable people to work. But to work where? Yes, there is a need for technically skilled people in organisations - but how many? How many software developers do we need? How many data analysts? Not a huge amount compared to the number of people, I would guess. So what does everyone else do? They click on buttons in tracker apps that monitor their work movements, they comply with surveillance requests, they complete mindless compulsory "training" so that their employers don't get sued, they sit on zoom, they submit ongoing logs of their activities on computers in their moments of rest, they post inane comments on social media and they end up emptied and dehumanized - the pushers of endless online transactions. Not exactly a sales pitch. Personally, I would be left wishing I'd done the engineering course!

A more fundamental problem is that most organisations have more technically-skilled people than they either know about, or choose to use effectively. This is a more serious and structural problem. It is because people who are really good at "digital" (whatever that means) are creative. And the last thing many organisations (or many senior managers in organisations) want is creativity. They want compliance, not creativity. They want someone who doesn't show them up as being less technically skilled. And often they act to suppress creativity and those with skills, giving them tasks that are well beneath their abilities. I don't think there's a single organisation anywhere where some of this isn't going on. Real digital skill is a threat to hierarchies, and hierarchies kick back.  

Educational agendas like computational thinking are metasystemic interventions. Other metasystemic interventions are things like quality controls and standards, curricula, monitoring and approved technical systems. The point of a metasystemic intervention is to manage the uncertainty of the system. Every system has uncertainty because every system draws a distinction between itself and the environment - and there is always a question as to where that boundary should be drawn, and how it can be maintained. The computational thinking agenda is an attempt to maintain an already-existing boundary.

Our deep problem is that the boundary between all institutions, companies and other social activities and their environments has been upheld and reinforced with the increasing use of technology. Technology in the environments for these institutions has been the root cause of why the institutional boundaries have been reinforced with technology in the first place. Technology is in the very fabric of the machine that maintains the institutions that we have, which themselves have used technology to avoid being reconstructed. The problem institutions have is that in order to maintain their traditional boundaries they must be able to maintain their technologies. Therefore they need everyone to comply with and operate their technologies, and a few to enhance them. But how does this not end in over-specialisation and slavery? How does it create rewarding work and nurture creativity?

No education system and no teacher should be in the business of preparing people for servitude. So what's to be done?

The question is certainly not about digital "literacy". It is about emancipation, individuation and conviviality in a technological environment. Our technologies are important here - particularly (I think) AI and quantum computing. But they are important because they can help us redesign our institutions, and in the process discover ourselves. That, I suspect, is not what the policy makers want because ultimately it will threaten their position. But it is what needs to happen.