Tuesday 5 October 2021

On Smartphones, Sandwiches and Teddy Bears: Energy flow in the Academy

I had an interesting discussion today about the nature of technology. It partly revolved around Latourian interpretations of agency, with the kind of criss-cross of concrete examples and theory which is particularly enjoyable when talking about technology. The "is x technology" game is fun, and its associated "does x act in a network?". It's perhaps defensible to play this game with a smartphone or computer, but harder to consider these questions with the concrete examples of food or teddy bears. 

Does your carrot act on you? Is your sandwich technology? Is this even a useful question? Well - it is if it exposes some fundamental weaknesses in social theory to which we are otherwise blind, and which if overturned, changes our perspective on other categories of understanding. 

In Latour's work, the sociomaterial co-construction of tools and humans arising through the emergent interaction of human and non-human "actors" is pivotal to an analytical approach which seeks to map out these networks and their dynamics. It's an idea which relates partly to systems-theoretical, process-oriented ontologies, and to mapping techniques which have become increasingly available in the wake of computers. 

I've never been particularly comfortable with the concept of "agency", and particularly with the concept of non-human agency. Apart from anything else, what is conceived as non-human agency seems to be really human agency at a distance. But then there is a question as to whether one can even draw a distinction around human agency itself. To what extent is the concept of agency meaningful - when do we "not-act"? Is thinking agency? Even when people's physical liberty is constrained,  they are still able to think. Anna Akhmatova's composing of poems in her head, committed to memory for fear of writing anything down, is agency, isn't it? 

For a number of reasons, I wonder if "energy" may provide a better way of thinking. It is useful to think in terms of energy because when education loses energy, it is not very good - irrespective of the agency involved. Moreover, when we act, we are involved in some kind of "energy exchange": Akhmatova's organising of her poems required energy in her body; participating in a conversation requires energy; depression and other mental health problems drain us of energy; and all educational development is the realising of "potential".... do we in fact miss the word "energy" from that? Are we really about unlocking the "potential energy" for future transformation that could be exercised by a student?

Discussing controversial objects as "technologies" also present a further case for thinking in terms of energy. For example, is a sandwich technology? Irrespective of the fact that human agency is involved in its construction, what is a sandwich but a container of energy? The same can be said of a carrot (and ultimately carrots and sandwiches get their energy from the sun). I find this interesting because seeing a sandwich as an energy container then throws the spotlight on the eater of the sandwich. The seeking out of food, the moving of the jaw, salivation, etc all requires energy. This latter energy is physiological, which the energy in the sandwich is "potential" (for want of a better word). But this is not a "gaining of energy" by eating the sandwich; it is a transformation of one kind of energy (physiological) to another (the sandwich) which in turn becomes transformed by metabolism (another process requiring energy) into physiology. Across the interaction between eater and the sandwich, total energy is conserved, but transformed from one form into another. The sandwich is a transducer.

What about a child's teddy bear? That is even more interesting I think. My daughter raised this with me when she was about 6 or 7 (she's now 21!). The child's reaction to her teddy bear is to expend energy on it: she hugs it, maybe talks to it, is concerned for it, she invents a world for it to exist in. She draws an imaginary distinction about the teddy bear as a "person" in her life. So what is happening energetically? It is as if the teddy bear is transforming energy within the child. Hugging requires the expense of energy. But what is gained is epigenetic information about the environment. The smell of the teddy and the release of oxytocin (the "hug hormone") are all critically important features of this interaction. It is an energy transformation where energy expended by the child is made and energy in terms of information is returned. The results are new distinctions, new actions, new (imaginary) conversations - and indeed some real conversations (when the parent asks where the teddy bear is). 

In the philosophy of Gilbert Simondon, all technologies are transducers. They exist at the boundaries of our interaction with an uncertain environment, what Simondon calls the "margin of indeterminacy between two domains ... that which brings potential energy to its actualization". Moreover life at all levels is made up of "margins of indeterminacy": it is the relationship between cells in their extra-cellular matrix, or the functional differentiation of the organs of the body, organisms in an epigenetic environment, the boundaries of social institutions, the interfaces of technologies, or concepts of personal identity. And the point about technology as transduction is that this "interface" that I perceive between myself and my computer is a process of energy conversion which is connected to every other process of energy conversion in my body and the universe. 

The physicists tell us that energy is conserved. They also tell us (at least the Quantum Mechanics people) that there is a dynamic balance between local and non-local phenomena. What happens at the boundary is a transformation of one form of energy into another which amount to the same quantity: the total energy in the system is preserved. In biology, we may see this energy transformation in the form of a balance between the energetic processes inside a cell which lead to protein production by DNA, alongside the epigenetic environment of the cell and its communication and organisation with other cells. It is the cell boundary which serves as a transducer (which is of course, exactly how the cell biologists describe it: Transduction (genetics) - Wikipedia)

Transduction is both the process of converting energy from one form to another, and the process of identifying the boundary across which transduction occurs. Thought generates transducers in the form of concepts. Akhmatova generated her poetry in her head, and as she did so, she created herself - what Simondon calls "individuation". Is there any real difference between the transducers in our heads, our concepts, and the transducers we type at and browse the web with? I doubt it. Gordon Pask was probably the first to think this: he saw concepts and objects in a very similar way - the results of self-referential processes, what von Foerster called "Eigenforms". Simondon's transduction is very similar. 

What this means is that we are looking at technology wrong. If we see tools as "objects" we will miss what is actually happening, and if we fail to understand what is actually happening in terms of energy, we will not be able to control effective energy flows. That is basically what has happened in education and technology. It is (yet another) explanation for why Zoom education is usually crap! It also helps us to explain what happens when we become "addicted" to social media: transductions can be highly inefficient, creating demand for more engagement which in turn is increasingly unrewarding.

However, if we see energy manifesting and transforming in human relations through tools, language, interfaces and even sandwiches, then we can gain better and more effective control of ways in which the "potential" (energy) of every individual in education can be realised. 

Sunday 26 September 2021

Digital Endosymbiosis

Digital Endosymbiosis is a realignment of universals and particulars between the activities that take place in an institution and those which take place outside it. At the moment, disciplinary discourse is confused about what is particular and what is universal. One has only to look at the critical discourse to see the identification of "pathology" in various areas (obesity? bullying? global warming?), with little attempt to see that all pathologies have the same structure with specific realisations. As the tools we use to teach become better and more refined, the underlying patterns of universals will become increasingly apparent. However, I think it is likely that these tools will emerge inter-institutionally because this will be the most effective way that institutions can realise their plans to "digitalize" the curriculum. 

This seems to be what's happening in Copenhagen, where I think real progress has been made on our digitalization project (steering collective understanding of something nobody understands is hard - but we're getting there). Also in the last week, I have been running the Global Scientific Dialogue course in Vladivostok, which will feed into an inter-institutional initiative from the Russians in the form of a Learning Futures Laboratory. I see no reason why these things shouldn't come together. Meanwhile, just to remind me of how current institutional arrangements are not viable, also this week I have been in some pretty intense negotiations with my former university about intellectual property and potential commercialisation of a project I have been involved in for 5 years.  Everything seems to happen at once. Having said that, it may all be quite good in the end.

The Global Scientific Dialogue course was excellent - again. This year I did much more around technical skill - it really is a course that is tool-led, rather than content-led - and the tools I introduced the students to included AI, Google colab and P5. All of this is framed around asking "big questions" about the future, wicked problems, etc, for which they work in small groups supervised by a team of 20 teachers. I was worried I had lost them because I tried to do something quite ambitious with Python generating word-clouds with the students own data. Actually, it turns out that many gained exactly what I'd hoped: "programming is not so easy, but it is really interesting, I realise I can do it, and I want to know more". In a course for people many of whom have never programmed before, you can't hope for more than that. And there was a remarkable moment when those students who were more experienced effectively "took over" the class to help those who were struggling, each student sharing their screen, and other Russian students talking them through how to fix their problems. I've never seen that before online - it was like "Twitch does programming". 

In Copenhagen there's been some tension around whether digitalization is about changing the curriculum (and so developing more technical skill in students), or if it is about changing teaching. It is, of course, both. The problem we have in teaching in institutions today is that all our technology has been taken over by industrial concerns. This is the deep reason why education has become increasingly transactional - the industrial systems we use are transactional. I've hatched a plan to challenge this by developing new teaching tools in-house which invite students into a collaborative process of making those tools better, and so are introduced to the technical discourse outside. Moreover, this improvement process can be done inter-institutionally. Vladivostok, Copenhagen and Liverpool might be my first pilot users.  It means that institutions can start to build a technical niche for themselves in a shared environment which connects them more directly with the world outside. Without getting carried away by this, it may mean that in the fullness of time, the inter-institutional niche becomes the main focus of educational activity, with institutionally-bound activities become more specifically focused on deep conversation around disciplines and research. I see it as a kind of institutional endosymbiosis - universals outside, particulars inside. 

I think if there's a thread that runs through everything it is that the internet will eventually transform the institution of education in ways that it clearly hasn't until now. It won't be about online classes or any other online reproduction of the traditional academy. It will be about the necessity of every individual to adapt to the digital environment as it is actually unfolding, not as institutions teach it. This necessity will mean that something will happen between institutions, not inside them. The institution will not die, but it will change into something other than what it is at the moment. As tools for teaching different subjects are refined, it will become increasingly obvious that our tools reveal the commonalities between our disciplines - universals. Beyond the development of deeper tools for inquiry, the need is for institutions to to conduct the conversation about particulars. This is what I suspect the disciplinary discourse will turn into - much more about the discovery of special cases in nature or society - a critical movement which feeds into the ongoing refinement of our universals. 


Tuesday 21 September 2021

Energy Collages in Vladivostok

When I went to the Paula Rego exhibition in London the other month, something really struck me about what Rego said about her technique of "collage". She talked about the sensual energy of tearing into something - pulling things apart (I can't remember the exact quote). My own experience too has suggested to me that there is something about tearing things apart and reassembling them. In the Global Scientific Dialogue course I've been running in Russia this week, collage and energy has been something of a theme: breaking things and fixing them. Today the students made collages from objects they found around them (inspired by Andy Goldsworthy). I'll post some of their images when I have permission from them. Yesterday, I made the  connection between tearing things apart and getting "stuck into" coding - breaking code and fixing it. There's more to be done with this, but it's all very promising. 

A lot of our social media is a collage. That is basically what a Facebook or Twitter "feed" is. It has an energy, and the continual rearrangement seems to keep on regenerating this energy. Is this why we find it addictive? Of course, its not unusual for "cheap" sources of energy to become addictive... that's what keeps McDonalds and Coca Cola in business, after all. The energy of the collages we make ourselves is hard-won; the breaking-apart of things is a real agony, and the rearrangement is a discovery. This is where the learning is.  

I've been reading Simondon's "Individuation in light of notions of Form and Information". There's a lot of stuff about energy there - both in physics and biology. Simondon goes back to the Aristotelian idea of "hylomorphism", critiquing the basic concept that in order to have any kind of "stuff", there must be some ideal form of the stuff to begin with. Hylomorphism was a doctrine to explain how it might be possible to get "something" from "nothing". The "idea" behind the thing was a way of explaining how this might happen. 

Getting something from nothing is a problem that has preoccupied physicists for most of the 20th century. If there was a "big bang" for example, where did that come from? Is a singularity something or nothing? 

I have been fascinated by Peter Rowlands's work because he turns this question around - it's not about "somethings" at all - everything is in a process of making successive "nothings". The algebra to support this idea derives from Hamilton's quaternions, and using this, it is possible to show how Einstein's mass-energy-momentum equation is really a Pythagorean triple, which factorises to zero.  But more basically, if everything is about nothing, and it is nothing which drives the process of creation in search of nothing, then we have no need for hylomorphism. 

But we do need energy. If E = mc^2, or rather E - mc^2 = 0, and this can be factorized into two expressions which represent "local" and "non-local" physical systems (whose product is nothing), then it might be possible to see how "tearing something apart" releases energy - the E in the equation. All as part of the continual process of resolving the tension between local and non-local to zero. 

Is this the driving force behind biological systems and learning? Does this explain why we are curious to know more? Cell division is, after all, a cut in the system - the creation of an asymmetry, rather like the tearing into a picture to make a collage.  Consciousness sits on cell division and self-organisation, local and non-local factors are mirrored in the relation between DNA and epigenetic marks. 

The mark of learning is to tear into things - to break things as a way of seeing things new. In Vladivostok I hope I have been there to support people doing this, and maybe in a few cases, to pick up the pieces when the shock of breaking something is too much. 

Wednesday 1 September 2021

Technology has no Curriculum (How to teach fish about water)

If there is a central tension in the wrestling match between technology/digitalization and Universities, it is that the curriculum is the central pillar of educational organisation, and the web organises itself quite differently. The online world is the epitome of self-organisation - it is no accident that the systems theorists whose work gave rise to the technology also produced the constructivist epistemology which described how natural systems needed no rigid blueprint for their development. 

Education's "curriculum-blinkers" means that education "schoolifies" the world: everything it encounters in the environment must be boxed-off with learning outcomes, a course plan and a timetable.  If this cannot be done, then basically education can't deal with it. The problem can be described in systems terms: it is basically a problem of "requisite variety".

Education works by attenuating the environment (the world) into organisational structures whose fundamental purpose is to coordinate conversations and award certificates. Education has lower variety (complexity) than the environment, but because it also creates an important part of its cultural environment in the world (it creates a niche for itself), it is able to maintain a stable existence in a complex world: it has "requisite variety". Niche construction takes many forms, but includes creating criteria for certification which can only be offered by education, professionalisation, producing artificial scarcity of knowledge and learning opportunities, creating "failure" and "success", and enculturing the young from birth into the habits of formal education.

This process of education's niche construction has depended on the world being "schoolifiable" without too much loss of information about reality. Technology threatens this. As much as the champions of digitalization try to persuade us, technology has no curriculum. It is essentially and irreducibly transdisciplinary. Of course, we can teach those aspects of "computer science" that encapsulate some of the skills and techniques of using technology, but this is only a small aspect of what technology is, what it does, how we should think about it, and what we might do with it. 

Heidegger called the essence of technology "enframing" in his famously pessimistic but penetrating essay on technology ("The Question Concerning Technology"): enframing was a kind of encapsulation of the thinkable world, rather like Blake's "mind-forged manacles".  Being the reactionary idealist he was, technology led him to want to escape into a world of poetry instead, which he saw as offering a different mode of encounter: what he called "dwelling". But this is the same Heidegger who saw that the future of philosophy lay in cybernetics. His own struggles mirror the struggles that education is now having in dealing with a world that simply doesn't fit its conceptual scheme. 

Heidegger knew he was struggling to deal with conceptualising something that resisted conceptualisation. He was a product of a traditional education system. Little wonder technology troubled him so much. Those thinkers who came from rather less conventional backgrounds like Illich had a better grasp. Technology, rather like time, gives us nothing to get hold of in the conceptual frame from which we inspect it. Our organisational structures can't grasp it. And yet, in our daily practice, our industrial practices, our communications, our creativity, our concerns about what is right and wrong, we are all swimming in it. If fish had universities, would they be able to teach them about water?

All this is telling us that our curriculum-based practices will eventually have to give way to a different way of organising human development and organisation. And yet, say this to anyone in a University and they will look at you as if you are mad. But look at what is on the horizon. The noise the technology companies are making about the future of education, and the enormous sums of money that are being invested, is mostly hot-air and greed - but not entirely. 

In evolutionary history, the most flexible and adaptable organisms survive. So compare a university (any university - they're pretty much the same) with a company like Microsoft or Google. Which is more adaptable? Which is more flexible? 

This is not to say that Google's current pitches for the future of education are the future. They are unlikely to work. But they are playing a long-game. Many of our current technologies would have been considered science fiction 30 years ago. We cannot begin to imagine our technological environment in 30 years time. But preparing for the future is what adaptive organisations do. 

Companies caught in an "adaptation block" are now developing separate branches free from the constraints of conventional business organisation. Universities need to be doing this. 

Thursday 12 August 2021

Digitalization In the Wires of the Institution

The defenders of the need for digitalization in Universities will point to the fact that the world does indeed seem to be "going digital". AI, big data and coding skills do appear to be needed in industry, and Universities are currently not ensuring that enough of their graduates are equipped with these skills. However, this is not to say that "going digital" is always an advisable move. Systems consultant John Seddon made the point a while back that "going digital" can be the last thing an organisation needs to do when what is really needed is a careful and strategic analysis of demand. One of the problems with digitalization is that it can generate its own demand (what Seddon calls "failure demand"), and this can exacerbate any underlying problems that a business had in meeting the already existing demand. Going digital is an easy management action - but a great deal of thought and care needs to be taken. Recent experiences with failed Covid apps are a good case in point. 

But if you examine those companies which are "digital", or who have gone digital successfully, being digital means more than simply using AI or data analysis. Not everyone in a digital company does data analysis, but successful companies will wire themselves in such a way that those who have deep technical knowledge can communicate with those who are more concerned with customer relations, personnel, or finance. These interconnections are vital to effective organisational adaptation: as technology advances and demand shifts, so arguments must be made as to what to purchase, develop, update, who to train, delivery targets, and so on. One might conceive of the digital business as a kind of "network", but it is more than that. It is a network which knows how to rewire itself. One has only to look at Microsoft, Amazon or IBM to see the power of the ability to rewire a business. 

The process of "rewiring" is not simply a process of assuming that certain connections will automatically be made with the "right tools". Human beings, like many biological organisms, are built for rewiring themselves, but this takes time and energy in which we need to learn about ourselves and our own wiring. Every new connection requires the conditions within which trusted communications can evolve. In biology, the creation of these conditions is the critical moment in the establishment of connection: it is the creation of a niche for communication. In industry, niche construction is a precursor to organisational shifts which ultimately result in changes to the ways in which transactions are conducted with customers. Of course, customers only see the transactions - they don't see the processes which underlie the organisational changes to how the business operates. This is a problem when businesses see education as a customer.  

Education is all about rewiring - it's basically another word for learning. It is much more about rewiring than it is about transactions. Because technologies have been adopted by education from industry, there has been a steady shift away from seeing education as about rewiring, to seeing education as being about transactions. Worse still is the fact that, when positioned as customers of industries (like Microsoft or Instructure, for example), education doesn't even see that the businesses that sell to them are actually much better at rewiring themselves than the educational institutions whose fundamental purpose is rewiring. The transactional processes they become absorbed in mask the importance of the niche-construction that is necessary for the rewiring to take place. 

If digitalization in education is to mean anything at all, it must mean that more flexible institutions which know how to rewire themselves are developed. Only with this kind of flexibility will educational institutions be able to adapt to a changing world and equip learners with their own capacity to rewire themselves. Unfortunately, digitalization is seen in terms of either "knowledge" (for example, digital literacy) or skill (programming, proficiency), both of which miss the point. The point is adaptability, and wiring the institution to instil adaptability lies at the heart of successful digitalization in industry. 

What needs to happen to instil adaptive tendencies? It is probably the capacity to create niches for innovative communications, experimentation and the development of new forms of organisation. That means, in turn, being prepared to embrace uncertainty, throw away trusted models, look at the world differently: uncertainty is the key driver where new communications are born.

We only have to look at industry to see how this is done. One of Satya Nadella's first acts at Microsoft was to introduce a new range of concepts including cloud computing and service oriented architecture, and to deprioritise the key products which the "old guard" believed were the cornerstone of corporate stability (notably, Windows). What that did was create a good deal of uncertainty, which in turn created the conditions for new networks and activities. What would this kind of shift of priorities look like in a University? Dispose of the curriculum? Commit to free and open education?  Disband the Computer Services department? Cap the salaries of managers so that management becomes a service to academics? This is niche creation.

In the niche, we learn new things about each other. This is the most important thing about rewiring. It's not just the technical architecture that needs to be rewired. It is the people - teachers, learners and managers. That can only happen if teachers, learners and managers understand how each other is wired. Technology follows: it is the thing which facilitates the rewiring, but in many ways it is the last stage. 

The real problem we have with the digital in education is not skills or tools; it is that the prevailing structures of the institution prevent rewiring. When everything is turned into a transaction, there is no space to create a new niche. When everything is turned into a transaction, universities have become the mere customers of corporations who, it turns out, are much better at transforming themselves than the universities are. 

This situation can be fixed, but it requires a combination of technical imagination and humane leadership. More importantly, it requires that the technical imagination can get under the skin of the institution and into its wiring. If it can do that, then new niches are possible, and new forms of organisation can be created. That, in the end, is what the digital can do for us - but it is for us to demand it. 

Monday 9 August 2021

Looking where the niche is: Creating the Conditions for Dialogue in Education

One of the things that Covid exposed in education is the extent to which learning has become transactionalised. When face-to-face engagements were removed and technology took centre stage, it was highly noticeable that the learning platforms with which we are all now familiar, managed transactions: "watch this video, respond to this forum, write this essay, here are your marks". 

Of course, education has basically been this for a very long time, but with the removal of the physical context, the raw transactions seemed particularly cold. There is a yearning to "get back to normal" - despite the fact that "normal" is little different in terms of the transactions of education, or indeed, its platforms. But the physical context of education makes the transactional stuff bearable - and perhaps this is a problem.

As institutions massified their operations, making things more transactional appeared an essential requirement to deal with scale. The only way this could be done without people complaining was to amplify the compensation for transactionalisation. That was the coffee bar, the sports hall, the evermore plush (and expensive) student accommodation, and so on. There are very important educational and developmental things going on in these contexts, and while they obscured the transactional coldness of the business of the university, university managers might have believed that the transactionalising of education could continue unabated.

The online move has represented a change of physiological context. It is not a matter of online vs face-to-face, but different epigenetic environments (myself and a couple of friends wrote about this a while ago: Covid-19 and the Epigenetics of Learning | SpringerLink). The biology of learning has barely touched on this, but fundamental to any biological learning adaptation is the construction of a niche within which growth and development is possible. We don't as yet have the means of studying this in detail, but it seems to me that understanding the  processes of niche-construction is essential if we are to have institutions which embrace the technological context that we are all now in and encourage dialogue.

What is a niche? It is a home. At a systems level, it is taming of the complexities of the environment such that growth is possible. Think of a spider's web - that is a niche which the spider constructs. We do the same in education, but a niche in education creates the conditions for dialogue. When Rupert Wegerif talks about the importance of trust (here: What is a 'dialogic self'? - Rupert Wegerif), that is the process of establishing a home for learners and teachers together, which makes their communications not only possible, but probable.

I'm very interested in how niches are constructed. Niches are not constructed through transactions: something else happens, and I think this is what was missing in our Covid technological experiments. 

The key feature of a niche is pattern (again, think of a Spider's web, a bird's nest, beehive, etc). Information theorists call pattern "redundancy", and I find this technical description useful.  Patterns can be formed by the rules of a game (although "rules" themselves emerge through patterns of interaction). More deeply, I think patterns are discovered through a deep physiological engagement. It is the root of intersubjectivity - what Alfred Schutz called "tuning-in to the inner-world of each other". When we do "ice-breakers", this is what is really happening. 

What I think is particularly remarkable is that deep questioning in the light of some shared experience can begin to reveal a niche for dialogue. There is a difference between deep questions and shallow questions: it may be a physiological difference. Certainly, thinking deeply feels different to shallow thinking. Why is this?

It's as if we spin our webs inside us, and join them up at their deepest point. The deepest point (and this was I think, something Schutz was aware of, even if he didn't spell it out) is that we are all made of the same cell-stuff. So does the depth go right back into our cells? Our current cognitive/neuro obsessions prevent us from thinking this, I suspect - but neurons are cells: in fact they are cells which stem from the same developmental "germ layer" as our skin, and like all cells, they share their earliest developmental zygotic processes with all living creatures (you can see the germ layers being formed in this amazing video: Watch a single cell become a complete organism in six pulsing minutes of timelapse | Aeon Videos). Is thinking "deep" going back to evolutionary origins where we all came from? Is that our real niche?

If this deep niche construction is what is needed for dialogue, then the transactional and shallow focus of institutional organisation needs to change. While dialogue is the central purpose of all education, the institutional conditions for dialogue are the institutional conditions to facilitate niche construction at a scalable level. Technology can, I believe, help us to do this. 

There is no reason why the physiological conditions of learning cannot embrace technology or even remote engagement. But I think starting to think about the physiology before we think about the transactions is critical. Creative activities or games, for example, are not just a "different" kind of activity; they are deeper physiologically.  This is quite obvious when we see kids engaging with each other on Twitch.

The niche is where the light gets in...


Monday 2 August 2021

Technology and Education as Energy Flows

There is some biological evidence for the role of inter-related parameters - particularly with regard to time and gravity in cellular development. For example, when taken into zero gravity, the biological development of cells "stalls" (this has been done with yeast and lung cells). Cells become a kind of "zombie" - nothing happens, time stops. When gravity is restored, development continues. So the removal of one of the fundamental parameters on which the equation for life depends (gravity) produces interference in another parameter of development: time. Normally we might think that such an interference in biological processes might occur through an epigenetic intervention (some chemical in the environment). But this isn't chemical, but a fundamental force. This suggests that physics goes very deep into biology, and biological origins.

One of the ideas of John Torday which has got me thinking most is the idea that a phenotype - any phenotype - is an "agent" which seeks information from its environment, driven by the demands of its internal operations with the ultimate aim of resolving the totality of its evolutionary history within its current informational context. What "resolving" means here, for Torday, is reconciliation within its original evolutionary state - the original "unicell". And since we are all phenotypes built on phenotypes, this process is ongoing at many levels of organisation, resulting (and this is the really intriguing thing) in the operations of mind.

Another interpretation of "resolution" is the "creation of zero" which one would get from the balancing of an equation, or a homeostatic equilibrium. Torday has been influenced by Peter Rowlands's physics, in which zero is everything, and Peter has shown how the mathematics of Hamilton's quaternions can explicitly show how zero can be made by manipulating the equations of Einsteinian relativity, or Dirac's quantum mechanics. What this means is that zero is an attractor, driving an ongoing evolutionary process. 

Another way of thinking about zero as an attractor is to see it as a flows of energy from one "level" of zero to another at a higher organisational level. There is some justification for thinking about zero as energy: Einstein's Mass-Energy-Momentum equation can be reinterpreted in the form of Pythagoras's triangle, which can then be shown to be an expression of zero in terms of mass, space, time and charge. What's powerful about that is that time is a fundamental parameter in the resolution: we tend not to think that material things embrace time, but Einstein says they do (because of the speed of light in the equation). Obviously, biological material things do embrace time, so this backs up the intuition that the evolutionary history must be considered in the behaviour of organisms - time and history are embedded in their structure. The same goes for social institutions (which are perhaps another kind of "phenotype")

Technologies embrace time too. The late Bernard Stiegler is broadly right about this: technologies are not mind-independent objects. A materialist view of technology is an the error. So what about thinking of technology as energy - or rather, technology as a fundamental component in evolutionary energy flows? 

Intuitively this makes sense to me. Quite literally, I find technology gives me energy - but only at some points. Nobody would do anything with technology if it didn't excite us.  It can, of course, also drain us. So at the point where it's exciting, there is an opening up of new possibilities. So is that the phenotype establishing a new level of balance? 

I think the important point here is that technological development is not material development, but energetic development. And technology is only one dimension of energetic development. Art, love, learning and politics are all things which can produce energetic development. They too are about gaining information from the environment. And each dimension (or parameter) is related to each other. Technology and science produce political effects, for example. But if each parameter is co-related then we need a way of examining the dynamics between them. 

At the level of human experience, our emotions are barometers of energy flows. We stall when one of the parameters we need to drive ourselves forwards is missing: something is lacking in the epigenetic environment. Restoring it requires finding ways of rebalancing our evolutionary development.

Our thinking about education is materialist and causal. We perceive educational outcomes as material products, not as flows of energy. This is partly because we haven't known how to organise ourselves to do anything else. What fascinates me about this scientific work is that it might give us new options for examining what actually happens in development beyond material productions.

Tuesday 27 July 2021

Challenging Perceptions of "Traditional University Education"

There are three critical and fundamental developments underway in higher education today:

  1. Confusion about the place of technology in education, which manifests as a crude "face-to-face vs online" discussion, but which reveals a much deeper confusion about education, learning and tool-use more generally.
  2. Confusion about the moral status of traditional disciplines in light of increasing awareness of their colonial roots, and acknowledgement of the effect that traditional disciplines (and those that uphold them) play in reproducing structural inequalities in society which have their roots in patriarchy and racism.
  3. Confusion about the purpose of education and its usefulness - made worse by the increasingly transactional nature of learning within most universities.
No discipline is safe from any of these. Technology obviously has made its presence felt over the last 18 months or so, and yes - we are all "zoomed out". But are we zoomed-out because of Zoom, or are we zoomed-out because of what we've done with Zoom? Even if the pandemic hadn't happened, we were probably staring at screens just as much as we had been during it. Perhaps we weren't quite so rigidly staring into a camera resisting the temptation to pick our noses, but we were staring at screens nonetheless. So what is it about our perception that was different during the pandemic? What was different in our physiology? This is a question about perception - technology blurs the bounds of conventional categories.

Perception about categories concerning the moral status of education are also changing. Black Lives Matter and MeToo have heralded fundamental changes to the way we look at institutions and the behaviour of people in them. What was it about the film industry that let Weinstein do what he did and get away with it? What is it about architecture education that allows for casual racism and sexism (as students at the Bartlett this week have revealed: Bartlett launches investigation after racism and sexism allegations (msn.com)) Why does traditional music education teach about a load of white male composers, or teach theories developed by other white men who had a vested interest in using their theory to promote the supremacy of their own culture? Some might be dismayed by the iconoclasm or the no-platforming - but iconoclasm always involved a fundamental perceptual shift. That's what matters.

Then there is the perception of the value of education itself.  This is the difference between what is perceived in the university and what is perceived in the outside world. Do they fit? Does one prepare for the other? Or does the university simply allow intelligent young people, who - if thrown into the world would work it out without a university - are instead allowed to be cossetted away from reality for a huge fee, are stressed from pointless assignments and grade obsessions, and set up with unreasonable expectations when they leave? This is going to become a bigger question as the world moves on and universities don't. 

What are we to make of these perceptual shifts? What's to be done with the "traditional academy"?

At the root of all of this is the relation between perception and knowledge. Universities have never taken perception that seriously. Their game was always knowledge because knowledge was measurable, structured and certifiable (even if this is notoriously inaccurate).  But knowledge sits on perception, and in the end, it is perception and its close cousin, "judgement", that matters. 

I wonder if, despite its best efforts (for example, reflective learning - although hardly a success) traditional education can't enter this confused realm of perception and judgement. Perhaps only direct engagement with the real world can do that. And educationally, only personal experience and experiment with one's own life can really produce the kind of learning which can equip the young with sufficient flexibility and good judgement to navigate a world where the traditional categories are vapourising before us. To quote Marion Milner, we now are all in a "Life of One's Own", and the focus of our inquiry is not on mastering traditional disciplines as much as finding enough space in our lives for individuation and creativity.

Monday 26 July 2021

The Empirical Phenotype

One of the realisations that has been creeping up on me is that the view of ontology which I subscribed to for many years, Critical Realism, is the wrong way round. My accidental journey as an academic began 20 years ago in the University of Bolton, and one of the first things I did there was to explore the "ontology of technology". Clive Lawson had a conference in Cambridge on this topic which I didn't go to, but read about it and subscribed to the "Realist Workshop". This set me reading the work of Roy Bhaskar - and that was where I got my ontology from.

In Bhaskar's ontology, "reality" is envisaged as a set of embedded layers. The "empirical" layer concerns that aspect of reality which is directly experienced. Obviously this includes the observations of scientists who discover laws of nature. The empirical in this scheme is the most circumscribed layer of reality. Above the empirical is what Bhaskar calls the "actual": this is the world of possible events, which may or may not be experienced, but which result from deeper mechanisms in nature which may be unknown to our scientific understanding. At the deepest level, the "real" is the totality of "generative mechanisms" which exist independently of human ability to observe them. Discovery of these generative mechanisms, which Bhaskar argues can be either "intransitive" (existing independently of human agency) and "transitive" (existing through human agency), is the point of inquiry in the physical and social sciences. It's through this basic scheme that Bhaskar builds an impressive argument to say that scientific inquiry and emancipatory politics are tied together: Critical Realism was a kind of lifeboat for Marxists disillusioned by what had transpired in the name of Marxism.  

I liked this because I found that the emphasis on "mechanism" led to some powerful parallels between cybernetics and Critical Realism. However, the notion of mechanism is problematic. If we ask what a mechanism is, can we say that it is any more than an idea: something produced by an observer, not something inherent in a thing. This observer problem haunts Critical Realism as it does cybernetics. 

If a mechanism is an idea, what is an idea? It seems as something arises (or is constructed) through a process of inquiry which is akin to a kind of "dance with nature". More importantly, there is a dance with some unknowable, and ambiguous environment. As human beings, we know that we do this. But we also see similar dances going on in the microscopic world of our immune system, or in the fertilization of an egg, or the development of an embryo. It seems that everything dances - or at least, we know ourselves to dance, and perceive in observations of nature similar dances. The resonance between self-knowledge and perception of nature is enough to justify the word "everything".  

The fundamental nature of dance is an experiment. But the fact that all living things dance, and that the dance of ourselves seems to relate to the dance of our own cells, we should ask:

  1. Who had the first dance?
  2. How does it all hold together?

It is surprising that the study of dance is not taken more seriously. Only Maxine Sheets-Johnstone has really made a profound contribution. But because everything dances, the study of any particular dance can shed light on other levels of dance. From the dance of white blood cells to a tango, or just the music of the tango, fundamentally there is resonance, and the resonance too is part of the dance. When we dance, we move together with something or someone. We create multiple descriptions of the same thing. But what's the point? 

If a dance didn't stop, we would be exhausted. Indeed, our exhaustion can be what brings things to a pause. Dances have a beginning and an end, and the middle is a progress from one to the other. In the end, there is no dance - nothing. Everything is in the process of creating nothing. And nothing is made with resonance - multiple descriptions of the same thing.

Making nothing is not a trivial thing. It requires the seeking-out of new descriptions which can resonate or interfere with existing descriptions so that eventually things "cancel out". I suspect this seeking out of new descriptions or differences is what might otherwise be called "agency" - it is what calls us to act. We are always being called to act to create a clearing, and our response can range from meditation to rage. 

Another word for "nothing" is equipoise. The dance of cells is driven by this principle. They seek equipoise with an ambiguous environment, maintaining their boundary and garnering sufficient energy to continue their empirical journey. This is the feature of any phenotype: as far as life is concerned, the first cell had the first dance.  And the biological phenotype gave us observation by virtue of the fundamental distinction of the boundary between itself and its environment. Observation always requires a boundary. Before the first cell, did anything observe?

But making nothing may well have been going on before this. Newton, Einstein and Dirac all amount to nothing - but we can only intuit that through our biological lens of thought. And, perhaps most importantly, that biological lens of thought refers back to the "first dance", as does all the dancing of every cell, organ, plant, bacteria, etc. 

And when we experiment, this is what we are dancing with: not only physical nature, but evolutionary history. So Bhaskar is wrong: the empirical is the most fundamental, most active layer of reality, which underpins the principle of making nothing. It is tied to physiology. We might imagine deeper mechanisms and make observations of matter through particle accelerators or (eventually) quantum computers - but our imagination sits on a physiology which is already doing the thing that we see in our physics. And the politics of science? A society which fails to create the conditions for making clearings in the minds of its members will destroy itself. It is education's job to do this - right now we fill peoples' minds with noise.  

Monday 28 June 2021

Creativity and Energy

One of the features of our online experience is its continuity: there's no space for breath. Personally, I find that endless continuity is very draining. I find myself in need of creating a clearing amidst a chaotic flow of events.

There is a connection between making a clearing and doing something new. A new departure, like all creative acts, must start from nothing. So making nothing before we do anything is the most important thing. In the digital flow of experience, making nothing is difficult. Perhaps we should turn everything off - but this is not the same as making nothing. I do wonder if creating nothing was easier in the past. Perhaps.

It is important to understand the dynamics of nothing. Nothing is not a state - it is a process. You can imagine that you are in a state of nothingness, but physiologically, your cells continue to reproduce themselves, your neurons pump calcium, you breathe, and so on. How is this state of nothing different from a state of something? 

The way I think about it is that getting into a nothing state is a process of tuning in to our most basic processes - the processes which in the beginning were the root of existence. The first cells, the quantum mechanics of the atoms which make up the substances from which our cells are made, the origin of the universe. Thinking is a manifestation of those processes. Nothing is where consciousness harmonises with those processes. Another way of thinking about this is to say that nothing is where energy manifests in its purest state.

The materialism of ordinary existence is the continual making of things around us - that thing which from which we might wish to escape. If nothing is energy in its purest state - a harmony - materialism is disordered energy. Matter is an epiphenomenon of things not being tuned in. I find this a helpful perspective because it sees matter and energy on a continuum - from disorder to deep order. 

Some aspects which we might consider material - for example, crystals or coral - reveal deep order to us. And some aspects of our scientific knowledge give us an insight into the processes of this deep order.  It is what Bohm called the "implicate order".

So to create, we have to become crystalline. Then we can make a start - with a mark.

So what happens then? In making a mark - it could be a line, words, some notes, etc - we start a new continuity. This is something which contains the seeds of its continuation. It takes energy to do this because what the energy must do is make a choice: this is part of the continuity and this isn't. And most continuities created like this quickly exhaust the energy that was the impetus to create them. We run out of steam.

But then the same thing must happen. We have a continuity which exhausts us from which we need to create a clearing. But if you cut a continuity, you create two continuities with nothing in between them. For some reason I find the act of cutting things and creating space between them gives me energy again. I guess if you cut an atom, you get energy. Perhaps cutting a continuity releases the energy inherent in it, and that then feeds a new process of making new marks in the clearing between them. Cutting a continuity creates a dynamic tension between the two parts too.

In this way, creation - like cells - grows from the middle-out. Cuts are made, clearings created, new continuities constructed. But the point of the whole is the creation of nothing. The point of a piece of music is the last note and then the silence. It is the making of a clearing for others. As things grow, nothing can be created not only by cutting and clearing, but by harmonising - which is a form of adding something to create nothing (no harmony is strictly necessary).

What does this all mean? 

Of our many problems, we have an education system which is materialist in its orientation, while it trumpets the virtue of creativity (often in "innovation") without really thinking what this is. Creativity is not "accretion" - it is not the continual adding of new things on top of one another - in the way our journals now work. That just creates noise and is exhausting. 

Creativity is harmonising with the processes of nothing. Some of our technologies are showing us how this principle might work (particularly AI which is a "nothing creating" technology). If we listened to what is happening, what we should do next would become much clearer to us. Perhaps the clearest signal is that learning is not "absorption", but creation. It's not that we learn mathematics, computing, chemistry or art. It is that we learn to generate nothing in the context of mathematics, computing, chemistry and art. 

Sunday 13 June 2021

AI, Experiment and Pedagogy - Why we need to step back from the critical "Punch and Judy" battles and be scientific

There are some things going on around technology in education which I'm finding quite disturbing. Top of the list is the "Punch and Judy" battle going on between the promoters of AI, and the critics of AI. One way or another, it is in the interests of both parties that AI is talked about - for the promoters, they want to encourage big money investment (usually, but not always, for nefarious purposes); for the critics, they want a platform upon which they can build/enhance their academic reputations - what would they do without AI??

Nowhere, in either case, is there real intellectual curiosity about the technology. Both parties see it as a functionalist thing - either delivering "efficiency" (and presumably, profits), or delivering educational disaster. The former is possible, but a huge missed opportunity; the latter is unlikely because the technology is not what the critics imagine it is. In fact, the technology is very interesting in many ways, and if we were scientists, we would be taking an active interest in it.

As I have said before, "Artificial Intelligence" is a deeply misleading term. What machine learning is is an exploitation of self-referential, recursive algorithms which display the property of an "anticipatory system": it predicts the likely categories of data it hasn't seen before, by virtue of being trained to construe fundamental features of each category.  We have not had technology like this before - it is new. It is not a database, for example, which will return data that was placed in it in response to a query (although AI is a kind of evolution from a database).   

"Artificial Anticipatory Systems" are extremely important for reasons we haven't begun to fathom. The deep issue is that, as all biological systems, we are also "anticipatory systems". Moreover, the principles of anticipation in biological systems are remarkably similar to the principles of anticipatory systems in machine learning: both rely of vast amounts of "information redundancy" - that is, different descriptions of the same thing. Redundancy was identified by Gregory Bateson (long ago) as fundamental to meaning-making and cognition. Karl Pribram wrote a brilliant paper about the nature of redundancy and memory (see T-039.pdf (karlpribram.com). Poets (Wallace Stevens in "The necessary angel"), musicians (Leonard B. Meyer), physicists (David Bohm) and many others have said the same thing about multiple descriptions and redundancy. How does it work? We don't know. But instead of using the opportunity to inspect the technology, foolish academics posture either trying to shoot the stuff down, or to wear it as a suit. 

To hell with the lot of them!

I read a recently-published remark against empiricism itself the other day by a well-known and highly intelligent scholar. The argument was basically that "flat earth" campaigners (and other conspiracy theorists) were empiricists because they appealed to simple observations. What was needed in place of this "empiricism" was the carefully constructed critical argument of the social science discourse. 

I think I partly blame the philosophy of "Critical Realism" for this (and I speak as someone who for a long time had a lot of time for CR). Roy Bhaskar  makes a distinction between the "empirical", the "actual" and the "real", arguing that the empirical is the most constrained situation, because it involves observation of events (typically, but not exclusively, through artificial closed-system experiments designed to produce observable and reproducible successions of events). The actual, by contrast, considers events that might occur, but might not be observed. The real, by contrast, involves the world as it is beyond human perception - what Bhaskar considers to be the result of "generative mechanisms". 

Now what's wrong with this deflating of empiricism? The real problem arises because Bhaskar bases his arguments on a particular reading of Hume's scientific theory which suggests that science results from experiments producing regular events, and scientists constructing causes to explain these events. Hume's position is unfortunately misconstrued by many as a defence of a naive mind-independent empirical reality (which was the opposite of what Hume was really saying), but Bhaskar's point is to say that Hume was wrong in saying causes were constructs. [The tortuous complexity of these arguments exhausts me!] However, behind all this is a deeper problem in that experiment is seen as a thoroughly rational and cognitive operation - which it almost certainly is not. Moreover, this cognitive and rational view becomes embedded in the kind of authoritarian "school science" that we all remember. 

The flat earthers are not empirical. They are authoritarian - borrowing the cognitive misinterpretation of science from the schoolroom to make their points.  

As physiological entities, scientists are engaged in something much more subtle when doing experiments. Science is really a "dance with nature" - a process of coordinating a set of actions against a set of unknown constraints from nature. Producing regular successions of events are  a way of codifying some of constraints that might be uncovered, but that in a way is really an epiphenomenon of the empirical enterprise. Codification is important for reasons of social status in science, and perhaps for social coordination (if it is codification of the genome of a virus, for example). But it is not what drives the empirical effort. That is driven by continually asking new questions, and making new interventions to get ever-richer versions of reality. The drive for this curiosity may be to do with evolution, or energy and negentropy. As David Bohm pointed out, scientific understanding is rather like a continual accretion of multiple descriptions of nature. It is also about redundancy. 

This is why I find the machine learning thing so important, and why the ridiculous posturing around it drives me crazy. This is a technology which embodies (an inappropriate term, of course) a principle which lies at the heart of our sense-making of the world. Studying it will shed light on some deep mysteries of consciousness and learning, and our relationship with technology. As we move closer to quantum computing, and closer towards being able to study nature "in the raw", some of the insights from the current development phase of machine learning will provide a useful compass for future inquiries. They are, I'm sure, related. 

It may be more of a tragedy that the critics of AI are not scientists, than the many promoters of AI in education being criminals.  


Saturday 29 May 2021

What is happening with "digitalization" in Education?

I am currently involved in a large-scale project on digitalization. The aim of the project is to instil practices relating to the manipulation of data, coding, algorithmic thinking and creativity throughout the curriculum in the University. While this appears, on the one hand, an attempt to reignite the "everyone should code" kind of stuff, something is clearly happening with the technology which is necessitating a reconfiguration of the activities of education with the activities of society. 

As with many big structural changes to education, there are already quite a few signs of changes in practice in the University: many courses in the humanities and sciences are using programming techniques, often in R and Python. My university has established faculty-based "data centres" - rather like centres for supporting e-learning - which provide services in analysis and visualisation. However, across the sector, there is as yet little coherence in approaches. It is rather like the situation with teachers using the web in the late 90s where enthusiastic and forward-thinking teachers would put their content on websites or serve them from institutional machines. The arrival of VLEs codified these practices and coordinated expectations of staff, students and managers. This is what is likely to happen again, but instead of codifying the means of disseminating content, it will codify programming practices across different aspects of the curriculum.  

There is a further implication of this, however, which has to do with the nature of disciplines and their separation from one another. One of the reasons why digitalization has such a hold in education at the moment is the dominance of digitalization in industry across all sectors. Where sectors might have distinguished themselves according to expectations formed around concepts, products and markets, increasingly we are seeing coordination of industrial activities around practices and processes. This has been slowly happening for the last 20 years or so, but is evidenced by the way that industries are realigning themselves by synergising practices and technologies across different fields of activity. Think of Amazon. This has been coupled with increasing "institutional isomorphism" in the management of institutions across the board. This has produced many problems in institutional organisation - partly because the old identities of institutions have been torn-up and new identities imposed which, although they exploit the new technologies available, almost always reinforce and amplify the hierarchies and inequalities of the old institution.  

With this in mind, this next phase of digitalization is going to be very interesting. The old hierarchies of the university are established around academic departments and subjects. These are basically codified around concepts which, within academia, operate to define and redefine themselves in contradistinction to one another. This is not to say that interdisciplinarity isn't something that's emerged: obviously we have things like biochemistry or quantum computing, but even within these new fields which appear interdisciplinary, the codification around concepts is the central mechanism which provides coherence. Look, for example, at how academic communities fracture and form tribes: not just the mutual antipathy between psychology and sociology, but between "code biology" and "biosemiotics", heterodox vs classical economics, etc. A lot of this kind of division has to do not just with disciplinary identity, but personal identity. Concepts are tools for amplifying the ego (am I not doing it here?), and the principal mechanism for this process has been the way we conduct scientific communication. 

Digitalization means that increasingly we are going to see research and learning coordinated around practices with tools. This is a more fundamental change to what is loosely called the "knowledge economy". It won't be enough to simply name a concept. We will need to show how what is represented by a concept actually works. Argument will be increasingly embellished with concrete examples, some in code, and all of which presented in a way in which mechanisms can be communicated, experimented with, new data applied to, refined, and continually tested. More importantly, because these practices become common, and because practices supersede concepts in scientific inquiry, the traditional distinctions between disciplines will be harder to defend. This will produce organisational difficulties for traditional institutions in which disciplines will perceive threats in the digitalization process and seek to defend themselves.

Another threat may come in the form of what might be called the "status machine" of the university. Concepts don't only codify a discipline, they codify the status of those who rise to positions where they can declare concepts (what Searle, who is not alone in pointing out this mechanism, calls "deontic power").  While new practices are codified in a similar way, practices are only powerful if they are adopted widely, and in being adopted, they are continually adjusted. Eventually we don't care about the concept or who thought of it, but about being part of the game which is developing and upholding a common set of practices. The operating system Linux is a good example: nobody really cares about who invented it; but we do care about using and developing it. We can start to make a list of similar practices which fit this model: computer languages, online programming environments, visualisation tools, etc.

But the university is a "status machine": its business ultimately lies in selling certificates and through codifying status. So if it comes to be about practice rather than status, what does the University then do? New forms of personal status codification are emerging. The online machine learning competition site Kaggle, for example, provides opportunities to do real and meaningful data analytic activities. Winning a Kaggle competition is an important marker of personal status carrying more meaning than a degree certificate because it demonstrates what someone can actually do, with references to things that were actually done. But Kaggle does not lock its status mechanisms behind the expensive close-system of an institution: it is open and free, funded by the fact that the fruits of intellectual labour become the property of Kaggle (and by extension, Google). Intellectual activity given to the platform is exchanged for status enhancement. It is in many ways an extension of the Web2.0 business model with some important differences. 

What happens in Kaggle educationally is particularly interesting. Kaggle teaches simply by providing a network of people all engaging in the same activities and addressing the same, or related, problems. There is no curriculum. There are emerging areas of special interest, techniques, etc. But nothing codified by a committee. And it exists in the ecosystem of the web which includes not only what Kaggle does, but what StackOverflow does, or anything else that can be found on Google. Human activity contributes to this knowledge base, which in turn develops practices and techniques. Learners are effectively enlisted as apprentices in this process. Experts, meanwhile, will go on to exploit their knowledge in new startups, or other industrial projects, often continually engaging with Kaggle as a way of keeping themselves up-to-date.

The University Professor, meanwhile, has both become increasingly managerial, and increasingly status-hungry as they seek the deontic power to declare concepts, or make managerial things happen ("we should restructure our University" is a common professorial refrain!), but increasingly (and partly because there are so many of the bastards now), nobody is really interested - apart from those who will lose their jobs as a result of professor x. We just end up with a lot of mini-Trumps. Deontic power doesn't work if nobody believes you, and it doesn't do any good if, even if they listen to you, they merely repeat the conceptualisations you claim (but with different understandings). The academic publishing game has become very much about saying more and more about less and less, where each professorial utterance merely adds to a confusing noise that only benefits publishers.

Kaggle shows us that we don't need professors. There will always be "elders" or "experts" who have more skin in the game and know how to use the tools well, or to apply deeper thinking. But it is not about leading through trying to coin some attractive neologism.  It is about leading through practice and example. 

Here we come to the root of the organisational challenge in the modern university. Their layers of management are not full of people leading by example with deep skill in the use of digital tools. They are full of people who postured with concepts. And yet, these are the people who have to change as the next wave of digitalization sweeps over us. I suspect it's not going to be an easy ride.

Thursday 6 May 2021

Technology, Conversation and Maturana

I wrote this last month for the Post-Pandemic University blog (see Technology and Conversation – The post-pandemic university (postpandemicuniversity.net)). Maturana was on my mind. I saw him speak at the American Society for Cybernetics conference in Asilomar, CA in 2012. There were many other cybernetics luminaries there for the Bateson celebration "An Ecology of Ideas" (see Microsoft Word - all.docx (asc-cybernetics.org))  I also remembered from that event Jerry Brown's motorcade arriving (he was a Bateson student), Nora Bateson's film, Graham Barnes's talk ("How loving is your world?"... Graham also died recently), Terry Deacon's talk and Klaus Krippendorff's birthday celebration. It was quite an event.  

I don't remember an awful lot of Maturana's talk except for a remark he made about learning in response to a question: "What we learn, we learn about each other". 

That deserves a huge YES!

So here's my postpandemic piece. And that comment from Maturana about learning runs all the way through it.

.....

Biologist Humberto Maturana once wrote a poem called “The Student’s Prayer” in response to the unhappiness of his son in school. It goes:

Don't impose on me what you know,
I want to explore the unknown
And be the source of my own discoveries.
Let the known be my liberation, not my slavery.
The world of your truth can be my limitation;
Your wisdom my negation.
Don't instruct me; let's walk together.
Let my richness begin where yours ends.
Show me so that I can stand
On your shoulders.
Reveal yourself so that I can be
Something different.
You believe that every human being
Can love and create.
I understand, then, your fear
When I asked you to live according to your wisdom.
You will not know who I am
By listening to yourself.
Don't instruct me; let me be
Your failure is that I be identical to you.

Maturana’s poem speaks of the importance of exploration and conversation in learning – what he calls “walking together”. Taken literally, conversation is actually “dancing together” because the Latin “con-versare” means “to turn together”. I find this a useful starting point for thinking through the confusing categories by which we distinguish online activities and artefacts from face-to-face engagements. Anyone who has danced with anyone else knows that it doesn’t work by one person imposing something on the other. It does require “leading”, but the leader of the dance engages in a kind of steering which takes into account the dynamics of the whole situation including themselves and their partner.

What happens in this steering process is also revealed in “conversation”: it is a negotiation of constraints – “this is how we can move”, “this is how I am able to move”, and so on. Like dancing, conversation is not about imposition. In a conversation, participants reveal their understanding and their uncertainty through the many utterances that they make. Those utterances are multiple attempts to describe something which lies beyond description. But taken together, something is revealed, and if it works, like the dancer and their partner, each person becomes a different version of the same thing – rather like a counter-melody to a familiar tune. A richer reality emerges through the counterpoint of multiple descriptions.  

This understanding of conversation is the antithesis of the increasingly transactional way in which the education system seems to view the “delivery” of education. This is not merely an exchange of words, essays, text messages, tweets, or blog posts. It is a coordination. Or perhaps more deeply (to borrow some terminology from Maturana) it is a “coordination of coordinations”.

When I try to explain this, I sometimes use some software called “Friture” which graphs the spectrum of sound in real-time. You can download the software here (http://friture.org). I ask people to sing a single note (or at least try) and capture it on the computer. The resulting spectrum shows a set of parallel lines representing the many frequencies which combine in making the single sound. Conversation is like this, I say.

Indeed, we can explore things further with sound. If you sing the note while gradually changing the shape of your mouth to make the different vowel sounds, the number of lines decrease and increase. The narrow “e” sounds are rather like a tinny transistor radio. The fuller “ah” sounds are more “hi-fi” and rich. So the more simultaneous versions of the same thing, the more “real” it feels. Try it!

The message is that our grasp on reality and the effectiveness of our social coordination requires the coordination of diverse voices – and that is what conversation is about. The physicist David Bohm, who made the connection between a view of quantum mechanics and scientific dialogue, explained it more elegantly here: (1) David Bohm on perception – YouTube. And there is a political message: the richness in our understanding of reality entails the conditions of a free society which embraces diversity and creates the conditions for conversation: that is, one that doesn’t impose one particular description of the world on everyone else. That merely produces the tinniest of transistor radios!

Technically, in the world of information theory, multiple descriptions of the same thing are termed “redundancy”, which is another word for “pattern”. This is useful when we try to make sense of the relationship between the conversations that we have face-to-face, and the phenomena that we experience online. 

The Internet’s Multiplicity

The internet has vastly expanded the multiplicity of descriptions of the world. Does the internet dance in much the same way that our face-to-face conversation dances? I think it does, but to understand how it does, we have to look a bit more deeply at the kinds of multiplicity involved in all conversation. 

The internet produces its multiplicities differently. Face-to-face conversation is comprised of gestures, words, phonemes, and prosody – we wave our arms, use our eyes, change the pitch of our voice, and often repeat ourselves. The repetition we might think of as a “diachronic” (over time) redundancy; the arm-waving, voice pitch, gestural stuff is “synchronic” (simultaneous). Returning to the sound spectrum analyzer, the parallel lines identify the synchronic aspects, while if we were to sing a melody, the changing pattern over time represents the diachronic dimension. 

So what if the balance between synchronic redundancy and diachronic redundancy can be shifted around? What if parts of what is synchronic, become diachronic? Isn’t that what happens on the internet? 

Our snippets of text, video, emails, game plays, hyperlinks, blogs, timelines, likes, shares and status updates do not happen at the same time. While some of them (like video) contain rich synchronic aspects similar to face-to-face engagement, and text itself is a remarkably rich synchronic medium (without which poetry wouldn’t exist!), much of the multiplicity (or the redundancy) occurs diachronically as well as synchronically. The timeline matters; the concern for a particular individual’s understanding matters. And we may never meet somebody face-to-face, but following them on Twitter might mean that we get to know them as if we had, and perhaps a bit better.

Why don’t your Zoom lectures Dance?

So why, when it comes to education online, does so much seem deathly? Why doesn’t your zoom lecture dance? If the internet dances, why can’t education join in?

To answer this, we have to examine education’s constraints. And here we meet the very things that Maturana was railing against. Why is it so dreadful? Because the basic function of the system is to impose on students what is already known, examine them and certificate them. It instructs, reproduces and fails (at least, in Maturana’s terms): more goose-step than dance. 

There are of course reasons why this is so. After all, how would a meaningful assessment system operate if learners were allowed to be different from one another or do completely different kinds of activities? Well-intentioned though ideologies like “constructive alignment” are, inevitably they get used to hammer abstract “learning outcomes” into students in the same way that we hammer in facts. In short, online learning is crap not because of technology, but because of the constraints of the institution. But our institutions took their form in a world where our available tools were limited, meaning that this was the most effective way to organise education at the time. If we started from scratch today, with the tools that we now have, might our institutions would look and behave very differently?

We have a vestigial education system which increasingly insists on a transactional “delivery of learning” and its measurement. Shifting this ideology online brings the added disadvantage that the internet does not afford the same synchronic richness of face-to-face situations (which at least mitigate the pain of instruction), while the education system cannot adapt to the internet’s rich diachronic mode of operation.

Dancing on Stilts

But it was always obvious to the pioneers of technology in education that learning with technology was a different kind of dance. One would end up dancing on stilts or trying to play Mozart wearing mittens if one insisted on reproducing established ways of institutional education online. 

In a very revealing passage explaining his core idea of “teach back” (where a teacher would ask a learner to teach back what they had learnt), Gordon Pask noted something fundamental in the patterns of teaching and learning processes that chimes with Maturana’s poem:

“The crucial point is that the student’s explanation and the teacher’s explanation need not be, and usually are not, identical. The student invents an explanation of his own and justifies it by an explanation of how he arrived at it” (Pask 1975)

What Pask argued was that it was the redundancy of the interactions that mattered in the dance. 

Technology and Institutional Structure

Now we have amazing technology, this redundancy can come in many forms and many different kinds of media. Videos, blogs, social media interactions, and so on. And yet within the context of formal education, we rarely harness this diversity because it presents organisational problems in the assessment and management of the formal processes of education. The root cause of why we dance on stilts lies in the structures of education, not in any particular pedagogy or “ed-tech”. 

This is the paradox of the current state of the uses and abuses of technology in education. The need for technical innovation in education lies in the use of technology to reform the management and structures of education which constrain teachers and learners to such an extent that it makes online education unbearable. The actuality of “ed-tech” innovation in education lies in corporations feeding on the obvious inadequacies of online learning, looking for a chunk of the enormous sums of money going into education, and pitching for minor improvements to fundamentally broken processes, while often burdening institutions with increasingly complex technical infrastructure and expensive subscriptions. 

There is hope. The internet really does dance, and the students are getting increasingly good at using it (and indeed, some teachers!). Educational institutions are like a Soviet-style old-guard in a rock-and-roll world. Technology is the lubricant that will eventually free everything up – but the old guard will be slow to shift. 

We have to decide what our educational institutions are for. Are they there to “deliver learning” and make profits and pay vice-chancellors obscene salaries? Or are they there for creating the contexts for new conversations? It is a fork in the road. One way lies a future of ed-tech which feeds on the inadequacies of the existing system like a parasite. The other lies a future of technology being used to transform the self-steering both of institutions and individuals in their dances with each other and society.

Thursday 29 April 2021

Real "Digital" vs Education's idea of "digital": Some reflections of computational thinking

Digitalization is (once again) the hot topic in education. Amid concern that students leave university without digital skill, educational policy is focusing on "instilling digital skill" from primary school upwards. In Europe and the US, this is labelled "computational thinking", and is closely related to the (rapidly failing) drive to push computer science in schools.

Rather like the STEM agenda, to which it is of course related, there is a difference between education's idea of "digital" and the real-world of "digital" which is happening in institutions and companies, for which skills are definitely needed. 

What is the real world of digital? Perhaps the first thing to say is that there is no single "real world". There are dispositions which are shared by technical people working in a variety of different environments. And there are vast differences between the kinds of environments and the levels of skill involved. For example, Python programming to analyse data is one thing, using tools like Tableau is another. There are the hard-core software development skills involved in enterprise system development with various frameworks (I'm currently banging my head against Eclipse, Liferay and Docker at the moment), and then there are those areas of skill which relate to the sexier things in technology which grab headlines and make policymakers worry that there is a skills gap - AI particularly.

So what do governments and policy makers really mean when they urge everyone towards "digitalization"? After all, engineering is pretty important in the world, but we don't insist on everyone learning engineering. So why computational thinking? 

Part of the answer lies in the simple fact of the number of areas of work where "digital" dominates. The thinking is that "digital skill" is like "reading" - a form of literacy. But is digital skill like reading and writing? Reading, after all, isn't merely a function which enables people to work. It is embedded in culture as a source of pleasure, conviviality, and conversation. By contrast "digital skill" is very pale and overtly functionalist in a way that reading and writing isn't.  

The functionalism that sits behind computational thinking seems particularly hollow. These are, after all, digital skills to enable people to work. But to work where? Yes, there is a need for technically skilled people in organisations - but how many? How many software developers do we need? How many data analysts? Not a huge amount compared to the number of people, I would guess. So what does everyone else do? They click on buttons in tracker apps that monitor their work movements, they comply with surveillance requests, they complete mindless compulsory "training" so that their employers don't get sued, they sit on zoom, they submit ongoing logs of their activities on computers in their moments of rest, they post inane comments on social media and they end up emptied and dehumanized - the pushers of endless online transactions. Not exactly a sales pitch. Personally, I would be left wishing I'd done the engineering course!

A more fundamental problem is that most organisations have more technically-skilled people than they either know about, or choose to use effectively. This is a more serious and structural problem. It is because people who are really good at "digital" (whatever that means) are creative. And the last thing many organisations (or many senior managers in organisations) want is creativity. They want compliance, not creativity. They want someone who doesn't show them up as being less technically skilled. And often they act to suppress creativity and those with skills, giving them tasks that are well beneath their abilities. I don't think there's a single organisation anywhere where some of this isn't going on. Real digital skill is a threat to hierarchies, and hierarchies kick back.  

Educational agendas like computational thinking are metasystemic interventions. Other metasystemic interventions are things like quality controls and standards, curricula, monitoring and approved technical systems. The point of a metasystemic intervention is to manage the uncertainty of the system. Every system has uncertainty because every system draws a distinction between itself and the environment - and there is always a question as to where that boundary should be drawn, and how it can be maintained. The computational thinking agenda is an attempt to maintain an already-existing boundary.

Our deep problem is that the boundary between all institutions, companies and other social activities and their environments has been upheld and reinforced with the increasing use of technology. Technology in the environments for these institutions has been the root cause of why the institutional boundaries have been reinforced with technology in the first place. Technology is in the very fabric of the machine that maintains the institutions that we have, which themselves have used technology to avoid being reconstructed. The problem institutions have is that in order to maintain their traditional boundaries they must be able to maintain their technologies. Therefore they need everyone to comply with and operate their technologies, and a few to enhance them. But how does this not end in over-specialisation and slavery? How does it create rewarding work and nurture creativity?

No education system and no teacher should be in the business of preparing people for servitude. So what's to be done?

The question is certainly not about digital "literacy". It is about emancipation, individuation and conviviality in a technological environment. Our technologies are important here - particularly (I think) AI and quantum computing. But they are important because they can help us redesign our institutions, and in the process discover ourselves. That, I suspect, is not what the policy makers want because ultimately it will threaten their position. But it is what needs to happen. 

Tuesday 27 April 2021

Spaceship Earth's Education System

Buckminster Fuller's account of "specialisation" in "An Operating Manual for Spaceship Earth" is fascinating me because he sets up an opposition between those who anticipate and those who can't, between those who think in systems terms and those who "specialise". In contrast to the majority of the specialised land-dwelling people of the pre-20th century planet who saw only a fraction of the earth and believed the world was flat and "thought its horizontally extended plane went circularly outward to infinity", Buckminster Fuller contrasts "the Pirates", who sailed the seas and

had high proficiency in dealing with celestial navigation, the storms, the sea, the men, the ship, economics, biology, geography, history, and science. The wider and more long distanced their anticipatory strategy, the more successful they became.

Anticipation counters specialism. "Leonardo da Vinci is the outstanding example of the comprehensively anticipatory design scientist." And then the Great Pirates who:

came to building steel steamships and blast furnaces and railroad tracks to handle the logistics, the Leonardos appeared momentarily again in such men as Telford who built the railroads, tunnels, and bridges of England, as well as the first great steamship. 

But this leads to imperialism. Fuller says imperialism was a new form of specialism in which the "Leonardos" were put to work by "sword-bearing patrons".

You may say, "Aren’t you talking about the British Empire?" I answer, No The so-called British Empire was a manifest of the world-around misconception of who ran things and a disclosure of the popular ignorance of the Great Pirates’ absolute world-controlling through their local-stooge sovereigns and their prime ministers, as only innocuously and locally modified here and there by the separate sovereignties’ internal democratic processes. As we soon shall see, the British Isles lying off the coast of Europe constituted in effect a fleet of unsinkable ships and naval bases commanding all the great harbours of Europe. Those islands were the possession of the topmost Pirates. Since the Great Pirates were building, maintaining, supplying their ships on those islands, they also logically made up their crews out of the native islanders who were simply seized or commanded aboard by imperial edict. Seeing these British Islanders aboard the top pirate ships the people around the world mistakenly assumed that the world conquest by the Great Pirates was a conquest by the will, ambition, and organization of the British people. Thus was the G. P.’s grand deception victorious. But the people of those islands never had the ambition to go out and conquer the world. As a people they were manipulated by the top pirates and learned to cheer as they were told of their nation’s world prowess. 

And from there we have the beginning of schools:

And this is the way schools began as the royal tutorial schools. You realize, I hope, that I am not being facetious. That is it. This is the beginning of schools and colleges and the beginning of intellectual specialization. Of course, it took great wealth to start schools, to have great teachers, and to house, clothe, feed, and cultivate both teachers and students. Only the GreatPirate-protected robber-barons and the Pirate-protected and secret intelligence-exploited international religious organizations could afford such scholarship investment. 

And the warning that we all know about: "But specialization is in fact only a fancy form of slavery wherein the "expert" is fooled into accepting his slavery by making him feel that in return he is in a socially and culturally preferred, ergo, highly secure, lifelong position"

Now the slavery of specialization is completely obvious to all who work for universities.

And we have become very much like the land-bound foolish specialists, duped by misconceptions of the powers of the mind by the trappings of grandeur of university life. We believe the horizon to the infinitely extended as our publications, impact, salaries, status (for some, at least) and citations increase - and we believe all of this is what matters. And we are caught in the gears of a machine of our own construction that is shredding everything of value that once existed in those institutions. 

Worse still, our anticipatory powers are fading as we are heralding a new era of anticipatory technology. Just when we should be asking of the next wave of technology where the boundary is between human anticipation and machine learning, or quantum computing, instead we seem destined to replace human anticipation altogether with machines. This is a new wave of specialisation. To put it mildly: it won't work. To put it more strongly: "Extinction is always the result of over-specialisation"

In formulating a new and positive vision, Buckminster Fuller argues that we need new ways of looking at our resources for organising. This he calls "wealth":

"Wealth is our organized capability to cope effectively with the environment in sustaining our healthy regeneration and decreasing both the physical and metaphysical restrictions of the forward days of our lives."

Now, where does that organized capability come from? It must come from communication. Drawing on another of Buckminster Fuller's ideas, he always drew the distinction between building by "compression" and building by "tension".  This I think is where our new anticipatory technologies might be very powerful. 

Our ability to communicate depends on anticipation: I cannot write these words if I do not have some idea of how they are likely to be read and understood. But I am communicating to a complex audience, and I would like to try some experiments, see how saying different kinds of things might "play out", and then choose the best form of utterance I can to achieve what I want to achieve. 

This is partly why students go to university - to try things out, to see how things might play out. But universities are increasingly bad at doing anticipation - partly because they've become divorced from their history - and anticipation without history cannot be any good. In place of real anticipation, we have empty promises, and a lot of young people with degrees who can't get jobs. 

A network of communication - a network of friends - is a complex system of inter-acting anticipations - what Husserl called "horizons of meaning". It holds its structure because people understand each other. It is very much like Bucky's icosahedron. That is a structure built from tension, not compression.

The early e-learning pioneers had hoped that the internet itself would create these forms of communicative tension. But what happened was that the networks quickly became new "empires" - indeed a new communicative "life form" which consumed human identities and desires. Rather like the British Empire. 

But the network is only half the story. We have had to wait for 30 years for the missing ingredient - the anticipatory technology. We're very close to having this now. It will soon be a fact of everyday life. Anticipation is the thing which tightens the strings, and gives new structures solidity. While we might one day celebrate the extra flexibility - the extra "wealth" as Buckminster Fuller puts it - that this brings, we will surely find that this is only a necessary adaptation for the survival of humankind.