Monday, 15 April 2019

Kaggle and the Future University: Learning. Machine. Learning.

One of the most interesting things that @gsiemens pointed out the other day in his rant about MOOCs was that people learning machine learning had taught themselves through downloading datasets from Kaggle (http://kaggle.com) and using the now abundant code examples for manipulating and processing these datasets with the python machine learning libraries which are also all on GitHub, including tensorflow and keras. Kaggle itself is a site for people to engage in machine learning competitions, for which it gathers huge datasets on which people try out their algorithms. There are now datasets for almost everything, and the focus of my own work on diabetic retinopathy has a huge amount of stuff in it (albeit a lot of it not that great quality). There is an emerging standard toolkit for AI: something like Anaconda with a Jupyter notebook (or maybe PyCharm), and code which imports ternsorflow, keras, numpy, pandas, etc. Its become almost like the ubiquity of setting up database connectors to SQL and firing queries (and is really the logical development of that).

Whatever we might think of machine learning with regard to any possibility of Artificial Intelligence, there's clearly something going on here which is exciting, increasingly ubiquitous, and perceived to be important in society. I deeply dislike some aspects of AI - particularly its hunger for data which has driven a surveillance-based approach to analysis - but at the same time, there is something fascinating and increasingly accessible about this stuff. There is also something very interesting in the way that people are teaching themselves about it. And there is the fact that nobody really knows how it works - which is tantalising.

It's also transdisciplinary. Through Kaggle's datasets, we might become knowledgeable in Blockchain, Los Angeles's car parking, wine, malaria, urban sounds, or diabetic retinopathy. The datasets and the tools for exploring them are foci of attention: codified ways in which diverse phenomena might be perceived and studied through a coherent set of tools. It may matter less that those tools are not completely successful in producing results - but they do something interesting which provides us with alternative descriptions of whatever it is we are interested in.


What's missing from this is the didacticism of the expert. What instead we have are algorithms which for the most part are publicly available, and the datasets themselves, and a question - "this is interesting... what can we make of it?"

We learn a lot from examining the code of other people. It contains not just a set of logic, but expresses a way of thinking and a way of organising. When that way of thinking and way of organising is applied to a dataset, it also expresses a way of ordering phenomena.

Through my diabetic retinopathy project, I have wondered whether human expertise is ordinal. After all, what do we get from a teacher? If we meet someone interesting, it's tempting to present them with various phenomena and ask them "What do you think about this?". And they might say "I like that", or "That's terrrible!". If we like them, we will try to tune our own judgements to mirror theirs. The vicarious modelling of learning seems to be something like an ordinal process. And in universities, we depend on expertise being ordinal - how else could assessment processes run if experts did not order their judgements about student work in similar ways?

The problem with experts is that when expertise becomes embodied in an individual it becomes scarce, so universities have to restrict access to it. Moreover, because universities have to ensure they are consistent in their own judgement-making, they do not fully trust individual judgement, but organise massive bureaucracies on top of it: quality processes, exam boards, etc.

Learning machine learning removes the embodiment of the expertise, leaving the order behind. And it seems that a lot can be gained from engaging with the ordinality of judgements on their own. That seems very important for the future of education.

I'm not saying that education isn't fundamentally about conversation and intersubjective engagement. It is - face-to-face talk is the most effective way we can coordinate our uncertainty about the world. But the context within which the talking takes place is changing. Distributing the ordinality of expert judgements creates a context where talk about those judgements can happen between peers in various everyday ways rather than simply focusing on the scarce relation between the expert and the learner. In a way, it's a natural development from the talking-head video (and it's interesting to reflect that we've haven't advanced beyond that!). 

Reality

Every improvisation I am making at the moment is dominated by an idea about the nature of reality as being  a hologram, or fractal. So the world isn't really as we see it: it's our cells that make us perceive it like that, and it's our cells that make us perceive a "me" as a thing that sees the world in this way.

This was brought home to me even more after a visit to the Whitworth gallery's wonderful exhibition of ancient Andean textiles. They were similar to the one below (from Wikipedia)



It's the date which astonishes: sometime around 200CE. Did reality look like this to them? I wonder if it might have done.

This music is kind-of in one key. It's basically just a series of textures and slides (which are meant to sound like traffic) that embellish a fundamental sound. I like to think that each of these textures overlays some fundamental pattern with related patterns at different levels. The point is that all these accretions of pattern produces a coherence through producing a fractal.

Saturday, 13 April 2019

Comparative Judgement, Personal Constructs and Perceptual Control

The idea that human behaviour is an epiphenomenon of the control of perception is an idea associated with Bill Power's "Perceptual Control Theory", which dates back to the 1950s. Rather than human consciousness and behaviour being "exceptional", individual, etc, it is rather seen as the aggregated result of the interactions of a number of subsystems, of which the most fundamental is the behaviour of the cell. So if our cells are organising themselves according to the ambiguity of their environment (as John Torday argues), and in so doing are "behaving" so as to maintain homeostasis with their environment by producing information (or neg-entropy), and reacting to chemiosmotic changes, then consciousness and behaviour (alongside growth and form) is the epiphenomenal result.

So when we look at behaviour and learning, and look back towards this underlying mechanism, what do we see? Fundamentally, we see individuals creating constructs: labels with which individuals deal with the ambiguity and uncertainty of the world. But what if the purpose of the creation of constructs is analogous to the purpose of the cell: to maintain homeostasis by producing negentropy and reacting to chemiosmosis (or perhaps noise in the environment)?

We can test this. Presenting individuals with pairs of different stimuli and asking them which they prefer and why is something that comparative judgement software can do. It's actually similar to the rep-grid analysis of George Kelly, but rather than using 3 elements, 2 will do. Each pair of randomly chosen stimuli (say bits of text about topics in science or art), are effectively ways of stirring-up the uncertainty of the environment. This uncertainty then challenges the perceptual system of the person to react. The "construct", or the reason for one choice or another, is the person's response to this ambiguity.

The interesting thing is that as different pairs are used, so the constructs change. Moreover, the topology of what is preferred to what also gradually reveals contradictions in the production of constructs. This is a bit like Power's hierarchies of subsystems, each of which is trying to maintain its control of perception. So at a basic level, something is going on in my cells, but as a result of that cellular activity, a higher-level system is attempting to negotiate the contradictions emerging from that lower system. And then there is another higher level system which is reacting to that system. We have layers of recursive transduction.

It's interesting to reflect on the logic of this and compare it to our online experience. Our experience of Facebook and the media in general is confusing and disabling precisely because the layers of recursive transduction are collapsed into one. Complexity requires high levels of recursion to manage it, and most importantly, it requires the maintenance of the boundaries between one layer of recursion and another. From this comes coherence. Without this, we find ourselves caught in double-binds, where one layer is in conflict with another, with no capacity to resolve the conflict at a new level of recursion.

If we want to break the stranglehold of the media on our minds, we need new tools for the bringing of coherence to our experiences. I wonder if were we to have these tools, then self-organised learning without institutional control becomes a much more achievable objective.

Tuesday, 9 April 2019

The OER, MOOC and Disruption Confusion: Some thoughts about @gsiemens claim about MOOCs and Universities

George Siemens made a strong claim yesterday that "Universities who didn't dive into the MOOC craze are screwed". Justifying this by acknowledging that although the MOOC experiment up until now has not been entirely successful, the business of operating at scale in the environment is the most important thing for universities. The evidence he points to is that many machine learning/big data experts taught themselves how to do it through online resources. Personally, I can believe this is true. George's view prompted various responses among many who are generally hostile to the "disruption" metaphor of technology in education (particularly MOOCs), but the most interesting responses suggested that the real impact of the MOOC was on OER, and that open resources were the most important thing.

I find the whole discussion around disruption, MOOCs and OER very confusing. It makes me think that these are not the right questions to be asking. They all seem to view whatever activities which happen online between individuals and content through the lens of what happens in traditional education:
Disruption = "hey kids, school's closed. Let's have a lesson in the park!";
MOOCs = "Hey kids, we're going to study with 6 million other schools today!";
OER = "Hey kids, look a free textbook!". 
The web is different in ways which we haven't fathomed yet. It's obvious now that this difference is not really being felt directly in education: as I have said in my book, and Steve Watson said the other day in Liverpool (see https://www.eventbrite.co.uk/e/cybernetics-and-de-risking-the-universities-talkdiscussion-tickets-59314680807#), education is largely using technology to maintain its existing structures and practices. But the difference is being felt in the workplace, in casualisation, among screen-addicted teens, and in increasingly automated industries which would once have provided those teens with employment.

The web provides multiple new options for doing things we could do before. The free textbook is a co-existing alternative to the non-free textbook; the MOOC is a not-too-satisfying but co-existing alternative to expensive face-to-face education. What we have seen is an explosion of choice, and an accompanying explosion of uncertainty as we attempt to deal with the choice. Our institutions and technology corporations have both been affected by the increase in uncertainty.

What we are now discovering in the way we use our electronic devices provides a glimpse into how our consciousness deals with uncertainty and multiplicity. On the surface, it doesn't look hopeful. We appear to be caught in loops of endless scrolling, swiping and distraction. But what we do not see is that this pathological behaviour is the product of a profit-driven model which demands that tech companies increase the number of transactions that users have with their tools: their share-prices move with those numbers. Every new aspect of coolness, from Snapchat image filters to Dubsmash silliness and VR immersive environments, serves to increase the data flows. Our tech environment has become toxic resulting in endless confusion and double-binds. But we are told a lie: that technology does this. It doesn't. Corporations do this, because this is the way you make money in tech - by confusing people. It unfortunately, is also the way universities are increasingly operating. Driven by financial motives, they have become predatory institutions. Deep down, everything has become like this because turning things into money is a strategy for dealing with uncertainty.

All human development involves bringing coherence to things. It is, fundamentally, a sense-making operation. Coherence takes a multiplicity of things and orders them in a deeper pattern. Newman put it well:
"The intellect of man [...] energizes as well as his eye or ear, and perceives in sights and sounds something beyond them. It seizes and unites what the senses present to it; it grasps and forms what need not have been seen or heard except in its constituent parts. It discerns in lines and colours, or in tones, what is beautiful and what is not. It gives them a meaning, and invests them with an idea. It gathers up a succession of notes into the expression of a whole, and calls it a melody; it has a keen sensibility towards angles and curves, lights and shadows, tints and contours. It distinguishes between rule and exception, between accident and design. It assigns phenomena to a general law, qualities to a subject, acts to a principle, and effects to a cause." 
This is what consciousness really does. What Newman doesn't say is that the means by which this happens is conversation. And this is where the web we have falls down. It instead acts as what Stafford Beer called an "entropy pump" - sowing confusion. The deeper reasons for this lie in fundamental differences between online and face-to-face conversation, which we are only beginning to understand. But we will understand them better in time.

I find myself agreeing with Siemens. I do not think that the traditional structures of higher education will survive a massive increase in technology-driven uncertainty. In the end, it will have to change into something more flexible: we will dispense with rigid curricula and batch-processing of students. Maybe the MOOC experiment has encouraged some to think the unthinkable about institutional organisation. Maybe.

A university, like any organism, has to survive in its environment. They are rather like cells, and like cells, they evolve by absorbing aspects of the environment within their own structures (those mitochondria were once independently existing). In biology this is endosymbiosis. That is how to survive - to embrace and absorb. Technology is also endosymbiotic in the sense that it has embraced almost every aspect of life. It feels like we are in something of a stand-off between technology and the university, where the university is threatened and as a result is putting up barriers, reinforced by "market forces". This is also where our current pathologies of social media are coming from. Adaptation will not come from this.

Creating and coordinating free interventions in the environment is at least a way of understanding the environment better. Personally, I think grass-roots things like @raggeduniversity also are important. MOOCs were an awkward way of doing this. But the next wave of technology will do it better, and eventually I think they will create the conditions whereby human consciousness can create coherence from conversations within the context of uncertainty in the challenging world of AI and automation that it finds itself in. 

Sunday, 7 April 2019

Natural information processing and Digital information processing in Education


In my book, I said:

"Educational institutions and their technology are important because they sit at the crossroads between the ‘natural’ computation which is inherent in conversation and our search for knowledge, and the technocratic approach to hierarchy which typifies all institutions."

I don’t mean to say that education institutions have to be hierarchical – but they clearly are. Nor do I mean to say that they have to be technocratic – but, increasingly and inevitably, they clearly are. It’s more about a distinction between the kind of “computing” that goes on in technocratic hierarchies, and the kind of “computing” that goes on in individuals as they have conversations with one another. And education seems to have to negotiate these two kinds of computing.

Without conversations, education is nothing. Without organisation, coherent conversation is almost impossible.

It’s as if one form of information – the information of the computer, of binary choice, or logistical control – has to complement the information of nature, organic growth and emotional flux. When the balance is right, things work. At the moment, the technocratic idea of information and its technologies dominate, squeezing out the space for conversation. And that’s why we are in trouble.
We know how our silicon machines work (although we may be a bit confused by machine learning!), but we don’t know how “natural” computing works. But we have some insights.

Natural computing seems to work on the basis of pattern – or, in information theoretical terms, redundancy. Only through the production of pattern do things acquire coherence in their structure. And without coherence, nothing makes sense: “can you say that again?”… We do this all the time as teachers – we make redundancy in our learning conversations.

Silicon, digital, information conveys messages in the form of bits, and while redundancy is a necessary part of that communication process, it is the “background” to the message. It is, in the simplest way, the “0” to the information’s “1”.

So is natural computing all about “0”? Is it “Much ado about nothing”? I find this an attractive idea, given that all natural systems are born from nothing, become something, and eventually decay back to nothing again. A sound comes from silence, wiggles around a bit, and then fades to silence again. All nature is like this. The diachronic structure of nature is about nothing.

Moreover, in the universe, Newton’s third law tells us that the universe taken as a totality must essentially be nothing too. There may have been a “big bang”, but there was also an “equal and opposite reaction”. Somewhere. And this is not to say anything about spiritual practices which are amongst always focused on nothingness.

When we learn and grow, we do so in the knowledge that one day we will die. But we do this in the understanding that others will live and die after us, and that how we convey experience from one generation to the next can help to keep a spirit of ongoing life alive.

Schroedinger’s “What is Life” considered that living systems exhibit negative entropy, producing order, and working against the forces of nature which produce entropy or decay. I think this picture needs refinement. Negative entropy can be “information” which in Shannon’s sense is measured in “bits” – or 1s and 0s. But negative entropy may also be an “order of nothings”. So life is an order of nothing from which an order of bits is an epiphenomenon?

Our “order of bits” has made it increasingly difficult to establish a coherent order of nothing. Our digital technologies have enforced an “order of bits” everywhere, not least in our educational institutions. But the relationship between digital information and natural information can be reorganised. Our digital information may help us to gain deeper insight into how natural information works.

To do this, we must turn our digital information systems on our natural information systems, to help us steer our natural information systems more effectively. But the key to getting this to work is to use our digital technologies to help us understand redundancy, not information.

Techniques like big data focus on information in terms of 1s and 0s: they take natural information systems and turn them into digital information. This is positive feedback, from which we seek an “answer” – a key piece of information which we can then make a decision. But we are looking for the wrong thing in looking for an answer. We need instead to create coherence.

Our digital information may be turned to identify patterns in nature: the mutually occurring patterns at different levels of organisation. It can present these things to us not in the form of an answer, but in the form of a more focused question. Our job, then, is to generate more redundancy – to talk to each other, to do more analysis – to help to bring coherence to the patterns and questions which are presented to us. At some point, the articulation of redundancies will bring coherence to the whole living system.

I think this is what we really need our educational technology to do. It is not about learning maths, technology, science, or AI (although all those things may be the result of creating new redundancies). It is about creating ongoing coherence in our whole living system.

Wednesday, 27 March 2019

@NearFutureTeach Scenarios and Getting many brains to think as one

I went to the final presentation of the Near Future Teaching project at Edinburgh yesterday. I've been interested in what's happening at Edinburgh for a while because it looked to me like a good way of getting teachers and learners to talk together about teaching and to think about the future. As with many things like this, the process of doing this kind of project is all important - sometimes more important than the products.

I'm familiar with a scenario-based methodology because this is what we did on the large-scale iTEC project (see http://itec.eun.org/web/guest;jsessionid=491AB3788AEB8152821138272A35C5E4) which was coordinated by European Schoolnet. Near Future Teaching has followed a similar plan - identification of shared values, co-design of scenarios, technological prototyping/provoking (using what they neatly called "provo-types"). iTEC took its technological prototypes a bit more seriously, which - on reflection - I think was a mistake (I wrote about it here: https://jime.open.ac.uk/articles/10.5334/jime.398/).

During iTEC I wasn't sure about scenario-building as a methodology. It seemed either too speculative or not speculative enough, where the future was imagined as seen using lenses through which we see the present. We're always surprised by the future, often because it involved getting a new set of lenses. I was talking to a friend at Manchester university on Monday about how theologians/religious people make the best futurologists: Ivan Illich, Marshall McLuhan, C.S. Lewis (his "Abolition of Man" is an important little book), Jacques Ellul. Maybe its because the lens that allows you to believe in God is very different to the lens that looks at the world as it is - so these people are good at swapping lenses.

After Near Future Teaching, I'm a bit more enthusiastic about scenarios. I spoke to a primary school teacher who was involved in the project, and we discussed the fact that nobody is certain about the future. Uncertainty is the great leveller, teachers and learners are in the same boat, and this is a stimulus for conversation and creativity. Its not a dissimilar idea to this: https://dailyimprovisation.blogspot.com/2018/10/transforming-education-with-science-and.html

But then there is something deeper about this kind of process. Uncertainty is a disrupter to conventional ways of looking at the world. Each of us has a set of categories or constructs through which we view the world. Sometimes the barriers to conversation are those categories themselves, and making interventions which loosen the categories is a way of creating new kinds of conversation. Introducing "uncertain topics" does this.

In his work on organisational decision-making, Staffford Beer did a similar thing with his "syntegration" technique. That involved emerging issues in a group, and then organising conversations which deliberately aimed to destabilise any preconceived ways of looking at the world. Beer aimed to create a "resonance" in the communications within the group as their existing categories were surrendered and new ones formed in the context of conversation. The overall aim was to "get many brains to think as one brain". Given the disastrous processes of collective decision which we are currently witnessing, we need to get back to this!

Having said this, there's something about the whole process which IS teaching itself. That leads me to think that Near Future Teaching is closely aligned to the methods of Near Future Teaching. Maybe the scenarios can be dispensed with, almost certainly we have to rethink assessment, we have to rethink the curriculum and the institutional hierarchy, but the root of it all is conversation which disrupts existing ways of thinking and established coherence within a group.

If we had this in education, Brexit would just be a cautionary tale.


Sunday, 24 March 2019

Human Exceptionalism and Brexit Insanty

Why have we managed to tie ourselves in knots? It's (k)not just over Brexit. It's over everything - austerity, welfare, tax, university funding, climate change, the point of education...

Following on from my last post, a thought has been niggling me: is it because we think human consciousness is exceptional?  Is our belief in the exceptionalism of consciousness in the human brain stopping us from seeing ourselves as part of something bigger? The problem is that as soon as we see ourselves as something special, that our consciousness is somehow special, we consider that one person's consciousness is more special than another. Then we hold on to our individual thoughts or "values" (they're a problem too) and see to it that the thoughts and values of one person must hold out against the thoughts and values of another. Is it because consciousness is not exceptional that this creates a terrible mess?

If consciousness is not exceptional, what does it do? What is its operating principle?

In my book, Uncertain Education, I aargued that "uncertainty" was the most useful category through which to view the education system. I think uncertainty is a good category to view an unexceptional consciousness too. Consciousness, I think, is a process which coordinates living things  in managing uncertainty. It is a process which maintains coherence in nature.

This process can be seen in all lifeforms from cells to ants to humans. What we call thinking is an aggregate of similar processes among the myriad of cellular and functionally differentiated components from which we are made, and which constitute our environment. The brain is one aggregation of cells which performs this role. It is composed of cells managing their uncertainty, and the aggregate of their operation and co-operation is what we think is thinking. Really, there's a lot of calcium and ATP which is pumped around. That's the work our cells do as they manage their uncertainty.

The same process occurs at different levels. The thing is fractal in much the same way that Stafford Beer described his Viable System Model. But we know a lot more about cells now than Beer did.

But what is the practical utility of a cellular view of consciousness?

Understanding that cells are managing uncertainty is only the beginning. More important is to realise that organisms and their cells have developed ("evolved") by absorbing parts of their environment as they have managed their uncertainty over history. This absorption of the environment helps in the process of managing environmental uncertainty: uncertainty can only be managed if we understand the environment we are in. Importantly, though, each stage of adaptation entails a new level of accommodation with the environment: we move from one stable state to the next "higher" level. You might imaging a "table" of an increasingly sophisticated "alphabet" of cellular characteristics and capacities to survive in increasingly complex environments.

The cellular activity of "thinking", like all processes of adaptation, occurs in response to changes in the environment. It may be that an environment once conducive to higher-level "thought" becomes constrained in a way that cells are forced to a previous, more simple, state of organisation in order to remain viable. It's a kind of regression. The kind that we see with intelligent people at the moment, paralysed by Brexit. In history, it is the thing that made good people do bad things in evil regimes. We become more primitive. Put a group of adults in a school classroom, and they will start to behave like children....!

Understanding this is important because we need to know how to go the other way - how to produce the conditions for increasing sophistication and richer adaptiveness. That is education's job. It is also the politician's job. But if we have a mistaken idea about consciousness, we are likely to believe that the way to increase adaptiveness is to do things which actually constrain it. This is austerity, and from there we descend back into the swamp.