Tuesday, 17 July 2018

How is music meaningful? An Information Theoretical approach to analyse the communication of meaning without reference

These are notes for a paper on music. It is not meant to be in any way coherent yet!

1 Introduction

Over the centuries that music has been considered as a scientific phenomenon, focus has tended to fall on music’s relation to number and proportion which has been the subject of scholarly debate since antiquity. In scientific texts on music from the Pythagoras to the enlightenment, focus has tended to fall on those aspects of sound which are synchronic, of which the most significant is harmony and the proportional relation between tones. Emerging scientific awareness of the physical properties of sound led scientists to consider deeper analysis of the universe. This scientific inquiry exhibited feedback with music itself: deeper scientific understanding inspired by music, harmony and proportion led to scientific advance in the manufacture of musical instruments and the consequent emergence of stylistic norms.

Less attention has been placed on music’s diachronic structure, despite the fact that synchronic factors such as timbre and harmony appear strongly related to diachronic unfolding, and that the diachronic dimension of music typifies what Rosen calls, “the reconciling of dynamic opposites is at the heart of the classical style”. In more recent years, the role of music in understanding psychosocial and biological domains has increased. Recent work around the area of “Communicative Musicality” has focused more on the diachronic unfolding alongside its synchronic structure, for it is here that it has been suggested the root of music’s communication of meaning lies, and that this communication is, in the suggestion of Langer, a “picture of emotional life”

Diachronic structures present challenges for analysis, partly because dialectical resolution between temporally separated moments necessarily entails novelty, while the analysis throws up fundamental issues of growth within complexes of constraint. These topics of emerging novelty within interacting constraints has absorbed scientific investigation in biology and ecology in recent years. Our contribution here is to point to the homologous analysis of innovation in discourse, and so we draw on insights into dynamics of meaning-making within systems which have reference as a way of considering the meaning-making within music which has no reference, with the aim of providing insights into both. The value of music for science is that it focuses attention on the ineffable aspects of knowledge: a reminder of how much remains unknown, how the reductionism of the enlightenment remains limited, and how other scientific perspectives, theories and experiments remain possible. And for those scientists for whom music is a source of great beauty, no contemplation of the mysteries of the universe could not consider the mysteries of music.

2 What do we mean by meaning?

Any analysis of meaning invites the question “what do we mean by meaning?” For an understanding of meaning to be meaningful, that understanding must participate in the very dynamics of meaning communication that it attempts to describe. It may be because of the fundamental problems of trying to identify meaning that focus has tended to shift towards understanding “reference” which is more readily identifiable. There is some consensus among scholars focused on the problem that meaning is not a “thing” but an ongoing process: whether one calls this process “semiosis” (Peirce), or Whitehead’s idea of meaning, or the result of the management of cybernetic “variety” (complexity). The problem of meaning lies in the very attempt to classify it: while reference can be understood as a “codification” or “classification”, meaning appears to drive the oscillating dynamic between classification and process in mental process. So there is a fundamental question: is meaning codifiable? For Niklas Luhmann, this question was fundamental to his theory of “social systems”. Luhmann believed that social systems operate autopoietically by generating and reproducing communicative structures of transpersonal relations. Through utterances, codes of communication are established which constraint the consciousness of individuals who reproduce and transform those codified structures. Luhmann saw the different social systems of society as self-organising discourses which interact with one another. Thus, the “economic system” is distinct from the “legal system” or the “art system”. Meaning is encoded in the ongoing dynamics of the social system. An individual’s utterance is meaningful because the individual’s participation in a social system leads to coordinated action by virtue of the fact that one has some certainty that others are playing the same game: one is in-tune with the utterances of others, and with the codified social system within which one is operating. Luhmann borrowed a term which Parson’s invented, inspired by Schutz’s work, “double contingency”. All utterances are made within an environment of expectation of the imagined responses of the other (what Parsons calls “Ego” and “Alter”) 

The problem with this perspective is that meaning results from a kind of social harmony: meaning ‘conforms’. Fromm, for example, criticizes this attitude as the “cybernetic fallacy”: “I am as others wish me to be”. Yet meaningful things are usually surprising. An accident in the street is meaningful to all who witness it but it outside the normal codified social discourse. Even in ordinary language, one emphasizes what is meaningful by changing the pitch of volume of ones’ voice for emphasis on a particular word, or gesturing to some object as yet unseen by others “look at that!”. Whatever we mean by meaning, the coordination of social intercourse and mutual understanding is one aspect, but novelty or surprise is the critical element for something to be called “meaningful”. It is important to note that one could not have surprise without the establishment of a some kind of ‘background’.
However, there is a further complication. It is not that surprise punctuates a steady background of codified discourse. Codified discourse creates the conditions for novelty. The effects of this have been noted in ecology (Ulanowicz) and biology (Deacon). The establishment of regularity and pattern appears to create the conditions for the generation of novelty in a process which is sometimes characterized as “autocatalytic” (Deacon, Ulanowicz, Kauffman). This, however, is a chicken and egg situation: it is that meaning is established through novelty and surprise against a stable background, or is it the stable background which gives rise to the production of novelty which in turn is deemed to be meaningful.

This dichotomy can be rephrased in terms of information theory. This is not to make a claim for a quantitative index of meaning, but a more modest claim that quantitative techniques can be used to bring greater focus to the fundamental problem of the communication of meaning. Our assertion is that this understanding can be aided through the establishment of codified constraints through techniques of quantitative analysis in order to establish a foundation for a discourse on the meta-analysis of meaningful communication. This claim has underpinned prior work in communication systems and economic innovation, where one of us has analysed the production of novelty as the result of interacting discourses between the scientific community, governments and industries. Novelty, it is argued, results from mutual information between discourses (in other words, similarly codified names or objects of reference), alongside a background of mutual redundancy or pattern. Information theory provides a way of measuring both of these factors. In conceiving of communication in this way, there is a dilemma as to whether it is the pattern of redundancy which is critical to communication or whether it is the codified object of reference. Music, we argue, suggests that it is the latter.

3 Asking profound questions

In considering music from the perspective of information theory, we are challenged to address some fundamental questions: 
1. What does one count in music?
2. From where does musical structure emerge?
3. What is the relation between music’s synchronic structure and its diachronic unfolding?
4. Why is musical experience universal?

4 Information and Redundancy in Music: Consideration of an ‘Alberti bass’

Music exhibits a high degree of redundancy. Musical structure relies on regularity, and in all cases of the establishment of pattern and regularity, music presents multiple levels of interacting pattern. For example, an ‘Alberti bass’ establishes regularity, or redundancy, in the continual rhythmic pulse of the notes that form a harmony. It also exhibits regularity in the notes that are used, and the pattern of intervals which are formed through the articulation of those notes. It also exhibits regularity in the harmony which is articulated through these patterns. Its sound spectrum identifies a coherent pattern. Indeed, there is no case of a musical pattern which is not subject to multiple versions of redundant patterning: even if a regular pattern is established on a drum, the pattern is produced in the sound, the silence, the emphasis of particular beats, the acoustic variation from one beat to another, and so on. Multiple levels of overlapping redundancy means that no single musical event is ever exactly the same. This immediately presents a problem for the combinatorics of information theory, which seeks to count occurrences of like-events. No event is ever exactly ‘like’ another because the interactions of the different constraints for each single event overlaps in different ways. The determination of a category of event that can be counted is a human choice. The best that might be claimed is that one event has a ‘family resemblance’ to another.

The Alberti bass, like any accompaniment, sets up an expectation that some variety will occur which sits within the constraints that are established. The critical thing is that the expectation is not for continuation, but for variety. How can this expectation for variety be explained?

In conventional information theoretical analysis, the entropy of something is basically a summation of the probabilities of the occurrences of the variety of different codified events which are deemed to be possible and can be counted. This approach suffers from an inability to account for those events which emerge as novel, and which could not have been foreseen at the outset. In other words, information theory needs to be able to account for an dynamically self-generating alphabet.
Considering the constraint, or the redundancy within which events occur presents a possibility for focusing on the context of existing or potentially new events. But since the calculation of redundancy requires the identification of information together with the estimation of the “maximum redundancy”, this estimation is also restricted in calculating novelty.

However, we have suggested that the root of the counting problem is that nothing is exactly the same as something else. But, some things are more alike than others. If an information theoretical approach can be used to identify the criteria for establishing likeness, or unlikeness, then this might be used as a generative principle for generating the likeness of things which emerge but cannot be seen at the outset. Since novelty emerges against the context of redundancy, and redundancy itself generates novelty, a test of this approach is whether it can predict the conditions for the production of novelty.
In other words, it has to be able to distinguish the dynamics between the Alberti bass which articulates a harmony for 2 seconds, and an Alberti bass which articulates the same harmony for more than 1 minute with no apparent variation (as is the case with some minimalist music).

  • How do our swift and ethereal thoughts move our heavy and intricately mobile bodies so our actions obey the spirit of our conscious, speaking Self?
  • How do we appreciate mindfulness in one another and share what is in each other’s personal and coherent consciousness when all we may perceive is the touch, the sight, the hearing the taste and smell of one another’s bodies and their moving?
These questions echo the concern of Schutz, who in his seminal paper, “Making Music Together” identifies that music is special because it appears to communicate without reference. Music occupies a privileged position in the information sciences as a phenomenon which makes itself available to multiple descriptions of its function, from notation to the spectral analysis of sound waves. Schutz’s essential insight was the musical meaning-making arose from the tuning-in of one person to another, through the shared experience of time.

This insight has turned out to be prescient. Recent research suggests that that there is a shared psychology or “synrhythmia” and amphoteronomy involved in musical engagement. How then might this phenomenon be considered to be meaningful, in the absence of reference, which in other symbolically codified contexts (e.g. Language), meaning appears to be generated in other ways which can be more explicitly explained through understanding the nature of “signs”.

5 Communicative Musicality and Shannon Entropy

Recent studies on communicative musicality have reinforced Schutz’s view that the meaningfulness of musical communication rests on a shared sense of time. The relation between music and time makes manifest many deep philosophical problems: music unfolds in time, but it also expresses time, where time in music does not move at the same speed: music is capable of making time stand still. If time is considered as a dimension of constraint, alongside all the other manifestations that music presents, then the use of time as a constraint on the analysis of music itself can help unpick the dynamics of those other constraints. While one might distinguish clock-time from psychological time, what might be considered psychological time is in reality the many interacting constraints, while clock-time is an imaginary single description allows us at least to carve up experience in ways that can be codified.

Over time, music displays many changes and produces much novelty. Old patterns give way to new patterns. At the root of all these patterns are some fundamental features which might be counted in the first instance:
1. Pitch
2. Rhythm
3. Interval
4. Harmony
5. Dynamics
6. Timbre

So what is the pattern of the Alberti bass?

If it’s note pattern is C-G-E-G-C-G-E-G, and its rhythm is q-q-q-q-q-q-q-q and its harmony is a C major triad, and intervallic relations are 5, -3, 3, -5, 5, -3, 3, -5, over a period of time, the entropy for each element will be;
1. Pitch: -0.232
2. Rhythm: 1
3. Interval: 0.2342
4. Harmony: 1
5. Dynamics:

The important thing here is that novelty becomes necessary if there is insufficient variation between interacting constraints. If mutual redundancy is high, then it also means that there is very high mutual information. The phrasing of an Alberti bass is an important feature of its performance. For musicians it is often considered that the Alberti pattern should be played as a melody.

Consider the opening of Mozart’s piano sonata

The playing of the C in the right hand creates an entropy of sound. Each sound decays, but the decay of C above the decay of the Alberti bass means that the context of the patterns of the other constraints changes. This means that the first bar can be considered as a combination of constraints.

The surprise comes at the beginning of the second bar. Here the pattern of the constraint of the melody changes because a different harmony is articulated, together with a sharp interval drop (a 6th), an accent, followed by a rhythmic innovation in the ornamental turn. Therefore the characteristic of the surprise is a change in entropy in multiple dimensions at once.

It is not enough to surprise somebody by simply saying “boo!”. One says “boo!” with an increased voice, a change of tone, a body gesture, and so on. The stillness which is previous to this is a manifestation of entropies which are aligned.

The intellectual challenge is to work out a generative principle for a surprise.

6 A generative principle of surprise and mental process

The differentiation between different dimensions of music is an indication of the perceptual capacity expressed through relations between listeners and performers. The articulation and play between different dimensions creates the conditions for communicating conscious process. If entropies are all common across a core set of variables, then there is effectively collapse of the differentiating principles. All becomes one. But the collapse of all into one is a precursor to the establishing of something new.
In music’s unfolding process a system of immediate perception works hand-in-hand with a meta-system of expectations (an anticipatory system). This can be expressed in terms of a system’s relation to its meta-system. Because the metasystem coordinates the system, circumstances can arise where the metasystem becomes fused to the system, so that differences identified by the system become equivalent to different identified by the metasystem. In such a case, the separation between system and metasystem breaks down. The cadence is the most telling example of this. In a cadence, many aspects of constraint work together all tending towards a silence: harmony, melody, dynamics, rhythm, and so forth.

Bach’s music provides a good example of the ways in which a simple harmonic rhythm becomes embellished with levels of counterpoint, which are in effect, ways of increasing entropy in different dimensions. The melody of a Bach chorale has a particular melodic entropy, while the harmony and the bass imply a different entropy. Played “straight”, the music as a sequence of chords isn’t very interesting: the regular pulse of chords creates an expectation of harmonic progress towards the tonal centre, where all the entropies come together at the cadence:

Bach is the master of adding “interest” to this music, whereby he increases entropy in each voice by making their independent rhythms more varied, with patterns established and shared from one voice to another:

By doing this, there is never a complete “break-down” of the system: there is some latent counterpoint between different entropies in the music, meaning that even at an intermediate cadence, the next phrase is necessary to continue to unfold the logic of the conflicting overlapping redundancies.

This is the difference between Baroque music and the music of the classical period. In the classical period, short gestures simply collapse into nothing, for the system to reconstruct itself in a different way. In the Baroque, total collapse is delayed until the final cadence. The breaking-down of the system creates the conditions for the reconstruction of stratified layers of distinction-making through a process of autocatalysis. Novelty emerges from this process of restratifying a destratified system. The destratified system exhibits a cyclic process whereby different elements create constraints which reinforce each other, much like the dynamics of a hypercycle described in chemistry, or what is described in ecology as autocatalysis.

Novelty arises as a symptom of the autocatalyzing relation between different constraints, where the amplification of pattern identifies new differences which can feed into the restratification of constructs in different levels of the system.

This can be represented cybernetically as a series of transductions at different levels. Shannon entropy of fundamental features can be used as a means of being able to generate higher level features as manifestations of convergence of lower-level features, and consequently the identification of new pattern.

It is not that music resonates with circadian rhythms. Music articulates an evolutionary dynamics which is itself encoded into the evolutionary biohistory of the cell.

7 Musical Novelty and the Problem of Ergodicity

In his introduction to the concept of entropy in information, Shannon discusses the role of transduction between senders and receivers in the communication of a message. A transducer, according to Shannon, encodes a message into signals which are transmitted in a way such that they can be successfully received across a communication medium which may degrade or disrupt the transmission. In order to overcome signal degradation, the sending transducer must add extra bits to the communication – or redundancy – such that the receiver has sufficient leeway in being able to correctly identify the message.
Music differs from this Shannon-type communication because the redundan- cies of its elements are central to the content of music’s communication: music consists primarily of pattern. Moreover, music’s redundancy is apparent both in its diachronic aspects (for example, the regularity of a rhythm or pulse), and in its synchronic aspects (for example, the overtones of a particular sound). Music’s unfolding is an unfolding between its synchronic and diachronic redun- dancy. Shannon entropy is a measure of the capacity of a system to generate variety. Shannon’s equations and his communication diagram re-represent the fundamental cybernetic principle of Ashby’s Law of Requisite Variety, which states that in order for one system to control another, it must have at least equal variety. In order for sender and receiver to communicate, the variety of the sender in transmitting the different symbols of the message must be at least equal to the variety of the receiver which reassembles those symbols into the information content. The two transducers, sender and receiver, communicate because they maintain stable distinctions between each other: information has been transferred from sender to receiver when the receiver is able to predict the message. Shannon describes this emergent predictive capacity in terms of the ‘memory’ within the transducers. It is rather like the ability to recognise a tune on hearing the first couple of notes.

In using Shannon to understand the diachronic aspects of music, we must address one of the weaknesses of Shannon’s equation in that it demands an index of elements which comprise a message (for example, letters in the alphabet). In cases where the index of elements is stable, then the calculation of entropy over a long stretch of communication will approximate to entropy over shorter stretches since the basic ordering of the elements will be the similar. This property is called ergodicity and applies to those cases where the elements of a communication can be clearly identified. Music, however, is not ergodic in its unfolding: its elements which might be indexed in Shannon’s equations are emergent. Music’s unfolding reveals new motifs, themes, moods and dynamics. None of these can be known at the outset of a piece of music.

In order to address this, some basic reconsideration of the relationship between order and disorder in music is required. In Shannon, order is determined by the distribution of symbols within constraints: Shannon information gives an indication of the constraint operating on the distribution. The letters in a language, for example, are constrained by the rules of grammar and spelling. If passages of text are ergodic in their order it is because the constraint operating on it is constant. Non-ergodic
communication like music result from emergent constraints. This means that music generates its own constraint.

In Shannon’s theory, the measure of constraint is redundancy. So to say that music is largely redundant means that music participates in the continual gen- eration of its own constraints. Our technical and empirical question is whether there is a way of characterising the emergent generation of constraint within Shannon’s equations.

To address the problem of emergent features, we need to address the problem of symbol-grounding. Stated most simply, the symbol grounding problem concerns the emergence of complex language from simple origins. Complex symbols and features can only emerge as a result of interactions between simple components. A further complication is that complex structures are neither arbitrary nor are they perfectly symmetrical. Indeed, if the complex structures of music, like the complex structures of nature are to be described in terms of their emergence from simple origins, they are characterized by a “broken symmetry”. In this sense, musical emergence is in line with recent thinking in physics considering the symmetry of the universe, whereby very small fluctuations cause the formation of discrete patterns.

Symmetry-breaking is a principle by which an analysis involving Shannon entropy of simple components may be used to generate more complex structures. In the next section we explain how this principle can be operationalized into a coherent information-theoretical approach to music using Shannon’s idea of Relative Entropy.

8 Relative Entropy and Symmetry-Breaking

As we have stated, music exhibits many dimensions of constraint which overlap. Each constraint dimension interferes with every other. So what happens if the entropy of two dimensions of music change in parallel? This situation can be compared to the joint pairing of meta-system and system which was discussed earlier.
If the constraint of redundancy of the many different descriptions of sound (synchronic and diachronically) constrain each other, then the effect of the re- dundancy of one kind of description in music on another would produce non- equilibrium dynamics that could be constitutive of new forms of pattern, new redundancy and emergent constraint. This effect of one aspect of redundancy on another can be measured with Shannon’s concept of relative entropy.

The effect of multiple constraints operating together in this way is similar to the dynamics of autocatalysis: where effectively one constraint amplifies the conditions for the other constraint. What is to stop this process leading to positive feedback? The new constraint, or new dynamic within the whole system is necessary to maintain the stability of the other constraints.

So what happens in the case of the Alberti bass? The relative entropy of the notes played, the rhythm, intervals, and so on would lead to collapse of distinctions between the components: a new component is necessary to attenuate the dynamic.

This helps explain the changes to the relative entropy in Mozart’s sonata. But does it help to explain minimalism? In the music of Glass, there is variation of timbre, and the gradual off-setting of the accompanimental patterns also helps to give the piece structure.

However, whilst the explanation for relative entropy helps to explain why novelty – new ideas, motifs, harmonies, etc. are necessary, it does not help to explain the forms that they take.
In order to understand this, we have to understand the symmetry-breaking principle is working hand-in-hand with the autocatalytic effects of the overlap of redundancy. Novelty is never arbitrary; it is always in “harmony” with what is always there. What this means is that novelty often takes the form of one change to one dimension, whilst other dimensions remain similar in constitution.
Harmony is another way of describing a redundant description. The melody of Mozart’s piano sonata is redundant in the sense that it reinforces the harmony of the accompaniment. The listener is presented with an alternative description of something that had gone before. The redundancy is both synchronic and diachronic.

From a mathematical perspective, what occurs is a shift in entropies over time between different parts.

The coincidence of entropy relations between parts over a period of time is an indication of the emergence of novelty. Symmetry breaking occurs because the essential way out of the hypercycle is the focus on some small difference from which some new structure might emerge. This might be because the combined entropies of multiple levels throw into relief other aspects of variation in the texture.

This would explain the symmetry-breaking of the addition of new voices, the subtle organic changes in harmony and so forth. However, it does not account for the dialectical features of much classical music. However, what must be noted is that the common pattern before any dialectical moment in music is that there is punctuation or silence or cadence. For example, in the Mozart example, the first four-bar phrase if followed by a more ornamental passage where whilst the harmony remains the same, a semiquaver rhythm takes over in the right hand, and in the process introducing more entropy into the diachronic structure.

Once again, within the dialectical features of music, it is not the case that musical moments are arbitrary. Moreover, it is the case that the choice of dialectical moments in which seem to continually surprise but relate to previous elements. Both in the process of composing music, and in the process of listening to it, what is striking is that the shared experience between composer, performer and listener is of a shared profundity. Where does this profundity come from?

The concept of relative entropy is fundamental to all Shannon’s equations. The basic measure of entropy is relative to a notional concept of maximum entropy. This is explicit in Shannon’s equation for redundancy, where in order to specify constraint producing a particular value of entropy, the scalar value of entropy has to be turned into a vector representing the ratio between entropy and maximum entropy. The measurement of mutual information, which is the index of how much information has been transferred, is a measurement the en- tropy of the sender relative to the entropy of the receiver. The problem is that nobody has a clear idea of what maximum entropy actually is, beyond a general statement (borrowed from Ashby) that it is the measurement of the maximum number of states in a system. In a non-ergodic emergent communication, there is no possibility of being able to establish maximum entropy. Relative entropy necessarily involves studying the change of entropy in one dimension relative to the change of entropy in another. One section of music over a particular time period might exhibit a set of descriptions or variables, each of which can be as- signed an entropy (for example, the rhythm or the distribution of notes played). At a different point in time, the relationship between the entropies of the same variables will be different. If we assume that the cause of this difference is the emergent effects of one constraint over another then it would be possible to plot the shifts in entropy from one point to the next and study their relative effects, producing a graph.

Such an approach is revealing because it highlights where entropies move together and where they move apart. What it doesn’t do is address the non- ergodicity problem of where new categories of description – with new entropies – are produced, how to identify them, and how to think about the effect that new categories of redundancy have on existing categories. Does the emergence of a new theme override its origins? Does the articulation of a deeper structure override the articulation of the parts which constitute it? However, the analysis can progress from relative entropy to address these questions if we begin by looking at the patterns of relative entropy from an initial set of variables (say, notes, dynamics, rhythm, harmony). If new dimensions are seen to be emergent from particular patterns of inter-relation between these initial dimensions, then a mechanism for the identification of new dimensions can be articulated. In effect, this is a second-order analysis of first-order changes in relative entropy.

A second point is to say that first-order, second-order and n-order dimen- sions of constraint will each interfere with each other over time. The experience of listening to music is a process of shifts between different orders of description. In Schenker’s music analytical graphs, this basic idea is represented as differ- ent layers of description from deep structure, middleground and foreground. A Shannon equivalent is relative entropy, relative entropy of relative entropy, relative entropy of relative entropy of relative entropy. . . and so on. In the analysis of music (for example, Herbert Brun) and in methods for the construction of musical composition (Xenakis), Shannon entropy has fascinated music scholars. Here we offer a new perspective drawing on the application of Shannon entropy to the analysis of scientific discourse. In this work, we have explored how mu- tual redundancy between different dimensions of communication in a discourse can be used as an index of meaning. In applying this approach to music, we consider the ways in which emergent mutual redundancy and relative entropy between different descriptions of music as it unfolds can identify similar patterns of meaning making in musical communication.

9 Common origins expressed through time

One of the problems of Shannon entropy and the transduction which is essentially expressed by is that, as an aspect of Ashby’s law, it is fundamentally conservative: the Shannon communicative model manages variety. In the same way that the different levels of regulation within Beer’s viable system model work together to articulate coherence of the whole. What Shannon entropy does not do is to articulate a dynamic whereby new things are created. Yet in order to account for music’s continual evolution, and also to account for the shared recognition of profundity, we need an account of the generation of profundity from some common, and simple, origin. 

Beer suggested a way of doing this in his later work which articulates his “Platform for change”. Here the process of emergence is fundamentally about the management of uncertainty.
Uncertainty is an important metaphor for understanding music. After all, for any composer, the question is always “what should I do next?” The choices that are faced by a composer are innumerable. In biology, there is evidence that cellular organization also is coordinated around uncertainty (Torday)

Uncertainty is an aspect of variety: it is basically the failure of a meta system to manage the variety of a system: or at least to fail to identify that which needs to be attenuated, or to attend to the appropriate thing. Yet the dialectical truth of music is that identity is always preserved amidst fluctuating degrees of uncertainty. This can be drawn as a diagram whereby whatever category is identified, which might be rhythm, notes, or whatever is counted within Shannon entropy contains within it its own uncertainty of effective identification.

The uncertainty of identification must be managed through the identification of a category, and this process – a metaprocess of categorical identification – interferes with other processes. In effect it codifies particular events in particular values. Yet the values which are codified will be challenged or changed at some point.

The coordination of uncertainty produces a generative dynamic which can under certain circumstances produce a symmetry. But the critical question is why the symmetry which is produced is common. An answer may lie in the origin of the cell.

I’ve got a diagram to show this. The fundamental feature of an improved mechanism to explain the emergent dynamics of musical innovation is that there is a dual transduction process: an inner transduction process which maintains balance within the identity of an entity, and an outer transduction process which maintains the relation between the identity and its environment. It should be noted that it is precisely this mechanism that exists in cellular communication through the production of protein receptors on the cell surface and the interaction between cells on the outside.

10 Communicative Musicality and relative entropy

Schutz’s understanding that musical communication as a shared experience of time now requires some reflection. If music is the shared experience of time, then what can an approach to understanding of entropy contribute to our understanding of time? The intersubjective experience of music where a shared object of sound provides a common context for the articulation of personal experience and consequently the creating of a ground of being for communication. In this can facilitate the sharing of words, of codified aspects of communication which Luhmann talks about. The remarkable thing with music is that the symmetry breaking of one individual is remarkably like the symmetry breaking of another individual. The choices for advanced emergent structures do not appear to be arbitrary. This may well be because of the synchronic structure of the materiality of sound. However it may go deeper than this. It may be that the essential physical constitution of the cell carries with it some pre-echoes of the physics of sound, which may at some point be united in fundamental physical mechanisms (probably quantum mechanical). There is an empirical need to search for the biological roots of music’s contact with biology. The work in communicative musicality provides a starting point for doing this.

Sunday, 15 July 2018

Uncertainty in Counting and Symmetry-Breaking in an Evolutionary Process

Keynes's, in his seminal "Treatise on Probability" of 1921 (little known to today's statisticians who really ought to read it), identified a principle which he called "negative analogy", as the principle by which some new difference is codified and confirmed by repeated experimental sampling.

"The object of increasing the number of instances arises out of the fact that we are nearly always aware of some difference between the instances, and that even where the known difference is insignificant we may suspect, especially when our knowledge of the instances is very incomplete, that there may be more. Every new instance may diminish the unessential resemblances between the instances and by introducing a new difference increase the Negative Analogy. For this reason, and for this reason only, new instances are valuable." (p. 233)
This principle should be compared to Ashby's approach to a "cybernetic science": the cybernetician "observes what might have happened by did not". The cybernetician can only do this by observing many descriptions of a thing to observe the "unessential resemblances" and introduce "a new difference". What both Keynes and Ashby are saying is that the observation of likeness is essentially uncertain.

The issue is central to Shannon information theory. Information theory counts likenesses. It determines the surprisingness of events because it treats each event as an element in an alphabet. It can then calculate the probability of that event and thus establish some metric of "average surprisingness" in a sequence of events. Although in the light of Keynes's thoughts on probability this seems naïve, Shannon's equation has been extremely useful - we owe the internet to it - so one shouldn't throw the baby out with the bathwater.

But the Shannon index, which identifies the elements of the alphabet, is actually a means by which uncertainty is managed in the process of calculating the "surprisingness" of a message. This can be shown in the diagram below, derived from Beer's diagrams in Platform for Change:

The beauty of this diagram is that it makes it explicit that the Shannon index is a "creative production" of the process of uncertainty management. It is a codification or categorisation. That means that essentially, it only has meaning because it is social. That means in turn that we have to consider an environment of other people categorising events, and for the environment to produce many examples of messages which might be analysed. Two people will differ in the ways they categories their events, which means that the uncertainty dynamic in counting elements is fluid, not fixed:

There is a tipping-point in the identification of indexes where some current scheme for identifying differences is called into question, and a new scheme comes into being. New schemes are not arbitrary, however. Some difference in the examples that are provided gradually gets identified (as Keynes says: "Every new instance may diminish the unessential resemblances between the instances and by introducing a new difference increase the Negative Analogy") but the way this happens is somehow coherent and consistent with what has gone before. 

I wonder if this suggests that there is an underlying principle of evolutionary logic by which the most fundamental principles of difference from the very earliest beginnings are encoded in emergent higher-level differences further on in history. A new difference which is identified by "negative analogy" is not really "new", but an echo of something more primitive or fundamental. Shannon, of course, only treats the surface. But actually, what we might need is a historicisation of information theory.

Let's say, for the sake of argument, that the foundational environment is a de Broglie-Bohm "pilot wave": a simple oscillation. From the differences between the codification of a simple oscillation, higher level features might be codified, which might then give way to the identification of new features by drawing back down to fundamental origins. The symmetry-breaking of this process is tied to the originating principle - which could be a pilot wave, or some other fundamental property of nature.  

So what might this mean for Shannon information? When the relative entropy between different features approaches zero, then the distinction between the difference between features collapses. This may be a sign that some new identification of a feature is about to take place: it is a collapse into the originating state.

Each level of this diagram might be redrawn as a graph of the shifting entropies of features at each level. A basic level diagram can draw the entropy of shifting entropies. A further level can draw the entropy of the relations between the entropy of shifting entropies, and so on.

We may not be able to see exactly how the negative analogy is drawn. But we might be able to see the effects of it having been drawn in the evolutionary development of countable features. Surprise has an evolutionary hierarchy. 

Thursday, 5 July 2018

Seven Problems of Pointing to the Future of Education (with our hands tied behind our back) and Seven suggestions for addressing it

The theme of "research connectedness" is common in today's universities. It's a well-intentioned attempt to say "University isn't school". Unfortunately, due to a combination of factors including marketisation, modularisation and learning outcomes, university has become increasingly like school in recent years. "Research connectedness" attempts to remedy the trend by introducing more inquiry-based learning and personalised curricula. All of this stuff is good, and its something which I have been involved with for a long time. It's also at the heart of what I'm doing at the Far Eastern Federal University in Russia on "Global Scientific Dialogue". But I can't help thinking that we're still missing the point with all these new initiatives. There are (at least) seven problems:

Problem 1: Universities see the curriculum as their product. However, the product of learning is the student's understanding which is arrived at through conversation. The university sells courses and certificates; it does not sell "the potential for arriving at an understanding".

Problem 2: Learning outcomes do not measure student understanding of a subject. They establish the validity of a student's claim to meet a set of criteria written by a teacher (or a module author). What it really measures is the student's understanding of the assessment process.

Problem 3: Learning outcomes codify the expectations of teachers with regard to the way that student performance will be assessed in a subject. By definition, they demand that the teacher knows what they are doing. In research, it is often the case that nobody quite knows what they are doing (Einstein: "If we knew what we were doing, we wouldn't call it research!")

Problem 4: Modules are aggregates of learning outcomes; that is, sets of expectations. Students study many modules, and have to negotiate many different sets of expectations from many different teachers. There is no space for how different teachers' understandings and expectations differ, whether there is coherence, or how incoherence might lead to fundamental problems in the student's understanding.

Problem 5: Inevitably, the only way to cope is to behave strategically. "How do I pass?" becomes more important than "What do I understand?". Suppressing "What do I understand?" in some cases may lead to mental breakdown.

Problem 6: An inquiry-based module from the perspective of strategic learners appears to be the worst of all possible worlds: "basically, they don't teach you anything, and you have to find your own way". Since even inquiry-based modules will have learning outcomes, a strategic approach may work, but result in very dissatisfactory learning experiences ("I didn't learn anything!")

Problem 7: Students are customers in an education market. Whilst learning outcomes codify teacher expectations, learner expectations are increasingly codified by the financial transaction they have with the university, and "student satisfaction" becomes a weapon that can be used against teachers to force them to align their expectations with the learners'.

What can we do?  Here are seven suggestions:

1. The product of the university must be student understanding. Certificates, modules and timetables are epiphenomena.

2. Understanding is produced by conversation. The fundamental function of the university is to find ways of best coordinating rich conversations between students and staff.

3. The curriculum is an outmoded means of coordinating conversation. It is a rigid inflexible object in a fast-changing, uncertain world. The means of coordinating conversation needs to become a much more flexible technology (indeed, this is where investment in technology should be placed, not in VLEs or e-Portfolio, which merely uphold the ailing curriculum)

4. Traditional assessment relies on experts, which necessitates hierarchy within the institution. This hierarchy can be transformed into a "heterarchy" - a flat structure of peer-based coordination. Technologies like Adaptive Comparative Judgement, machine learning and other tools for collaboration and judgement-making can be of great significance here.

5. Transformation of institutional hierarchy can produce far greater flexibility in the way that learners engage with the institution. The "long transactions" of assessment (i.e. the 14 week period "I've done my assessment, you give me a mark") can be broken-up into tiny chunks, students can genuinely roll-on, roll-off courses, and new funding models including educational subscription and assessment services explored.

6. The university needs to investigate and understand its real environment (it is not a market - it is society!). The environment is uncertain, and the best way to understand uncertainty is through making exploratory interventions in society from which the university can learn how to coordinate itself. Generosity towards society should be a strategic mission: free courses, learning opportunities, community engagement should be done for the purpose not of "selling courses", but for the strategic seeking of future viability.

7. To put student understanding at the heart of what the university is, is also to place shared scientific inquiry as the underpinning guide. The scientific discourse is hampered by ancient practices of publication and status which ill-suit an inherently uncertain science. The university should free itself from this, and embrace the rich panoply of technology we have at our disposal for encouraging scientists to communicate their uncertainty about science in an open and dialogic way.

Saturday, 30 June 2018

Ground State

I haven't done any improvisation for ages. It is really a kind of spiritual practice for me - one that I am understanding more deeply now, as I am looking at work on "Communicative Musicality". I like Trevarthen's idea of "synrhythmia" and "amphoteronomy". It seems plausible to me that the vibrations of sound resonate with the circadian rhythms of physiology.

John Torday suggests that the physiological mechanisms and vibrations which underpin consciousness, are in themselves reflected in the action of cellular calcium pumps, which may well unlock the primal origins of cells in pre-history and the "implicate order" in Bohm's quantum mechanics. Music affects us profoundly - might it be because the whole universe and the wholeness of history is collapsed within the cellular organisation of our physiology? What a thought that is! I would like it to be true...

Tuesday, 26 June 2018

Tara McPherson on UNIX, technical componentisation and Feminism

I came back from Berkeley with a haul of books, partly thanks to the wonderful Moe’s Bookstore, and to a recommendation from a Japanese biology professor for a book of essays about current work on "communicative musicality", which I didn’t know about. In answer to the topic of my paper in Berkeley, “Do cells sing to each other?”, the answer is yes, and we are only just beginning to understand the aesthetic dimensions of communication which underpin biological self-organisation, and knowledge about which will, I believe, transform the way we think about human communication and learning. I also picked up a copy of Tara McPherson’s “Feminist in a Software Lab”, which my wife drew my attention to in another bookshop window.

It’s a beautiful book. McPherson is one of the lead figures in the digital humanities, and her book concerns underpinning critical issues within the most basic technologies we use. Unlike a lot of critical work about technology, which tends to be written by people who are not so comfortable with a command prompt, McPherson understands the world of UNIX kernels, cron jobs and bash scripting from the perspective of a practitioner. She also understands the technical rationale behind people like Eric Raymond, mention of whom caused such uproar among feminist critics at this year’s OER conference. But because she understands the technics, she can see beyond the surface to deeper problems in the way we think about technology, and to where Raymond’s deeply unpleasant politics is connected to a rationale for software development which very few dispute. She cites Nabeel Siddiqui who, on a SIGCIS listserv exchange about “Is UNIX racist?”, says:
“Certain programming practices reflect broader cultural ideas about modularity and standardization. These ideas also manifest in ideas about race during the Civil Rights movement and beyond… Computation is not simply about the technology itself but has broad implications for how we conceive of and think about the world around us… The sort of thinking that manifests itself in ‘color-blind’ policies and civil rights backlash have parallels with the sort of rhtetoric expressed in Unix Programming manuals.” 
McPherson adds “this thinking also resonates with structures within UNIX, including its turn to modularity and the pipe.” With regard to education, she comments:
“Many now lament the overspecialization of the university; in effect, this tendency is a result of the additive logic of the lenticular or of the pipeline, where “content areas” or “fields” are tacked together without any sense of intersection, context or relation.” 
She quotes Zahid Chaudhury saying
“hegemonic structures tend to produce difference through the very mechanisms that guarantee equivalence. Laundry lists of unique differences, therefore, are indexes of an interpretive and political desire, a desire that often requires recapitulation to the familiar binarisms of subordination/subversion, homogeneity/heterogeneity, and increasingly, immoral/moral” 
This connection urgently needs to be made. The lack of diversity in tech is a problem – but it is underpinned by an approach to rationalist thinking which has gone unchallenged and which frames the way we think about science and software, pedagogy and the organisation of education – and, most importantly, diversity itself. Misogyny and racism are built into the genotype of techno-rationalism. This helps to explain how simply increasing diverse representation doesn’t really seem to change anything. Something deeper has to happen, and McPherson points to where that might be.

It is right to focus critique on the component orientation of modern software. We rationalise our software constructions as recursive aggregations of functional components; we replace one system with another which is deemed to have “functional equivalence”, all the time obliterating the difference between the village post office and an online service. Having said this, component orientation seems to help with the management of large-scale software projects (although maybe it doesn’t!), and facilitates the process of recombination which is an important part of many technical innovations. Yet McPherson also points to the fact that this separation and otherness is also a creation of boundary and distinction, and those distinctions tend to accompany distinctions of race and gender.
Through her Vectors project, McPherson has been probing at all this. She has enlisted the support of some powerful fellow travellers, including Katherine Hayles, whose work on cybernetics and the post-human is equally brilliant.

It’s rather easy these days to adopt a critical stance on technology – from the perspective of race, gender, sexuality, and so on. That’s because, I think, there’s so much injustice in the world and many people are hurting and angry. But critical intelligence demands more than the expression of outrage – that, after all, will be componentised by the system and used to maintain itself whilst it pretends to embrace diversity. Critical intelligence demands a deeper understanding of more fundamental biological, ecological, physical and social mechanisms which find expression in our technology.

McPherson is an advocate for making things – not just talking about them. If we all need to learn to code (and I am very sceptical about the motivation for government initiatives to do this), it is not because we all need to become workers for Apple or Microsoft. It is because we need a deep understanding of a way of thinking which has overtaken us in the 20th and 21st century. It’s about mucking-in and talking to each other.

Saturday, 23 June 2018

The Presence of the Past and the Future of Education

I've been in Berkeley for the last few days at the Biosemiotics gathering (http://biosemiotics.life). It's a long story as to how an educational technologist becomes interested in cell to cell signalling, but basically it involves cybernetics, philosophy, music and technology. In fact, all the things that this blog is about.

In addition to the biosemiotics conference, I went to Los Angeles to meet with Prof. John Torday of UCLA whose work on cell signalling (see http://www.thethirdwayofevolution.com/people/view/john-s.-torday) follows a different path to that of the biosemioticians like Terry Deacon. Deacon's work on the role of constraint in epigenesis (see https://www.amazon.co.uk/Incomplete-Nature-Mind-Emerged-Matter/dp/0393049914/ref=sr_1_1?ie=UTF8&qid=1529740533&sr=8-1&keywords=incomplete+nature) impressed me hugely because he was basically saying something that philosophers had been arguing for a long time: absence is causal. Actually, Bateson got there first (also at the conference was Bateson scholar Peter Harries-Jones, whose work on bio-entropy is very important), but Bateson made a fundamental distinction between the Jungian Pleroma and Creatura - between the non-living inanimate world which obeys the 2nd law of thermodynamics, and the living world, which works against entropy, producing "neg-entropy", or "information".

Torday goes beyond Bateson, and suggests a deep connection between pleroma and creatura, between matter and consciousness. To do this, he cites the quantum mechanics of David Bohm whose hidden variables, or pilot waves, presents a fundamental originating mechanism for what Bohm calls an "implicate order". Going beyond Bateson is no mean feat. I'm convinced that this is right.

Torday has been steadily producing empirical evidence in his work on the physiology of the lung and the treatment of asthma. There have been significant medical breakthroughs which can only be explained through his new perspective on cell signalling.

Put most basically, cells organise themselves according to the ambiguity in their environment. Since the environment is central to cellular organisation, changes to the environment become imprinted in cell structures, where environmental stress causes fundamental functional changes to organisms. This helps to explain Stephen Jay Gould's exaptation, or pre-adaptation, by which the swim bladder of the fish evolves into a lung.

But its not just lungs and swim bladders. Consciousness itself may also be the result of a similar process. The primeval past of evolutionary development, from the big bang (or whatever was at the beginning) to the present is enfolded in our being.

Torday and I talked a lot about education. The pursuit of truth is really a pursuit of the fundamental ground state - of what Bohm calls the implicate order. The truth resonates, and Bohm himself argued that through music we could glimpse something of the implicate order which we lose sight of in other aspects of intellectual life. But we also see it in love, justice, and mathematics.

I'm optimistic because I think that in the end we have the truth on a spring. Right now, it's stretched almost to breaking point... I'm experiencing some of the direct consequences of this myself at the moment. But truth will return - although springs, like cells, have hysteresis, so everything which has been remains present in everything that comes after. This should serve as a warning to those who pursue self-interest, greed and oppression.

At the biosemiotics conference at Berkeley I got everyone to make music. They loved it, partly because they had to engage with each other in making sounds and listening to each other. The implicate order is in our communicative practice - it can't be abstracted away. In the end, when things right themselves again, we will teach our students differently, and we will use our technology to transform the ways we organise deep conversations into what Bohm called "dialogue". Fundamentally, we will dance again, because conversation is dancing - it is, as I mentioned at Anthony Seldon's excellent HE Festival the other week, "con-versare"... to "turn together". 

Sunday, 10 June 2018

Information theoretical approaches to Ecology in Music

Information theory provides an index of "surprise" in a system. It concerns the difference between what is anticipated and what is unexpected. So in a string of events which might read A, A, A, A, A there is a strong anticipation that the next event will be A. The Shannon formula reveals this because the probability of A in this string is 1, and log 1 is 0, thus making the degree of surprise 0. But if the string read A, A, A, A, Z then we get a much higher number: the probability of A is 4/5, and the probability of Z is 1/5, and their logs (base 2) are  -0.32192809488 and -2.32192809489. Multiply these by the probabilities we get:

(4/5 * -0.32192809488) + (1/5 * -2.32192809489) =

-0.25754247591 + -0.464385618978 = -0.721928094888

The problem with this approach is that it sees a sequence of characters as a single description of a phenomena that can be treated independently from any other phenomena. But nothing only has a single description. There are always multiple descriptions of this. This means that there are multiple dimensions of "surprise" which must be considered together when doing any kind of analysis - and each dimension of surprise constrains the surprising nature of other dimensions.

A musical equivalent to the A, A, A, A, A might be seen to be
But is this right? By simply calculating the entropy of the note C, this would give an entropy of 0. And so would this...
What if the Cs continued for hours (rather like the B-flats in Stockhausen's "Stimmung") - is that the same? No. 

A better way to think about this is to think about the interacting entropies of multiple descriptions of the notes. How many descriptions are there of the note C? Well, there are descriptions about the timbre, the rhythm, the volume, and so on. And these will vary over time, both from note to note, and from time t1 to time t2.. 

I've written a little routine in Python to pull apart the different dimensions in a MIDI file and analyse it in time segments for the interactions between the entropies of the different kinds of description (I'll put the code on GitHub once I've ironed-out the bugs!). 

Analysing the midi data produces entropies over time sections, which look a bit like this (using 2-second chunks):
These values for entropy for each of the dimensions can be plotted against one another (one of the beauties of entropy is that it normalises the "surprise" in anything - so sound can be compared to vision, for example). Then we can do more with the resulting comparative plots. For example, we can spot where the entropies move together - i.e. where it seems that one entropy is tied to another. Such behaviour might suggest that a new variable could be identified which combines the coupled values, and that the occurrence of that new variable can then be searched for and its entropy calculated. This overcomes the fundamental problem with Shannon in that it seems tied to a predefined set of  variables. 

Comparing the interaction of entropies in music can be a process of autonomous pattern recognition - rather like deep learning algorithms. But rather than explore patterns in a particular feature, it explores patterns in surprisal between different features: the principal value of Shannon's equations is that they are relational. 

The point of pursuing this in music is that there is something in music which is profoundly like social life: its continuous emergence, the ebb and flow of emotional tension, the emergence of new structure, the articulation of a profound coherence, and so on. David Bohm's comment that music allows us to apprehend an "implicate order" is striking. I realised only recently Bohm's thought and the cosmological thought of the composer Michael Tippett might be connected (Tippett only became aware of Bohm very late in his life, but expressed some interest in it). That's my own process of seeking cosmological order at work!

Tuesday, 5 June 2018

Screens, Print and the Ever-changing lifeworld

I'm writing some music at the moment. I'm using self-publishing book tools provided by Blurb (http://blurb.co.uk) to help me focus on the always laborious process of studying and playing the written notes and gradually improving them. It seems to be working. My initial not-very-good-notes sit on my piano in a beautiful book. I play at them, cross things out, make adjustments, which I then feed into the score and will get to the stage when I produce the next printed version.

This process isn't the same as simply printing-off pages. The blurb book arrives a few days after ordering and it looks beautiful. The pages are bound together which mean that the ordering of the flow of the music is tied into the form of the book. In other words, the form of the book constrains the form of the music as I originally wrote it. The constraint is useful because it means that I have to work with what's there, chipping away bits and pieces.

On a computer screen anything is possible. Any mistake can be made, erased, remade, re-erased, etc. The computer presents unlimited possibilities. And that can be a problem in creative work. Unlimited possibilities = increased uncertainty in making decisions about what to do. The computer presents an ever-changing lifeworld.

As human beings (and indeed, as animals), we desire a manageably stable relationship with our ennvironment. It is this primal force which sits behind Lorenz's 'imprinting' and Bowlby's 'attachment'. It starts with proximity to the parent, and transforms into proximal relationships to objects such as toys and teddy bears, and later I think into attachments to ideas - where some of those ideas are our own creations. This primal force is something which is destabilised by computers - and particularly by the AI-driven social media which is ever-changing.

I noticed that Dave Elder-Vass wrote about our 'attachment' to online services (although he never mentions Bowlby) in his recent "Profit and Gift in the Digital Economy". His instinct is right, but what he calls attachment is I suspect a clinging-on to some kind of stability. Facebook is like Lorenz's wire-frame "mother": as it changes, we are compelled to follow. But as we do so, we are taken back to that primal stage of imprinting when we were babies. In adult life, however, we learn to create our own environment with concepts, artefacts and tools. Higher learning is an important stage of development in enabling us to do this.

The important point is that the adult life of declaring new concepts and ideas entails acts of communication which connect something inside us (a psychodynamic process) to something in our environment (a communicative process). The balance between the inner process and the outer process is a sign of health in the individual's relation to the world. So what if the communicative dimension is replaced with a constant stream of visual disruptions which demand the maintenance of proximity towards them? How do these inner world phenomena get expressed? How is the balance between inside and outside maintained?

I think the answer is, it isn't. There's something stupid about the way that the continually shifting phenomena of the online world mean that the outer world stability which is necessary for personal growth is never allowed to form. The reason is partly to do with the corporate business models of the social media companies: they need an ever increasing range of transactions with their customers in order to justify their existence and maintain their value. This corporate model necessitates damaging the mental health of users by destabilising their lifeworld. The obsession with social media may be a kind of PTSD: might we see lawsuits in the future???

So what of print and my music writing? My book of  notes arrives a few days after I ordered it, and it stays with me. I continually glance through it, thinking about changes and improvements, and scribbling all over it. But the book is stable. It becomes my attachment object, and since it is stable, I can coordinate the flow between my inner processes and my outer processes.

I encouraged a friend who is currently writing-up their PhD thesis to send their draft document to blurb to get it printed: "You need multiple descriptions of the thing you are working on in order to focus and develop your ideas". He did it, and it seems that his experience is very similar to my own.

There's something important about print. As the internet becomes ever-more controlled by government and corporations, I wouldn't be surprised to see what is efffectively the 3d printing of books become a major activity in the near future. People often talk about Stewart Brand's "Whole Earth Catalogue" of the 1960s as being a proto-internet. But maybe the book itself is about to find a new lease of life for the sake of everyone's sanity!

Wednesday, 30 May 2018

Synthetic and Analytic Approaches to Technology and the Future of Education

Most research in almost all domains today has the form of a synthesis. Attempts are made to establish coherence between current knowledge, manifest phenomena, available theories and prevailing methodologies. Whether one is trying to reconcile relativity theory with quantum mechanics, or the secrets of epigenesis, or one is trying to find out what the future of education holds, it's the same story. In the case of physics, the manifest phenomena are produced by experiments designed with the available models (the immensely complicated "standard model"). Experiments tend to confirm the model - which might mean it's right, but might not tell us how something more fundamental is going on, from where we might look at things differently.

Major scientific breakthroughs rarely follow the synthetic path. We call them "Copernican" because they don't. We call them that because somebody comes along and asks an analytical question, not a synthetic one. They ask "What simple origin could be producing this complexity of theory, phenomena and methods?" Copernicus realised we were looking down the wrong end of the telescope, as did Galileo, Newton, Einstein and quite a few others.

My colleague Peter Rowlands, has drawn the relationship between synthetic and analytic science like this (from the perspective of physics). I find it a powerful diagram:

So what of the future of education? The synthetic approach is to seek the manifest phenomena, use the available methodologies, assess the current knowledge and try to fit it with available theories. In education, of course, none of this is coherent: there is no coherent theory, there is no agreed methodology (although there are some which have become normatively established - largely through PhD programmes), and where reporting manifest phenomena, all education can seem to manage is responses to questionnaires and interviews, and test scores. Is a synthetic approach going to work? I doubt it. It will tell us what we already know.

So what of an analytical approach to education. Where would we start? The starting point is to speculate that there must be a simple originating principle behind the rapidly increasing complexity that we find ourselves in - something behind the babel of theories, interviews, methods. Freud saw it in terms of psychodynamics; John Bowlby saw it in terms of attachment; Gordon Pask saw it in terms of conversation; Maturana and Varela saw it in terms of what they call "structural coupling"; Beer saw it in terms of "variety"; Piaget saw it in terms of assimilation and accommodation. And these principles are not limited to explain social phenomena: they are connected to the explanation of physical phenomena too (particularly in the case of Beer, Maturana and Pask)

Analytic approaches are good at predicting the future. One of the most powerful analytic approaches to the future of technology is contained in Winograd and Flores's Understanding Computers and Cognition. There are no interviews, and their methods are a combination of critique of the status quo, and the seeking of a coherence between philosophy, speech act theory and cybernetics. But their predictions were right where so many others were pointing in the opposite direction. Edgar Morin's Seven Complex Lessons of Education for the Future (see unesdoc.unesco.org/images/0011/001177/117740eo.pdf) is another example. He too is well ahead of his time - even if the book is less than practical.

It's interesting that the concentration of scientific effort on synthesis can in a large part be attributed to the current practices of scientific publication. In order to get past peer-review, in order to play the citation game, etc, one has to synthesise knowledge. The citation is the mark of synthesis. An important step in moving towards a more analytic frame is to break the hold that publishers have on academic activity.

Saturday, 26 May 2018

Individuation and Higher Learning in Vladivostok (Paper for the Philosophy of Higher Education Conference)

Of all the things I am doing at the moment, a radical educational experiment in Russia has been by far the best. Weirdly, myself and Seb Fiedler have had to travel to the other side of the planet to do something different. I'm going to San Francisco in a couple of weeks for a conference on biosemiotics, but it struck me that in the 60s, to do something radical, people went to California to escape the stiffness of the establishment. Now, the establishment is definitely in California (it's defined by California!)... so we fly 14 hours in the other direction... to Vladivostok! Not as warm in climate, but just as warm in terms of the people there. And when I think of the trouble that Russians have to go to to get a visa to come and see me in the UK (they have to fly 9 hours from Vladivostok to Moscow), it puts my 10 hour flight to San Francisco in perspective.

It's taken a while for me to articulate what the plan was in Russia. As I've written before, it's a course on systems thinking, but really, we are aiming to use technology to oil the connections between the inner world of learners and the outer world of communication. It's pretty much what psychotherapists do. Which leads me to think that Higher Learning is really about "Individuation" in a Jungian sense.

My 18-year old daughter, who is eschewing university (at least for now) in disgust at it simply being "more school for which we have the privilege of paying" (she's quite right),  has been pointing to the rise in mental health problems at University. "But they're doing this to their students!" She may be right. But we don't understand how or why. Except that I think it's got something to do with talking and listening.

The technological explosion of the last few years has exposed us to vastly increased variety in sensual stimulation which reaches our minds, but the experience of increased variety is rarely talked about. Instead we may talk about Trump or cute kittens and giggle, but never talk about what is actually happening to us. So a lot is going in and not enough is being intelligently exchanged in discourse to maintain a balance between the inner psychodynamic mechanism and social mechanisms. Internet porn is probably the most obvious example of where this is happening, but really it's everything from fake news to constant social media checking. AI may help alleviate the problem by facilitating deeper human connections between people, or it may exacerbate it. Either way, we have to wake up to what is happening, because AI is going to make it bigger.

The human result is unmanaged uncertainty in the psychodynamic process - which is a recipe for varieties of psychological problems. This Russian course is constructed to use the rich stimulation of the web - particularly in terms of the vast array of resources from all subjects - to get people talking about deeper mechanisms underpinning life and experience. It's a bit like an updated version of Marion Milner's "A Life of One's Own". From a technological point of view, it's simple. From a human perspective, it's been fascinating and rewarding.

Uncertainty, Objects and Technology in Education: Inverting the relation between content, process and conversation in a complex world

Mark William Johnson
Sebastian H.D. Fiedler, University of Hamburg, 
Svetlana Rodriguez Arciniegas, Far Eastern Federal University 
Maria Kirilina, Far Eastern Federal University

Computer technology has changed education and the world in a remark- ably short time and nobody seems to be certain exactly what’s just happened. There has been an increase in uncertainty in educational practice as people try to decide on what tools to use, confusion about institutional purpose coupled with managerialism, metricisation and financialisation which has left scholars of higher education expressing concern about the state of universities and higher learning (Brown 2010; Collini 2017; Barnett 1990). In the face of market de- mands, university has become more like school. Defenses of ‘higher learning’ to provide necessary ‘unsettling’(Barnett 1990) through presenting ‘troublesome knowledge’(Meyer and Land 2006) giving students ‘epistemic access’ (Morrow 2009), or providing opportunities for personal transformation or individuation (Mezirow 1991) do not appear to have had mainstream impact on pedagogic practice. Such distinctions themselves raise questions about the status of the ancient academy in the face of a new world of communications technology which works in very different ways to the university’s slow rhythms, and students appear unwilling to be ‘troubled’ once they see themselves as customers. This is not the first time in history when humans have been faced with technical changes that render existing social structures no longer fit for purpose. The computer and its communication networks have disrupted the most basic foun- dation of human activity: the way we talk to each other. Our institutions of higher education have yet to find an effective way of reorganising themselves in response.

We present an argument based on information-theoretical analysis concerning the relationship between uncertainty in education and technological development. We argue that technological development creates uncertainty in the environment of existing institutions, and that social change which sometimes follows technological development is a reaction to this increasing uncertainty. We contend that the institution of education is in a positive feedback loop with environmental uncertainty, which it is exacerbating with its current use of tech- nology. This position, we argue, distinguishes itself from technological determinist arguments about the social effects of technology, whilst also avoiding the often equally problematic social constructivist position (Feenberg and Callon 2010; Smith 2010). Technology does not determine social change, but creates uncer- tainty by increasing the variety of options for acting.

According to the information theory of Shannon (Shannon and Weaver 1949) an increase in the number of options increases the maximum entropy of choice, so the selection problem of choosing a particular option to pursue becomes more difficult. Institutions - and the people within them - have to adapt to this increased uncertainty: some- times by attenuating the technological possibilities (i.e. with new regulations to banish particular technologies), or sometimes by exploiting some aspects of a technology to reinforce existing institutional structures (e.g. the LMS’s ampli- fication of the classroom). Recent developments in higher education have seen both of these reactions.

While the ancient academy developed its structures to manage a once stable environment of uncertainty concerning science and knowledge, the technologically- driven explosion of uncertainty renders its structures ineffective. In a sea of uncertainty, psychoanalytical and sociological work suggests that intersubjec- tive engagement through conversation can still provide effective management of personal uncertainty through what Schutz calls the ‘pure we-relation’ (Schutz 1974), Luhmann calls ‘double-contingency’ (Luhmann 1996) and Freudian psy- choanalysis characterises as a ‘talking therapy’ (Freud 2016). The ‘tuning-in’ to the inner-worlds of each other through conversation remains the most powerful mechanism to address uncertainty at the interface between the psyche and in the social environment. Such a personalisation of uncertainty management how- ever, presents challenges to formal structures and practices in education which are tied to curricula and rigid assessment schemes.

The search for new ways of exploiting technology in organised learning con- versations which do not contribute to the uncertainty feedback-loop is urgent since the pace of technologically-driven uncertainty is not going to slow. We report on an experiment at the Far Eastern Federal University in Russia where technical artefacts have been used in conjunction with activity coordination tools and flexible assessment strategies to put learner intersubjective engage- ment centre-stage and create a virtuous cycle between what we call, following Luhmann (Luhmann 1996), the management of ‘psychic uncertainty’ and ‘social uncertainty’.

Figure 1 shows a schematic diagram drawing on the cybernetics of Stafford Beer (Beer 1995) of the experiment’s uncertainty management approach, where each individual ‘self’ or ‘Ego’ has both structure and uncertainty (contained in the large lower box) which are kept in balance by a process which is similar to Freud’s concept of ‘primary’ and ‘secondary’ process (Ehrenzweig 1968). This psychic uncertainty, which we relate to the Freudian ‘Id’, is managed by a meta- system (at the top): in this case, the individual’s ‘Superego’. The metasystem helps to determine communicative utterances, assisted by the presence of me- diating technological artefacts. A virtuous cycle is theoretically possible where effective management of psychic uncertainty leads to powerful communications which in turn benefit psychic processes.

In the experiment, technological artefacts (videos, pictures) other objects (shells, rocks, trash, artworks) and visiting experts (artists) were mashed-up in unusual combinations to stimulate conversation through coordinated activities. The process is designed to reflect the lived experience of exposure to a rich variety of online phenomena, but to bring the psychodynamic effects of this into conscious experience and conversation. We report on the results of a 3-day pilot with 30 participants.
In conclusion, we argue that the full force of technology’s threat to education and society has yet to be felt. The nature of this threat is not automation of hu- man action; the threat lies in the pathological reaction of human institutions to uncertainty created by new technology. A good society manages its uncertainty. The conversational inversion of uncertainty management of the kind we report presents an opportunity to explore the ways technological artifacts - whether videos, AI, or Virtual Reality - can be used to drive a virtuous personal and convivial uncertainty management process.

Barnett, Ronald (1990). The Idea Of Higher Education. en. Google-Books-ID: eTjlAAAAQBAJ. McGraw-Hill Education (UK). isbn: 978-0-335-09420-2.
Beer, Stafford (1995). Platform for Change. English. 1 edition. Chichester ; New
York: Wiley. isbn: 978-0-471-94840-7.
Brown, Roger, ed. (2010). Higher Education and the Market. English. New York, NY: Routledge. isbn: 978-0-415-99169-8.
Collini, Stefan (2017). Speaking of Universities. English. London ; New York:
Verso. isbn: 978-1-78663-139-8.
Ehrenzweig, A. (1968). The Hidden Order of Art. Weidenfeld and Nicolson. Feenberg, Andrew and Michel Callon (2010). Between Reason and Experience:
Essays in Technology and Modernity. English. New edition. Cambridge, Mass: The MIT Press. isbn: 978-0-262-51425-5.
Freud, Sigmund (2016). Introductory Lectures on Psychoanalysis. English. Cre-
ateSpace Independent Publishing Platform. isbn: 978-1-5375-4930-9.
Luhmann, Niklas (1996). Social Systems. isbn: 978-0-8047-2625-2.
Meyer, Jan and Ray Land (2006). Overcoming Barriers to Student Understand- ing: Threshold Concepts and Troublesome Knowledge. en. Google-Books-ID: RCUVmm05qmcC. Routledge. isbn: 978-1-134-18995-3.
Mezirow (1991). Transformative Dimensions. English. 1 edition. San Francisco:
John Wiley & Sons. isbn: 978-1-55542-339-1.
Morrow, W. (2009). Bounds of democracy: epistemological access in higher ed- ucation. HSRC Press. url: http://repository.hsrc.ac.za/handle/20. 500.11910/4739.
Schutz, A. (1974). Collected Papers I. The Problem of Social Reality: Problem
of Social Reality v. 1. English. 1972 edition. Hague ; Boston: Springer. isbn: 978-90-247-5089-4.
Shannon, Claude E. and Warren Weaver (1949). The Mathematical Theory of Communication. English. Urbana: University of Illinois Press. isbn: 978-0- 252-72548-7.
Smith, Christian (2010). What Is a Person?: Rethinking Humanity, Social Life, and the Moral Good from the Person Up. English. Chicago, Ill.; London: University of Chicago Press. isbn: 978-0-226-76594-5.