Thursday, 26 July 2018

Dialectic and multiple description in Haydn

The slow movement of Haydn's piano sonata in C major, no. 50 is full of gestures which articulate a simple tonality, but which are so varied in their rhythm, range of the keyboard, dynamics, texture and lyricism maintain interest in what would otherwise be rather uninteresting music. But what Haydn actually does is anything but uninteresting: it is full of surprise, and is highly expressive. Where does the expression come from? This is Lang Lang playing it:



The contrast with the Bach sinfonia no 9 (which I looked at a couple of posts ago) couldn't be greater. Yet, I think the principle is the same. Haydn, like Bach, creates overlapping redundancies. But where Bach takes a chromatic harmonic structure and weaves motifs all the way through it, Haydn takes a relatively simple harmonic structure (I-V-I-IV-I-V-I...) Played as straight chords, these are not nearly as interesting as the Bach chorale chords with their diminished chords and exotic chromaticism. Which means that the reduction of the Haydn is even more unsurprising than the Bach - a very low entropy.

Haydn seems to add disorder through contrasting ideas: this is, after all, the composer who wrote a "representation of chaos" at the beginning of "The Creation", which does a similar thing. But like Bach, this is not disorder, it is redundancy. Basically, rhythm, articulation, range, figuration and dynamics are used to articulate the same basic simple tonality of the piece: one way of describing it follows another in often contrasting juxtaposition.

Each way of describing the tonality is not arbitrary, but unlike Bach, the redundancy of the added patterns does not run through the whole thing: it is episodic, frequently punctuated by silence, from which Haydn then surprises us again.

Silence is important in this process. Silence is when all the parameters of overlapping redundancies all tend towards the same thing. I suggested in my analysis of Bach, that this shared movement of entropy between different parts was an indication of a new idea: a new countable thing. And indeed so it is in Haydn. It's just that the ideas are layered sequentially rather than layered on top of one another.

Each closure to silence creates uncertainty as to what is coming next. Something seems to have been defined that is stable - will it be continued, or will something different follow? Haydn plays with this uncertainty, and when we do get a lyrical melody (the falling melody over an undulating accompaniment) it feels like a relief.

In dialectical theory, the succession of musical ideas might be considered to be like a "thesis", "antithesis", "synthesis" triad. But really, those three elements are different descriptions of the same thing, their juxtaposition a way of accumulating redundancies on a journey to creating a relationship between different descriptions whose interactions reveal the inner structure of the music, and which can close to silence without feeling the need for more. The tension that is established between different descriptions isn't "resolved" as such. It creates the conditions for the emergence of another description which is more fundamental than those which precede it. This new description coexists with its predecessors.

Such moments are felt emotionally.  Emotion and meaning in music is like a light-house which guides the generation of a set of redundant but never arbitrary descriptions which eventually together reveal some kind of insight into what underpins the whole thing.  

Freire and the Arbitrariness Problem in Learning Conversations: insight from music, maths and physics

If the cyberneticians were completely right about learning conversations, then conversations would have the form of a kind of Brownian motion: utterances would respond to utterances, learners and teachers may adapt to the utterances they hear, but the whole thing would be a kind of wandering in search of a point of resolution. This is why I think Pask believed that "Teach-back" was critical to the learning conversation - because without it, how could the learning conversation have any closure? So closure comes from a logical judgement as to whether the student has "taught back" what the teacher taught to them.

I don't think this is quite right, and more critically, it isn't what we see in aesthetic phenomena like music. The critical point is that utterances are not arbitrary. Learning conversations take a form which is found in their originating material. In music the originating material is in the initial musical ideas, the instruments, etc. In learning, the originating material is in the whole constitution of learners and teachers, and the subject domain. There is no originating point of a learning conversation which isn't bathed in personal biography, and histories and social structures which extend far beyond any individual life.

What appears to happen in music is that completion arises from the overlaying of multiple descriptions which unfold some originating idea. Eventually there is some kind of coordination between descriptions which means that a return to some originating state (silence) becomes a natural progression. This would suggest that those physicists who believe that everything comes from nothing (like Peter Rowlands) are right. Quantum mechanics holds some important messages for constructivists.

Composers say what they say by finding ways of generating multiple descriptions (redundancy) and layering redundancies over one another. But redundancies are not chosen arbitrarily. They emerge through a kind of symmetry-breaking process, where the originating principle of this symmetry breaking is implicit in the first instance.

Learning conversations are much like this I think. But what is the originating material of a symmetry breaking process in a learning conversation? It is the root of oppression within the individuals talking. Freire was right.

Closure does not arrive through teach-back - although this may be seen to be an epiphenomenon of closure. Closure arrives through the articulation of multiple descriptions of oppression. This may be what it is for something to "make sense" (c.f. Weick on sensemaking: "the ongoing retrospective development of plausible images that rationalize what people are doing").

What this means, most simply, is that the model of perturbation and response needs a historical dimension. The reflexive selection of utterances depends on the dynamic interplay between three dimensions:

  1. prediction based on what has just happened
  2. prediction based on a model of what is happening
  3. speculation on possible models of what might be happening, and what might be possible. 

Daniel Dubois's mathematical work on anticipatory systems identifies these three dimensions as "recursive", "incursive" and "hyperincursive". They are coexisting processes. Between them they produce elaborate fractal patterns like this:

When I first saw patterns like this, I thought it looked rather deterministic, too neat. But now I think that is not what the picture is showing. It is showing in an idealised way that the interaction between history, the present and future produces moments of symmetry breaking, from which new descriptions of the same thing can emerge. There are moments when multiple descriptions come together and produce moments of closure because the combination of past, present and future produce distinct moments of "resolution" (perhaps this might be in the big diamonds of the picture). 

More importantly, the stratification of this diagram suggests that utterances are not arbitrary. They exist in, and articulate a structure. Emancipation is the revealing of the structure through very careful articulation of levels of description: the production of redundancy. This is very similar to what composers do. It is also worth comparing to the stratification articulated in the quantum mechanical twin-slit experiment:

Stratification is not arbitrary. Learning conversations are not Brownian motion. They are layered accretions of redundant patterns - like music. 

Saturday, 21 July 2018

Redundancy and the Communication of Meaning in Music: Bach's 3-part Invention (Sinfonia) no. 9 in F minor

Hindemith chose Bach's F minor 3 part invention for analysis to demonstrate his theory of composition in "The craft of musical composition". It is a fascinating piece - one of Bach's most chromatic and expressive pieces of keyboard writing, and rather like other extraordinary musical moments (like Wagner's musical orgasm in Tristan), it raises the question "What is going on?". I like Hindemith's theory very much (although not as much as I like his music!), but his analysis sent me on my own analytical journey through the lens of information theory.

What happens in music, I believe, is the unfolding of a structure where multiple constraints are interwoven and overlaid. Information theory can provide some insight into this (as is discussed in a very recent paper from Loet Leydesdorff, myself and Inga Ivanova in the Journal of the Association for Information Science and Technology: https://onlinelibrary.wiley.com/doi/full/10.1002/asi.24052), and particularly the meaningfulness of the communication.

When considering music from the perspective of information theory, there are three fundamental problems to be overcome:

  1. Music has no object of reference. So how is meaning communicated without reference?
  2. Music emerges over time, producing novelty and unfolding a diachronic structure which appears to be linked to its synchronic structure. For this reason, music is not ergodic, unlike the use of letters in a language: its entropy over one period of time is not the same as its entropy over a different period of time. 
  3. Music's unfolding novelty is not arbitrary: novelty in music appears to be a symmetry-breaking process similar to that found in epigenesis where both synchronic and diachronic symmetries gradually define structure
The first page of Bach's music looks like this:
Here's a performance:



The piece is fugal, and obviously in three parts, there is a very bare texture, and this bareness seems to contribute to the expressiveness of the music. However, there is a harmonic structure which is articulated throughout the piece, and a reduction of the harmonic written as chords per beat, looks something like this:

This kind of harmonic reduction is very common in music analysis as a method for getting at the "deep structure" of music (particularly in Schenker). It is typical of Bach's music that the harmonic reduction is very much like a chorale (hymn). In trying to understand how Bach's music works, we can start by asking about the relation between the harmonic reduction and the finished piece. 

At first glance, from an information theory perspective, the block chords of the reduction seem to remove a considerable amount of entropy which exists in the movement of parts in the original. It does this by compressing the variety into single "beats", which taken as an entirety have an entropy of 0. However, the variety compression makes more apparent the shifting harmonies. Written in chord symbols, this is an extended I (tonic) - V (dominant) - I (tonic) movement, interspersed with diminished chords (which are harmonically ambiguous) and a oscillation between major and minor chords. But if one was to calculate the entropy of the harmony, it wouldn't be that great. 

However, to say that the meaningfulness of Bach's music is arrived at by adding entropy is misleading. If entropy was to be added, we would expect increasing disorder. But if disorder is what is added, then it is unlikely that a coherent harmonic reduction would be possible, and furthermore it is unlikely that the piece would have any coherence. 

The striking thing about comparing the harmonic reduction with the original is that it is not disorder (entropy) which is added. It is redundancy.

This is most obvious is the melodic motif which Bach uses throughout the piece: 
This expresses a number of patterns which are represented throughout: the 3-quaver rhythm followed by a quaver pause; the interval jump of a third and then a fall of a second (except for moments of higher tension when he breaks this); and the phrasing with emphasis on the second note. 

In fugal writing, where each voice imitates others, the redundancy of patterns like this motif is enhanced. But it is interspersed with other ideas: the variety of the pattern is broken in the second bar, with a closing motif which also gets repeated throughout:


Shannon information theory, derived from Boltzmann's thermodynamics, measures uncertainty or disorder. It does this by identifying features which can be counted, and calculates the probabilities of their occurrences. 

On observing Bach's piece, the two motifs identified might be considered to be "features", and their occurrences over time calculated. Equally, the occurrences of individual notes might be calculated, or the harmonic sequence identified in the harmonic reduction. But somehow this seems to miss the point. When we do Shannon analysis in text, we can identify words, and usually those words have reference, and so the communication of meaning can be inferred from the patterns of the words and their references (this is how big data techniques like Topic Modelling work)

The second problem about music's lack of ergodicity is more serious still. Whatever is counted at the beginning is not what can be counted at the end. Novelty arises through the unfolding. Since Shannon's formulae requires an index of features, how might it accommodate features which cannot be foreseen from the outset?

However, there are fundamental features for which the entropies can be calculated: notes, rhythm, intervals, harmony, etc. An emergent structure can be arrived at through the interaction between different constraints: music creates its own constraints. Every measurement of entropy of a particular feature is constrained by the measurements of entropies of other features: the choice of notes depends on the choice of harmony or rhythm, for example.

Seeing any measurement of entropy in this way means that any measurement of entropy is also an index of the constraints (or redundancies) that produce that measure of entropy. In this way, we do not need to engage with Shannon's formula for redundancy - which is good because it invokes the notion of "maximum entropy" which is a concept that is beyond measurement. 

Constraints work over time. The effect of one constraint on another results in changes to the entropies of different features. If one constraint changes at the same time as another, then there is some connection between them.

In considering the motif that Bach uses to structure his fugue, the intervals, the rhythm, the expression and the pattern of notes are all the same. That means that the entropies of each fundamental element relative to the others is tightly coupled. 

Does this means that tight coupling between fundamental features is a sign of an emergent cell or motif? I think it might, and of course, if tight coupling is identified as a new idea, then the new idea becomes a new feature in the analysis, upon which its own entropy can be calculated relative to other features. 

At the end of a piece, there is tight coupling between everything. Everything dissolves into silence. The important thing is that everything dissolves into what it came from. This is directly related to problem 3 above: novelty is not arbitrary. All novelty articulates different descriptions of the whole. Nothing (silence) is arrived at when those descriptions come together in the same way that the colour spectrum comes together to produce white light. 

Music is profound in its communication of meaning because it expresses something fundamental in our own biological structure. In music's unfolding broken symmetry, we recognise our own unfolding broken symmetry. This principle is behind what I think Alfred Schutz saw in his paper "Making Music Together". When education really works, it has exactly this same property expressed in the loving relations between human beings.

Tuesday, 17 July 2018

How is music meaningful? An Information Theoretical approach to analyse the communication of meaning without reference

These are notes for a paper on music. It is not meant to be in any way coherent yet!

1 Introduction

Over the centuries that music has been considered as a scientific phenomenon, focus has tended to fall on music’s relation to number and proportion which has been the subject of scholarly debate since antiquity. In scientific texts on music from the Pythagoras to the enlightenment, focus has tended to fall on those aspects of sound which are synchronic, of which the most significant is harmony and the proportional relation between tones. Emerging scientific awareness of the physical properties of sound led scientists to consider deeper analysis of the universe. This scientific inquiry exhibited feedback with music itself: deeper scientific understanding inspired by music, harmony and proportion led to scientific advance in the manufacture of musical instruments and the consequent emergence of stylistic norms.

Less attention has been placed on music’s diachronic structure, despite the fact that synchronic factors such as timbre and harmony appear strongly related to diachronic unfolding, and that the diachronic dimension of music typifies what Rosen calls, “the reconciling of dynamic opposites is at the heart of the classical style”. In more recent years, the role of music in understanding psychosocial and biological domains has increased. Recent work around the area of “Communicative Musicality” has focused more on the diachronic unfolding alongside its synchronic structure, for it is here that it has been suggested the root of music’s communication of meaning lies, and that this communication is, in the suggestion of Langer, a “picture of emotional life”

Diachronic structures present challenges for analysis, partly because dialectical resolution between temporally separated moments necessarily entails novelty, while the analysis throws up fundamental issues of growth within complexes of constraint. These topics of emerging novelty within interacting constraints has absorbed scientific investigation in biology and ecology in recent years. Our contribution here is to point to the homologous analysis of innovation in discourse, and so we draw on insights into dynamics of meaning-making within systems which have reference as a way of considering the meaning-making within music which has no reference, with the aim of providing insights into both. The value of music for science is that it focuses attention on the ineffable aspects of knowledge: a reminder of how much remains unknown, how the reductionism of the enlightenment remains limited, and how other scientific perspectives, theories and experiments remain possible. And for those scientists for whom music is a source of great beauty, no contemplation of the mysteries of the universe could not consider the mysteries of music.

2 What do we mean by meaning?

Any analysis of meaning invites the question “what do we mean by meaning?” For an understanding of meaning to be meaningful, that understanding must participate in the very dynamics of meaning communication that it attempts to describe. It may be because of the fundamental problems of trying to identify meaning that focus has tended to shift towards understanding “reference” which is more readily identifiable. There is some consensus among scholars focused on the problem that meaning is not a “thing” but an ongoing process: whether one calls this process “semiosis” (Peirce), or Whitehead’s idea of meaning, or the result of the management of cybernetic “variety” (complexity). The problem of meaning lies in the very attempt to classify it: while reference can be understood as a “codification” or “classification”, meaning appears to drive the oscillating dynamic between classification and process in mental process. So there is a fundamental question: is meaning codifiable? For Niklas Luhmann, this question was fundamental to his theory of “social systems”. Luhmann believed that social systems operate autopoietically by generating and reproducing communicative structures of transpersonal relations. Through utterances, codes of communication are established which constraint the consciousness of individuals who reproduce and transform those codified structures. Luhmann saw the different social systems of society as self-organising discourses which interact with one another. Thus, the “economic system” is distinct from the “legal system” or the “art system”. Meaning is encoded in the ongoing dynamics of the social system. An individual’s utterance is meaningful because the individual’s participation in a social system leads to coordinated action by virtue of the fact that one has some certainty that others are playing the same game: one is in-tune with the utterances of others, and with the codified social system within which one is operating. Luhmann borrowed a term which Parson’s invented, inspired by Schutz’s work, “double contingency”. All utterances are made within an environment of expectation of the imagined responses of the other (what Parsons calls “Ego” and “Alter”) 

The problem with this perspective is that meaning results from a kind of social harmony: meaning ‘conforms’. Fromm, for example, criticizes this attitude as the “cybernetic fallacy”: “I am as others wish me to be”. Yet meaningful things are usually surprising. An accident in the street is meaningful to all who witness it but it outside the normal codified social discourse. Even in ordinary language, one emphasizes what is meaningful by changing the pitch of volume of ones’ voice for emphasis on a particular word, or gesturing to some object as yet unseen by others “look at that!”. Whatever we mean by meaning, the coordination of social intercourse and mutual understanding is one aspect, but novelty or surprise is the critical element for something to be called “meaningful”. It is important to note that one could not have surprise without the establishment of a some kind of ‘background’.
However, there is a further complication. It is not that surprise punctuates a steady background of codified discourse. Codified discourse creates the conditions for novelty. The effects of this have been noted in ecology (Ulanowicz) and biology (Deacon). The establishment of regularity and pattern appears to create the conditions for the generation of novelty in a process which is sometimes characterized as “autocatalytic” (Deacon, Ulanowicz, Kauffman). This, however, is a chicken and egg situation: it is that meaning is established through novelty and surprise against a stable background, or is it the stable background which gives rise to the production of novelty which in turn is deemed to be meaningful.

This dichotomy can be rephrased in terms of information theory. This is not to make a claim for a quantitative index of meaning, but a more modest claim that quantitative techniques can be used to bring greater focus to the fundamental problem of the communication of meaning. Our assertion is that this understanding can be aided through the establishment of codified constraints through techniques of quantitative analysis in order to establish a foundation for a discourse on the meta-analysis of meaningful communication. This claim has underpinned prior work in communication systems and economic innovation, where one of us has analysed the production of novelty as the result of interacting discourses between the scientific community, governments and industries. Novelty, it is argued, results from mutual information between discourses (in other words, similarly codified names or objects of reference), alongside a background of mutual redundancy or pattern. Information theory provides a way of measuring both of these factors. In conceiving of communication in this way, there is a dilemma as to whether it is the pattern of redundancy which is critical to communication or whether it is the codified object of reference. Music, we argue, suggests that it is the latter.

3 Asking profound questions

In considering music from the perspective of information theory, we are challenged to address some fundamental questions: 
1. What does one count in music?
2. From where does musical structure emerge?
3. What is the relation between music’s synchronic structure and its diachronic unfolding?
4. Why is musical experience universal?

4 Information and Redundancy in Music: Consideration of an ‘Alberti bass’

Music exhibits a high degree of redundancy. Musical structure relies on regularity, and in all cases of the establishment of pattern and regularity, music presents multiple levels of interacting pattern. For example, an ‘Alberti bass’ establishes regularity, or redundancy, in the continual rhythmic pulse of the notes that form a harmony. It also exhibits regularity in the notes that are used, and the pattern of intervals which are formed through the articulation of those notes. It also exhibits regularity in the harmony which is articulated through these patterns. Its sound spectrum identifies a coherent pattern. Indeed, there is no case of a musical pattern which is not subject to multiple versions of redundant patterning: even if a regular pattern is established on a drum, the pattern is produced in the sound, the silence, the emphasis of particular beats, the acoustic variation from one beat to another, and so on. Multiple levels of overlapping redundancy means that no single musical event is ever exactly the same. This immediately presents a problem for the combinatorics of information theory, which seeks to count occurrences of like-events. No event is ever exactly ‘like’ another because the interactions of the different constraints for each single event overlaps in different ways. The determination of a category of event that can be counted is a human choice. The best that might be claimed is that one event has a ‘family resemblance’ to another.

The Alberti bass, like any accompaniment, sets up an expectation that some variety will occur which sits within the constraints that are established. The critical thing is that the expectation is not for continuation, but for variety. How can this expectation for variety be explained?

In conventional information theoretical analysis, the entropy of something is basically a summation of the probabilities of the occurrences of the variety of different codified events which are deemed to be possible and can be counted. This approach suffers from an inability to account for those events which emerge as novel, and which could not have been foreseen at the outset. In other words, information theory needs to be able to account for an dynamically self-generating alphabet.
Considering the constraint, or the redundancy within which events occur presents a possibility for focusing on the context of existing or potentially new events. But since the calculation of redundancy requires the identification of information together with the estimation of the “maximum redundancy”, this estimation is also restricted in calculating novelty.

However, we have suggested that the root of the counting problem is that nothing is exactly the same as something else. But, some things are more alike than others. If an information theoretical approach can be used to identify the criteria for establishing likeness, or unlikeness, then this might be used as a generative principle for generating the likeness of things which emerge but cannot be seen at the outset. Since novelty emerges against the context of redundancy, and redundancy itself generates novelty, a test of this approach is whether it can predict the conditions for the production of novelty.
In other words, it has to be able to distinguish the dynamics between the Alberti bass which articulates a harmony for 2 seconds, and an Alberti bass which articulates the same harmony for more than 1 minute with no apparent variation (as is the case with some minimalist music).

  • How do our swift and ethereal thoughts move our heavy and intricately mobile bodies so our actions obey the spirit of our conscious, speaking Self?
  • How do we appreciate mindfulness in one another and share what is in each other’s personal and coherent consciousness when all we may perceive is the touch, the sight, the hearing the taste and smell of one another’s bodies and their moving?
These questions echo the concern of Schutz, who in his seminal paper, “Making Music Together” identifies that music is special because it appears to communicate without reference. Music occupies a privileged position in the information sciences as a phenomenon which makes itself available to multiple descriptions of its function, from notation to the spectral analysis of sound waves. Schutz’s essential insight was the musical meaning-making arose from the tuning-in of one person to another, through the shared experience of time.

This insight has turned out to be prescient. Recent research suggests that that there is a shared psychology or “synrhythmia” and amphoteronomy involved in musical engagement. How then might this phenomenon be considered to be meaningful, in the absence of reference, which in other symbolically codified contexts (e.g. Language), meaning appears to be generated in other ways which can be more explicitly explained through understanding the nature of “signs”.

5 Communicative Musicality and Shannon Entropy

Recent studies on communicative musicality have reinforced Schutz’s view that the meaningfulness of musical communication rests on a shared sense of time. The relation between music and time makes manifest many deep philosophical problems: music unfolds in time, but it also expresses time, where time in music does not move at the same speed: music is capable of making time stand still. If time is considered as a dimension of constraint, alongside all the other manifestations that music presents, then the use of time as a constraint on the analysis of music itself can help unpick the dynamics of those other constraints. While one might distinguish clock-time from psychological time, what might be considered psychological time is in reality the many interacting constraints, while clock-time is an imaginary single description allows us at least to carve up experience in ways that can be codified.

Over time, music displays many changes and produces much novelty. Old patterns give way to new patterns. At the root of all these patterns are some fundamental features which might be counted in the first instance:
1. Pitch
2. Rhythm
3. Interval
4. Harmony
5. Dynamics
6. Timbre

So what is the pattern of the Alberti bass?

If it’s note pattern is C-G-E-G-C-G-E-G, and its rhythm is q-q-q-q-q-q-q-q and its harmony is a C major triad, and intervallic relations are 5, -3, 3, -5, 5, -3, 3, -5, over a period of time, the entropy for each element will be;
1. Pitch: -0.232
2. Rhythm: 1
3. Interval: 0.2342
4. Harmony: 1
5. Dynamics:


The important thing here is that novelty becomes necessary if there is insufficient variation between interacting constraints. If mutual redundancy is high, then it also means that there is very high mutual information. The phrasing of an Alberti bass is an important feature of its performance. For musicians it is often considered that the Alberti pattern should be played as a melody.

Consider the opening of Mozart’s piano sonata
.

The playing of the C in the right hand creates an entropy of sound. Each sound decays, but the decay of C above the decay of the Alberti bass means that the context of the patterns of the other constraints changes. This means that the first bar can be considered as a combination of constraints.

The surprise comes at the beginning of the second bar. Here the pattern of the constraint of the melody changes because a different harmony is articulated, together with a sharp interval drop (a 6th), an accent, followed by a rhythmic innovation in the ornamental turn. Therefore the characteristic of the surprise is a change in entropy in multiple dimensions at once.

It is not enough to surprise somebody by simply saying “boo!”. One says “boo!” with an increased voice, a change of tone, a body gesture, and so on. The stillness which is previous to this is a manifestation of entropies which are aligned.

The intellectual challenge is to work out a generative principle for a surprise.

6 A generative principle of surprise and mental process

The differentiation between different dimensions of music is an indication of the perceptual capacity expressed through relations between listeners and performers. The articulation and play between different dimensions creates the conditions for communicating conscious process. If entropies are all common across a core set of variables, then there is effectively collapse of the differentiating principles. All becomes one. But the collapse of all into one is a precursor to the establishing of something new.
In music’s unfolding process a system of immediate perception works hand-in-hand with a meta-system of expectations (an anticipatory system). This can be expressed in terms of a system’s relation to its meta-system. Because the metasystem coordinates the system, circumstances can arise where the metasystem becomes fused to the system, so that differences identified by the system become equivalent to different identified by the metasystem. In such a case, the separation between system and metasystem breaks down. The cadence is the most telling example of this. In a cadence, many aspects of constraint work together all tending towards a silence: harmony, melody, dynamics, rhythm, and so forth.

Bach’s music provides a good example of the ways in which a simple harmonic rhythm becomes embellished with levels of counterpoint, which are in effect, ways of increasing entropy in different dimensions. The melody of a Bach chorale has a particular melodic entropy, while the harmony and the bass imply a different entropy. Played “straight”, the music as a sequence of chords isn’t very interesting: the regular pulse of chords creates an expectation of harmonic progress towards the tonal centre, where all the entropies come together at the cadence:

Bach is the master of adding “interest” to this music, whereby he increases entropy in each voice by making their independent rhythms more varied, with patterns established and shared from one voice to another:

By doing this, there is never a complete “break-down” of the system: there is some latent counterpoint between different entropies in the music, meaning that even at an intermediate cadence, the next phrase is necessary to continue to unfold the logic of the conflicting overlapping redundancies.

This is the difference between Baroque music and the music of the classical period. In the classical period, short gestures simply collapse into nothing, for the system to reconstruct itself in a different way. In the Baroque, total collapse is delayed until the final cadence. The breaking-down of the system creates the conditions for the reconstruction of stratified layers of distinction-making through a process of autocatalysis. Novelty emerges from this process of restratifying a destratified system. The destratified system exhibits a cyclic process whereby different elements create constraints which reinforce each other, much like the dynamics of a hypercycle described in chemistry, or what is described in ecology as autocatalysis.

Novelty arises as a symptom of the autocatalyzing relation between different constraints, where the amplification of pattern identifies new differences which can feed into the restratification of constructs in different levels of the system.

This can be represented cybernetically as a series of transductions at different levels. Shannon entropy of fundamental features can be used as a means of being able to generate higher level features as manifestations of convergence of lower-level features, and consequently the identification of new pattern.

It is not that music resonates with circadian rhythms. Music articulates an evolutionary dynamics which is itself encoded into the evolutionary biohistory of the cell.

7 Musical Novelty and the Problem of Ergodicity

In his introduction to the concept of entropy in information, Shannon discusses the role of transduction between senders and receivers in the communication of a message. A transducer, according to Shannon, encodes a message into signals which are transmitted in a way such that they can be successfully received across a communication medium which may degrade or disrupt the transmission. In order to overcome signal degradation, the sending transducer must add extra bits to the communication – or redundancy – such that the receiver has sufficient leeway in being able to correctly identify the message.
Music differs from this Shannon-type communication because the redundan- cies of its elements are central to the content of music’s communication: music consists primarily of pattern. Moreover, music’s redundancy is apparent both in its diachronic aspects (for example, the regularity of a rhythm or pulse), and in its synchronic aspects (for example, the overtones of a particular sound). Music’s unfolding is an unfolding between its synchronic and diachronic redun- dancy. Shannon entropy is a measure of the capacity of a system to generate variety. Shannon’s equations and his communication diagram re-represent the fundamental cybernetic principle of Ashby’s Law of Requisite Variety, which states that in order for one system to control another, it must have at least equal variety. In order for sender and receiver to communicate, the variety of the sender in transmitting the different symbols of the message must be at least equal to the variety of the receiver which reassembles those symbols into the information content. The two transducers, sender and receiver, communicate because they maintain stable distinctions between each other: information has been transferred from sender to receiver when the receiver is able to predict the message. Shannon describes this emergent predictive capacity in terms of the ‘memory’ within the transducers. It is rather like the ability to recognise a tune on hearing the first couple of notes.

In using Shannon to understand the diachronic aspects of music, we must address one of the weaknesses of Shannon’s equation in that it demands an index of elements which comprise a message (for example, letters in the alphabet). In cases where the index of elements is stable, then the calculation of entropy over a long stretch of communication will approximate to entropy over shorter stretches since the basic ordering of the elements will be the similar. This property is called ergodicity and applies to those cases where the elements of a communication can be clearly identified. Music, however, is not ergodic in its unfolding: its elements which might be indexed in Shannon’s equations are emergent. Music’s unfolding reveals new motifs, themes, moods and dynamics. None of these can be known at the outset of a piece of music.

In order to address this, some basic reconsideration of the relationship between order and disorder in music is required. In Shannon, order is determined by the distribution of symbols within constraints: Shannon information gives an indication of the constraint operating on the distribution. The letters in a language, for example, are constrained by the rules of grammar and spelling. If passages of text are ergodic in their order it is because the constraint operating on it is constant. Non-ergodic
communication like music result from emergent constraints. This means that music generates its own constraint.

In Shannon’s theory, the measure of constraint is redundancy. So to say that music is largely redundant means that music participates in the continual gen- eration of its own constraints. Our technical and empirical question is whether there is a way of characterising the emergent generation of constraint within Shannon’s equations.

To address the problem of emergent features, we need to address the problem of symbol-grounding. Stated most simply, the symbol grounding problem concerns the emergence of complex language from simple origins. Complex symbols and features can only emerge as a result of interactions between simple components. A further complication is that complex structures are neither arbitrary nor are they perfectly symmetrical. Indeed, if the complex structures of music, like the complex structures of nature are to be described in terms of their emergence from simple origins, they are characterized by a “broken symmetry”. In this sense, musical emergence is in line with recent thinking in physics considering the symmetry of the universe, whereby very small fluctuations cause the formation of discrete patterns.

Symmetry-breaking is a principle by which an analysis involving Shannon entropy of simple components may be used to generate more complex structures. In the next section we explain how this principle can be operationalized into a coherent information-theoretical approach to music using Shannon’s idea of Relative Entropy.

8 Relative Entropy and Symmetry-Breaking

As we have stated, music exhibits many dimensions of constraint which overlap. Each constraint dimension interferes with every other. So what happens if the entropy of two dimensions of music change in parallel? This situation can be compared to the joint pairing of meta-system and system which was discussed earlier.
If the constraint of redundancy of the many different descriptions of sound (synchronic and diachronically) constrain each other, then the effect of the re- dundancy of one kind of description in music on another would produce non- equilibrium dynamics that could be constitutive of new forms of pattern, new redundancy and emergent constraint. This effect of one aspect of redundancy on another can be measured with Shannon’s concept of relative entropy.

The effect of multiple constraints operating together in this way is similar to the dynamics of autocatalysis: where effectively one constraint amplifies the conditions for the other constraint. What is to stop this process leading to positive feedback? The new constraint, or new dynamic within the whole system is necessary to maintain the stability of the other constraints.

So what happens in the case of the Alberti bass? The relative entropy of the notes played, the rhythm, intervals, and so on would lead to collapse of distinctions between the components: a new component is necessary to attenuate the dynamic.

This helps explain the changes to the relative entropy in Mozart’s sonata. But does it help to explain minimalism? In the music of Glass, there is variation of timbre, and the gradual off-setting of the accompanimental patterns also helps to give the piece structure.

However, whilst the explanation for relative entropy helps to explain why novelty – new ideas, motifs, harmonies, etc. are necessary, it does not help to explain the forms that they take.
In order to understand this, we have to understand the symmetry-breaking principle is working hand-in-hand with the autocatalytic effects of the overlap of redundancy. Novelty is never arbitrary; it is always in “harmony” with what is always there. What this means is that novelty often takes the form of one change to one dimension, whilst other dimensions remain similar in constitution.
Harmony is another way of describing a redundant description. The melody of Mozart’s piano sonata is redundant in the sense that it reinforces the harmony of the accompaniment. The listener is presented with an alternative description of something that had gone before. The redundancy is both synchronic and diachronic.

From a mathematical perspective, what occurs is a shift in entropies over time between different parts.


The coincidence of entropy relations between parts over a period of time is an indication of the emergence of novelty. Symmetry breaking occurs because the essential way out of the hypercycle is the focus on some small difference from which some new structure might emerge. This might be because the combined entropies of multiple levels throw into relief other aspects of variation in the texture.

This would explain the symmetry-breaking of the addition of new voices, the subtle organic changes in harmony and so forth. However, it does not account for the dialectical features of much classical music. However, what must be noted is that the common pattern before any dialectical moment in music is that there is punctuation or silence or cadence. For example, in the Mozart example, the first four-bar phrase if followed by a more ornamental passage where whilst the harmony remains the same, a semiquaver rhythm takes over in the right hand, and in the process introducing more entropy into the diachronic structure.

Once again, within the dialectical features of music, it is not the case that musical moments are arbitrary. Moreover, it is the case that the choice of dialectical moments in which seem to continually surprise but relate to previous elements. Both in the process of composing music, and in the process of listening to it, what is striking is that the shared experience between composer, performer and listener is of a shared profundity. Where does this profundity come from?

The concept of relative entropy is fundamental to all Shannon’s equations. The basic measure of entropy is relative to a notional concept of maximum entropy. This is explicit in Shannon’s equation for redundancy, where in order to specify constraint producing a particular value of entropy, the scalar value of entropy has to be turned into a vector representing the ratio between entropy and maximum entropy. The measurement of mutual information, which is the index of how much information has been transferred, is a measurement the en- tropy of the sender relative to the entropy of the receiver. The problem is that nobody has a clear idea of what maximum entropy actually is, beyond a general statement (borrowed from Ashby) that it is the measurement of the maximum number of states in a system. In a non-ergodic emergent communication, there is no possibility of being able to establish maximum entropy. Relative entropy necessarily involves studying the change of entropy in one dimension relative to the change of entropy in another. One section of music over a particular time period might exhibit a set of descriptions or variables, each of which can be as- signed an entropy (for example, the rhythm or the distribution of notes played). At a different point in time, the relationship between the entropies of the same variables will be different. If we assume that the cause of this difference is the emergent effects of one constraint over another then it would be possible to plot the shifts in entropy from one point to the next and study their relative effects, producing a graph.

Such an approach is revealing because it highlights where entropies move together and where they move apart. What it doesn’t do is address the non- ergodicity problem of where new categories of description – with new entropies – are produced, how to identify them, and how to think about the effect that new categories of redundancy have on existing categories. Does the emergence of a new theme override its origins? Does the articulation of a deeper structure override the articulation of the parts which constitute it? However, the analysis can progress from relative entropy to address these questions if we begin by looking at the patterns of relative entropy from an initial set of variables (say, notes, dynamics, rhythm, harmony). If new dimensions are seen to be emergent from particular patterns of inter-relation between these initial dimensions, then a mechanism for the identification of new dimensions can be articulated. In effect, this is a second-order analysis of first-order changes in relative entropy.

A second point is to say that first-order, second-order and n-order dimen- sions of constraint will each interfere with each other over time. The experience of listening to music is a process of shifts between different orders of description. In Schenker’s music analytical graphs, this basic idea is represented as differ- ent layers of description from deep structure, middleground and foreground. A Shannon equivalent is relative entropy, relative entropy of relative entropy, relative entropy of relative entropy of relative entropy. . . and so on. In the analysis of music (for example, Herbert Brun) and in methods for the construction of musical composition (Xenakis), Shannon entropy has fascinated music scholars. Here we offer a new perspective drawing on the application of Shannon entropy to the analysis of scientific discourse. In this work, we have explored how mu- tual redundancy between different dimensions of communication in a discourse can be used as an index of meaning. In applying this approach to music, we consider the ways in which emergent mutual redundancy and relative entropy between different descriptions of music as it unfolds can identify similar patterns of meaning making in musical communication.

9 Common origins expressed through time

One of the problems of Shannon entropy and the transduction which is essentially expressed by is that, as an aspect of Ashby’s law, it is fundamentally conservative: the Shannon communicative model manages variety. In the same way that the different levels of regulation within Beer’s viable system model work together to articulate coherence of the whole. What Shannon entropy does not do is to articulate a dynamic whereby new things are created. Yet in order to account for music’s continual evolution, and also to account for the shared recognition of profundity, we need an account of the generation of profundity from some common, and simple, origin. 

Beer suggested a way of doing this in his later work which articulates his “Platform for change”. Here the process of emergence is fundamentally about the management of uncertainty.
Uncertainty is an important metaphor for understanding music. After all, for any composer, the question is always “what should I do next?” The choices that are faced by a composer are innumerable. In biology, there is evidence that cellular organization also is coordinated around uncertainty (Torday)

Uncertainty is an aspect of variety: it is basically the failure of a meta system to manage the variety of a system: or at least to fail to identify that which needs to be attenuated, or to attend to the appropriate thing. Yet the dialectical truth of music is that identity is always preserved amidst fluctuating degrees of uncertainty. This can be drawn as a diagram whereby whatever category is identified, which might be rhythm, notes, or whatever is counted within Shannon entropy contains within it its own uncertainty of effective identification.

The uncertainty of identification must be managed through the identification of a category, and this process – a metaprocess of categorical identification – interferes with other processes. In effect it codifies particular events in particular values. Yet the values which are codified will be challenged or changed at some point.

The coordination of uncertainty produces a generative dynamic which can under certain circumstances produce a symmetry. But the critical question is why the symmetry which is produced is common. An answer may lie in the origin of the cell.

I’ve got a diagram to show this. The fundamental feature of an improved mechanism to explain the emergent dynamics of musical innovation is that there is a dual transduction process: an inner transduction process which maintains balance within the identity of an entity, and an outer transduction process which maintains the relation between the identity and its environment. It should be noted that it is precisely this mechanism that exists in cellular communication through the production of protein receptors on the cell surface and the interaction between cells on the outside.

10 Communicative Musicality and relative entropy

Schutz’s understanding that musical communication as a shared experience of time now requires some reflection. If music is the shared experience of time, then what can an approach to understanding of entropy contribute to our understanding of time? The intersubjective experience of music where a shared object of sound provides a common context for the articulation of personal experience and consequently the creating of a ground of being for communication. In this can facilitate the sharing of words, of codified aspects of communication which Luhmann talks about. The remarkable thing with music is that the symmetry breaking of one individual is remarkably like the symmetry breaking of another individual. The choices for advanced emergent structures do not appear to be arbitrary. This may well be because of the synchronic structure of the materiality of sound. However it may go deeper than this. It may be that the essential physical constitution of the cell carries with it some pre-echoes of the physics of sound, which may at some point be united in fundamental physical mechanisms (probably quantum mechanical). There is an empirical need to search for the biological roots of music’s contact with biology. The work in communicative musicality provides a starting point for doing this.


































































Sunday, 15 July 2018

Uncertainty in Counting and Symmetry-Breaking in an Evolutionary Process

Keynes's, in his seminal "Treatise on Probability" of 1921 (little known to today's statisticians who really ought to read it), identified a principle which he called "negative analogy", as the principle by which some new difference is codified and confirmed by repeated experimental sampling.


"The object of increasing the number of instances arises out of the fact that we are nearly always aware of some difference between the instances, and that even where the known difference is insignificant we may suspect, especially when our knowledge of the instances is very incomplete, that there may be more. Every new instance may diminish the unessential resemblances between the instances and by introducing a new difference increase the Negative Analogy. For this reason, and for this reason only, new instances are valuable." (p. 233)
This principle should be compared to Ashby's approach to a "cybernetic science": the cybernetician "observes what might have happened by did not". The cybernetician can only do this by observing many descriptions of a thing to observe the "unessential resemblances" and introduce "a new difference". What both Keynes and Ashby are saying is that the observation of likeness is essentially uncertain.

The issue is central to Shannon information theory. Information theory counts likenesses. It determines the surprisingness of events because it treats each event as an element in an alphabet. It can then calculate the probability of that event and thus establish some metric of "average surprisingness" in a sequence of events. Although in the light of Keynes's thoughts on probability this seems naïve, Shannon's equation has been extremely useful - we owe the internet to it - so one shouldn't throw the baby out with the bathwater.

But the Shannon index, which identifies the elements of the alphabet, is actually a means by which uncertainty is managed in the process of calculating the "surprisingness" of a message. This can be shown in the diagram below, derived from Beer's diagrams in Platform for Change:


The beauty of this diagram is that it makes it explicit that the Shannon index is a "creative production" of the process of uncertainty management. It is a codification or categorisation. That means that essentially, it only has meaning because it is social. That means in turn that we have to consider an environment of other people categorising events, and for the environment to produce many examples of messages which might be analysed. Two people will differ in the ways they categories their events, which means that the uncertainty dynamic in counting elements is fluid, not fixed:

There is a tipping-point in the identification of indexes where some current scheme for identifying differences is called into question, and a new scheme comes into being. New schemes are not arbitrary, however. Some difference in the examples that are provided gradually gets identified (as Keynes says: "Every new instance may diminish the unessential resemblances between the instances and by introducing a new difference increase the Negative Analogy") but the way this happens is somehow coherent and consistent with what has gone before. 

I wonder if this suggests that there is an underlying principle of evolutionary logic by which the most fundamental principles of difference from the very earliest beginnings are encoded in emergent higher-level differences further on in history. A new difference which is identified by "negative analogy" is not really "new", but an echo of something more primitive or fundamental. Shannon, of course, only treats the surface. But actually, what we might need is a historicisation of information theory.

Let's say, for the sake of argument, that the foundational environment is a de Broglie-Bohm "pilot wave": a simple oscillation. From the differences between the codification of a simple oscillation, higher level features might be codified, which might then give way to the identification of new features by drawing back down to fundamental origins. The symmetry-breaking of this process is tied to the originating principle - which could be a pilot wave, or some other fundamental property of nature.  



So what might this mean for Shannon information? When the relative entropy between different features approaches zero, then the distinction between the difference between features collapses. This may be a sign that some new identification of a feature is about to take place: it is a collapse into the originating state.

Each level of this diagram might be redrawn as a graph of the shifting entropies of features at each level. A basic level diagram can draw the entropy of shifting entropies. A further level can draw the entropy of the relations between the entropy of shifting entropies, and so on.

We may not be able to see exactly how the negative analogy is drawn. But we might be able to see the effects of it having been drawn in the evolutionary development of countable features. Surprise has an evolutionary hierarchy. 

Thursday, 5 July 2018

Seven Problems of Pointing to the Future of Education (with our hands tied behind our back) and Seven suggestions for addressing it

The theme of "research connectedness" is common in today's universities. It's a well-intentioned attempt to say "University isn't school". Unfortunately, due to a combination of factors including marketisation, modularisation and learning outcomes, university has become increasingly like school in recent years. "Research connectedness" attempts to remedy the trend by introducing more inquiry-based learning and personalised curricula. All of this stuff is good, and it's something which I have been involved with for a long time. It's also at the heart of what I'm doing at the Far Eastern Federal University in Russia on "Global Scientific Dialogue". But I can't help thinking that we're still missing the point with all these new initiatives. There are (at least) seven problems:

Problem 1: Universities see the curriculum as their product. However, the product of learning is the student's understanding which is arrived at through conversation. The university sells courses and certificates; it does not sell "the potential for arriving at an understanding".

Problem 2: Learning outcomes do not measure student understanding of a subject. They establish the validity of a student's claim to meet a set of criteria written by a teacher (or a module author). What it really measures is the student's understanding of the assessment process.

Problem 3: Learning outcomes codify the expectations of teachers with regard to the way that student performance will be assessed in a subject. By definition, they demand that the teacher knows what they are doing. In research, it is often the case that nobody quite knows what they are doing (Einstein: "If we knew what we were doing, we wouldn't call it research!")

Problem 4: Modules are aggregates of learning outcomes; that is, sets of expectations. Students study many modules, and have to negotiate many different sets of expectations from many different teachers. There is no space for how different teachers' understandings and expectations differ, whether there is coherence, or how incoherence might lead to fundamental problems in the student's understanding.

Problem 5: Inevitably, the only way to cope is to behave strategically. "How do I pass?" becomes more important than "What do I understand?". Suppressing "What do I understand?" in some cases may lead to mental breakdown.

Problem 6: An inquiry-based module from the perspective of strategic learners appears to be the worst of all possible worlds: "basically, they don't teach you anything, and you have to find your own way". Since even inquiry-based modules will have learning outcomes, a strategic approach may work, but result in very dissatisfactory learning experiences ("I didn't learn anything!")

Problem 7: Students are customers in an education market. Whilst learning outcomes codify teacher expectations, learner expectations are increasingly codified by the financial transaction they have with the university, and "student satisfaction" becomes a weapon that can be used against teachers to force them to align their expectations with the learners'.

What can we do?  Here are seven suggestions:

1. The product of the university must be student understanding. Certificates, modules and timetables are epiphenomena.

2. Understanding is produced by conversation. The fundamental function of the university is to find ways of best coordinating rich conversations between students and staff.

3. The curriculum is an outmoded means of coordinating conversation. It is a rigid inflexible object in a fast-changing, uncertain world. The means of coordinating conversation needs to become a much more flexible technology (indeed, this is where investment in technology should be placed, not in VLEs or e-Portfolio, which merely uphold the ailing curriculum)

4. Traditional assessment relies on experts, which necessitates hierarchy within the institution. This hierarchy can be transformed into a "heterarchy" - a flat structure of peer-based coordination. Technologies like Adaptive Comparative Judgement, machine learning and other tools for collaboration and judgement-making can be of great significance here.

5. Transformation of institutional hierarchy can produce far greater flexibility in the way that learners engage with the institution. The "long transactions" of assessment (i.e. the 14 week period "I've done my assessment, you give me a mark") can be broken-up into tiny chunks, students can genuinely roll-on, roll-off courses, and new funding models including educational subscription and assessment services explored.

6. The university needs to investigate and understand its real environment (it is not a market - it is society!). The environment is uncertain, and the best way to understand uncertainty is through making exploratory interventions in society from which the university can learn how to coordinate itself. Generosity towards society should be a strategic mission: free courses, learning opportunities, community engagement should be done for the purpose not of "selling courses", but for the strategic seeking of future viability.

7. To put student understanding at the heart of what the university is, is also to place shared scientific inquiry as the underpinning guide. The scientific discourse is hampered by ancient practices of publication and status which ill-suit an inherently uncertain science. The university should free itself from this, and embrace the rich panoply of technology we have at our disposal for encouraging scientists to communicate their uncertainty about science in an open and dialogic way.