Friday 14 October 2022

The Structure of Entropy

One of the things I've been doing recently in my academic work is examining the ebb-and-flow of experience as shifts in entropy in different dimensions. It began with a paper with Loet Leydesdorff for Systems Research and Behavioural Science on music: https://onlinelibrary.wiley.com/doi/full/10.1002/sres.2738?af=R, and a paper on the entropy of student reflection and personal learning https://www.tandfonline.com/doi/abs/10.1080/10494820.2020.1799030 and has continued in a recent paper on the sonic environment for Postdigital Science and education. 

I have been fascinated by the visualisations and entropy graphs of different phenomena, partly because it provides a way of comparing the shifts of entropy of different heterogenous variables all in the same scale: so, one can consider sound as frequency together with the entropy of words, together with the entropy of things happening in video. The principal feature of this is that the flow of experience is a counterpoint of different variables, and the fundamental theoretical question I have asked concerns the underlying mechanism which coordinates the dance between entropies.

Another way of talking about this dance is to say that entropy has a "structure". Loet Leydesdorff commented on this in conversation at the weekend after I shared some recent analysis of music with him (see below). Interestingly, to talk of the structure of entropy is to invite a recursion: there must be an entropy of structured entropy. Indeed, Shannon's equation is surprisingly flexible in being able to shed light on a vast range of problems. 

To understand why this might be important, we have to think about what happens in the flow of experience. I think one of the most important things that happens (again, I have got this from Loet) is that we anticipate things: we build models of the world so that we have some idea of what is going to happen next. These anticipatory models work with multiple descriptions of the world - there is "mutual redundancy" between the different variables which represent our experience, and I think Loet is right that this mutual redundancy produces an interference pattern which is a kind of fractal. It makes sense to think that anything anticipatory is fractal because in order to anticipate, we must be able to identify a pattern from past experience and map it on to possible future experience. Also, there is further evidence for this because it is basically how machine learning techniques like convolutional neural networks work.

Fractals are self-segmenting: the distinction between patterns at different orders of scale emerges from the self-referential dynamics which produce them. At certain regular points, the interference between different variables produces "nothing" - some gap in pattern which demarcates it. In the paper on music, I suggested that this production of nothing was related to the production of silence, and how music seems to play with redundancies (which is another way of producing nothing) as a way of eventually constructing an anticipation that a piece is going to end. 

I made this video last week about a Haydn piano sonata as a way of explaining my thinking to Loet:


The entropy graph I displayed here uses a Fast Fourier Transform to analyse the frequency of the sound, identifying the dominant pitch, the richness of the texture and the volume of the sound, and calculates the entropy of those variables. This graph illustrates the "structure of entropy" - and of course, eventually everything stops.

I think learning and curiosity is like this too. It too is full of redundancy, and the entropy of learning has a similar kind of dance to music. Indeed, sound is one of the key variables in learning (this is what my recent PDSE paper is about). But it's not just sound. Light also is critical - it's so interesting that our computer screens basically produce patterns of light, and yet there is so little research on light's impact on learning. And indeed, the entropy of light and the entropy of sound can be related in exactly the same way that I explore the entropy of the frequency in this video.

As to what structures the dance of entropy, I think we have to look to our physiology. It is as if there is a deeper dance going on between our physiology and our interactions with our environment. What drives that? It's probably deep in our cells - in our evolutionary history - but something drives us to shape entropies in the way we do. 

No comments: