I've been experimenting a bit more... The interesting thing here is the role of harmony. Now, surely one might think that harmony is a "multiple" line because it has multiple voices. But it can be seen as merely a new aspect on a single sonority.
This is half-composed - a bit too busy in places, but interesting what can be done...
One of the features of music composition which has fascinated me recently is the phenomenon of heterophony: the playing of a single melodic line by many voices which meander around that line, providing different versions of it.
Heterophony generates noise which feeds the line, which generates noise. The line becomes continuous and self-referential. The lines swirls and gains new degrees of freedom as a knot. In swirling it creates a space of interaction. Totality is in a single note. We move into a structured nothing with perception of a line of a plane. This may be the essence of being human. And it means that everything that happens is inevitable because it exists as a possibility within totality. The totality of perception is nothing - we can only hold onto a thread of our part of that totality.
AI may one day be able to do better than this - to offer something more total (although it could never be totality itself). And AI is essentially heterophonic, as I have mentioned before: Improvisation Blog: AI and Heterophony
This improvisation isn't so much a dance, but I think the combination of timbres and gestures is in reality the unfolding of a single line which knots itself and gradually unknots itself, in the process constructing and demarcating time.
I gave this presentation on Wednesday (the day Trump won the election) to Liverpool University's Music Theory group (see http://www.chromatic-harmony.com/theoryclub/).
Present were some of the key intellectual figures who have been important in my journey, not just my thinking about music, but about perception, AI and physics: Peter Rowlands, whose physics has fundamentally changed my outlook on perception, alongside John Torday, whose biology informs a much deeper integration between physics and physiology, which explains what curiosity is, and Bill Miller who has worked with John on cellular consciousness. Also there was Michael Spitzer, whose book "The Musical Human" treads a path into music and evolution to which I am very sympathetic, although perhaps now I would say, "we need to think about the physics!"
This integrates with the AI work that I did, and particularly perception in AI, where I learnt a huge amount from my Liverpool colleague David Wong (who couldn't make the presentation). With David, we are further developing these ideas, and this has led to a medical diagnostic company, but also to a slew of new thinking about the role of AI in society.
There are so many avenues to explore from this, but one of the most fascinating came from Peter Rowlands, who said that "music and mathematics are fundamentally 'abstract patternings'", and I had a conversation with Peter after about whether this was the deep connection between maths and music: it's not that music is mathematical (which is often how we think, particularly with composers like Bach), but that mathematics is musical: a mathematical proof is like a perceptual journey in a similar way to the way to how I describe music.
Seymour Papert was on to this I think when he pointed out the root of the word "mathematics" is the Greek "mathmatikos" which literally means "to be disposed to learn". I don't think that's a million miles away from "disposed to going on a journey of perception".
The really fascinating thing here is the primacy of statistics in the study of perception - the essence of Gustav Fechner's work. Statistics is an outlier in mathematics, because it is rarely presented as logic, but fact, from which calculations are made. Where this "fact" comes from is quite mysterious - how and why does the probability density function arise, with its Pi and e and square roots? The "central limit theorem" will be the typical answer - but that only goes so far, because among the limits of the central limit theorem is "finite variance": well, what makes it finite? That may be a question for biology.
But then, machine learning is statistical. It is all about statistics and recursion. And when we say "we don't know why it works", what we're really saying is "we don't really understand the ontology of the statistics". What I am suggesting in my presentation, is that the ontology of the statistics may be even more profound than the ontology of mathematics as we conventionally understand it, or even the ontology of logic. I think this thought has been with me for most of my life.
I'm in Copenhagen for the first time in almost a year. It's nice to see friends, but it's also letting me reflect on what's happened in the intervening time. I was last here between the 27th to the 29th November. Nothing here has really changed, except that the notorious Niels Bohr Building has been officially opened. I have to say, it's not a building that inspires me in any way... Copenhagen generally has a slightly weird "cold industrial" look about it, although the centre is nice...
I prefer to sit in the local cafe which is much nicer.
It was good to catch up with people in the department, and I went to a fun "improv night" in which a former colleague and friend was performing.
Looking back, I think coming to Copenhagen for a year or so was important for me to do, although I left a well-paid senior management job in Liverpool to do it. But Liverpool was not a nice place. Copenhagen at least allowed space to think about what was happening to education. Although the work was very messy, it may yet be important.
Today I've been teaching Danish teachers about AI. All very interesting, and nice people. My heart, however, is firmly in Manchester, and the extent to which that is the case has really dominated my thoughts while I've been here. Last time I was here I wasn't quite sure, and now I am. What happened in the intervening period was really critical in shaping the person I am now.