Friday, 30 September 2016

E-learning and Scientific Communication

With the technologies that we have today, it is possible to communicate on a large scale in a much richer way than has ever been available to us before. Fundamentally, the power of our new media has to do with its potential variety of expression, or (more technically) its maximum entropy (the maximum possible surprisingness which can manifest itself though the medium). Text - particularly the text of academic papers - has a much lower maximum entropy.

I mention academic papers because I find it strange that academics remain transfixed by the academic paper as the 'gold standard' of intellectual communication. There are important reasons why it ought not to be. Not least among them is the fact that today's science is not the science of certainty and objectivity for which the academic paper was originally conceived as a means of communicating science by the academic societies in the 1660s. Today's science is a science of contingency, complexity and uncertainty. Communicating uncertainty through a medium designed to communicate certainty is surely going to lead to problems. And indeed it does... the 'marketisation' of education may be the most devastating manifestation of this epistemological misalignment.

With video, one can express one's uncertainty - which, in a science of uncertainty, is a very important thing to communicate: in the end, the point of communicating science is the coordination of understanding and action. As part of the FIS discussion (http://fis.sciforum.net/fis-discussion-sessions/)about academic publishing, I produced this:

The point of making a video is trying to convey honestly the uncertainties of knowledge and understanding. It is important to use a communication medium which affords this. The academic paper encourages people to hide, posture, and so on. Our educational market encourages people not even to care about 'communicating' but merely to posture, and acquire the status markers of publication. In the FIS discussion, a number of people have expressed pessimism about "human nature" in the sciences - that the ego-driven posturing will always win out. But I can't help wondering if this ego behaviour wasn't the product of the means of communication (the paper) as well as the epistemological model. If scientists used a more revealing technology to communicate, we would see, I think, different kinds of scientific behaviour.

Another important reason for thinking about scientific communication is that it is scientific communication which Universities are fundamentally about. In recent years, this has been forgotten - even in the most elite institutions. The market-driven focus is now on teaching students, with endless speculation about the 'best' pedagogy (whatever that means - it is all speculation, because nobody can see learning). So, we end up in a very confused place. "Teaching" in universities involves preparing people for the labours of scientific communication - which still means academic papers, conference presentations, etc... even when the science and the epistemology now concerns uncertainty and complexity. Educational technologists are enlisted to attempt to produce resources that encourage learners to develop themselves in ways which turn them into copies of the 17th century enlightenment scientist. This is a bit crazy.

The universities of the 1700s changed fundamentally within the space of 100 years or so (Bacon's "Advancement of Learning" of 1605 castigated Cambridge's curriculum, and by the 1700s, its Aristotelian ways had pretty much disappeared). What changed them? It was the transformed practices in experiment and communication among scientists, from the invisible college to the Royal Society.

Our universities today are in a mess - this is a very bad time in education. University managers think they can determine the future of Universities. But in the end, the future of Universities is always led by scientific communities. When those communities change the way they communicate then everything else in the education system changes alongside. I believe much of what we consider typical of a University today will have disappeared in 100 years, just as the once-unquestionable supremacy of Aristotelian doctrine in the scholastic university was swept away. The abandonment of the academic paper (certainly in its current form) and the adoption of new ways of communicating uncertainty will lead the way in this.

The reason why I think this will happen is because our epistemology of uncertainty cannot successfully communicate itself through a low-variety medium. It demands richness, aesthetic power, and emotional connection. The Newtonian, Lockean doctrine of the scientist as dispassionate observer cannot be right; complexity science will eventually disarm it.

There are some simple questions to ask: Do scientists really communicate with one another today? Is citation an adequate indicator of how well we understand each other? Are conferences any better for scientific communication? (I'm sorry, your time is up - you have to stop). If papers and conferences are no good for scientific communication, what actually works? What can we do better?

Probably as a first step, we have to realise that science isn't possible without communicating. 

Sunday, 25 September 2016

Big Data and Bad Management

There's been a lot of stuff in the news recently about the threats posed by Big Data, AI, etc. "Computers will take our jobs!" is the basic worry. Except nobody seems to notice that the only jobs that seem bullet-proof are those of the managers who determine that other peoples' jobs have been replaced with computers. It is bad management we should worry about, not technology.

No computer is, or will ever be, a "match" for a single human brain: brains and computers are different kinds of things. Confusing brains and computers is an epistemological error - a "mereological fallacy" (the reduction of wholes to parts), a Golem-like mistaken belief in the possibility of 'mimesis'.

Ross Ashby, who studied brains closely for his entire career, was aware that the brain was a highly effective variety-absorbing machine. Its variety reduction is felt in the body: often as intuition, instinct or a 'hunch'.

Computers, by contrast, count. They have to be told what to count and what to ignore. In order to get the computer to count, humans have to attenuate the variety of the world by making distinctions and applying them to the computer's software. If the computer does its job well, it will be able to produce results which map uncertainties which will relate to the initial criteria for what can be counted and what can't. Knowledge of these uncertainties can be useful - they can help us predict the weather, or help translate a phrase from one language to another. But it is the hunches and instincts of human beings which attenuate the computer's world in the first place.

Stafford Beer tells the story of Ashby's explanation for accepting without a moment's hesitation the invitation to move to the US and work with Heinz von Foerster in Illinois. Ashby explained to Beer:
Years of research could not attain to certainty in a decision of this kind: the variety of the options had been far too high. The most rational response would be to notice that the brain is a self-organizing computer which might be able to assimilate the variety, and deliver an output in the form of a hunch. He [Ashby] had felt this hunch. He had rationally obeyed it. And had there been no hunch, no sense of an heuristic process to pursue? Ross shrugged: ‘then the most rational procedure would be to toss a coin’
Our biggest threat is bad management, which feeds on bad epistemology. The great difficulty we have at the moment is that our scientific practices of Big Data, AI and so on, are characterised by complexity and uncertainty. Yet we view their outputs as if they were the 'objective' and 'certain' outputs of the classical scientist. Deep down, our brains know better.

Tuesday, 20 September 2016

Status Scarcity and Academic Publishing


Update: 25/9/16: A more complete version of this post is here: https://www.academia.edu/s/b6902593d0/the-status-of-scientific-publication-in-the-information-age?source=link

A published academic paper is a kind of declaration: the board of such-and-such a journal agrees that the ideas expressed in the paper are a worthy contribution to its discussions. It is, in effect, a license to make a small change to the world. Alongside the license comes other prestige indicators which carry real value for individuals: in today's academia, publications help to secure the position of academics in universities (without them, they can lose their jobs). Beyond publication itself, citations serve as further 'evidence' of approval of a community. Fame and status as a "thought leader" comes from many citations, which in turn brings invitations to keynotes at conferences, impact of ideas, secondary studies of an author's ideas, and so on. Fundamentally, there is a demarcation between the star individual and the crowd. Publication counts because it is scarce: approval for publication is a declaration of scarcity.

Publication in some journals is more scarce than in others. The less the probability that a paper might be accepted for publication in a journal, the greater the status associated with that journal.  High ranking journals attract more citations because they are seen to be more authoritative. Journals acquire status by virtue of their editorial processes and the communities they represent. The scarcity declarations made to an author reflect and serve to enhance the journal's status.

With scarcity comes economics. Access to published work in high ranking journals has a value greater than work published in less highly ranked journals, or work published for free. Since academic job security is dependent on acceptance by the academy, and since the means of gaining acceptance is to engage with the scholarship in high-ranking journals, publishers can demand a high price for access to published work. This is passed on to students in Universities, and access to intellectual debate is concentrated within Universities whose own status is enhanced by their position as a gateway to high ranking scholarship.

Moreover, Universities employ academics, who they expect to be publishing in high-ranking journals. The status of individual academics is enhanced through publication in high-ranking journals, and the status of journals in enhanced by their maintenance of scarcity of publication, the University declares scarcity in the access both to well-published academics and to high-ranking journals. Successful publication increases job security because it reinforces the scarcity declaration by the institution.

A third layer has recently emerged which reinforces the whole thing. The measurement of status through league tables of universities and indirectly, journals has introduced an industry of academic credit-worthiness into which institutions are increasingly being coerced to submit themselves. To not be listed in league tables is akin to not being published in high ranking journals.

In the end, students and governments pay for it all. The money is split between the Universities and the publishers.

The problems inherent in this model can be broken down into a series of 'scarcity declarations':

  • The declaration of scarcity of publication in journals for authors
  • The declaration of scarcity of access to journals by institutions
  • The declaration of scarcity of status of institutions through league tables
  • The declaration of scarcity of intellectual work within the universities

How has this situation evolved over history? How has technology changed it?

Before the Royal Society published its transactions (generally considered to be the first academic journal), publication was not considered something that scientists ought to do. The publications of scientific discoveries was frequently cryptic: an assertion of priority of the individual, without giving anything away in terms of specific details of the discovery which might then be 'stolen' by other scholars. So Galileo's famous anagrams were a way of making a declaration that "Galileo has made a discovery" without necessarily saying what it was.

The possession of knowledge was the key to enhancing status in the medieval world - so scientists became 'hoarders' of knowledge. It is perhaps rather like some university teachers today who might be unwilling to have their lectures videoed: that if their performance in class was captured in a way that could be infinitely replayed and reused, their jobs would be threatened because they would no longer be required to lecture. Equally, many academics today are resistant to blogging because they don't want to 'give their ideas away'. The medieval scholar was much like this.

In an age of printing, knowledge hoarding became increasingly difficult to defend. To enhance one's own status within an institution increasingly necessitated reaching out to a larger readership in other institutions. Publication practice gradually took on the form that we now know it. One of the best examples is the Royal Society's publication of its history (two years after its foundation!). This received considerable and well-documented bureaucratic processes and editorial control: the 'history' was a declaration of the institution's status itself, and it sought to preserve its own distinctness.

The  contrast between the Royal Society's practices of peer review was a change not only in scientific practice and epistemology, but also in the democratisation of intellectual status acquisition. Publication and admittance to the academy was technically available to all. The status of observation and experiment supported the democratic movement. The noteworthiness of the experiment and its results were more important than the status of the individual. Science was the gateway to truth - the uncovering of certainties in nature. We tend to see this epistemological shift occuring alongside the shift in communicative practice. But fundamentally the technologies of communication and the scientific epistemology were probably interconnected - the technology brought about new epistemologies.

This is an interesting perspective when we come to the internet. If we live in what some call an 'information society' is it a surprise that information frames a new scientific epistemology? The contrast between our information world and the world of the Royal Society is the certainty that was assumed to lie behind scientific discovery. Uncertainty rather than certainty is the hallmark of modern science - whether it is the probabilistic modelling of economics or patterns in DNA, the analysis of big data, the investigation of quantum fields or the study of ecologies. And information itself is, at least from a mathematical perspective, a measure of uncertainty. So we move from the certainty of the Royal Society and the democratisation of the academic publication, to the uncertainty of information science and yet we retain the publication model of the 17th century.

This publication model is in trouble. Journals struggle to get reviewers, publishers have become over-powerful, education is increasingly unaffordable. Meanwhile Universities have adopted practices which have reduced their running costs, employing cheap adjunct lecturers who can barely afford to eat, whilst increasing their revenues. Consequently the ecology of scholarship is increasingly under threat. It is curious that in a world where knowledge is abundant, universities have maintained their scarcity (evidenced by rapidly rising fees), and publishers - whilst coming under attack for their practices - largely operate with the same models that they did in the 18th century. These are all signs of education in crisis.

There have been attempts to address this crisis. In the early 2000s, the realisation of the technological abundance of knowledge suggested that it might be possible to bypass the institution altogether. Guerrilla tactics to open up closed journals have appeared, with Sci-Hub being the most famous example. New models of peer-review have been introduced, and new models of open access publishing. But as one part of the status problem is addressed, so a different aspect on the same problem opens up: open-access publishing is often little more than the opportunity for an author to buy increased chances of citation.

But the journal paper itself seems outdated. Video appears to be a much more compelling case for advancing intellectual arguments and engaging with an audience. Why do we not present our ideas in video? On YouTube it is artists rather than academics who have harnassed the power of video for coordinating understanding. An uncertain world requires not the presentation of definite results and proof, but rather the determination and coordination of the constraints of understanding. In an uncertain world, knowledge and teaching come together. Then there are other means of coordinating understanding through online activities.

Sunday, 18 September 2016

Student Rent Strikes - Revisiting the political power of an un-mortgaged society?

Inspecting the looming world of financialised housing in 1959, Aneurin Bevan gave a speech to the Labour Party conference:
I have enough faith in my fellow creatures in Great Britain to believe that when they have got over the delirium of the television, when they realize that their new homes that they have been put into are mortgaged to the hilt, when they realize that the moneylender has been elevated to the highest position in the land, when they realize that the refinements for which they should look are not there, that it is a vulgar society of which no decent person could be proud, when they realize all those things, when the years go by and they see the challenge of modern society not being met by the Tories who can consolidate their political powers only on the basis of national mediocrity, who are unable to exploit the resources of their scientists because they are prevented by the greed of their capitalism from doing so, when they realize that the flower of our youth goes abroad today because they are not being given opportunities of using their skill and their knowledge properly at home, when they realize that all the tides of history are flowing in our direction, that we are not beaten, that we represent the future: then, when we say it and mean it, then we shall lead our people to where they deserve to be led!
One of the most interesting things about the property boom and the mortgage crisis is that few young people can afford to take a sufficient mortgage to buy a house. Of course, this deprives the money-lenders the opportunity to control the young and tie them into financial servitude for 25 years or more. Although for the young who believe they deserve the same standard of living as their parents (but can't get it), this may seem terrible, it also presents provides the young with political power - which they have yet to realise.

The atomised mortgaged property-owning individual was (is) politically disenfranchised not only through the mortgage itself, but also through their impaired ability to organise themselves into a political force. The collapse of heavy industry, and the unions which were once so powerful meant that there was no single target to strike collectively to hold elites to account.

So heavy industry has gone. Massed labour has gone.... to be replaced with mass university education. The student rent-strike (see https://www.theguardian.com/education/2016/sep/17/uk-university-students-rent-strike-rising-cost-accommodation) is exactly the same kind of phenomenon as the organisation of mass political power in the past. Rent hurts students on a day-to-day basis. It means they can't eat properly or go out in the evening. I think the rent strike is likely to succeed - in London, it has already started to show results (http://www.independent.co.uk/student/student-life/accommodation/ucl-rent-strike-resolved-student-accommodation-in-london-a7120421.html): Of course, Universities may threaten legal action, etc. But against everybody? I doubt it - there are too many vested interests in the students being there - and Universities without students aren't Universities. The interesting thing is, if the rent strike is successful, what next? What, when students rediscover the power of self-organisation and political action, will be next?

What about a "fees strike"? This is more difficult. Fees are paid through loans taken out by the student, directly to the University. The student never sees the money, and have no power to withhold it. All they can do is leave, which would also mean not getting their qualifications. I'm not entirely sure that mass exodus as a political threat is completely out of the question (who knows - particularly with dwindling prospects for graduates, and the fact that a student who's studied for a year knows that the rest is more of the same), but the question over rent will raise a lot of questions not just about student finance, but social power.

Friday, 9 September 2016

Gordon Pask: "A Discussion of the Cybernetics of Learning Behaviour" (1963)

At #altc this year (which I didn't attend) there was a keynote given by Lia Commissar (@misscommissar) about the brain and learning. By coincidence I stumbled across a volume in Stafford Beer's archive at Liverpool John Moores library, edited by Norbert Wiener on "Nerve, Brain and Memory Models" from 1963. It followed a Symposium on Cybernetics of the Nervous System at the Royal Dutch Academy of Sciences in April, 1962. There is a stellar list of contributers:
W. R. Ashby, V. Braitenberg, J. Clark, J. D. Cowan, H. Frank, F. H. George, E. Huant, P. L. Latour, P. Mueller, A. V. Napalkov, P. Nayrac, A. Nigro, G. Pask, N. Rashevsky, J. L.Sauvan, J. P. Shade, N. Stanoulov, M. Ten Hoopen, A. A. Verveen, H. Von Foerster, C. C.Walker, O. D.Wells, N. Wiener, J. Zeman, G.W. Zopf
There is a long paper by Gordon Pask called "A discussion of the cybernetics of learning behaviour" which I thought would be relevant to the current vogue for everything 'neuro' in education. There are many other things there too, including a fascinating paper by Ashby and Von Foerster on "The essential instability of systems with threshold, and some possible applications to psychiatry". There is also a record of the conversation with Wiener afterwards. 

I've quoted the opening of Pask's paper below because it is an excellent summary of the neuroscience of the time. It was surprisingly advanced, and in many ways today's emphasis on MRI scanning technologies has meant that the field has become somewhat homogenised. One of the reasons why I'm interested is because the models of the brain taken by Stafford Beer in his Viable System Model very much belong to this period: what effect would more up-to-date understanding of the brain have had on his thinking? 

But Pask's contribution on Learning Behaviour is also interesting because it presents a very early (and rather formal) version of what became conversation theory. He relies quite heavily on Robert Rosen's work ("Representation of biological systems from the standpoint of the theory of categories" (1958) - Bulletin of Mathematical Biophysics; "A logical paradox implicit in the notion of a self-reproducing automaton" (1959), same journal). His championing of Ashby's approach to the brain is, I think, very important.

From "A discussion of the cybernetics of Learning Behaviour" - Gordon Pask, 1962

1.2 The approach of cybernetics

Some cybernetic models are derived from a psychological root, for example, Rosenblatt's (1961) perceptron and George's (1961) automata stem largely from Hebb's (1949) theory. Others, such as Grey Walter's (1953) and Angyan's (1958) respective tortoises, have a broader behavioural antecedent.

On the other hand, neurone models, like Harmon's (1961) and Lettvin's (1959), are based upon facts of microscopic physiology and have the same predictive power linked to the same restrictions as an overtly physiological construction.

Next, there are models which start from a few physiological facts such as known characteristics or connectivities of neurones and add to these certain cybernetically plausable assumptions. At a microscopic level, McCulloch's (1960) work is the most explicit case of this technique (though it does not, in fact refer to adaptation so much as to perception) for its assumptions stem from Boolean Logic (Rashevsky, (1960), describes a number of networks that are adaptive). Uttley (1956), using a different set of assumptions, considered the hypothesis that conditional probability computation occurs extensively in the nervous system. At a macroscopic level, Beurle (1954) has constructed a statistical mechanical model involving a population of artificial neurones which has been successfully simulated, whilst Napalkov's (1961) proposals lie between the microscopic and macroscopic extremes.

Cyberneticians are naturally concerned with the logic of large systems and the logical calibre of the learning process. Thus Willis (1959) and Cameron (1960) point out the advantages and limitations of threshold logic. Papert (1960) considers the constraints imposed upon the adaptive process in a wholly arbitrary network, and Ivahnenko (1962) recently published a series of papers reconciling the presently opposed idea of the brain as an undifferentiated fully malleable system and as a well structured device that has a few adaptive parameters. MacKay (1951) has discussed the philosophy of learning as such, the implications of the word and the extent to which learning behaviour can be simulated; in addition to which he has proposed a number of brain-like automata. But it is Ashby (1956) who takes the purely cybernetic approach to learning. Physiological mechanisms are shown to be special cases of completely general systems exhibiting principles such as homeostasis and dynamic stability. He considers the behaviour of these systems in different experimental conditions and displays such statements as 'the system learns' or 'the system has a memory' in their true colour as assertions that are made relative to a particular observer. 

Sunday, 4 September 2016

Levels of Constraint in Music and Recursive Distinguishing

A while ago I wrote a computer program which analysed the midi signals coming from my piano as I improvised, and made a range of entropy calculations on-the-fly using a sliding window technique (so the entropy was relative to a window of recent events): see http://dailyimprovisation.blogspot.co.uk/2015/09/entropy-and-aesthetics-some-musical.html. I was fascinated to watch the numbers shift as I played, and to make observations of my emotions correlating with these numbers. I have some reservations about the sliding window idea, but it was pragmatic and certainly interesting. I ought to write it up.

In my last post I wrote about information and Shannon entropy in the Schenker graphs. As one moves up the levels from foreground to background, I think there is decreasing uncertainty and increasing constraint. However, what I didn't say was the fact that I think prolongation - which is the fundamental concept in Schenker - exists because of the inter-relationship between levels: it is the dynamics between different kinds of constraint which produce the levels in the first place. Each level only exists because of the constraints relations it has with harmony, rhythm, etc. There is no melody without rhythm, no harmony without melody, no music without the whole thing. I hinted at this in a rather weird article I wrote for Kybernetes a few years ago using Beer's VSM as a tool to think with: see http://www.emeraldinsight.com/doi/abs/10.1108/03684921111160304) And the "whole thing" is always an out-of-reach thing-in-itself. The analyst (Schenker in this case) brings constraints to bear on the music.



Since Bach's first prelude is Schenker's most famous (and simplest) example, what is its entropic structure? Well, using the technique in my software, there is little rhythmic information: basically it's all semiquavers. That means there is high constraint. There's also little information in the notes that are used: the broken chords repeat themselves each bar. If we just looked at the chords, then of course, there is difference, as the harmonic scheme revealed by the chords unfolds: that carries more information. If we look for motifs, then the accompaniment breaks down into the first rising broken chord, and then the repetition of the last three notes of the rising chord (and this pattern repeats). If we look at intervals, there is something interest that happens when the accompaniment uses 2nds rather than 3rd and 4ths: that is a difference.

If I was to simply look for notes that are different, then the chromatic notes appearing later on are striking - they move the music on. Also we could look for the entropy of register: the bass goes lower (as in so many Bach preludes and fugues). We know we are coming to the end when the patterns are broken (when the left hand on plays once a bar and the right extends its idea over the whole bar).

What I notice most about this kind of approach is the fact that counterpoint is fundamentally an overlapping constraint: something is kept the same whilst something else is changed. What happens at the end? The constraints in a variety of different dimensions all come together.

Information theory is powerful, but basically it is simply about counting. My program counts things it has been told about: rhythm, harmony, intervals, notes, registers, etc. As the music unfolds, there are new things to count which might not be immediately obvious when the music starts: motifs, articulations of form and so on. As a musician, I can use my ears (what's that?) to guide the selection of these things that might be counted. But I'm curious as to whether the selection of things-to-be-counted might be arbitrary: the point that matters is not the distinctions that are made and the counting which occurs; it is the relations between the entropies of the different counted distinctions.

A machine-learning algorithm, for example, makes distinctions about features which might be counted to identify a class of object. The important point is that the algorithm can be consistent in identifying new forms of regularity. Some of them will be insignificant in their relations to others. But other distinctions will be highly significant in their relations to others.

There's something going on at the moment in cybernetics called 'recursive distinguishing' developed by Louis Kauffman and Joel Isaacson (see http://homepages.math.uic.edu/~kauffman/RD.html; this is particularly interesting: https://dl.dropboxusercontent.com/u/11067256/JSPSpr2016.pdf). Shannon's information equations are a crude instrument for doing this kind of stuff, and I'm increasingly aware of the need for some form of recursive measurement. It helps me to blog this - this is all very speculative! The important thing in information theory is counting, and identifying things to count. Shannon measures 'surprise', but surprise arises over time. Time unfolds like music: it refers to itself (actually that's tricky - but that's another post on 'what is self-reference?')... what that means practically is that there are continually emerging new categories of things-to-count. But the specific categories of things to count are not important; what matters are the relations between the constraints identified in counting whatever it is we count (in fact, 'to count' is itself a relation).

It's late. I need to think about this.

Saturday, 3 September 2016

An Anatomy of Surprise in Music, Drama and the Classroom

When something surprising happens - be it some kind of accident, a transformation, a joke - we can rationalise it and identify the causes for the surprise. Accidents are a classic example: we look for the 'factors' which caused the accident - learning about the causal factors like a slippery floor or a blind bend in the road, might avoid similar accidents in future. Nice surprises like a joke's punchline, or the climax in a play or in music, can also be analysed for the factors which produce them.

However many factors we identify, we cannot enumerate all the 'factors' or angles which lead to a surprise. I think that's another way of saying that there are many possible stories that might be told about it. In music, for example, there is the story about the harmony, melody, rhythm, dynamics, structure, tempo, and so on. In drama, there is the back story of the characters, the situations they find themselves in, the material environment, their possessions and artefacts (like a murder weapon) which they have access to, objects of their affection, and so on. This also applies to the classroom - which is the scene of a kind of drama. Each dimension viewed in its own terms will produce its own pattern of surprises - of course what "on its own terms" needs unpacking. The surprising event will occur at a point of coordination between many different factors, some of which we only become aware of after the event.

In discussion with one of my Phd students the other day, we talked about layers of clouds overlapping one another, where the surprise is the sun breaking through (in Manchester it is a surprise!). I like the metaphor, but it makes the business of identifying the different layers easy - the different stratification of clouds.

In drama, music or the classroom there are different descriptions, and the different descriptions have different structures. There are also layers of descriptions. In a Schenkerian analysis of a piece of music, for example, there are descriptions of a large-scale structure of 'prolongation' of a fundamental tonality; there is a description of key melodic features (and to some extent rhythm - although this is not Schenker's strong point), and there is a description of the surface with more of the detail exposed. The climax occurs at the point of coordination or overlap between the layers.

There is much less "information" or "uncertainty" conveyed at a deep level than at the surface level. The difference between the deep structure and the surface is an increase in constraint. The deep level points to clear moments of structural articulation which, even if one didn't accept the terms of the analysis, one would still be able to hear. The middle layer has less constraint, but exhibit some attenuation in stripping-out superfluous features. The foreground layer exhibits the least constraint and is closest to the actual music taken in totality, without any specific distinctions being made.  The meaningfulness of the Schenker graph is a dynamic of oscillation between high and low constraint.

This brings me on to these diagrams by Gordon Pask in his research into teaching machines. These graphs demonstrate the different strategies different learners take in negotiating new concepts. The Y-axis of each is labelled according to the different variables A, B, C, D (which can be taken to be different concepts), and their different combinations. Single concepts (A, B, C, D) are at the top - these represent high constraint and relatively low information in terms of focus on individual concepts of the topic. At the bottom is the label ABCD (no commas) which represents the integration of the concepts A,B,C,D. This represents lower constraint and higher information: a master of a subject is able to produce a variety of interpretations, representations and performances of a subject.


But what of surprise here? I think the point here is that as one journey into concepts takes place, there is an interaction with another journey into concepts: There is a conversation. Surprises occur between people when one person does something that the other doesn't expect. The behaviour of the other person does not fit the model that is held by the person who is surprised.

Pask is interesting in his analysis because he codified the levels of description as degrees of complex interactions. He created a situation (and technology) where the combinations of specific concepts could be monitored - this is what we should be doing with learning analytics!

When we look at drama, is it possible to analyse the 'concepts' that each character has, and to consider the dynamics of the conversations they have with one another and see where they have to adjust their models of each other? When each character has to change their conceptual model, they have learnt something, and the result of the learning is to change the dynamics of the conversation. This is the moment of surprise.

In music, we "learn something" in the moment of climax, or some other event that surprises us. We can describe the rhythm, melody, dynamics, etc, and surprise will be some kind of "coming together" of different descriptions. What if each description is like Pask's A, B, C and D? In music though, we start at ABCD (at the bottom). At different moments, we focus on different 'concepts' or aspects. As the thing progresses, the aspects change, and indeed, what might be called "A" or "B" changes. What we learn at the moment of surprise is something about the composer, and maybe something about the performer.

The 'model' of music we create on listening is very much like the model we create of other people in conversation. What this is, fundamentally, is a kind of pattern of constraint relations which present a map of the relations between aspects which are highly focused (for example, the rhythm in Bolero), to the unconstrained totality, and can take us from the unconstrained totality back to details. 

Thursday, 1 September 2016

Stafford Beer's Obituary for Ross Ashby: "Requisite Ross"

I am currently in the process of making a deep study of the work of Ross Ashby, and I came across this obituary written by Stafford Beer in his library (which is archived at Liverpool John Moores University), tucked away on sheets of typescript inside a copy of Ashby's book "Design for a Brain". It is extremely illuminating. 
It was meant to be published in a journal of the American Society for Cybernetics called 'Forum' in (or around) 1972 (when Ashby died) - I would have thought Beer would have a copy, but it isn't in the archive, so it's pretty inaccessible. 

Requisite Ross - by Stafford Beer

‘Someone is boring me. I think it is me’. Thus once remarked the Welsh poet Dylan Thomas. Often when talking to people about Ross Ashby I am myself assailed by this feeling; and I am quite serious in saying that I know that it is Ross himself who sets it up in me. It is open to you to interpret that remark in various ways.  The least challenging is to say if you will (though I shall not) that obviously the focussing of memory on a man who hated effusion would prompt the suggestion not to be too effusive. If you were to say this, then I reckon that Ross would applaud your use of Occam’s Razor – and then come at you with a merciless scalpel about the ‘obviously’ you used.

Ashby was a man of such precise and incisive thought processes that he did indeed operate as a surgeon of the intellect, whereas he was far too gentle and sensitive a person to have been the neurosurgeon that this ‘brain man’ of our great affection might otherwise have become. Perhaps he did not always realize that some people have even less relish for the dissection of their treasured notions of what-it’s-all-about than for the dissection of their prefrontal lobes. At least you get a general anaesthetic for that, and not a shot of Ashby’s Special – which could evoke instant hypersensitivity. Ashby’s British compatriots are especially ingenious (so I being me have also observed) in finding ways in which to accommodate the most irrefutable evidence that their model is wrong. It would be culpable to deny the evidence: they do not. It would be absurd to alter the model: all right-thinking people know it to be right. The basic trick is to acknowledge all aspects of the effort that has been made, and its great importance – ‘when the time is ripe’. Of course the time is never quite ripe; meanwhile, certain (undetectable) adjustments to the model are understood to have been made. And so on.

Most sadly, this lack of recognition in Britain of his discoveries hurt Ross deeply: he felt that his important new concepts (as they were and remain) were being spurned (as they were and still) both by The Establishment, and by those engaged in managing affairs. He often spoke to me of his outrage as to the impregnability of the first, and of his simple amazement at the incomprehension of the second. So I would engage him in discourse about the pathology of those very mechanisms of viability that he himself had disclosed, and beg him not to suffer any hurt himself; it was for the loss of the scientific and practical advances which he had conceived and made possible – but which he blamed himself for not managing to effect. Well, he gave us who study his works all those ideas free of charge. Let us accept that gift and handle it impeccably, for it was passed to us (I have been seeking to show) with exceptional innocence.

At the start I expressed, though I did not explain, the difficulty I have in speaking of Ross. After many false starts to this note, I managed to get going by firing both barrels of the only critical gun at my disposal, one after the other. Negative effusion, you see. Suddenly (surprise, surprise) they too have just turned themselves into a twenty-one gun salute.

Ashby did not want personal plaudits; and, as I said, he hated effusion – although without doubt he recognized and also returned love. It was, in these circumstances, a risky course on which I embarked in the British University where I teach. I nominated W Ross Ashby for a Doctorate in Science Honoris causa. Now this was after his return from the United States as Professor Emeritus. I wrote an encomium, as required, which I dared not show him. He would have his chance to turn down the honour when it was offered. He would certainly be cross with me, but might accept it for the sake of cybernetics; and that might in turn secretly ameliorate his hurt. The selection committee met, announced the year’s honorary degrees, and departed. I heard nothing at all. So I made enquiries. I was told that Ross’s name did not appear in ‘Who’s Why’. Perhaps you have experienced the feeling that you don’t know whether to laugh or cry. It was exactly this kind of twaddle that the honour was intended to put down in principle – and to lay to rest for ever in his case.

It did not at this point occur again to me that I might after all be doing just the wrong thing. I was incensed for Ross’s reputation; he was sitting unbeknownst of these entire developments at his home in Bristol, and visiting nearby Cardiff as an Honorary Professorial Fellow. Moreover, I was angry with my own incompetence. The next year, then, his name went forward again. This time it was accompanied by a huge dossier. There were letters of support from every cybernetician in the world whose name would count. In addition, there were Nobel Laureates, leading scientists from other disciplines, top managers…. For all that I can now remember, I may have cited the Queen. At any rate, this was a dossier to wring the heart of Genghis Khan. Just a few weeks before the selection committee met again, the said W Ross Ashby died. Despite a certain austerity of manner behind which the shyness hid, Ross usually laughed a lot. And I can hear his laughter now – this being the first he has heard of these well-meant fiascos.





You will have noticed that I am treating the change to write about my friend anecdotally. Others, more archival in their scholarship than I, will surely write brilliantly about his scientific achievements. I believe in the oral tradition. Anyone can gain access to the official story; But I had the privilege that it is now up to me to answer for. Is this for reasons of self-indulgence, or for the dubious delights of gossip mongering, or what? The answer lies in the what, as I hope you have recognized by now. Hearken then again:

I have another good and true friend, to whom I mentioned the project of this special edition of Forum. One of the issues about which he and I always vehemently disagreed is the status of the Law of Requisite Variety (which, in and out of season, I reference as Ashby’s Law – and I hope that you will too). For, argues this friend, it is a mere tautology. ‘Only variety can absorb variety.’ Always suspect the word ‘mere’. Consider: the entire corpus of mathematics is either tautologous – or wrong. Wrong is wrong. Tautologies are right, and that’s a start. What’s wrong cannot be (directly) useful. Were it not so, mathematics would never be useful – but they are. And no-one calls them ‘mere’ (except as do I in too many cases of misconceived OR).

Well, this other friend with whom I disagreed about Ashby’s Law and to whom I mentioned this very edition of Forum, became quite angry. He asked me not to talk to him any more about this issue of about the contribution what you are now reading. Was this because of his hostility to Ashby’s Law? Now, it was not. It was because (my friend said) a scientist had the right to express himself as he chose; and he should therefore be judged by his published and authenticated works. It was at best superfluous, certainly impertinent, and potentially damaging to talk anecdotally about the man himself. But, I argued, published works suffer variety attenuation by the rules of the publication game: especially if an author be too diffident to challenge or to circumvent those rules. Only variety can absorb variety,  after all – so if we do not have it, we must needs generate it. Was it not possible that the full flowering of the recipient understanding could be amplified by the injection of variety concerning the further nature of the author? Was that author capable of jokes? Did he customarily employ the full pelt of dramatic irony? Were his mathematics suspect? Did he ever publish things (such as on television, where there is a permanent record) on an off-day? And if, as in Ross’s case, he were capable of learning the clarinet at retirement age, would or would not that throw light on his control of his own development and innovative qualities? No (the answers were tetchy by now) it would not.

Of the hundreds of concocted or reported examples of the relevance of Ashby’s Law that I have published over twenty-five years, this true story is the one offered here for the serious Ashbean connoisseur, because of its tail-eating involution. It seems that law which express mere tautologies ought to be disobeyed in the very process of declaring them tautologous and therefore not susceptible to being disobeyed, and that variety need not actually be requisite since everyone already knows that it must be. The connoisseur should feel the patina of the inside surfaces of this Klein Bottle, and ‘nose’ the bouqet of its inaccessible wine……




I cannot be sure when first we met; but Ross Ashby and I were seeing each other regularly in the second half of the ‘fifties. I was in the Sheffield steel industry then, and he was in Bristol – where also were two prominent cyberneticians. There was Frank George at the University, then alarming a psychology department oriented primarily towards monkeys with biscuit-tins-full of electronics simulating neuronal systems (he is now Professor of Cybernetics at Brunel University in Uxbridge). There was Grey Walter at the Burden Neurological Institute, who was the world authority of electroencephalography – but who was experimenting also with cybernetic tortoises of his own invention. Ashby himself, who was later to become Director of the Burden, was Director of Research (he was a psychologist) at Banwood House Hospital in Gloucester – which is where he worked when he wrote both his books. It was a regular practice for me, not surprisingly, to visit all three in Bristol in those years; and sometimes Ashby came to visit my department in United Steel.

It is hard to remember that Ross was a generation ahead of me. It did not feel like that; he would not allow it; he chopped off my awe at the knees. THAT was something he TAUGHT me about. It is more useful to confess to that than to acknowledge that he drew my attention to Bourbakian algebraic topology – although that was his (earlier) doing too, and very useful it proved to be.

Ashby’s later papers involving this kind of mathematics will take many years to elucidate. A large number belongs to the public domain (through the microfiches of the BCL publication made available through the University of Illinois Urbana). There is in addition, it seems 7188 pages of notebooks so far undisclosed… It is predictable that many future doctoral theses lie, inanimately suspended, in this yet-fecund soil.

Then what can here be said, in this short and anecdotal memoire, about Ashby’s view of algebraic topology? The answer is: he wore it round his neck.

Meticulous in intellect, meticulous in dress: and, unobtrusively beneath his tie, he wore a thin gold chain – consisting of a triple loop. It was in fact a topological knot, and one that fascinated him. I shall tell you how to make it; because he liked to demonstrate its subtle properties, and because I am holding in my other hand than the hand that holds my pen the nylon ropes that Ross himself strung together as the exemplar for the jeweller who made the golden chain.  It was kindly passes to me by Mrs Ashby; and if any reader has ever seen me wearing a triple string of wooden beads, it is the copy that I made of Ross’s knot.

Make three loops of string – just circles. Flatten the second, and hand it through the first. Its two loops hang down like a bloodhound’s dewlaps. Cut the third loop once, pass the end through each of the pendant loops, and rejoin. You now have two completely independent circles, connected by the central circle. If you next shake all this out, you have the triple necklace – in which the crossovers are unobtrusive. If you spread it out flat, and move the loops around and try to understand what’s happening, you may get into something of a trance. At any rate, that’s all the algebraic topology that you will get from me right now – I trust with Ashby’s blessing.



I think it was in 1957 that I invited Ross Ashby and Grey Walter out to lunch in Bristol. Both wore beards, but there the similarity ended. Ross had on a black jacket and striped trousers – or maybe not: the point is that he wore the most formal of his uniforms. Grey was wearing a suit that appeared to have been fabricated out of a green billiard baize – together with a string tie (think of the date; think of England!). I can still recall large chunks of the conversation, and so probably can the other diners nearby, who were taking in the scene open-mouthed, as if it were some sort of cabaret.

I approached the most difficult aspect of these recollections: the way in which Ross Ashby handled ‘awkward’ situations. This luncheon was certainly one, as all three of us expected it to be. The powerful magnetism exerted by these two older men had opposite polarity. Both behaved with the utmost courtesy, and I was beginning to learn how Ross would operate… [...] – we stood together to represent Britain in several international cybernetic scenes.

To provide details of any of these affairs would be journalistic, indiscrete and – in one word – uncouth. I just want to record that Ross Ashby would never ‘take advantage’; that he was perplexed by skullduggery (which he was certainly far too clever not to recognize); that he was endlessly gentle and endlessly tenacious – the most dire combination of all loving souls. Earlier, I used the word ‘innocent’, an operational synonym for which is ‘look out’.

Such impeccable behaviour took it out of him; and although he was wiry seemed to acknowledge his physical limits. I was three times with him when he simple stopped. On the most dramatic of these occasions, he not only stopped – but vanished. We had been in a negotiation together, and suddenly I was alone; bogus excuses had to be made…. ‘sudden recall’ and so on. Havivng no idea of the truth, but unworried for him, I limped on: two days later he reappeared – in a café that we had been frequenting. He had driven his car deep into the forest, and locked himself in to play the clarinet. He had had enough, but enough. Ashby knew Ashby’s law. It is astonishing: few people can thus assimilate what they have had occasion to know.

Ross was not only honest to the threshold of pain, he was extremely sensible as well. That being so, I realized that he is not going to allow me much longer….



Late in 1960, a group of Heinz von Foerster’s friends were together in the evening at Heinz’s home in Urbana, Illinois. A complicated ballet ensued, the choreography of which I do not altogether remember. At the precisely proper – the balletic -  moment, Heinz offered Ross a Chain in BCL. He quietly accepted, without a moment’s pause, and asked to telephone his wife back home in Bristol. It was the middle of the night: thank goodness that the sun moves from East to West. Everyone concerned was totally astonished – Mrs Ashby, I think I may say, especially. And so he changed his life: for vitally important years 1961- - 1970, W Ross Ashby MD was Professor in the Department of Biophysics and Electrical Engineering at the University of Illinois in Urbana. It couldn’t have happened to a nicer psychiatrist.

We walked back alone together to the Faculty Club, where we had adjacent rooms, across the campus under a full moon. We were strolling quietly and relaxed. I told him that I was amazed at his instance decisiveness. He asked me why. I talked about his scientific acumen, his meticulous methodology, his exactitude: I had expected him to ask for a year to consider, to evaluate the evidence for and against emigration. Surely his response had been atypically irrational?

He stopped in his tracks and turned to me, and I shall never forget his TEACHING me at that moment. No, he said calmly. Years of research could attain to certainty in a decision of this kind: the variety of the options had been far too high. The most rational response would be to notice that the brain is a self-organizing computer which might be able to assimilate the variety, and deliver an output in the form of a hunch. He had felt this hunch. He had rationally obeyed it. And had there been no hunch, no sense of an heuristic process to pursue? Ross shrugged: ‘then the most rational procedure would be to toss a coin’. I wrote in his Times obituary about this judgment that ‘the first comment came from a man who knew as much the computer-in-the-skull as anyone alive, the second from a man devoid of self-delusion’.


Someone is boring me. I think it’s me.

All right Ross. That’s it.