Friday 27 October 2023

Computer metaphors and Human Understanding

One of the most serious accusations levelled against cognitivism is that it imposed a computer metaphor over natural processes of consciousness. At the heart of the approach is the concept of information as conceived by engineers of electronic systems in the 1950s (particularly Shannon). The problem with this is that there is no coherent definition of information that applies to all the different domains in which one might speak of information: from electronics, to biology, to psychology to philosophy, theology and physics.

Shannon information is a particularly special case and unique in the sense that it provides a method of quantification. Shannon himself, however, made no pretence in applying this to other phenomena than the engineering situation he focused on. But the quantified definition contains concepts other than information - most notably, redundancy (which Shannon, following the cyberneticians including Ashby identified as constraint on transmission) and noise. Noise is the reason why the redundancy is there - Shannon's whole engineering problem concerned the distinguishing of signal from noise on a communication channel (i.e. a wire). 

Shannon was involved with the establishment of cybernetics as a science. He was one of the participants at the later "Macy conferences" where the term "cybernetics" was defined by Norbert Wiener (actually, it may have been the young Heinz von Foerster who is really responsible for this). Shannon would have been aware that other cyberneticians saw redundancy rather than information as the key concept of natural systems: most notably, Gregory Bateson saw redundancy as an index of "meaning" - something which was also alluded to by Shannon's co-author, the philosopher Warren Weaver.

But in the years that followed the cybernetic revolution, it was information that was the key concept. Underpinned by the technical architecture that was first established by John von Neumann (another attendee of the Macy conferences), computers were constructed from a principle that separated processing from storage. This gave rise to the cognitivist separation of "memory" from "intelligence". 

There were of course many critiques and revisions: Ulrich Niesser, for example, among early cognitivists, came to challenge the cognitivist orthodoxy.  Karl Pribram wrote a wonderful paper on the importance of redundancy on cognition and memory (The Four Rs of Remembering6 see karlpribram.com/wp-content/uploads/pdf/theory/T-039.pdf). But the information processing model prevailed, inspiring the first wave of Artificial Intelligence and expert systems from the late 80s to the early 90s. 

So what have we got now with our AI? 

What is really important is that our current AI is NOT "information" technology. It produces information in the form of predictions, but the means by which those predictions are formed is the analysis and processing of redundancy. This is unlike early AI. The other thing to say is that the technology is inherently noisy. Probabilities are generated for multiple options, and somehow a selection must be made between those probabilities: statistical analysis becomes really important in this selection process. Indeed, within own involvement with AI development in medical diagnostics, the development of models (for making predictions about images) was far less important than the statistical post-processing that cleaned the noise from the data, and increased the sensitivity and specificity of the AI judgement. It will be the same with chatGPT: there the statistics must ensure that the chatBot doesn't say anything that will upset OpenAI's investors!

Information and redundancy are two sides of the same coin. But redundancy is much more powerful and important in natural systems, as has been obvious to researchers in ecology and the life sciences for many years (notably, statistical ecologist Robert Ulanowicz, economist Loet Leydesdorff, Bateson, Terry Deacon, etc). It is also fundamental to education - but few educationalists recognise this.

The best example is in the Vygotskian Zone of Proximal Development. I described a year or so ago how the ZPD was basically a zone of  "mutual redundancy" (here: Reconceiving the Digital Network: From Cells to Selves | Request PDF (researchgate.net) ), drawing on Leydesdorff's description. ChatGPT emphasises this: Leydesdorff's work is of seminal importance in understanding where we really are in our current phase of socio-technical development. 

Nature computes with redundancy, not information - and this is computation unlike how we think of computation with information. This is not to leave Shannon behind though: in Shannon, what happens is selection. Symbols are selected by a sender, and interpretations are selected by a receiver. The key in the ability to communicate is that the complexity of the sending machine is equivalent to the complexity of the receiving machine (which is a restatement of Ashby's Law of Requisite Variety - Variety (cybernetics) - Wikipedia). If the receiver doesn't have the complexity of the sender there will be challenges in communication. With such challenges - either because of noise on the channel, or because of insufficient complexity on the part of the receiver, it is necessary for the sender to create more redundancy in the communication: sufficient redundancy can overcome a deficiency in the complexity of the receiver to interpret the message. 

One of the most remarkable features of AI generally is that it is both created with redundancy, and it is capable of generating large amounts of redundancy. If it didn't, its capacity to appear meaningful would be diminished. 

For many years (with Leydesdorff) the nature of redundancy in the construction of meaning and communication has fascinated me. Music provides a classic example of redundancy in communication - there is so much repetition, which we analysed here: onlinelibrary.wiley.com/doi/full/10.1002/sres.2738. I've just written a new paper on music and biology which will be published soon which develops these ideas, drawing on the importance of what might be called a "topology of information" with reference to evolutionary biology. 

It's not just that the computer metaphor doesn't work. The metaphor that does work is probably musical.

1 comment:

Ib said...

Wonderful! And love the shift to music. Thanks!