Friday 9 October 2020

Mutual Information, Mutual Redundancy and the Cell

Shannon's measurements of Mutual Information and Mutual Redundancy have the same kind of organic feel to them that was originally displayed in Ashby's homeostat, and many other cybernetic devices. This organic correlation may have application in the design of new kinds of communication networks which operate on a cellular and ecological basis rather than through a "node-arc" model.

Mutual information is clearly defined as the overlap between the entropies of 2 phenomena - the extent to which one entropy can be coordinated with the entropy of the other. Thus it is a measure of the similarities in degrees of disorder between two systems. It turns out that this "similarity in degree of disorder" is particularly useful in calculating the extent that an information source has been transferred to a receiver, which may develop, in time, the capacity to predict the information produced by the source. Thus mutual information can also be considered as a measure of the "transfer" of information.

It's calculation can be simplified to the entropy-of-A +  entropy-of-B - entropy of A and B together.

Most of the time, mutual information in 2 dimensions like this produces a positive result. Indeed, it has been shown that mutual information cannot be negative. Yet, under certain circumstances, using Shannon's equations, it is, and negative values of mutual information have prompted much speculation as to what this means.  

In 3 dimensions, mutual information is more likely to be negative.  

These information-theoretical measurements can be related to three fundamental features of cellular organisation. Mutual information can be considered to represent the degree of self-organisation within a cell. Mutual redundancy concerns the overlap in the pattern of constraint between a cell and its environment. A cell also requires energy from its environment and this can be represented by the extent to its range of possible actions can be expanded through interacting with its environment (it's maximum entropy). 

In information theoretical analysis of economic activity, the three measures of mutual information, mutual redundancy and maximum entropy can be used to measure the level of innovation in an economy. However, this high-level calculation depends on lower-level processes involving groups of individuals within institutions. If the high-level organisation of the economy can be seen as an "organism", then might the low-level communications of individuals within the economy been seen as constituent "cells"?

Examining this from the perspective of education is interesting. Educational "cells" are not individuals. They are conversations involving a number of people (just as cells contain many interacting components)- and conversations display exactly the same features of mutual information, mutual redundancy and maximum entropy. Importantly too, however, is the fact that conversations have a history. The way a conversation develops depends not just on its history, but on the history of its components. The personal biographies of a cell's components will play an important role in the development of a conversation.

When academics talk about this "cell-like" communication structure, it is sometimes related to the structure of terrorist groups like the IRA, or the mafia, or the French Resistance. It is a principal characteristic of a Clandestine Cellular Network (see https://en.wikipedia.org/wiki/Clandestine_cell_system) . Thinking about terrorist groups highlights the importance of a recursive structure in cells: the personal biographies of terrorists and freedom fighters is often tied to emotional trauma in individual histories. The trauma is instrumental in the larger communication cell growing. 

But going deeper still, the "cells" of conversations depend on biology - real cells. These too interact on the same principles - mutual information in their self-organisation; mutual redundancy in their engagement with their environment; maximum entropy in their gaining of energy and information from the environment. These cells too have a history which will determine the direction of their own development: cells have "hysteresis", bearing the marks of previous stages of evolution. 

Information theory is important to this because it provides a way whereby we can ask "are the patterns of organisation - between mutual information, mutual redundancy and maximum entropy - related?" Are the patterns of a cell related to the patterns of a conversation? Are the patterns of a conversation related to the patterns of an economy? 

A mathematical-empirical foundation for asking these questions is important: it allows us to take measurements and make predictions. It allows us to do simulations. It feels like a different kind of science, that takes organisation, history and communication together at multiple levels, and across phenomena. My interest in this is to explore new ways in which these equations can lead to a re-formation of educational structures using technology.

No comments: