Saturday, 7 September 2019

Information Loss and Conservation

One of the ironies of any "information system" is that they discard information. Quite simply, anything which processes large amounts of data to produce an "answer", which is then acted on by humans, is attenuating those large amounts of data in various ways. Often this is done according to some latent biases within either the humans requesting the information, bias within the datasets that are processed, or bias within the algorithms themselves. Bias is also a form of attenuation, and the biases which have recently been exposed around racial prejudice in machine learning highlight the fundamentally dangerous problem of loss of information in organisations and society.

In his book "The human use of human beings" Norbert Wiener worried that our use of technology sat on a knife-edge between it either being used to destroy us, or to save us from ourselves. I want to be more specific about this "knife edge". It is whether we learn how to conserve information within our society and institutions, and avoid using technology to accelerate the process of information destruction. With the information technologies which we have had for the last 50 years, with their latency (which means all news is old news) and emphasis on databases and information processing, loss of information has appeared inevitable.

This apparent "inevitable" loss of information is tacitly accepted by all institutions from government downwards. Given the hierarchical structures of our institutions, we can only deal with "averages" and "approximations" of what is happening on the ground, and we have little capacity for assessing whether we are attenuating out the right information, or whether our models of the world are right. To think this is not inevitable, is to think that our organisations are badly organised - and that remains an unthinkable thought, even today. Beyond this, few organisations run experiments to see if the world they think they are operating in is the actual world they operate in. Consequently, we see catastrophe involving the destruction of environments, whether it is the corporate environment (banking crisis), social environment (Trump, Brexit), the scientific environment (university marketisation), global warming, or the economic system.

Of course, attenuation is necessary: individuals are less complex than institutions, and institutions are less complex than societies. Somehow, a selection of what is important among the available information must be made. But selection must be made alongside a process of checking that whatever model of the world is created through these selections is correct. So if information is attenuated from environment to individual, the individual must amplify their model of the world and themselves in the environment. This "amplification" can be thought of as a process of generating alternative descriptions of the information they have absorbed. Many descriptions of the same thing are effectively "redundant" - they are not strictly necessary, but at the same time, the capacity to generate multiple descriptions of the world creates options and flexibility to manage the complexity of the environment. Redundancy creates opportunities to make connections with the environment - like creating a niche, or a nest - rather in the same way that a spider spins a web (that is a classic example of amplification).

The problem we have in society (and I believe the root cause of most of our problems) is that the capacity to produce more and more information has exploded. This has produced enormous unmanageable uncertainty, and existing institutions have only been able to mop-up this uncertainty by asserting increasingly rigid categories for dealing with the world. This is why we see "strong men" (usually men) in charge in the world. They are rigid, category-enforcing, uncertainty-mops. Unfortunately (as we see in the UK at the moment) they exacerbate the problem: it is a positive-feedback loop which will collapse.

One of the casualties of this increasing conservatism is the capacity to speculate on whether the model of the world we have is correct or not. Austerity is essentially a redundancy-removal process in the name of "social responsibility". Nothing could be further from the truth. More than ever, we need to generate and inspect multiple descriptions of the world that we think we are living in. It is not happening, and so information is being lost, and as the information is lost, the conditions for extremism are enhanced.

I say all this because I wonder if our machine learning technology might provide a corrective. Machine learning can, of course, be used as an attenuative technology: it simplifies judgement by providing an answer. But if we use it like this, then the worst nightmares of Wiener will be realised.

But machine learning need not be like this. It might actually be used to help generate the redundant descriptions of reality which we have become incapable of doing ourselves. This is because machine learning is a technology which works with redundancy - multiple descriptions of the world - which determine an ordering of judgements about the things it has been trained with. While it can be used to produce an "answer", it can also be used to preserve and refine this ordering - particularly if it is closely coupled with human judgement.

The critical issue here is that the structures within a convolutional neural network are a kind of fractal (produced through recursively seeking fixed points in the convolutional process between different levels of analysis), and these fractals can serve the function of what appears to be an "anticipatory system". Machine learning systems "predict" the likely categories of data they don't know about. The important thing about this is, whatever we think "intelligence" might be, we can be confident that we too have some kind of "anticipatory system" built through redundancy of information. Indeed, as Robert Rosen pointed out, the whole of the natural world appears to operate with "anticipatory systems".

We think we operate in "real time", but in the context of anticipatory systems, "real-time" actually means "ahead of time". An anticipatory system is a necessary correlate of any attenuative process: without it, no natural system would be viable. Without it, information would be lost. With it, information is preserved.

So have we got an artificial anticipatory system? Are we approaching a state where we might preserve information in our society? I'm increasingly convinced the answer is "yes". If it is "yes", then the good news is that Trump, Brexit, the bureaucratic hierarchy of the EU, are all the last stages of a way of life that is about to be supplanted with a very different way of thinking about technology and information. Echoing Wiener, IF we don't destroy ourselves, our technology promises a better and fairer world beyond any expectations that we might allow ourselves to entertain right now.


No comments: