Wednesday, 5 June 2013

Beyond Understanding Computers and Cognition

Winograd and Flores's1986 book "Understanding Computers and Cognition" (really just Flores since he wrote much of it as his PhD in a Chilean prison after Pinochet killed Allende and the Cybersyn project) still is a landmark in the technology literature. It was the book that argued that it wasn't data processing and AI that was the future of computing, but communication. It's difficult to make accurate predictions in technology, and this was a particularly good one considering the web was still some way off. How did they make their predictions? Well, philosophy, cybernetics and biology were the main ingredients in what is still a rollicking read (I've recommended it to a number of students who've loved it too). Anyone who thinks technology is a deathly topic of inquiry (as I did when I came across it) should read it.

They were right. Then. But now? Things have a habit of coming full-circle. Their first sentence was "Computers are everywhere." What would we say now? First of all, ubiquity has made "computers" almost disappear. But there's something else. "Data processing is everywhere!" That's where we are now. There isn't a single thing that we do online which doesn't get analysed and fed into some kind of decision-support system. And the algorithms are getting better and better.

I wrote a few days back about how Discrete Wavelet Transforms are facilitating the analysis of continuous data from Video Mining to medical applications that listen to the sound of bowel movements (or anything else). This is basically 'fingerprinting' technology. YouTube can to its video recognition because the Wavelet transformations allow for the production of a highly compressed unique key to each video. With a searchable databank and a simple method of applying the algorithm in such a way that it reveals a part of the key that can be searched with, it is easy to see how they do it.

With text mining, techniques learnt from genetic sequencing similarly are "fingerprinting" our utterances. Facebook and Google know more about us than we think we have revealed. The vast fingerprinting and trawling of patterns of human text engagements means that reasonable prediction becomes a reality. There are of course many applications of this which could be great. Add to this the semantic networks which exploit the data mining continually adapting new inferences and rules, there's some impressive stuff going on. But it's all rather Orwellian, isn't it?

Winograd and Flores would say "This isn't AI". Maybe. But it's bloody clever all the same. And it's new. It depends on the fact that we have now wired ourselves up to each other in a way that even Flores would have been astonished by.

I think we need to re-look at computers. Our world has changed. Super-powerful inference engines exhibit a kind of 'agency': information, after all, is constraint. But I put 'agency' in inverted commas because I don't think it is agency (I disagree with the Socio-materialists and Actor-Network theory people here). At least it's not the agency of a machine. Agency, I think, requires conscience - and computers can't have that.

The technology question is, more than ever before, a political question. The agency of the machines is the agency of the people behind the machines. Who are they? Who votes for them? What are they up to? How can we get rid of them if we don't like them?

What they are up to should concern all of us. Their motivation is one problem. But their methods are dangerous. The apparent cleverness of the data anlysis increasingly takes us into a 'positive' world where all that matters is that which can be shown on a network graph. What about all the stuff in between? What about the fact that whatever I just tweeted, it wasn't quite what I really thought - just an approximation. An abstraction. Not the reality. Yet whatever has been concretely launched into cyberspace is "the truth". We're losing our grip on the space between things. It's like analysing thousands of Bach fugues to determine his 'rules', and then asserting the "Bach was following rules". [of course, he didn't write thousands!]

Whether these are good people or bad people, their naivity is akin to recklessly drilling for oil at the bottom of the ocean. But the real problem is that they act outside the regulations upheld by the people we vote for. The people acting through machines are a new elite who we do not vote for. This is a challenge for government and democracy.

Bringing technology into the political domain may be the most significant challenge we face in the 21st century.

No comments: