Tuesday 1 December 2015

Re-understanding "Understanding Computers and Cognition"

I will always regret nobody told me about Winograd and Flores's "Understanding computers and cognition" when I was a teenager first encountering computers. As it was, I read it at the recommendation of Oleg Liber in 2002, and it transformed my perspective of not only technology, but art, education, emotion and meaning. It provided a framework for a deeper understanding of technology and its relation to human organisation which is so fundamental to the exploitation of computers in education. I am so grateful for this, although in the years that have passed, through numerous projects involving technology and education, my enthusiasm for the cybernetic theory and the phenomenological and analytical philosophy (Heidegger and Speech act theory) that underpinned Winograd and Flores has waxed and waned. But now we live in age when our teenagers cannot remember a world without the internet, it is a book that demands study even more urgently.

Winograd and Flores's (really, it's Flores's book) real achievement is that they asserted that computers were about communication not data processing and AI (which was the dominant view in 1986), and they were proved spectacularly right a few years later with the advent of the web. It's notoriously hard to make technological predictions: they showed the way to do it - with cybernetics and philosophy!

But that was 1986. What would they say if they were to write it now? Their theoretical ground is slow moving - but there has been movement - most notably from John Searle, whose Speech act theory they relied on most heavily, using it to construct their "conversation for action" model. In recent years, Searle has thought very deeply about "social reality" - something which his younger self would have dismissed as an epiphenomenon of speech acts. His recent work remains language-based, but he acknowledges the existence of social institutions, presidents, money and armed forces as something more than an individual construction. Social reality is constituted by special kinds of speech act called 'status functions': declarations by powerful individuals or institutions about states of affairs, networks of rights, responsibilities, obligations and commitments, upheld by a 'collective intentionality' which plays along with the declaration. So we have 'money' (the status function declaration "I promise to pay the bearer..."), university certificates, company declarations, laws, and so on.

We also now have software, online networks, educational technologies, web services, systems interoperability, Twitter, Facebook, porn, trolls and Tinder (to name a few!) How do status functions amd collective intentionality relate to these? The complexity of these new technological forms of "social reality" make me think that Winograd and Flores's original "conversation for action" diagram now needs to be re-thought. They saw computers as ways of managing social commitments we make to each other (commitment has been a key feature of Flores's work).. But commitments are situated within a web of status function declarations which make up the social world. The speech acts that people make in agreeing or disagreeing to do something are much more nuanced than Winograd and Flores originally thought. Technologies now come with layers of commitments: to agree to use system x is to get sucked into a range of new status functions which aren't immediately visible on the surface. Teachers might initially think e-portfolio is a good idea; but after experience with the e-portfolio system, they find the commitments to the various sub-status functions of the system conflict with other aspects of their practice, and so they find themselves either not doing what they originally committed to do, or having to rethink fundamental parts of their practice which they might not have reckoned with at the outset. This can help to explain why thousands of people sign-up for MOOCs, but so few complete them.

As our technology becomes more complex and our institutions become more technocratic, the accretions of layers of status functions within the technology demands an ever-shifting compliance. The problem is that critical engagement with the technology - where we seek appropriate technical solutions to real social problems - can lose out to slavish human adaptation to the technical machinery with the consequent loss of responsibility-taking and autonomy: we let the technology create problems it can solve. The result is a conflicted self, torn between human needs and technical requirements. The later Heidegger (which Winograd and Flores ignore, concentrating on his earlier work) had a rather bleak name for this: "enframing". 

No comments: