Thursday 29 April 2021

Real "Digital" vs Education's idea of "digital": Some reflections of computational thinking

Digitalization is (once again) the hot topic in education. Amid concern that students leave university without digital skill, educational policy is focusing on "instilling digital skill" from primary school upwards. In Europe and the US, this is labelled "computational thinking", and is closely related to the (rapidly failing) drive to push computer science in schools.

Rather like the STEM agenda, to which it is of course related, there is a difference between education's idea of "digital" and the real-world of "digital" which is happening in institutions and companies, for which skills are definitely needed. 

What is the real world of digital? Perhaps the first thing to say is that there is no single "real world". There are dispositions which are shared by technical people working in a variety of different environments. And there are vast differences between the kinds of environments and the levels of skill involved. For example, Python programming to analyse data is one thing, using tools like Tableau is another. There are the hard-core software development skills involved in enterprise system development with various frameworks (I'm currently banging my head against Eclipse, Liferay and Docker at the moment), and then there are those areas of skill which relate to the sexier things in technology which grab headlines and make policymakers worry that there is a skills gap - AI particularly.

So what do governments and policy makers really mean when they urge everyone towards "digitalization"? After all, engineering is pretty important in the world, but we don't insist on everyone learning engineering. So why computational thinking? 

Part of the answer lies in the simple fact of the number of areas of work where "digital" dominates. The thinking is that "digital skill" is like "reading" - a form of literacy. But is digital skill like reading and writing? Reading, after all, isn't merely a function which enables people to work. It is embedded in culture as a source of pleasure, conviviality, and conversation. By contrast "digital skill" is very pale and overtly functionalist in a way that reading and writing isn't.  

The functionalism that sits behind computational thinking seems particularly hollow. These are, after all, digital skills to enable people to work. But to work where? Yes, there is a need for technically skilled people in organisations - but how many? How many software developers do we need? How many data analysts? Not a huge amount compared to the number of people, I would guess. So what does everyone else do? They click on buttons in tracker apps that monitor their work movements, they comply with surveillance requests, they complete mindless compulsory "training" so that their employers don't get sued, they sit on zoom, they submit ongoing logs of their activities on computers in their moments of rest, they post inane comments on social media and they end up emptied and dehumanized - the pushers of endless online transactions. Not exactly a sales pitch. Personally, I would be left wishing I'd done the engineering course!

A more fundamental problem is that most organisations have more technically-skilled people than they either know about, or choose to use effectively. This is a more serious and structural problem. It is because people who are really good at "digital" (whatever that means) are creative. And the last thing many organisations (or many senior managers in organisations) want is creativity. They want compliance, not creativity. They want someone who doesn't show them up as being less technically skilled. And often they act to suppress creativity and those with skills, giving them tasks that are well beneath their abilities. I don't think there's a single organisation anywhere where some of this isn't going on. Real digital skill is a threat to hierarchies, and hierarchies kick back.  

Educational agendas like computational thinking are metasystemic interventions. Other metasystemic interventions are things like quality controls and standards, curricula, monitoring and approved technical systems. The point of a metasystemic intervention is to manage the uncertainty of the system. Every system has uncertainty because every system draws a distinction between itself and the environment - and there is always a question as to where that boundary should be drawn, and how it can be maintained. The computational thinking agenda is an attempt to maintain an already-existing boundary.

Our deep problem is that the boundary between all institutions, companies and other social activities and their environments has been upheld and reinforced with the increasing use of technology. Technology in the environments for these institutions has been the root cause of why the institutional boundaries have been reinforced with technology in the first place. Technology is in the very fabric of the machine that maintains the institutions that we have, which themselves have used technology to avoid being reconstructed. The problem institutions have is that in order to maintain their traditional boundaries they must be able to maintain their technologies. Therefore they need everyone to comply with and operate their technologies, and a few to enhance them. But how does this not end in over-specialisation and slavery? How does it create rewarding work and nurture creativity?

No education system and no teacher should be in the business of preparing people for servitude. So what's to be done?

The question is certainly not about digital "literacy". It is about emancipation, individuation and conviviality in a technological environment. Our technologies are important here - particularly (I think) AI and quantum computing. But they are important because they can help us redesign our institutions, and in the process discover ourselves. That, I suspect, is not what the policy makers want because ultimately it will threaten their position. But it is what needs to happen. 

No comments: