Monday, 18 January 2021

Computing Conviviality

Conviviality is a state of being between members of group whose work within that group is felt by all to be meaningful, cohesive and enlivening. Classic descriptions of what it is like to engage in convivial work include the crop-mowing scene in Anna Karenina, where Tolstoy describes the aristocrat Levin's experience of sharing the manual labour with his workers. For Tolstoy, the aristocratic socialist that he was, the portrait of Levin was pretty much a self-portrait. But he knew conviviality was a thing relating work and technology.

Conviviality with modern technology is hard to conceive of. We may all be on Facebook, or are editing the odd shared spreadsheet, but each of us is pursuing personal goals and personal ambition: how can I look good by doing this? will this get me promoted? can I publish about this activity and become famous? This is particularly the tragedy of modern universities - even for academics who preach socialist principles, often the flip side is an ego which seeks to publish their excoriations of the system as a big-hitting journal paper. 

Conviviality entails the loss of ego as far as possible; it entails service to the group over the climb to the top. But most importantly, conviviality entails collective technological work. This is where things get very complicated for conviviality in the 21st century.

Our computer technology allows each of us to transform our environment in remarkable ways. For Illich, whose study of conviviality is one of the most penetrating, the antithesis of conviviality was represented by the mechanical digger: unlike the shovel which required the collective effort of many to dig a hole, the JCB could do the work of 100 people in one go. Computers are not shovels; they are JCBs. 

Heidegger had an interesting way of talking about the fruits of human technological labour: it served, he argued, to reorder nature into what he called the "standing reserve" - the world as revealed to us as "instrumental". Through such reordering, a field could become a coalmine, or a river bank could become a point of crossing with a bridge.  With computer technology, the standing reserve that is manipulated is often data: a disordered collection of data becomes categoried and compartmentalised; a stream of categorised data can coordinate new human action for a common purpose. In this way, technological work can be convivial perhaps - except that it rarely feels like it. 

Heidegger's metaphor can be useful if we are to reconceive conviviality in a the digital age. The problem is that computer tools are generally not very good tools for doing the jobs they are intended to do. This is partly because the design of tools is generally shaped by the work-as-imagined by the software developers, not the work-as-done by the actual users. In trying to engage in the work-as-done, users have to find ways of working around the work-as-imagined in order to solve their problems. Typically this involves many mouse clicks, the frustration that the interface doesn't make the "thing that you want it to do" easy, or that the system simply doesn't work as expected. These experiences apply particularly to educational technology. In my institution, where we've recently introduced Canvas and Teams, it is noticeable how much time is being spent by academics dealing with an interface which, whilst it is much better than Blackboard, still demonstrates the difference between work-as-imagined and work-as-done. 

This requires both what David Graeber calls "imaginative labour", and an awful lot of tedious clicking - so the high-level intellectual work of academics is replaced with low-level interface work.

A possible homologue of Heidegger's river bank or empty field is the software designer's interface constraints. Although such things are the product of technology, and in some sense, already "standing-reserve", it also presents humans with natural challenges in the form of stress, boredom, etc. The problem is that everybody is having to build their own bridge through the interface - the scope for collectively working with each other to build a bridge that everyone can use is constrained by barriers in the software, or institutional barriers which stop people getting access to aspects of the software which might be changed for the better for all. The second problem is that even if it was possible to build a bridge across the software's "river bank" for all, there would still be a gap between "work-as-imagined" and "work-as-done" for new users of the system.

Conviviality requires co-design. Enid Mumford's work on co-designing systems with nurses in the 1980s and 90s has become very important to me recently: she certainly deserves much greater recognition (and an alumnus of Liverpool University). For Mumford in the 80s, coding was sufficiently common to all forms of interface so that it enabled her to find ways of engaging nurses at quite a technical level in the functionality of the system, so that they could shape it to reflect work-as-done.

Today, the equivalent level of commonality is the Application Programming Interface. Good computer systems today are built of services, and good systems with good APIs make all the underlying services available to anyone who wants to access the data providing they have sufficient privileges. 

I've spent some time recently working with professional service staff and academics in working with APIs to achieve effective and efficient solutions to the problems of the interfaces. While the solutions we have developed together can be redistributed to other staff, it is much better if the means for creating those solutions is shared so that everybody understands how they can build bridges together. This is not to say that everybody should learn to program, but it is to say that engaging with a deeper technical understanding of systems is to open doors to deeper institutional discussions about the problems that academics and support staff face.

Conviviality occurs when we recognise the problems that each of us face, and relate them to our own problems, whilst recognising that working together can help us all. Coding creates a universal framework for articulating and sharing deep problems - it is not something that is done TO people: that merely creates a new set of constraints of "work as imagined". The magic only works if it is done WITH people. 

Friday, 8 January 2021

Recursion and Function

I attended an interesting talk before Christmas on music and structure at Liverpool's PhD music discussion group (which is great fun!).  The talk was really about the "function" of musical chords, and how this function related to the structure of the music. What was being argued was that there was a kind of recursive relation between the deep structure and the function on the surface. This isn't a new idea, of course - Heinrich Schenker was the first to talk about this over 100 years ago - but it was interesting because it muddied the distinction between structure and function: function was something that is "done" by some component (a chord) which in aggregate articulates a structure. Or does it? Is it structure which determines the function of its surface components? The problem is that when "structure" is discussed, it is always comprised of some fundamental unit (in this case, a "function"), but one could not talk objectively about the structure if one were to consider a different function - a different way of carving up the structure. 

Our understanding of "function" is rather confused more generally - not just in music. It does appear to be the case that music has some kind of recursive structure - units of homologous processes repeat at different orders of scale. It is a fractal in this sense. But fractals in nature are not made from simply repeating some kind of pattern; they emerge through interference of multiple variables at different orders of scale. It is the difference between an L-system fractal (see L-system - Wikipedia), which can produce something that looks like a tree, and a real tree. Photographic holograms provide the real clue: these are images of 3D objects encoded into a 2D pattern by virtue of the interference between the beams of laser light which are used to encode the image. The fractal pattern that emerges is actually an encoding of space and time, since space and time are the fundamental values in the frequency of light.

We might look at a hologram produced in this way and say that it is comprised of a number of features which "repeat". We might even argue that there are particular functions which repeat. And yet these are some surface features which resut from a deeper process of interference which we cannot fully comprehend. Ascribing "function" to some element is our way of dealing with this uncertainty. But it can lead us astray.

What is the process of interference that leads to the fractal in which we identify "functional units"?

Ultimately, I cannot see how it is other than physical and physiological. In other words, it involves (quantum) mechanics and cells. Music is clearly within this realm: vibrations in the air and physiological responses. What is remarkable, given the variety of physiological possibilities, is that there can be any agreement about function at all. This suggests to me that the social function of coordination and agreement serves in some way to establish coherence and pattern in the fractal between social groups - and that indeed, this may be fundamental to inter-human (or indeed, inter-organism, or inter-cell) coordination.  

We are given to believing that there are fundamental surface units of functionality which produce coordination. Yet, we are continually reminded that this is not the case. From this we conclude:

  1. There is nothing fundamental in nature that unites us, or that can be harnessed to orchestrate our minds better: it's just "culture";
  2. There remains to be found some structural method which with sufficient force, vigour or control can mobilise collective action;
  3. It is the job of education to programme the young to reinforce the functions of existing ways of thinking, and to teach them that this is as good as it gets.

Point 1 will lead to destruction of our environment (as it is doing!). Point 2 leads to totalitarianism. Point 3 leads to the enslavement of education to those in power rather than an authentic inquiry into nature. 


Sunday, 3 January 2021

Covid-19, Science and Education

One of the best things I watched on terristrial TV this new year was the Parliament channel's recording of a special meeting to discuss the scientific evidence on the new Covid-19 variant. It's not often we see scientists in almost a "native" mode, expressing the uncertainty around knowledge about a virus, and explaining this uncertainty as best they can to politicians whose job it is to formulate policy. You can watch it here: Parliamentlive.tv - Science and Technology Committee

The scientific rigour is impressive (and some of the questioning by the politicians is also very acute - did we inadvertantly cause this mutation? will the vaccine work against it? - no definitive answers to any of this). But it also struck me that this is a group of people who are totally focused on their scientific niche, when the virus presents a systemic problem and there is no coherent way to connect the rigour of virology to the broader social questions - particular those about children, viral transmission and schooling.

About schooling, Dawn Butler asked a pertinent question about whether schools should be closed. This was one of the more disappointing moments from the scientists. They fudged around the issue, saying things like "obviously not going to school damages children... so it's a balance".

Obviously? What do they mean by that exactly? Where is the comparable rigour behind that assumption that they show in their dealings with the virus? Calum Semple from the Sage committee was on the radio the other day even saying that the approach to education was a "whole system approach". Really? I don't know what his understanding of "whole system" is, but it doesn't look like something that systems theorists would understand as "whole system". "Whole system approach" has become a kind of sop to fend off awkward questions: interesting how the establishment appropriates systems thinking, and then does the opposite. 

We can't really have a whole system approach until we have some grasp of the relationship between natural systems and cultural systems. Our problem with education is that we conceive of it being entirely cultural, and that our culturally-defined parameters and metrics which determine the success of education are divorced from natural biological and psychological factors. Covid is screaming at us that these things are not seperable, and that we need better scientific theory in education and learning. 

In the time of Piaget, there was little doubt that learning was connected to nature. The challenge was to find the best way to connect the natural processes with the cultural, institutional organisation of society's education system. Now hardly anyone talks about Piaget in any depth. Constructivism (if anything pedagogical is discussed at all) takes its place, and rather like "whole systems", the word has become divorced from its scientific roots, and splashed around as a badge of honour to mask pedagogical and institutional approaches to learning which are the very opposite of what Piaget was talking about. Constructive alignment anyone?

Today the government let it be known that "online learning was a last resort" for education. A last resort from what exactly? From the cultural system that we call education which wants to maintain its convenient divorce from nature and science. As the world moves into a new era where a combination of working online alongside mass unemployment is going to bring intense stresses on existing social structures, are we really saying that face-to-face mass education which is little changed since 1900 is the way forwards? How does that prepare the kids for the world as its becoming? The answer is, it doesn't: it prepares the kids for their parents' and teachers' world that we are leaving behind. 

This is not to say that the online education we have today is good. It's obviously mostly terrible. But it's terrible because the computer is being used to reproduce the function of the old system. But a proper "whole system approach" would change the system. 

I don't think it's that difficult to imagine how things could be much better.  The effect of technology upholding ancient institutional practices and structures has been an increasing transactionalisation, increasingly mundane technological work for learners and teachers, and a fundamental loss of meaning in the activities of learners and teachers, from primary school onwards. Education risks meaninglessness and irrelevance. 

How do we put the meaning back into our teaching and learning activities? They must be reconceived around the manifest and fundamental uncertainties that face us all. Rather than drilling knowledge that everyone already knows into young minds, education could be a process of renewal where the questions that nobody knows the answer to are addressed by young and old together. And yes, you can teach maths like that! That was pretty much what was going on the Science and Technology Committee. 

Monday, 21 December 2020

The Technological Meaningfulness of Academic Work in the 21st Century

I was speaking to an academic colleague the other day who was trying to work out why her quizzes weren't displaying properly in the VLE. She'd spent many hours entering them and editing them, only to become frustrated that they didn't appear to display as she wished. We worked on the problem together and sorted it out (this is very much my preferred course of action in these circumstances), but I asked her how she felt as she was entering all this data. Of course, it's incredibly boring and tedious. I remarked that one of the effects of technology in education is the amount of low-level repetitive work it creates for people who feel they are meant to engage in high level intellectual work.   

A similar situation arose with a member of professional service staff who was trying to get some specific data from the system that the system would not provide through the interface. Basically, we wrote a little program together (our shared Python coding environments are brilliant for this kind of thing), and again, solved the problem together. Here the alternative repetitive low-level work was by proxy, as a different system would be used that demanded far more tedious work than would be required from a bit of computational thinking and practice. Finally, I have been working with our medical school, the vets and the dentists in trying to synergise their educational approaches (they basically share a similar technical and pedagogical approach). Doing these things with people (not doing it to them) has become important to me, just as the process of working together to create synergies in activity. It's about bringing wholeness to the relationship between technical and intellectual work.

More broadly these situations have led me to reflect on the nature of academic work, and the relationship between high level and low level work. The distinctions are difficult: artistic work, for example, often features much repetition - indeed, it is characterised by this. But where that feels fulfilling and in a certain sense "high level" because it leads to self-expression, the tedious button-clicking of an interface definitely feels menial and low-level. So what's the difference?

I think looking at individual tasks in isolation is not helpful. Bridget Riley adding rows to one of her geometric paintings, or Lowry painting matchstick figures, is not a task that is disconnected from the overall artistic aims and ambition: it is entirely consistent with it.  The work has to be taken as a whole, and taken as a whole, the artist's "menial" tasks are consistent and coherent with the whole. So what does modern academic work  look like as a whole? 

I think this is at the root of what's happened to academia since the advent of technology (and possibly a bit before). The ideal of the academic institution is not individual academic labourers pushing out publications to raise their H-index, bidding for grants, etc, but the community of scholars - teachers and students - in the college working together for the wellbeing of each other and the pursuit of truth. When everyone works as one, the mechanism which selects what must be done is understood by all, and each responds in the knowledge that whatever task has to be performed - whether repetitive or deep and intellectual - is done because it is required for the common good. 

The point about the college is that it is a coherent whole. So what did technology do? It carved up the function of each individual and instead of it being a convivial activity, it became specialised and professionalised, given to a particular individual who could basically perform this function on behalf of everyone else. It's precisely what Illich says in "Tools for conviviality" about the difference between the conviviality of the shovel, and the lack of conviviality in the JCB. 

What is a convivial approach to technology in universities? It is certainly about doing things together, not doing things to people, or even for people. What's been so interesting with the current crop of technologies in institutions is that the skills for accessing and manipulating their data have become common: very often these are simple programming skills, and these skills, even if they are not known by everyone, can be communicated and their experience shared across a community. 

The technical essence of the Personal Learning Environment was a common toolset with which universal skills which could be shared. What we didn't talk about so much was the fact that because these skills were common and universal, effective convivial activity could be organised. (I do remember Oleg saying that the PLE was a way forwards to Illich's ideas of conviviality, but I couldn't see it clearly at the time). 

Today academic work is disaggregated, individualised, compartmentalised and specific functions are separated off with no connection to a common purpose. Moreover, the disaggregation is reinforced by tools which position themselves in "market segments" when in fact the whole thing is manipulating the same data. The one blessing that we have is that access to the core data underneath each of these specific functions has become easier through the APIs. Increasingly, I think we will see a single "back-end" for these systems - probably (so it seems at the moment) coordinated by Microsoft. 

But a deeper issue lies in the fact that our disciplines themselves have become separated from the technological and institutional context within which they organise themselves. Discplines tend not to think of themselves technologically or institutionally - and yet almost all disciplines these days see at their frontiers issues of technology, uncertainty, institutional organisation and data. In fact for many disciplines, if they were to examine the institutional and technological context of their own educational technology, they would find real-life and tangible examples of the very things they concern themselves with in an intellectual way. For example, medicine is increasingly going to become dominated by the kind of AI tools which sit behind many of the interfaces they use to organise and discuss their content. The same goes for Law (think Turnitin), Maths (learning analytics, convolutional neural networks), Psychology (AI), biology (bio-sensors, imaging), and so on.  

Change requires a new meta-language of subject orientation with technology. Institutionally, maybe this can be organised through inter-disciplinary collaboration and engagement, where deep questions about disciplinary knowledge and technical engagement can be asked. All disciplines demand that certain competency criteria are met. But there are always many ways to do this. And doing it together, where the disruptions of technology can be felt in the very fabric of disciplines themselves, may provide a powerful way back to the kind of wholeness and integrity of academic life that the college once had, but within a new digital context.  The way forwards is synergy.

Friday, 18 December 2020

Bio-drama and new ways of teaching

For a few years now I've been exploring with John Torday how the many profound aporia we live with (what we seem to accept as "wicked problems" - climate change, inequality, homelessness, educational problems, health, geopolitics, etc) result from some gap in our understanding of how human consciousness came to be: to put it simply, it is Bateson's "gap between the way people think and the way nature works". In excluding the possibility of a deeper and more coherent narrative, we have grown to believe that our profound problems cannot not exist. But I am now asking a question that was once asked by Jiddhu Krishnamurti to David Bohm - is it possible for humans to have no "problems" at all?

The root of most of our human problems lies in the way that our egos separate themselves from their natural origins - their origins in cellular, biological, evolutionary processes. From mechanisms of cellular self-organisation and cooperation (without which there would be no advanced biological forms at all), we establish an idea of self which is more often competitive, defensive or combative. Our social structures are designed on the model established by the nature-divorced ego, and encourage this divorce from nature. Only when natural disasters occur - not just things like Covid, but earthquakes, floods, etc - that we are reminded of our origins in nature, and cooperation comes more to the fore (we are often surprised when it does, and praise the "heroics" of individuals doing what they were biologically programmed to do). We intellectually know that the split between ego and environment will kill us - but we seem powerless to intervene. 

Artists have always understood this split. Shakespearean tragedies show how the ego is torn apart by exposing the fundamental rift between culture and nature, but how Shakespeare does this is the thing. The play is a form in time, conceived in the mind of the playwright to unfold its moments of tension and tragedy in the lived experience of all the audience and players. This time of unfolding has a structure which is tied up with the structure of the drama, and this is the playwright's art - in concieving of the unity of the diachronic and synchronic aspects of drama as a whole.  

This uniting of the diachronic structure of life and its synchronic structure is something that also can be seen in cellular evolution. The principal mechanism by which it occurs is endosymbiosis - the absorption by the cell of aspects of its environment. As the endosymbiosis process unfolds, then obviously the structure of the cell reveals its history - rather like the rings in the trunks of trees, or ice cores represent a history book. Diachronic and synchronic are united. 

If this is the stuff that we are made of - if each of us, even at birth, are all "history stuff" - then the power of Shakespeare makes sense. In its unfolding temporal structure it recapitulates a much deeper temporal unfolding which unites each of us to each other through our cells. It's not that King Lear's tragedy awakens specific agonies in us with regard to our encounters with politics or power; it is that in the lived experience of seeing this unfolding dramatic structure, we see "through" each other - in Alfred Schutz's words, we "tune-in" to the inner life of each other. The power of the play is the power of deep connection and mutual recognition - all the more potent for it being so old and yet so fresh.

What we lack in our educational structures is the ability to make a similar kind of connection between teachers and learners. But such a connection can be made, I think. But like a Shakespeare play, it must be constructed so that this can happen. If there is any utility in the concept of "learning design", then it is this intended purpose - that something is designed or constructed such that learners and teachers can can perceive their shared biology and connection. 

How can you teach maths like that? or biology or physics? or medicine or brain surgery? Perhaps a better question to ask is "What stops us teaching maths, etc, like that?". And the answer to that is the nature-divorced ego implicit in our social structures of education. And yet, division and multiplication, cell boundaries, or the symmetries of quantum mechanics are all tied up in the processes which unify the diachronic and synchronic. We just lack a clear way of showing how - we lack a narrative and structure which reveals it. I think it's right, for example - as Louis Kauffman has argued (and more recently Steve Watson) that the symbolic notation that we use as the basis of teaching maths takes us in the wrong direction if we wish to identify wholeness. There may well be better "iconic" ways of approaching mathematical thought - as Lewis Carroll or Charles Sanders Peirce knew.

I'm increasingly interested in exploring what novel iconic and dramatic approaches to learning might reveal, and how they might be organised with technology. For example, knots are a powerful metaphor which extend from mathematics to psychology and organisation theory. And there's wonderful software to explore them (see The KnotPlot Site

It's something which I think does unite some of the better critical pedagogic thinking (Freire, Boal, Shor, etc) with deeper understanding of the relationship between mind and nature. 

Thursday, 10 December 2020

Networks and Biology: Wiring ourselves into a bad theory

The one thing that can be said about networks is that they are easy to draw. Anyone who's done "join the dots", or who has looked at a map, or studied physiology or neuroanatomy understands networks in their essence: a set of points joined together with lines. The join-the-dots pattern permeates the natural world like a kind of fractal motif. But what we see and what things actually are, are not the same. How would we know if networks actually exist?  

In order to know whether a network is real, we would have to be able to establish some kind of correlation between our observations of the network's structure (which is "the network"), its behaviour, and any changes we might make to that structure. Obviously, if the network is human-made, then the relationship between an electronic  network's structure, how it behaves, and predictable outcomes in the light of changes to it would seem to be straight-forward. But in complex artificial networks, such as those defined by machine learning models, predictability in the light of network change is elusive. We are strangely unbothered by this, because we see the same type of unpredictability in natural networks. 

We may see in the brain an array of enormously complicated "networks", but beyond some very crude interventions which zap entire sections of "the network", there is little predictability in the effects of these interventions. So when we see little predictability in the AI webs our consciousness has made, we are inclined to imagine ourselves in the image of God, and satisfy ourselves that if fuzziness is good enough for our understanding of nature, it is good enough for our understanding of artificial intelligence. 

But this fuzziness should ring scientific alarm bells. Networks do not just spring from nothing. They emerge in nature from biological processes. To put it more directly, networks emerge from the dynamics of cells. Neurons are cells. Nerves are made from cells. Tree roots, fungal fibres and bacterial colonies are made from cells. The cell is the thing. The network is an epiphenomenon arising from the cell's behaviour.

This point is important when we think about our technology. If we designed our technology from the metaphor of the cell, rather than the metaphor of the network, we would have very different technology. And I am increasingly convinced that if we understood our existing networks with their mystical properties (like machine learning) from the perspective of cells, then their behaviour would be much less mysterious to us.

The main thing a cell must do is maintain a boundary between itself and its environment. It must maintain its internal environment and maintain balance with an ambiguous external environment, and it requires energy to perform these functions. It is through performing these functions that the cell establishes relations with other cells, from which the physical characteristics of a "network" might be seen to emerge. 

However, this mechanism must drive the cell through the processes of self-organisation with its environment. Networks are driven through a process whereby the cell seeks stability in its organisation relative to its environment. This can be achieved through absorbing features of the environment so as to adapt itself and organise itself into increasingly complex life forms - a process Lynne Margulis called "endosymbiosis". Increasingly complex life forms in turn provide the cell with increased adaptability in the face of environmental challenge. These processes of endogenisation and adaptation are the basis for the epigenetic mechanisms which are exciting increasing interest in current empirical biology.

But endogenisation and adaptation mean that history and time is embedded in the structure of the cell, and in the networks it forms. Biological networks - like neural networks - are more like scar tissue, or the scree lines fomed by geological events than they are simple nodes and arcs. At each stage of organisation, the cell must maintain homeostasis and balance with its environment; at each stage it tends towards the conditions of its initial formation - conditions which are historically embedded in its own structure.

This is the "network" science we need. It is not a science of networks at all, but of dynamic processes of maintaining boundaries at all levels of organisation, from the brain and the liver through to consciousness, communication, technology and education. Behind the rigid visualisations of network dynamics on Facebook, and the scree lines and scar tissue of individual biographies, and biological history. 

Looked at this way, the way we think about our networks of human communication are grotesque distortions of nature produced by a bad theory. Instead of cooperating and organising themselves, the bruised egos of individual nodes compete against one another, each node seeing to be the loudest or the best, or "clusters" of damaged souls reinforce pathological and explosive boundaries in politics. 

The basic point is that the homologue of the cell's  boundary wall is not the person's skin; it is a dialogue's boundary. At a human level, we organise ourselves through communication - that's where our boundaries are formed. However, when locked-in to the network technologies of social media, the boundary walls are reinforced against the environment - there is little endogenisation, and hence little growth and development. It is the homologue of the cancer cell. 

There is an urgent question in technical design: whether it is possible to create a dialogical technology which can reproduce the organisational processes of the cell, including its endogenisation of the environment, and its maintenance of self-organisation against an ambiguous environment. To do this requires a much less mystical view of nature and of things like machine learning. Such a view can be found if we jetisson our obsession with the network, and instead think about the commonalities between how we maintain our communicative boundaries, and how a cell does it. 


Tuesday, 24 November 2020

An inevitable paradox of data protection?

One of the principal benefits of computer technology is that it performs the role of recording speech acts (in emails, tweets, messages in Teams, etc). This serves an important function in human relationships. It enables each of us to track the commitments we make to each other, helping us to communicate, anticipating potential breakdowns in organisation or communication.  For individuals maintaining communication with each other, the record of speech acts is useful, but the fact that this data can be retrieved by others for whom this information was not intended, can distort power relations. 

Anxiety that such threats by powerful information-holding elites to individuals might manifest in legal proceedings aimed at those primarily responsible for protecting data and upholding GDPR legislation (managers in institutions) can lead those managers to seek a place of safety to protect themselves from litigation. However, this "place of safety" can be directly in conflict with the principal rationale and benefit of computer systems in the first place - that they allow for speech acts and commitments to be recorded and managed. 

A friend told me that a German university has recently done precisely this: in confronting the problems faced by GDPR, they first adopted a Microsoft solution (GDPR has been an open goal for Microsoft). Deeper reflection on the implications of increased capacity for storing speech acts and monitoring commitments has subsequently led to the same management to determine that all text and video communications produced through teaching and learning processes should be deleted after two weeks. In effect, having created a technological environment within which learners and teachers can grow to understand each other through producing data, management then assaults learners and teachers with measures that sabotage the technology. 

Is this an inevitable paradox produced by the way we have organised ourselves with technology?

The dimensions of the paradox are:

  1. Data is something "given" in the world - like an object or a created artefact; 
  2. Data counters amnesia. Records of conversation mean things to individuals, particularly when stored over a long period of time (although the locus of meaning is not "in" the data, but in relationships);
  3. Data can be rearranged, recombined, reorganised to produce other kinds of object. Consciousness works dynamically with processes of manipulation which result in new meaning arising. When data is analysed, something "given" is turned into something else in the world; 
  4. Databases are a technology for centralising access to data. They have emerged through self-organising, free market dynamics which were intended to distribute information, but instead they have produced concentrations of information and power;
  5. To protect individuals from this concentration of power, new legislation bears upon organisations to ensure that personal data must be carefully controlled (GDPR)
  6. To uphold the commitment to GDPR, institutions are forced to massify their technology (enter Microsoft)
  7. Massifying the technology introduces new concerns about concentration of power, leading managers worried about litigation to drastically restrict the capacity of the technology to store personal data;
  8. Restrictions on the capability of technology to store data directly impacts on the ability for individuals to coordinate actions with each other - it introduces amnesia.

While this "amnesia paradox" appears to be the result of pathology in institutional management (which it may be), more deeply it is probably the result of technical architecture which tends towards centralisation, in a similar way to which the http protocol has tended towards centralisation and pathology.