Tuesday 24 November 2020

An inevitable paradox of data protection?

One of the principal benefits of computer technology is that it performs the role of recording speech acts (in emails, tweets, messages in Teams, etc). This serves an important function in human relationships. It enables each of us to track the commitments we make to each other, helping us to communicate, anticipating potential breakdowns in organisation or communication.  For individuals maintaining communication with each other, the record of speech acts is useful, but the fact that this data can be retrieved by others for whom this information was not intended, can distort power relations. 

Anxiety that such threats by powerful information-holding elites to individuals might manifest in legal proceedings aimed at those primarily responsible for protecting data and upholding GDPR legislation (managers in institutions) can lead those managers to seek a place of safety to protect themselves from litigation. However, this "place of safety" can be directly in conflict with the principal rationale and benefit of computer systems in the first place - that they allow for speech acts and commitments to be recorded and managed. 

A friend told me that a German university has recently done precisely this: in confronting the problems faced by GDPR, they first adopted a Microsoft solution (GDPR has been an open goal for Microsoft). Deeper reflection on the implications of increased capacity for storing speech acts and monitoring commitments has subsequently led to the same management to determine that all text and video communications produced through teaching and learning processes should be deleted after two weeks. In effect, having created a technological environment within which learners and teachers can grow to understand each other through producing data, management then assaults learners and teachers with measures that sabotage the technology. 

Is this an inevitable paradox produced by the way we have organised ourselves with technology?

The dimensions of the paradox are:

  1. Data is something "given" in the world - like an object or a created artefact; 
  2. Data counters amnesia. Records of conversation mean things to individuals, particularly when stored over a long period of time (although the locus of meaning is not "in" the data, but in relationships);
  3. Data can be rearranged, recombined, reorganised to produce other kinds of object. Consciousness works dynamically with processes of manipulation which result in new meaning arising. When data is analysed, something "given" is turned into something else in the world; 
  4. Databases are a technology for centralising access to data. They have emerged through self-organising, free market dynamics which were intended to distribute information, but instead they have produced concentrations of information and power;
  5. To protect individuals from this concentration of power, new legislation bears upon organisations to ensure that personal data must be carefully controlled (GDPR)
  6. To uphold the commitment to GDPR, institutions are forced to massify their technology (enter Microsoft)
  7. Massifying the technology introduces new concerns about concentration of power, leading managers worried about litigation to drastically restrict the capacity of the technology to store personal data;
  8. Restrictions on the capability of technology to store data directly impacts on the ability for individuals to coordinate actions with each other - it introduces amnesia.

While this "amnesia paradox" appears to be the result of pathology in institutional management (which it may be), more deeply it is probably the result of technical architecture which tends towards centralisation, in a similar way to which the http protocol has tended towards centralisation and pathology. 

No comments: