Friday 11 January 2013

Datadrama

Nigel Howard's "drama theory" is highly relevant to data analytics. Any piece of data is the result of deliberation and decision. Behind each item of data, there is (at least) psychological drama, if not real drama of events. Behind each item of data are the characters who form the context within which an act is taken. Each decision is taken in the light of a set of expectations that certain events will follow. If it is a good decision, the events will be in accord with expectations.

In Howard's methodology, things begin with a kind of 'agon' where each character articulates their position relative to the others (see http://en.wikipedia.org/wiki/Drama_theory); what they want; what their fall-back position is. When everyone is aware of everyone else, the characters work through their positions. This usually results in 'dilemmas' (in his theory of meta-rationality, Howard called these 'paradoxes') as the positions taken aren't tenable, threats aren't believable, etc. This causes emotional and psychological change, with characters developing their position as the drama unfolds. The wikipedia article describes that:

a character with an incredible threat makes it credible by becoming angry and finding reasons why it should prefer to carry out the threat; likewise, a character with an incredible promise feels positive emotion toward the other as it looks for reasons why it should prefer to carry its promise. Emotional tension leads to the climax, where characters re-define the moment of truth by finding rationalizations for changing positions, stated intentions, preferences, options or the set of characters.
There are specific 'dilemmas' or paradoxes, which Howard articulates (see the wikipedia article). His argument is that resolution in the drama cannot be reached until all the paradoxes or dilemmas are dissolved.

It is important to note that the dilemmas arise through particular strategic and meta-strategic priorities of the actors. What informs the strategy-making (and meta-strategy-making) are pieces of information or data which the actors either willingly release, or unwittingly reveal. The willing release of data will be a strategic act whose consequences will have been considered. The unwitting release of data will be a bi-product of some other kind of act.

A strategic act will depend on anticipation of the reaction of others. It will depend on guessing of the strategies and meta-strategies (the anticipations) of others.

A strategic release of information (the kind of data that gets analysed in data analytics) will be based on a calculation of how others will respond. However, it may be so calculated as to mislead others into thinking that the intentions behind the act (and so the strategy) are something other than it really is. An information leak can easily hide true intentions.

Misinformation will lead to certain paradoxes. One piece of information with an assumed strategy behind it will contradict another piece of information. Those on the receiving end of this kind of information will wonder how the originator of the information can be thinking, how they can be consistent.

What can happen here is that the apparent meaning of the data can be questioned in the light of what might be in the mind of the originator. "Maybe they said this to put us off the scent... Maybe they don't really care what happens, but want us to believe that they do!" and so on. Those on the receiving end of this stuff will construct a storyline which most closely fits.

What really is going on here is the business of identifying the kinds of constraints which might produce the assorted items of information. Is it madness? Is it self-interest? Is it a genuine mistake? and so on.

Analytically, this produces some interesting possibilities. Imagine we are presented with some data. Email messages sent, policy decisions made, reports of discussions, etc. Can we reconstruct the kind of psycho-drama that produced these events? This is a bit like psychological profiling. What this should reveal is the extent to which the overt meaning of speech acts, email messages, etc matches the intentional meaning behind those acts and their intended purpose. Where there is a mismatch between the intentionality behind messages and the expressed meaning of those messages (the meaning that might be initially constructed by recipients), then a refined view towards the interpretation of future messages is necessary. This would take into account the reasons why there is a mismatch. The 'refined' view  is actually a shift in the modelling of the originator of those messages - a more accurate identification of their absences.

Resolving the paradoxes means finding a "storyline which fits". But if we do arrive at something which fits, is there further analysis that might be done? Can the process be re-applied to an emergent narrative? Doing this might, I think, shed new light on new items of data to be searched for. It is rather like exploring a new theory of physics. The theory is an invention, but the invention suggests the existence of new physical phenomena - so we go looking for them.

Our tendency when analysing 'big data' is for the high-level global viewpoint. A drama-theory inspired perspective would:

  • start from particular items of data
  • suggest a psycho-drama that might produce the data
  • identify the conflicts between the suggested psycho-drama and the overt meaning of the messages (identify where the anticipations of the utterer of the messages is different from the anticipations of the audience)
  • redefine the psycho-drama
  • with a refined psychodrama, identify new possible items of data to support the theory
  • go and find them, together with randomly chosen new items of data which might challenge the theory.
So the trawl through the big-data set is a step-by-step exploration driven by suggested contexts for the production of the data.

To me, something like this puts the heart back into the data - something which I find sorely missing in most data analysis I encounter!


No comments: