In my previous posts, I have argued that different second-order cybernetics approaches vary according to fundamental theoretical orientations, whilst superficially appearing to be allied in their adherence to notions of reflexivity and observation [This is really a paper which I'm gradually chewing my way through!]. The differences between these approaches creates conditions where coordinating discussions among second-order cyberneticians is deeply challenging, and it is not uncommon for proponents of one theoretical perspective to accuse proponents of another of objectivism (which both ostensibly oppose) or universalism. The root of this problem, I have argued, lies partly in a failure to articulate the fundamental differences between approaches, but most importantly in the failure to critically articulate those principles which all second-order theories have in common, amongst which the principle of induction plays a central role. In further dissecting induction with regards to second-order cybernetics, I have argued, following Hume and Keynes, that the issue of analogies - things being determined to be the same thus providing the conditions for adaptation - is one which can be used to differentiate different approaches to second-order cybernetics. Without a grounded account of what counts as the 'same', second-order cybernetic discourse throws open the possibility of misunderstanding and incoherence.
Information Theory presents a quantifiable model which relates perceived events to an idealised inductive process. The application of information theoretic approaches demands that the issue of analogy is explicitly addressed: the 'sameness' of events must be determined in order for one event to be compared to another and for its probability (and consequently its entropy) to be calculated. Whilst the use of information theory may (and frequently does) slip into objectivism, the opportunity it presents is for a coordinated critical engagement with those deeper issues which underlie second-order cybernetic theory. At the heart of this critical engagement is a debate regarding the distinguishing of events which are 'counted' (for example, letters in a message, the occurrence of subject keywords in a discourse, the physical measurements of respirations of biological organisms). Reflection concerning the identification and agreement of analogies entails a process of participation with the phenomena under investigation, as well as reflection and analysis of the discourse through which agreement about the analogies established in that phenomenon is produced. At a deeper level, the extent to which any information theoretical analysis could be interpreted in an objectivist way, the problems inherent in Shannon's information theory, and other aspects of self-criticism presents a deeper level of reflection as to how research results may be presented, interpreted, misinterpreted, and so on. But at the root of it is the identification of analogies.
Among the empirical applications of information theory which are explictly aware of cybernetic and reflexive nature of information theory's application, the statistical ecology of Ulanowicz, the discursive evolutionary economics of Leydesdorff and the synergetics of Haken present three examples whose contribution to a more coherent and stable second-order cybernetic discourse can be established.
Ulanowicz's statistical ecology uses information theory to study the relations between organisms as components of ecologies. The information component in this work concerns measurements of the respiration and consumption in ecological systems. Drawing on established work on food chains, and also cognisant of economic models such as Leontieff's 'input-output' models, Ulanowicz has established ways in which the health of ecosystems may be characterised through studying the 'mutual information' between the components, and that the measurement of 'average mutual information' is particularly useful in the identification of the health of the ecosystem. Calculations produced through these statistical techniques have been compared to the course of actual events, and a good deal of evidence suggests the effectiveness of the calculations.
Ulanowicz has, however, also considered the value in Shannon's equations. In particular, he has engaged with criticism that Shannon's measure of uncertainty (H) fails to distinguish (in itself) the novelty of events (which, by virtue of being low probability, have high uncertainty), and those events which confirm what already exists: in other words those events which are analogous to existing events. Whilst building on his existing empirical work, Ulanowicz has sought to refine Shannon's equations so as to account for the essentially relational nature of the things that Shannon measures. In this regard, Ulanowicz has distinguished between the mutual information in the system (a measure of analogies of events), and the 'flexibility' of a system: a measure of the extent to which the system proves itself to be adaptable for future shocks. At the heart of the contrast between mutual information and novelty, Ulanowicz has suggested alternative ways of measuring novelty by using Shannon's concept of redundancy.
Echoing earlier arguments by von Foerster, Ulanowicz argues that Shannon's redundancy measure may be more significant that his measure of uncertainty. The statistical measurements involving mutual information have led to the ground to articulate the need for a re-evaluation of the information uncertainty measures, and with it have led to further debate and developments among statistical ecologists. At the heart of Ulanowicz's thinking is the connection between constraint and redundancy, and the possibility that a calculus of constraints presents a way of thinking about complex ecological phenomena which overcomes some of the deep problems of multivariate analysis. If patterns of growth are seen as flexible adaptive responses which emerge within constraints, and if those constraints may be measured using statistical tools, then a different causal orientation may be established between variable factors and likely events can be established.
Ulanowicz's ecological metaphor explicitly relates itself to Gregory Bateson's epistemology. Ulanowicz argues that his statistical project is a realisation of Bateson's project to articulate an 'ecology of mind': in this way, the statistical evidence of ecosystems can also provide evidence for ecological relationships in observing systems.
Leydesdorff's Triple Helix, like Ulanowicz, uses Shannon's equations as a way of delving into an empirical domain. Leydesdorff's empirical domain is the study of discourse. Following Luhmann's second-order cybernetic theory, Leydesdorff argues for the possibility of a calculus of 'meaning' by studying the observable uncertainty within discourses. Like Ulanowicz, the principle focus of this has been on mutual information between discourses in different domains. Drawing on Luhmann's identification of different discourses, Leydesdorff has layered a quantitative component, apply this idea to innovation activities in the economy. The essential argument is that the highlighting of mutual infomration dynamics within discourses implies deeper reflexive processes in the communication system. Like Ulanowicz, Leydesdorff suggests two mechanisms for producing this. On the one hand, the measurement of mutual information shows the coherence between discourses, whilst the measurement of mutual constraint or redundancy demonstrates the possible flexibility in the system. Following Ulanowicz, Leydesdorff identifies autocatalytic process which generate redundancies within the discourse.
Like Ulanowicz, measurements of mutual redundancy introduce the possibility that multivariate analysis of complex interactions may be done simply through an additive calculation, avoiding the necessity to calculate (or guess) the causal power of individual variables. Calculations of mutual redundancy and mutual information together produce a rich picture of communication dynamics. Shannon's equations for mutual information produce fluctuating signed values where there are more than two domains of interaction; calculations for mutual redundancy always produce a positive value irrespective of the number of dimensions. The fluctuating sign of mutual information is an indicator, Leydesdorff argues, of the generation of hidden (redundant) options in a discourse.
Whilst the calculations of the Triple Helix have gained traction within a section of evolutionary economics, its application has tended to be econometric, and the techniques seen as a new kind of economic measurement. As with statistical ecology, objectivism potentially remains a problem. However, in Leydesdorff's empirical work there is a co-evolution of theory with empirical results. Whilst the Triple Helix is closely tied to economics, and in particular, econometrics, the articulation of deep cybernetic theory about communication and the inspection of that theory in the light of attempts to measure discourses make the empirical investigative part of the Triple Helix a driver for further development in second-order cybernetic theory. Most impressive with the Triple Helix is the fact that despite the apparent shallowness of measuring the co-occurrence of key terms in different discourses, convincing arguments and comparative analysis can then be made about the specific dynamics of discourses, which can generate hypotheses which can then be tested against particular interventions in policy. The reflexive aspect of this activity concerns the deeper identification of the scope of what is claimed by the Triple Helix. For example, Triple Helix analysis shows Japan to be one of the most innovative economies in terms of the discourse between Universities, Government and Industry. Does this mean that the Japanese economy is one of the most successful? What would that mean?
The Triple Helix deals with the problem of double-analogy in second-order cybernetics by using information theory to examining the talk between scientists, government and industry. Ultimately, analogies are detected between words which are published in scientific journals and which can be inspected in a relatively objective way. However, the analogies that it highlights can be challenged in various ways, as (for example) the product of institutional processes whose discourse is not inspected (the processes of academic publication, for example), or for making assumptions about the power and status of academic journals as opposed to other forms of discourse. The other side of the double-analogy relies on the social systems theory of Niklas Luhmann. Whilst this is a powerful theory, it remains essentially a metaphysical speculation. In this way, the Triple Helix's attempt to work towards a more coherent second-order cybernetics has to defend itself on the ground of its empirical distinctions about which there are assumptions which inevitably have to be made.
When the empirical grounding of a theoretical approach relies more on physical phenomena, there is at least some hope of a more effective coordination of discourse. Haken's synergetic theory grounds itself in the physical behaviour of photons in lasers. Using this as a starting point, Haken has sought to metaphorically extend these physical properties into other domains, from biology to social systems. Whilst Haken's work has unfolded in parallel to cybernetic theories (and there has been little contact between the two), he has made extensive use of Shannon's information theory to express mathematically the 'synergetic' processes which he argues underpin all forms of self-organisation.
Haken's physical observations were an important stage in the development of lasers. He realised that photons would coordinate their behaviour with each other if they could be contained within a domain of interaction for a sufficient amount of time. In the laser, photons are maintained within the domain of interaction through the use of mirrors, which only allow the escape of light at a particular energy level. Haken argues that similar dynamics under similar conditions also produce self-organising behaviour. In recent years, he and colleagues have performed analysis of social phenomena like the construction and development of cities to identify this.
I think Haken uses the observable analogies of the laser to describe a universalist principle. Whilst this can be a foundation for coordination of discourse about the physics of lasers, the identification of analogies in other domains is more problematic.
Information Theory presents a quantifiable model which relates perceived events to an idealised inductive process. The application of information theoretic approaches demands that the issue of analogy is explicitly addressed: the 'sameness' of events must be determined in order for one event to be compared to another and for its probability (and consequently its entropy) to be calculated. Whilst the use of information theory may (and frequently does) slip into objectivism, the opportunity it presents is for a coordinated critical engagement with those deeper issues which underlie second-order cybernetic theory. At the heart of this critical engagement is a debate regarding the distinguishing of events which are 'counted' (for example, letters in a message, the occurrence of subject keywords in a discourse, the physical measurements of respirations of biological organisms). Reflection concerning the identification and agreement of analogies entails a process of participation with the phenomena under investigation, as well as reflection and analysis of the discourse through which agreement about the analogies established in that phenomenon is produced. At a deeper level, the extent to which any information theoretical analysis could be interpreted in an objectivist way, the problems inherent in Shannon's information theory, and other aspects of self-criticism presents a deeper level of reflection as to how research results may be presented, interpreted, misinterpreted, and so on. But at the root of it is the identification of analogies.
Among the empirical applications of information theory which are explictly aware of cybernetic and reflexive nature of information theory's application, the statistical ecology of Ulanowicz, the discursive evolutionary economics of Leydesdorff and the synergetics of Haken present three examples whose contribution to a more coherent and stable second-order cybernetic discourse can be established.
Ulanowicz's statistical ecology uses information theory to study the relations between organisms as components of ecologies. The information component in this work concerns measurements of the respiration and consumption in ecological systems. Drawing on established work on food chains, and also cognisant of economic models such as Leontieff's 'input-output' models, Ulanowicz has established ways in which the health of ecosystems may be characterised through studying the 'mutual information' between the components, and that the measurement of 'average mutual information' is particularly useful in the identification of the health of the ecosystem. Calculations produced through these statistical techniques have been compared to the course of actual events, and a good deal of evidence suggests the effectiveness of the calculations.
Ulanowicz has, however, also considered the value in Shannon's equations. In particular, he has engaged with criticism that Shannon's measure of uncertainty (H) fails to distinguish (in itself) the novelty of events (which, by virtue of being low probability, have high uncertainty), and those events which confirm what already exists: in other words those events which are analogous to existing events. Whilst building on his existing empirical work, Ulanowicz has sought to refine Shannon's equations so as to account for the essentially relational nature of the things that Shannon measures. In this regard, Ulanowicz has distinguished between the mutual information in the system (a measure of analogies of events), and the 'flexibility' of a system: a measure of the extent to which the system proves itself to be adaptable for future shocks. At the heart of the contrast between mutual information and novelty, Ulanowicz has suggested alternative ways of measuring novelty by using Shannon's concept of redundancy.
Echoing earlier arguments by von Foerster, Ulanowicz argues that Shannon's redundancy measure may be more significant that his measure of uncertainty. The statistical measurements involving mutual information have led to the ground to articulate the need for a re-evaluation of the information uncertainty measures, and with it have led to further debate and developments among statistical ecologists. At the heart of Ulanowicz's thinking is the connection between constraint and redundancy, and the possibility that a calculus of constraints presents a way of thinking about complex ecological phenomena which overcomes some of the deep problems of multivariate analysis. If patterns of growth are seen as flexible adaptive responses which emerge within constraints, and if those constraints may be measured using statistical tools, then a different causal orientation may be established between variable factors and likely events can be established.
Ulanowicz's ecological metaphor explicitly relates itself to Gregory Bateson's epistemology. Ulanowicz argues that his statistical project is a realisation of Bateson's project to articulate an 'ecology of mind': in this way, the statistical evidence of ecosystems can also provide evidence for ecological relationships in observing systems.
Leydesdorff's Triple Helix, like Ulanowicz, uses Shannon's equations as a way of delving into an empirical domain. Leydesdorff's empirical domain is the study of discourse. Following Luhmann's second-order cybernetic theory, Leydesdorff argues for the possibility of a calculus of 'meaning' by studying the observable uncertainty within discourses. Like Ulanowicz, the principle focus of this has been on mutual information between discourses in different domains. Drawing on Luhmann's identification of different discourses, Leydesdorff has layered a quantitative component, apply this idea to innovation activities in the economy. The essential argument is that the highlighting of mutual infomration dynamics within discourses implies deeper reflexive processes in the communication system. Like Ulanowicz, Leydesdorff suggests two mechanisms for producing this. On the one hand, the measurement of mutual information shows the coherence between discourses, whilst the measurement of mutual constraint or redundancy demonstrates the possible flexibility in the system. Following Ulanowicz, Leydesdorff identifies autocatalytic process which generate redundancies within the discourse.
Like Ulanowicz, measurements of mutual redundancy introduce the possibility that multivariate analysis of complex interactions may be done simply through an additive calculation, avoiding the necessity to calculate (or guess) the causal power of individual variables. Calculations of mutual redundancy and mutual information together produce a rich picture of communication dynamics. Shannon's equations for mutual information produce fluctuating signed values where there are more than two domains of interaction; calculations for mutual redundancy always produce a positive value irrespective of the number of dimensions. The fluctuating sign of mutual information is an indicator, Leydesdorff argues, of the generation of hidden (redundant) options in a discourse.
Whilst the calculations of the Triple Helix have gained traction within a section of evolutionary economics, its application has tended to be econometric, and the techniques seen as a new kind of economic measurement. As with statistical ecology, objectivism potentially remains a problem. However, in Leydesdorff's empirical work there is a co-evolution of theory with empirical results. Whilst the Triple Helix is closely tied to economics, and in particular, econometrics, the articulation of deep cybernetic theory about communication and the inspection of that theory in the light of attempts to measure discourses make the empirical investigative part of the Triple Helix a driver for further development in second-order cybernetic theory. Most impressive with the Triple Helix is the fact that despite the apparent shallowness of measuring the co-occurrence of key terms in different discourses, convincing arguments and comparative analysis can then be made about the specific dynamics of discourses, which can generate hypotheses which can then be tested against particular interventions in policy. The reflexive aspect of this activity concerns the deeper identification of the scope of what is claimed by the Triple Helix. For example, Triple Helix analysis shows Japan to be one of the most innovative economies in terms of the discourse between Universities, Government and Industry. Does this mean that the Japanese economy is one of the most successful? What would that mean?
The Triple Helix deals with the problem of double-analogy in second-order cybernetics by using information theory to examining the talk between scientists, government and industry. Ultimately, analogies are detected between words which are published in scientific journals and which can be inspected in a relatively objective way. However, the analogies that it highlights can be challenged in various ways, as (for example) the product of institutional processes whose discourse is not inspected (the processes of academic publication, for example), or for making assumptions about the power and status of academic journals as opposed to other forms of discourse. The other side of the double-analogy relies on the social systems theory of Niklas Luhmann. Whilst this is a powerful theory, it remains essentially a metaphysical speculation. In this way, the Triple Helix's attempt to work towards a more coherent second-order cybernetics has to defend itself on the ground of its empirical distinctions about which there are assumptions which inevitably have to be made.
When the empirical grounding of a theoretical approach relies more on physical phenomena, there is at least some hope of a more effective coordination of discourse. Haken's synergetic theory grounds itself in the physical behaviour of photons in lasers. Using this as a starting point, Haken has sought to metaphorically extend these physical properties into other domains, from biology to social systems. Whilst Haken's work has unfolded in parallel to cybernetic theories (and there has been little contact between the two), he has made extensive use of Shannon's information theory to express mathematically the 'synergetic' processes which he argues underpin all forms of self-organisation.
Haken's physical observations were an important stage in the development of lasers. He realised that photons would coordinate their behaviour with each other if they could be contained within a domain of interaction for a sufficient amount of time. In the laser, photons are maintained within the domain of interaction through the use of mirrors, which only allow the escape of light at a particular energy level. Haken argues that similar dynamics under similar conditions also produce self-organising behaviour. In recent years, he and colleagues have performed analysis of social phenomena like the construction and development of cities to identify this.
I think Haken uses the observable analogies of the laser to describe a universalist principle. Whilst this can be a foundation for coordination of discourse about the physics of lasers, the identification of analogies in other domains is more problematic.
No comments:
Post a Comment