For physicists, entropy and time are concepts which are tied-up together. Eddington's "arrow of time" is most generally seen as an increase in entropy (see https://en.wikipedia.org/wiki/Entropy_(arrow_of_time))
Boltzmann's Entropy however, is a measure of uncertainty about the state matter at a particular location. It is statistically expressed in his famous probabilistic equation. Basically, beneath all the interpretations laid upon it, it is counting.
With Shannon's appropriation of Boltzmann's equation, the entropy measure could be used for counting the surprisal in communication: in the way that we might look for disorder among atoms, we can look for (and count) disorder among symbols. One of the key features of Shannon's approach is the idea of the 'selection' of symbols, where the greater the number of symbols selected is an index of the communicating system's complexity, and the capacity requirements of the communicating channel (this was Shannon's practical purpose in developing his theory).
When we come to talk about human communication, selection at the level of symbols invites a more sophisticated idea of selection at the level of meaning. In Sociology, Niklas Luhmann developed this idea into an elaborate and complex theory of society. From here, entropy might be seen to provide the root of a theory of meaning. More profoundly, might we count to reveal meaning?
We need to take a step back. What was assumed in the physicist's conception of time in the first place? Phenomenologists would question the initial assumption about entropy because it failed to account for the perceiving subject's experience of entropy and time. It would also question the emphasis (so strong within physics) of the individual perceiving subject, when physical observations and scientific discoveries are not just individual, but social and intersubjective. These phenomenological issues were explored by Husserl, Bergson, Schutz, Merleau-Ponty and Heidegger (to name a few). It's worth noting that Bergson had an intriguing but rather unfruitful meeting with Einstein in the 1922 (see this - which sees their dispute as partly political - https://muse.jhu.edu/article/193244/pdf) which led to extended debate.
I find Schutz's account of time interesting because it sees time emergent not from the experience of emerging physical disorder, but from the intersubjective experience between human beings: time always flows in a world of others. What may be important here is that meaning and mattering become prior to information. Also important is that those activities we do together - learning, working, playing. Schutz gives an account of what these things might be which I find far richer that the formalism of information theorists because it suggests that educational practice is the most important thing that human beings need to understand properly. Schutz is also critical to the entropy debate because his work was of seminal importance to Talcott Parsons, whose work on "double contingency" was fundamental to the development of Luhmann's theory.
Is there a middle ground between the Einstein, Bergson, Schutz, Luhmann viewpoint? Well, entropy is counting. Counting produces descriptions. Descriptions are communicated between people. Communication entails a continual intersubjective apprehension of time. Is it this intersubjective apprehension of time that we call 'education'?
Boltzmann's Entropy however, is a measure of uncertainty about the state matter at a particular location. It is statistically expressed in his famous probabilistic equation. Basically, beneath all the interpretations laid upon it, it is counting.
With Shannon's appropriation of Boltzmann's equation, the entropy measure could be used for counting the surprisal in communication: in the way that we might look for disorder among atoms, we can look for (and count) disorder among symbols. One of the key features of Shannon's approach is the idea of the 'selection' of symbols, where the greater the number of symbols selected is an index of the communicating system's complexity, and the capacity requirements of the communicating channel (this was Shannon's practical purpose in developing his theory).
When we come to talk about human communication, selection at the level of symbols invites a more sophisticated idea of selection at the level of meaning. In Sociology, Niklas Luhmann developed this idea into an elaborate and complex theory of society. From here, entropy might be seen to provide the root of a theory of meaning. More profoundly, might we count to reveal meaning?
We need to take a step back. What was assumed in the physicist's conception of time in the first place? Phenomenologists would question the initial assumption about entropy because it failed to account for the perceiving subject's experience of entropy and time. It would also question the emphasis (so strong within physics) of the individual perceiving subject, when physical observations and scientific discoveries are not just individual, but social and intersubjective. These phenomenological issues were explored by Husserl, Bergson, Schutz, Merleau-Ponty and Heidegger (to name a few). It's worth noting that Bergson had an intriguing but rather unfruitful meeting with Einstein in the 1922 (see this - which sees their dispute as partly political - https://muse.jhu.edu/article/193244/pdf) which led to extended debate.
I find Schutz's account of time interesting because it sees time emergent not from the experience of emerging physical disorder, but from the intersubjective experience between human beings: time always flows in a world of others. What may be important here is that meaning and mattering become prior to information. Also important is that those activities we do together - learning, working, playing. Schutz gives an account of what these things might be which I find far richer that the formalism of information theorists because it suggests that educational practice is the most important thing that human beings need to understand properly. Schutz is also critical to the entropy debate because his work was of seminal importance to Talcott Parsons, whose work on "double contingency" was fundamental to the development of Luhmann's theory.
Is there a middle ground between the Einstein, Bergson, Schutz, Luhmann viewpoint? Well, entropy is counting. Counting produces descriptions. Descriptions are communicated between people. Communication entails a continual intersubjective apprehension of time. Is it this intersubjective apprehension of time that we call 'education'?
No comments:
Post a Comment