There's been a lot of stuff in the news recently about the threats posed by Big Data, AI, etc. "Computers will take our jobs!" is the basic worry. Except nobody seems to notice that the only jobs that seem bullet-proof are those of the managers who determine that other peoples' jobs have been replaced with computers. It is bad management we should worry about, not technology.
No computer is, or will ever be, a "match" for a single human brain: brains and computers are different kinds of things. Confusing brains and computers is an epistemological error - a "mereological fallacy" (the reduction of wholes to parts), a Golem-like mistaken belief in the possibility of 'mimesis'.
Ross Ashby, who studied brains closely for his entire career, was aware that the brain was a highly effective variety-absorbing machine. Its variety reduction is felt in the body: often as intuition, instinct or a 'hunch'.
Computers, by contrast, count. They have to be told what to count and what to ignore. In order to get the computer to count, humans have to attenuate the variety of the world by making distinctions and applying them to the computer's software. If the computer does its job well, it will be able to produce results which map uncertainties which will relate to the initial criteria for what can be counted and what can't. Knowledge of these uncertainties can be useful - they can help us predict the weather, or help translate a phrase from one language to another. But it is the hunches and instincts of human beings which attenuate the computer's world in the first place.
Stafford Beer tells the story of Ashby's explanation for accepting without a moment's hesitation the invitation to move to the US and work with Heinz von Foerster in Illinois. Ashby explained to Beer:
No computer is, or will ever be, a "match" for a single human brain: brains and computers are different kinds of things. Confusing brains and computers is an epistemological error - a "mereological fallacy" (the reduction of wholes to parts), a Golem-like mistaken belief in the possibility of 'mimesis'.
Ross Ashby, who studied brains closely for his entire career, was aware that the brain was a highly effective variety-absorbing machine. Its variety reduction is felt in the body: often as intuition, instinct or a 'hunch'.
Computers, by contrast, count. They have to be told what to count and what to ignore. In order to get the computer to count, humans have to attenuate the variety of the world by making distinctions and applying them to the computer's software. If the computer does its job well, it will be able to produce results which map uncertainties which will relate to the initial criteria for what can be counted and what can't. Knowledge of these uncertainties can be useful - they can help us predict the weather, or help translate a phrase from one language to another. But it is the hunches and instincts of human beings which attenuate the computer's world in the first place.
Stafford Beer tells the story of Ashby's explanation for accepting without a moment's hesitation the invitation to move to the US and work with Heinz von Foerster in Illinois. Ashby explained to Beer:
Years of research could not attain to certainty in a decision of this kind: the variety of the options had been far too high. The most rational response would be to notice that the brain is a self-organizing computer which might be able to assimilate the variety, and deliver an output in the form of a hunch. He [Ashby] had felt this hunch. He had rationally obeyed it. And had there been no hunch, no sense of an heuristic process to pursue? Ross shrugged: ‘then the most rational procedure would be to toss a coin’Our biggest threat is bad management, which feeds on bad epistemology. The great difficulty we have at the moment is that our scientific practices of Big Data, AI and so on, are characterised by complexity and uncertainty. Yet we view their outputs as if they were the 'objective' and 'certain' outputs of the classical scientist. Deep down, our brains know better.
No comments:
Post a Comment