by Benjamin Campbell
These petrified relations must be forced to dance by singing their own tune to them! Karl Marx
In November, The Atlantic published an interview with Noam Chomsky on the state of contemporary cognitive science and “Where Artificial Intelligence Went Wrong.” Chomsky, a central figure in the “cognitive revolution”, lamented what appeared to be today’s reversion to the behaviorism that he so strongly critiqued over half a century ago.
This interview was timely, as I have recently likened Chomsky and the cognitive revolution to Immanuel Kant’s philosophical reaction against empiricism. Today, Chomsky expresses concern about the increasing reliance on advanced statistical techniques and data-mining approaches in machine learning, as though we have taken a step backwards to a naïve empiricism. In contrast to such modeling approaches, Chomsky invokes the perspective of the theoretical neuroscientist David Marr. Marr, in studying the visual system, famously suggested that one must study a system at multiple levels, including the highest computational level, asking what is it that a system does? While Chomsky’s objection to contemporary theory might seem like a sharp rejoinder to today’s vulgar quants, in fact, it merely begs the question: what if the ultimate computational goal of an organism is to model its environment?
Long before Chomsky’s generation was launching the tradition of cognitive science, there existed a significant current of theoretical research known as cybernetics. Cybernetics was a term coined by Norbert Wiener from the Greek for “steersman,” for as the polymath Wiener put it in The Human Use of Human Beings: “We are but whirlpools in a river of ever-flowing water. We are not stuff that abides, but patterns that perpetuate themselves.” Cybernetics was briefly dominant in the post-war era, but was soon displaced by A.I. approaches that seemed more practical and were thus better funded. Where A.I. would emphasize computation, internal representation, and symbolic logic, cybernetics stressed the connections between control, communication, information, and thermodynamics. Life was seen as a homeostatic process, existing in enclaves of negative entropy where organisms regulated their environment by navigating the surrounding streams of disorder. And as the pioneering cyberneticist W. Ross Ashby would publish, Every Good Regulator of a System Must Be a Model of That System (1970). Thus, even if we examine the computational goal of an organism from Marr’s top-level perspective, we eventually come back to the conclusion that, as a homeostatic process, the computational goal of an organism must be to model its environment.
Chomsky is correct that certain manifestations of machine learning and neuroscientific theory have the feel of a reversion to behaviorism, including much of reinforcement learning theory. Yet. a broader perspective would recognize that the last two decades have seen a qualitative paradigm shift in our understanding of the brain. It is not that data-modeling is a repudiation of Kantian “forms of thought,” but merely a recognition that the forms of thought cannot be considered independently of the empirical data that they process. Recall Hegel’s criticism: Kant was to be commended for making the forms of thought a matter of study, but “there soon creeps in the misconception of already knowing before you know.” Forms of thought are not merely a priori constructs through which the world is viewed, but both act on their content and are acted on by that content, dynamically and dialectically.
This interdependence of form and content would find a more quantitative expression in principles of efficient coding, in which the brain learns dynamic representations that depend on the statistics of its inputs. This view has since led to approaches termed “Bayesian,” after the subjectivist approach to probability pioneered by Thomas Bayes. In this prominent view, the brain learns a statistical model to represent its environment, continually predicts that environment, and updates its internal model when these inferences are contradicted—a process known as Bayesian inference.
This now-familiar procedure of priors adjusted by new evidence to form posteriors bears an uncanny resemblance to that famous quasi-Hegelian triad: thesis, anti-thesis, synthesis. Today, however, Bayesian inference seems much more scientific than the historically obscure dialectic, raising the question of why anyone would concern themselves with Hegel out of anything other than antiquarian (or masochistic) interest. Importantly, however, while Bayesian models represent a significant advance over previous theory, they remain only approximations. A mere application of Bayes’ rule updates the prior belief in a proposition to a posterior belief in that proposition, but neglects to consider that the propositions themselves are dynamic. These forms of thought, through which we see the world, are never quite enough to adequately represent that world, leading to what Nicholas Georgescu-Roegen referred to as “qualitative residuals,” which manifest themselves as contradictions in our present understanding of the world. Thus, a Bayesian conception of the brain remains only an approximation to a dialectical understanding of the brain, and indeed a dialectical understanding of life, in which the organism comes to know its environment through a continual process driven by the resolution of contradictions between the world and the forms of thought used to represent it.
The fact that Hegelian philosophy speaks so clearly to biology should be unsurprising. Hegel was greatly influenced by the natural philosophy of his day, and as Frederick Beiser points out in his introductory Hegel, “the purpose of Hegel’s Science of Logic is indeed to develop a logic of life, a way of thinking to understand life.” Unfortunately, the correspondence between Hegelian philosophy and biology has been historically obscured by the degeneration of “Marxism” into farcical state ideologies. As a result, an anti-Hegelian scientism developed in the West, perhaps best exemplified by Jacques Monod’s tendentious and reductionist screed, Chance and Necessity. Recent decades, however, have seen a renaissance of Hegelian thought in philosophy, particularly in American pragmatism, and it seems inevitable that the intractability and absurdity of the present crises of capitalism will give rise to a renewed interest in Marxism. An understanding of the relation between the Hegelian dialectic and contemporary biology thus seems a necessary prerequisite for any real “consilience” of the two cultures, as well as any rebirth of the left.
Thus, contrary to Chomsky, I suggest we interpret the emerging conception of the brain not as a return to empiricism, but as analogous to the Hegelian advance beyond Kant. The work of the last two decades can thus be considered as “revolutionary” as that of Chomsky and his colleagues, even if the fragmentary and atomized research of late capitalism makes it difficult to identify the contemporary Zeitgeist.
However my analogy here remains incomplete, for just as Hegel critiqued Kant, so too was Hegel relentlessly critiqued by the “young Hegelians” who followed him. I have thus far presented the brain as passive observer coming to know its environment. To paraphrase the young Marx in the most famous of his Theses on Feuerbach: we have thus far only interpreted the world; the point is to change it!
December 2012