Skip to Search
Skip to Navigation
Skip to Content

University of Connecticut Commencement & Convocation

Text of Past Commencement Addresses

Graduate Ceremony Address: Noam Chomsky, May 16, 1999

Noam Chomsky, Professor of Linguistics, Massachusetts Institute of Technology, received an honorary Doctor of Letters degree during the ceremony

The Text:

There's no really natural transition to the topic I would like to say a few words about, namely the prospects for the brain and cognitive sciences, what is sometimes felt to be the last great mountain range that the natural sciences might hope to scale.

But there is perhaps a certain transition.

A forthcoming paper by one of the leading researchers in cognitive neuroscience, Randy Gallistel, argues - pretty convincingly, I think - that a long tradition of research and speculation is seriously off the mark in its beliefs about general learning processes and associative theories of learning. Rather, throughout zoology and experimental psychology it is increasingly becoming clear that learning mechanisms are computationally specialized behavior for solving particular kinds of problems. That ranges from insect behavior, to shooting the winning basket for the Celtics, to what you and I are doing right now.

Gallistel also points out that this "modular view of earning" is "the norm these days in neuroscience." According to this view, in all animals learning is based on specialized mechanisms,"instincts to learn" in specific ways. These "learning mechanisms" can be regarded as "organs within the brain [that] are neural circuits whose structure enables them to perform one particular kind of computation."

They do that reflexively, unless the environment is "extremely hostile." Human language acquisition is instinctive in this sense, based on a specialized "language organ," which grows from an initial state that is an expression to the genes through various stages until it reaches a mature state that corresponds more or less to what we informally call "a language"; say, some variety of English.

And this transition does appear to take place as a virtual reflex, apart from "extremely hostile environments." It seems that normal normal knowledge of language can be closely approximated even by the deaf-blind, using only the extremely rudimentary information that is provided by placing a hand on the face of a person who is speaking, though the f ull story is somewhat more complex.

These are the specific topics of cognitive psychology that have been of particular interest to me for many years. I think the work of the past half century, including ground-breaking work that has been done right here, quite strongly supports the general thesis that Gallistel takes to be the norm in neuroscience: the circulatory system, the digestive system, the kidney, the visual systems, and so on.

These are often called "organs of the body," but of course not with the implication that they can be removed leaving the rest intact. Rather, these are subsystems of a complex whole, with their own specialized properties, interacting in specific ways - organs in a useful but somewhat abstract sense., There seems every reason to believe that the human brain is structured along similar lines.

Gallistel also sounds a useful warning note: he points out that "We clearly do not understand how the nervous system computes," even "how it carries out the small set of arithmetic and logical operations that are fundamental to any computation." A great deal has been learned about several of the organs that enter into human thought and action, notably the language organ.

But enormous gaps of understanding remain. One of them is the gap between theories of the nature, growth, and use of the organ, and theories of the anatomy and psychology of the brain. It is for such reasons that the organs that enter into thought and action are often called "mental organs," to signal that the problem of unification of mental aspects of the world and other aspects, in this case, we presume, cellular aspects.

As Gallistel notes, the problem arises for all psychological processes. It shows up in many ways. Thus, a great deal has been learned about vision in recent years, but as another prominent neuroscientist recently pointed out, the ability to recognize "a continuous vertical line is a mystery that neurology has not yet solved." The word "yet" should sound another warning note: no one can guess what might be necessary even for this problem to be solved, let alone what changes in basic science might be required to relate mental aspects of the world to others.

Many leading figures in the brain and cognitive sciences are optimistic about the prospects, but not without recognizing the gaps that remain - "chasms" might be a better word. One of the grand old men of the field, Vernon Mountcastle, who is one of the optimists, observes that the study of higher mental faculties raises a serious question about the validity of "the long-standing dogma of neuroscience conserved in mammalian evolution."

The dogma may not be "universally true," he suggests,"especially if it applies to the human brain," which appears to have neuron types that differ from those of other mammals in biochemical mechanisms and patterns of connectivity. Still more seriously, we do not even know if these are the right mechanisms and patterns to explore in seeking to achieve unification of mental and cellular theories of the world.

It is generally assumed that the human species reached essentially its current state about 100,000 years ago, after very radical changes in the preceding several million years. These developments included a tripling of brain size and a great many structural changes long after the separation from the nearest surviving relatives, roughly 5 million years ago, which means a separation of twice that length in evolutionary terms. It is also assumed that whatever happened about 100,000 years ago probably involved the appearance of a language organ, and with it, presumably, many of the other distinctive properties of our curious species.

That's a flick of an eye in evolutionary terms.

Also intriguing is the apparent biological isolation of the human language faculty. Perhaps the closest analogues are in insects - the famous dance of the honeybees. But there is a good deal of controversy about the nature and function of these systems, the analogies at best are very weak, and there is of course no evolutionary relation.

The basic facts were observed by Darwin, who noted the radical distinction between human language and all known animal systems of communication. Human language, he pointed out, is infinite in its capacity for expression of thought, while other animals crucially lack that property, and as we now know, many other elementary properties of human language as well. The observation is in fact far older: it was noted by Galileo, and became a central part of the great scientific revolution of the 17th century - and that included the first great "cognitive revolution," perhaps the only one that really merits the term.

The 17th century scientific revolution reached its highest peak in the achievements of Isaac Newton. It is commonly held that Newton showed that the universe is an intricate mechanism, rather like the complex automata that captured the imagination of 17th-18th century thinkers, much as computers do today. But in fact what Newton demonstrated was exactly the opposite. Newton showed, much to his dismay, that the universe is not a mechanical device.

To quote a leading modern historian of physics, Newton showed that "a purely materialistic or mechanistic physics is impossible," that it is necessary to introduce into core natural science "incomprehensible and inexplicable facts" (Alexander Koyre).

Newton regarded his own conclusions as an "absurdity," and spent the rest of his life trying to find some escape, as did many other leading scientists, in fact for centuries. But in vain.

It is common these days to ridicule those who still believe in the ghost in the machine. But that criticism mistakes the problem. Newton exorcised the machine; he left the ghost intact. The fact was understood by leading figures. 250 years ago David Hume recognized that "Newton seemed to draw off the veil from some of the mysteries of nature," but "he shewed at the same time the imperfections of the mechanical philosophy; and thereby restored [Nature's] ultimate secrets to that obscurity in which they ever did and ever will remain." The world is simply not comprehensible to human intelligence, at least in the ways that modern science had hoped and expected.

The classic scholarly study of the history of materialism describes Newton's achievements as the destruction of materialism or physicalism, in any serious sense of the terms. It reviews how the expectations and goals of the pioneers of the scientific revolution, and their materialist predecessors, were abandoned, and we gradually "accustomed ourselves to the abstract notion of forces, or rather to a notion hovering in a mystic obscurity between abstraction and concrete comprehension," a "turning-point" in the history of materialism that removes the doctrine far from those of the "genuine Materialists" of the 17th century and before, and deprives it of much significance.

These facts have considerable bearing on the study of mind and brain today. In the light of Newton's demolition of the concept of matter, many scientists came to recognize that John Locke must have been correct in suggesting that, just as the world has properties of attraction and repulsion and others that "we can in no way conceive motion bale to produce," so "a faculty of thinking" might have been "superadded" to matter.

The term "matter" by then had lost any significance, referring merely to the world, with whatever strange properties it has, including Newtonian "absurdities" and more extreme ones that had to be accepted as true in later years.

By the end of the 18th century, Locke's tentative suggestion was appropriately rephrased by the famous chemist Joseph Priestley as a virtual truism: "the powers of sensation or perception and thought" are properties of "a certain organized system of matter"; properties "termed mental" are the result [of the] organical structure" of the brain and "the human nervous system" generally.

Priestley of course had no idea how these properties arise from the nervous system. Rather, much like the properties of attraction, repulsion, chemical affinity, light, electricity and magnetism, and others, mental properties had to be postulated on the basis of experimental evidence, perhaps with the eventual hope of unification, but without any prior idea of the form that such unification might take.

Today, that traditional and virtually inevitable conclusion has been revived, now formulated as a major thesis that "Things mental, indeed minds, are emergent properties of brains," though "we do not yet understand" the principles that relate these emergent properties to those of cells. The word "yet" again reflects the prevailing optimism. But whatever speculations one may have about the prospects, the thesis is not new; it is a traditional one, a direct consequence of Newton's exorcism of the machine.

The history of chemistry provides revealing lessons for the study of mental aspects of the world, and the course it might take. Chemistry, of course, is hard science; right next door to core physics in the rather misleading standard hierarchy of "reducibility."

By the mid-18th century it was understood that "chemical affinity must be accepted as a first principle, which we cannot explain any more than Newton could explain gravitation, and let us defer accounting for the laws of affinity until we have established such a body of doctrine as Newton has established concerning the laws of gravitation" (English chemist Joseph Black).

That is pretty much what happened. Chemistry proceeded to establish a rich body of doctrine,"its triumphs . . . Built on no reductionist foundation but rather achieved in isolation from the newly emerging science of physics" (a leading contemporary historian of chemistry, Arnold Thackrey). That continued until very recently.

What was finally achieved 60 years ago, by Linus Pauling, was not reduction: rather, unification, something very different. A few years earlier, in 1929, Bertrand Russell, who knew the sciences well, observed that chemical laws "cannot at present be reduced to physical laws." But his phrase "cannot at present" was shown to be wrong. It turned out that chemical laws cannot in principle be reduced to physical laws, as physical laws were understood.

Physics had to undergo fundamental changes, mainly in the 1920s, in order to be unified with basic chemistry. Physics had to "free itself" from "intuitive pictures" and give up the hope of "visualizing the world," as Heisenberg put it, another long leap away from intelligibility in the sense of the scientific revolution of the 17th century.

As recently as 70 years ago, before the unification was achieved, chemistry was regarded by many prominent scientists as a calculating device, a way of organizing and predicting the results of experiments, without any reality, because it had not been reduced to core physics. The topic was hotly debated, in terms that are very similar to those that dominate much of contemporary thinking and debate in cognitive psychology and philosophy of mind.

By the 1930s, it was understood that the debate had been pointless, that chemistry was real in the only sense of "reality" we have: it was the best theory that could be constructed to understand chemical aspects of the world. These quite recent developments in the core natural sciences should, I think, be taken seriously in considering higher mental faculties and the "bodies of doctrine" that are being developed concerning them, language in particular.

The unification of biology and chemistry a few years after Pauling's discovery can be misleading. That was genuine reduction, but to a newly created physical chemistry; some of the same people were involved, notably Pauling himself. True reduction is not so common in the history of science, and need not be assumed automatically to be a model for what will happen in the future.

The study of human higher mental faculties might well follow the course of the investigation of mechanical, electromagnetic, optical, and chemical aspects of world, among others. The hope for reduction may once again prove illusory.

One cannot know, until we know, how unification might take place - if it ever does - and what form it will take; perhaps, once again, a radical reconstruction of what is misleadingly called "the more basic" science. Perhaps novel biochemical mechanisms and patterns of connectivity are involved, as Mountcastle suggests. Or perhaps more radical revisions, as often in the past.

In the last half century there has been intensive and often highly productive inquiry into the brain, behavior, and cognitive faculties of many organisms. The Holy Grail, of course, is human higher mental faculties. But it should be clear that this is the goal that is likely to be the most remote, probably by orders of magnitude, if only because of the complexity of the systems and their apparently novelty and biological isolation.

Another serious barrier to inquiry is that direct experimentation is excluded on ethical grounds - today, I should add; not long ago practices were quite different, in ways that we would now find extremely shocking. A lot is known about the human visual system, but that is because it is assumed to be rather like those of other mammals, including other primates.

And for these animals, we permit ourselves invasive experimentation, raising animals in controlled environments, and so on. From such experimentation, a great deal is learned, and the basic conclusions are reasonably to hold for the human visual system as well.

But we know of no analogues to language and other human mental organs, we would presumably bar direct experimentation as we do for humans. There is hope that new non-invasive technologies - brain-imaging techniques and others - may offer a way around this barrier to understanding. The prospects are exciting, and a lot has already been learned.

But despite much important progress in many areas, and justified excitement about the prospects opened by newer technologies, I think it is wise to be cautious in assessing what we know and what we might realistically hope to learn.

For the present, the study of language and other higher human mental faculties is proceeding much as chemistry did, seeking to "establish a rich body of doctrine," and sometimes succeeding, with an eye to eventual unification, but without any clear idea of how this might take place. Some of these bodies of doctrine are rather surprising in their implications.

Thus in the case of language, very recent work, some of the most important of it conducted here, is providing interesting grounds for taking seriously an idea that a few years ago would have seemed outlandish: that the language organ of the brain approaches a kind of optimal design, that it is in some interesting sense an optimal solution to the minimum design specifications the language organ must meet to be usable at all.

That is not what one expects to find in a highly complex biological organ. At the very simplest level, say cell division, or the structure of viruses, conclusions of this sort seem very reasonable, and even partially understood.

But it has been commonly assumed that evolution is a "tinkerer," in the phrase of Nobel Laureate Francois Jacob, doing the best it can with the materials at hand, the best typically being human existence in fact does approach optimal design, that would suggest that in some unknown way, it may be the result of the functioning of physical and chemical laws for a brain that has reached a certain level of complexity. And further questions arise for general evolution that are by no means novel, but that have been somewhat at the margins of inquiry until fairly recently.

I'm thinking of the work of D'Arcy Thompson and Alan Turing, to mention two of the most prominent figures.

Perhaps I might add one final remark about the limits of understanding. Many of the questions that inspired the modern scientific revolution are not even on the agenda. These include issues of will and choice, which were taken to be the core of the mind-body problem - the problem that was undermined by Newton, when he showed that there were no bodies in any meaningful sense.

There has been very valuable work about how an organism executes a plan for some integrated motor action - how a cockroach walks, or a person reaches for a cup on the table. But no one even raises the question of why this plan is executed rather than some other one, apart form the very simplest organisms. Much the same is true even for visual perception, sometimes considered to be a passive or reflexive operation.

Recently two MIT cognitive neuroscientists published a review of progress in solving a problem posed in 1850 by Helmholtz: "Even without moving our eyes, we can focus our attention on different objects at will, resulting in very different perceptual experiences of the same visual field."

The phrase "at will" points to an area beyond serious empirical inquiry. It remains as much of a mystery as it was for Newton at the end of his life, when he was still seeking some "subtle spirit" that lies hidden in all bodies and that might, without "absurdity," account for their properties of attraction and repulsion, the nature and effects of light, sensation, and the way "member of animal bodies move at the command of the will" - all comparable mysteries for Newton. In the 17th century, the ordinary use of language was taken to be the prime illustration of this mystery, and the best proof of the existence of other minds. And for reasons we should not lightly dismiss.

For some of these mysteries, extraordinary bodies of doctrine have been developed in the past several hundred years, some of the greatest achievements of the human intellect. And there have been remarkable feats of unification as well. How remote the remaining mountain peaks may be, and even just where they are, we can only guess.

Within the range of feasible inquiry, there is plenty of work to be done in understanding mental aspects of the world, including human language. And the prospects are surely exciting. We would do well, however, to keep in some corner of our minds David Humes' conclusion about "Nature's ultimate secrets" and the "obscurity in which they ever did and ever will remain and particularly the reasoning that led him to that judgement, and its confirmation in the subsequent history of the hard sciences.

These are matters that are sometimes too easily forgotten, I suspect, and that merit serious reflection - possibly, some day, even constructive scientific inquiry.

Thank you.