7

innate ideas

 

 

If the soul resembles blank tablets, truth would be in us as the figure of Hercules is in the marble, when the marble is wholly indifferent to the reception of this figure or some other. But if there were veins in the block which would indicate the figure of Hercules rather than other figures, this block would be more determined thereto, and Hercules would be in it as in some sense innate, although it would be needful to labour to discover these veins, to clear them by polishing, and by cutting away what prevents them from appearing. Thus, it is that ideas and truths are for us innate, as inclinations, dispositions, habits, or natural potentialities, and not as actions, although these potentialities are always accompanied by some actions, often insensible, which correspond to them.

G. Leibniz

In the Socratic dialogue Meno, Plato related how Socrates drew knowledge of geometry from a slave boy who had no prior training in mathematics. The boy could not produce his own solutions to geometrical problems drawn in the sand, but with some coaxing from Socrates he could recognize the correct answer when it was suggested to him. Plato believed that the boy was remembering what he had learned in a previous existence. Experience in this life, he thought, could not teach us anything of the eternal truths of geometry, since the ordinary, physical world in which we live is a shifting shadowland. Whatever we know, he reasoned, must have been learned before we entered this world of treacherous appearances and moving lines. He surmised that the slave boy’s understanding was attained prior to birth in the heavenly realm of ‘Forms’, where each and every object, be it a horse, a moral value or a fact of arithmetic, was an icon of Truth. Although the physical world could not show us such things, it could still remind us of them. Plato thought that this process of reawakening was what we normally perceived as the process of learning, for, as Socrates asked, how could we recognize the Truth when we saw it, unless in some way we already knew the answers to our questions?

It appears that Plato did not try very hard to solve this problem when he was writing the Meno, as there are many ways in which we can identify the object of a search even though we have never seen it before. For example, we may not know what the final piece of a jigsaw puzzle will look like, but we know that it will be the only one left at the end. Or we may have only a partial description of the object to begin with, such as the images of prehistoric monsters that inspired naturalists to mount an expedition to Komodo Island off the coast of Borneo in 1912. They knew they had found the source of the legend when they came across a large lizard – the ‘Komodo Dragon’ – that had so far been unrecorded by science. Sometimes we do not need a description at all, since the answer is defined as whatever lies at the end of a certain path, such as when we add up a long string of figures. Most mortals do not punch 2,935 x 7,478 into their calculators and then remark ‘Just as I suspected’ when they see that the total is 21,947,930.

Two thousand years after Plato, the project of discerning innate knowledge began anew. Although many things could only be learned by experience, ‘rationalist’ philosophers such as René Descartes, Gottfried Leibniz and Baruch Spinoza hoped that the most important truths – those concerning God, His methods and His wishes (that is to say, mathematics and morality) being their favourites – could be discovered without leaving one’s armchair, via unaided contemplation. Men and women who have led full and varied lives are wont to accord experience rather more respect than it deserves, while those who have led sedentary lives often prefer to believe that enlightenment can be achieved without all that trouble. The position of the French thinker Descartes, the founder of rationalism, should not be put down to the way in which he spent most of his working life dozing in an airing cupboard or tucked up in bed during his extended lie-ins. Descartes may have been lazy, but he saw that the mind was more susceptible to certain thoughts than others. Show children a distorted triangle or a line with a kink in it and they will describe them as imperfect instances of a triangle and a straight line – not perfect instances of a distorted triangle and a crooked line. We possess the ideas of these geometric forms even though there are no absolutely perfect triangles or lines in the world for us to have observed through experience. In today’s parlance, they are ‘hardwired’ in the human brain.

Descartes’s notion of innate ideas was one in a long history of philosophical attempts to insulate the self from the outside world, be it from moral fortune, environmental influences on our free will or, as in this case, cognitive fallibility. It also held out the possibility of a purer knowledge than that offered by the periodically unreliable senses. This was a picture of a fair world, in which the peasant can be as versed in holiness as the sage, since the most valuable truths are those inscribed on the walls of the birth canal. Moreover, from this viewpoint, it was inappropriate to doubt the beliefs we were born with, since if a notion did not come from our own experience it presumably came from a far better source – namely, the design of the Almighty.

It was therefore embarrassing when the supposed ‘will of God’ did not live up to expectations. There is an obvious reason why someone would want an idea to be innate – since it absolves him or her of the obligation to justify it. The ‘empiricist’ philosophers of Britain argued that even instinctual beliefs would have to be checked against experience, as our prejudices are often mistaken. Innate ideas also did not seem to be evenly distributed. Descartes’s empiricist counterpart, the English philosopher John Locke, pointed out that supposedly universal truths are never agreed upon by all. Some individuals even doubted the existence of God. It counted against innate knowledge that it was not innate to every single individual, even if most of the heretics were children or madmen. This was not an argument against its existence as such, but it certainly harmed its moral and theological rationale, which was an altogether more effective way of casting doubt on such universal truths. Similar concerns hamper the debate in one of its modern incarnations, namely, the nature–nurture question. The notion of the innate has threatened the equality of all people, since one individual clearly might possess more innate intellectual gifts than another. Once the notion that innate advantages or impediments might be bestowed not by a benevolent deity but by cruel and indifferent Nature was broached, the idea of innate knowledge was no longer so attractive.

Locke’s alternative has been found altogether too captivating in recent years. He proposed that the human mind at birth was a ‘blank slate’ upon which experience and learning made their impressions. On this thinking, although we seem to have the ideas of perfect triangles and straight lines, the fact that we do not find such flawless figures in nature means that we must be mistaken. As for the susceptibilities to certain ideas that Descartes cited, Locke maintained that it was absurd to suggest one possessed knowledge but could not comprehend it. If we allow that the mere capacity to come to know something can qualify as a form of knowledge, then we might as well say that every piece of possible knowledge is held innately, since I must have the capacity to learn something in order to go on to learn it. There was thus no sense in saying that knowledge was imprinted on the soul in a latent form, so that it might be activated as a child reaches adulthood. This view was motivated by a particular understanding of learning capacities.

Plato had believed that the capacity to know a truth must be very similar to that truth, fitting it as a lock fits a key. However, the capacity might instead resemble formless clay at the outset, which could be given a number of shapes and configurations without tending towards one in particular rather than any other, as in Locke’s Blank Slate. No one would say that we have no innate capacities whatsoever – we need something innate in order to be human. What we want to know is to what extent these capacities have content – that is, how much is clay and how much keyhole? Today we have moved on from the question of how a capacity might resemble its object. We no longer talk of resemblances, as we do not think they are essential to representation. Binary computer code can represent anything we care to mention, but a portrait composed of zeros and ones would not win any art prizes (at least not any for photographic realism). While we cannot be sure that brains work in a similar fashion to computers, we at least know that a scholar’s brain is much the same shape as that of an illiterate. There is no mental clay within the skull that takes on the form of our studies. The debate over innate ideas now concerns mental processes rather than mental states or knowledge.

The empiricist philosophy that came to end rationalist dogmas ultimately created its own dogma, as the need for experience and experiment was eventually put ahead of the results of those very processes. Locke’s account contained two components: the belief that knowledge came by experience, and the belief that experience began only at birth. We might well wonder how it could be otherwise, as there are no academies for embryos, let alone heavenly schools wherein the soul might be educated prior to conception. Writing in the seventeenth and eighteenth centuries before the discovery of genetic heredity, the empiricists could not have known that the previous life in which certain truths were learned was not one’s own pre-existence, but the lives of one’s ancestors.

The struggle between rationalist and empiricist thinking was concluded in the twentieth century due to the arbitration of one man, the American philosopher and left-wing political activist Noam Chomsky, now Professor of Linguistics and Philosophy at the Massachusetts Institute of Technology. Chomsky was born in Philadelphia in 1928 and was still a teenager when he began helping his father, a Hebrew scholar, to edit his works. His youthful support for an ethnic Jewish homeland endures, but he regularly professes to despise the state of Israel. Then again, he doesn’t really believe in states at all – describing himself as an ‘anarcho-syndicalist’. Chomsky is also one of the world’s most articulate and extreme critics of his own nation’s foreign policy, once claiming that by the standards of the Nuremberg trials, every post-war US president should have been hanged for war crimes. He began his political life campaigning against the Vietnam War, supporting students who dodged the draft. He shared a jail cell with the novelist Norman Mailer, following the 1967 Pentagon protest, after which Mailer wrote of ‘a slim, sharpfeatured man with an ascetic expression, and an air of gentle but absolute moral integrity’ who seemed ‘uneasy at the thought of missing class on Monday.’1 The description holds after forty years.

Professor Chomsky is famous for his willingness to receive guests, while unsolicited correspondents often received detailed replies to their enquiries, covering several pages. The two guests who preceded my own visit to his office left dazed and teary-eyed after having their pictures taken with him. The same month, Chomsky had addressed the World Social Forum in Porto Alegre on the issue of globalization. A few days after our meeting, he turned up in the newspapers again after travelling to Turkey to condemn the persecution of the Kurds and defend his Turkish publisher from charges of disseminating separatist propaganda. When Chomsky flew in and asked to be tried alongside the publisher, the court dropped the case. It is no wonder that, according to the Arts and Humanities Citation Index, Chomsky is the most cited living author, and the eighth most cited of all time.

Chomsky prefers to keep his political and linguistic ideas separate, and this is perhaps just as well – for while his political views make him the darling of the left, his chief contribution to philosophy was to lay the first charge in the demolition of the left’s most cherished doctrine: the malleability of human nature. Syntactic Structures,2 Chomsky’s groundbreaking thesis on linguistics, was published in 1957 when he was just twenty-nine years old. His place in history was sealed two years later in his 1959 review of B.F. Skinner’s book Verbal Behaviour, published in the journal Language. Chomsky’s work was a reaction to the empiricist nadir represented by the behaviourist psychology of Skinner.

According to Skinner, the mind could be thought of as box that gave out only what was first put into it via the senses. All human behaviour, however elaborate, was a response to a stimulus. However, the stimuli one could observe and that were cited by Skinner seemed far too sparse to explain behaviour as rich as language. Chomsky noticed, for example, that children have a capacity to understand and articulate sentences that they have never heard before. Nonsense phrases such as ‘Colourless green ideas sleep furiously’ can be readily understood and judged grammatically correct. He was also struck by how almost all children learn language at a very early age, before their other intellectual faculties are fully developed, and that they do so without being rigorously drilled as they are in mathematics or reading. Unlike adults mastering a second language, children do not acquire their mother tongue by instruction. The very thought that they might do so involves a paradox. When we are unsure of the meaning of a concept, we can look it up in a dictionary, but this would be no use if we did not already understand the concept of meanings. This suggests that our first steps in language cannot be taken in this world. Any child attempting to bootstrap linguistic ability into a ‘blank slate’ of a mind would be in trouble, for most spoken, as opposed to written, language is relatively chaotic and ungrammatical. It is for this reason that newspaper interviews often result in misquotation – because they would be unprintable without heavy editing. Most parents do not act like newspaper editors. They are happy to contort incomprehensible gurglings into their child’s first ‘words’ and do not correct their offspring in the main. And this is reasonable behaviour. There would be no point in telling an infant how to talk, because they would not understand the explanation.

However, children manage to walk effortlessly through this seemingly impassable obstacle to learning. The Lockean model of the mind presented a general-purpose learning device operating from general rules for associating one sense impression with another. This was in part an attempt to mimic the success of physics in reducing events to a small number of consistent laws. But in the acquisition of language, general learning principles such as induction – where one moves, for example, from having observed a number of yellow lemons, to the conclusion that all lemons are yellow – would be little help if children want to be fluent talkers before they end up in retirement homes. A child is simply not given enough material to work with in order to attain these results in such a short period. Children also learn to speak at roughly the same rate, despite wide differences in intelligence, which one would not expect if learning ability were dependent upon a more general intellectual faculty.

Humans attempting to survive over the aeons have needed a learning faculty that is itself capable of learning – that is to say, one that can become streamlined for the specific tasks they find most important. At a certain point, any learning device worthy of the name will sacrifice a degree of capacity for content, just as, for instance, a proportion of the CPU inside most modern desktop PCs is given over wholly to graphics, sacrificing a degree of versatility for a degree of specialization. Such changes in our learning apparatus are likely to reinforce the needs and behaviour that led to them. For example, all teeth – whether canine or molar – can be used to eat every foodstuff. But if you had only canine teeth there would be a certain disposition towards meat-eating – for the simple reason that it would take a lot of effort nibbling away at seeds with a mouth full of pointy fangs. If what was once general-purpose becomes specific-purpose enough, then it might as well be said to come with its content ready-determined. In our own case, this content is a comprehension of how human languages work. A similar understanding might be programmed into a computer or one day genetically modified into an animal, but for humans language is a free gift. It is also a substantive gift, for human grammar is not the only possible structure a language might possess. Computer languages can express the same propositions as English or French using quite different principles.

Despite the work of Chomsky, the battle between rationalism and empiricism should not be deemed a victory for the followers of Descartes. It would be more accurate to say that rationalism was killed stone dead before its organs were found to be of use to other patients. The dogmas of empiricism were not exposed by rationalist thought, but by a more careful examination of the facts of experience. While empiricism failed as metaphysics, as a methodology it led to all the fruits of modern science – including the new science of lingustics, founded by Chomsky.

The discipline’s first steps consisted in identifying abstract grammatical features common to all languages on Earth. Chomsky himself proposed that such linguistic universals were the result of a common genetic heritage. Alternatively, the common structures may have been owed to a shared cultural heritage. If all humans today are descended from a small group, then all languages are probably descended from a single prehistoric tongue. Although the language has changed so much that different branches cannot be understood by others, certain general features (perhaps those that allow for translations to take place) have remained. Even so, humans in unusual circumstances – such as twins locked in the attic by crazed parents – have demonstrated an ability to create a language all by themselves. It is difficult enough for nurturists to explain language learning given the limited stimulus of normal child-rearing. A complete absence of stimulus would appear to settle the argument in Chomsky’s favour.

Given humanity’s evolutionary origin, some form of innate ideas could be expected on grounds of efficiency. The same reasoning also explains why not all useful beliefs are innate. Although it can confer an advantage in terms of energy saved and early death avoided, even more energy might be saved if such knowledge is not stored innately, because that knowledge is just so easily available to any creature with basic powers of reasoning. Certain beliefs might come bundled with the faculty of believing, but it would be a waste of energy to make noun forms innate if any algorithm worth its genes would soon alight on such a convenient way of speaking. A child needs to learn the arbitrary vocabulary and minor grammar that his parents use because these are not the sorts of things that it makes sense to hardwire from birth. The more detailed needs of language often change rapidly to match environmental changes, and the slow-moving human genome could never keep pace with such demands.

However, in a more or less stable environment there is great survival value in an innate capacity or propensity. It would save a lot of time, for example, if you did not have to learn how to see the world in three dimensions. As Daniel Dennett remarks on the alternative: ‘Skinnerian conditioning is a fine capacity to have, so long as you are not killed by one of your early errors.’3 It is obviously far better to weed out life-threatening or inefficient behaviour before it has a chance to be tested, although if grammar has a genetic basis then presumably such actions have indeed been tried out and removed from the gene pool in generations past. As Descartes and the rationalists suspected, the most important beliefs are indeed innate, they are beliefs that are important for survival in this world, rather than the next. If the higher truths of metaphysics and moral values are not among their number, we must remember that they were put there not by gods passing their time in mathematics and logic, but by primates picking peanuts and avoiding tigers.