----------------

History of Humankind

----------------

Origin and Evolution of the Cosmos
Formation of Planet Earth and the Origin of Life
History of Humankind (Speciation)
History of Humankind (Domestication)
History of Humankind (Civilization)
History of Humankind (Industrialization)
History of Humankind (The 20th Century)


Introduction: The First Humans (The History of Mankind: The First Humans, 1993) D. Johanson, G. Burenhult
What is Humankind?: The Evolution of Human Behavior (The History of Mankind: The First Humans, 1993) R. Fletcher
Human Origins: Our Earliest Ancestors (The History of Mankind: The First Humans, 1993) C. Groves
Towards Homo Sapiens: Habilines, Erectines, and Neanderthals (The History of Mankind: The First Humans, 1993) C. Groves
Modern People in Africa and Europe (The History of Mankind: The First Humans, 1993) G. Burenhult
The Rise of Art (The History of Mankind: The First Humans, 1993) G. Burenhult
Spreading Throughout the Globe (The History of Mankind: The First Humans, 1993) G. Burenhult
The Settlement of Ancient Australia (The History of Mankind: The First Humans, 1993) J. Peter White
Modern People in the New World (The History of Mankind: The First Humans, 1993) G. Frison
Just Another Species of Big Mammal (The Third Chimpanzee, 2006) J. Diamond
A Tale of Three Chimps (The Third Chimpanzee, 2006) J. Diamond
The Great Leap Forward (The Third Chimpanzee, 2006) J. Diamond
An Animal with a Strange Life Cycle (The Third Chimpanzee, 2006) J. Diamond
The Evolution of Human Sexuality (The Third Chimpanzee, 2006) J. Diamond
The Science of Adultery (The Third Chimpanzee, 2006) J. Diamond
Sexual Selection, and the Origin of Human Races (The Third Chimpanzee, 2006) J. Diamond
Why Do We Grow Old and Die? (The Third Chimpanzee, 2006) J. Diamond
Uniquely Human (The Third Chimpanzee, 2006) J. Diamond
Bridges to Human Language (The Third Chimpanzee, 2006) J. Diamond
Animal Origins of Art (The Third Chimpanzee, 2006) J. Diamond
Agriculture's Mixed Blessings (The Third Chimpanzee, 2006) J. Diamond
Alone in a Crowded Universe (The Third Chimpanzee, 2006) J. Diamond
Nothing Learned, and Everything Forgotten? (The Third Chimpanzee, 2006) J. Diamond
Rethinking the Human Revolution (Rethinking the Human Revolution, 2007) P. Mellars
The Origin and Dispersal of Homo Sapiens (Rethinking the Human Revolution, 2007) C. Stringer
A Constructionist Approach to the Evolution of Human Mental Capacities (Rethinking the Human Revolution, 2007) K. Gibson
Did Working Memory Capacity Power the Evolution of Modern Thinking? (Rethinking the Human Revolution, 2007) T. Wynn, F. Coolidge
The Social Brain and the Cultural Explosion of the Human Revolution (Rethinking the Human Revolution, 2007) R. Dunbar
Did Syntax Trigger the Human Revolution? (Rethinking the Human Revolution, 2007) D. Bickerton
Music and the Origin of Modern Humans (Rethinking the Human Revolution, 2007) S. Mithen
Down With the Revolution (Rethinking the Human Revolution, 2007) S. McBrearty
Evidence for the Origin of Symbolic Behaviour In and Out of Africa (Rethinking the Human Revolution, 2007) F. D'Errico, M. Vanhaeren
What Makes Us Human? (What Makes Us Human?, 2007) W. Bodmer
Imitation Makes Us Human (What Makes Us Human?, 2007) S. Blackmore
Memory, Time, and Language (What Makes Us Human?, 2007) M. Corballis, T. Suddendorf
Why Are Humans Not Just Great Apes? (What Makes Us Human?, 2007) R. Dunbar
The Hominid That Talked (What Makes Us Human?, 2007) M. Gentilucci, M. Corballis
Curiosity and Quest (What Makes Us Human?, 2007) C. Pasternak
Human Evolution and the Human Condition (What Makes Us Human?, 2007) I. Tattersall
Deep Social Mind in the Evolution of Human Nature (What Makes Us Human?, 2007) A. Whiten
Causal Belief Makes Us Human (What Makes Us Human?, 2007) L. Wolpert
The Cooking Enigma (What Makes Us Human?, 2007) R. Wrangham
Language: A Darwinian Adaptation? (The Evolutionary Emergence of Language, 2000) C. Knight, M. Studdert-Kennedy, J. Hurford
The Evolution of Cooperative Communication (The Evolutionary Emergence of Language, 2000) C. Knight
The Emergence of Phonetic Structure (The Evolutionary Emergence of Language, 2000) M. Studdert-Kennedy
The Emergence of Syntax (The Evolutionary Emergence of Language, 2000) J. Hurford
The Evolution of Social Organization (The Evolution of Culture, 1999) C. Key, L. Aiello
Symbolism and the Supernatural (The Evolution of Culture, 1999) S. Mithen
The Evolution of Language and Languages (The Evolution of Culture, 1999) J. Hurford
The First True Humans (The Dawn of Human Culture, 2002) R. Klein, B. Edgar
Humanity Branches Out (The Dawn of Human Culture, 2002) R. Klein, B. Edgar
Nurture or Nature Before the Dawn? (The Dawn of Human Culture, 2002) R. Klein, B. Edgar
First Tool-Users and Makers (The First Africans, 2008) L. Barham, P. Mitchell
The Problem of Modern Human Origins (The Evolution of Modern Human Diversity, 1996) M. Mirazon Lahr
The Human Revolution (Origins and Revolutions, 2007) C. Gamble
The Birth of History (A Global Human History, 2003) S. Mithen
The World at 20,000 BC (A Global Human History, 2003) S. Mithen
'The Blessings of Civilisation' (A Global Human History, 2003) S. Mithen
What Are Human Beings? (Humans Before Humanity, 1995) R. Foley
When Did We Become Human? (Humans Before Humanity, 1995) R. Foley
Footprints On The Sands of Time (African Exodus: The Origins of Modern Humanity, 1997) C. Stringer, R. McKie
The Sorcerer (African Exodus: The Origins of Modern Humanity, 1997) C. Stringer, R. McKie
The First True Humans (Human Beginnings in South Africa, 1999) H.J. Deacon, J. Deacon
Emergence of Modern People (Human Beginnings in South Africa, 1999) H.J. Deacon, J. Deacon
Our Distant Past (The Dominant Animal, 2008) P. Ehrlich, A. Ehrlich
Of Genes and Culture (The Dominant Animal, 2008) P. Ehrlich, A. Ehrlich
An Evolutionary Approach to the Origin of Mind (Human Evolution, Language and Mind, 1996) W. Noble, I. Davidson
The Origin of Symbol-Making (Human Evolution, Language and Mind, 1996) W. Noble, I. Davidson
The Naked Ape (How Human History Is Revealed In Our Genes, 2004) J. Relethford
Do You Know Where Your Ancestors Are? (How Human History Is Revealed In Our Genes, 2004) J. Relethford
Introduction and Conclusion (The Emergence of Culture, 2006) P. Chase
How is Human Culture Different? (The Emergence of Culture, 2006) P. Chase
From Social Interaction to Social Institutions (Why We Cooperate, 2009) M. Tomasello
Culture Is Essential (Not By Genes Alone, 2005) P. Richerson, R. Boyd
Nothing About Culture Makes Sense Except in the Light of Evolution (Not By Genes Alone, 2005) P. Richerson, R. Boyd
The Neanderthal Within (The 10,000 Year Explosion, 2009) G. Cochran, H. Harpending
The Evolution of Certain Novel Human Capacities (The Descent of Mind, 1999) P. Bloom
Primate Social Insticts, Human Morality, and the Rise and Fall of 'Veneer Theory' (Primates and Philosophers, 2006) F. de Waal
Ethics and Evolution: How To Get Here From There (Primates and Philosophers, 2006) P. Kitcher
On The Origin of the Human Mind (Evolution and the Humand Mind, 2000) R. Dunbar
Approaches to Modelling Early Human Minds (Modelling the Early Human Mind, 1996) P. Mellars, K. Gibson
The Sapient Behaviour Paradox (Modelling the Early Human Mind, 1996) C. Renfrew
Symbolism, Language, and the Neanderthal Mind (Modelling the Early Human Mind, 1996) P. Mellars
The Biocultural Human Brain, Seasonal Migrations, and the Emergence of the Upper Palaeolithic (Modelling the Early Human Mind, 1996) K. Gibson
Evolution of the Symbolic Self (Evolution of the Psyche, 1999) J. Skowronski, C. Sedikides
Consciousness in Evolution (A Mind So Rare: The Evolution of Human Consciousness, 2001) M. Donald
The Triumph of Consciousness (A Mind So Rare: The Evolution of Human Consciousness, 2001) M. Donald
The Search for a Mental Rubicon (The Evolution of Cognition, 2000) E. Macphail
Climate, Culture, and the Evolution of Cognition (The Evolution of Cognition, 2000) P. Richerson, R. Boyd
Cultural Cognition (The Cultural Origins of Human Cognition, 1999) M. Tomasello
How Humans Became Intelligent (The Thinking Ape: Evolutionary Origins of Intelligence, 1995) R. Byrne
The Evolutionary Roots of Higher Cognition (The Evolution of Mind, 1998) D. Dellarosa Cummins
Reconstructing the Evolution of the Human Mind (The Evolution of Mind, 2007) E. Smith
How the Evolution of the Human Mind Might Be Reconstructed (The Evolution of Mind, 2007) S. Mithen
What Nonhuman Primates Can and Can't Teach Us About the Evolution of Mind (The Evolution of Mind, 2007) C. Stanford
Chimpanzee and Human Intelligence: Life History, Diet, and the Mind (The Evolution of Mind, 2007) J. Lancaster, H. Kaplan
The Hominid Entry Into the Cognitive Niche (The Evolution of Mind, 2007) H. Barrett, L. Cosmides, J. Tooby
Key Changes in the Evolution of Human Psychology (The Evolution of Mind, 2007) S. Mithen
Evolution of the Social Brain (The Evolution of Mind, 2007) R. Dunbar
Brain Evolution (The Evolution of Mind, 2007) G. Miller
General Intellectual Ability (The Evolution of Mind, 2007) S. Mithen
Deep Roots of Kin: Developing the Evolutionary Perspective From Prehistory (Early Human Kinship, 2008) J. Gowlett
From Ape Gestures to Human Language (Origins of Human Communication, 2008) M. Tomasello
Language and Revolutionary Consciousness (The Transition to Language, 2000) C. Knight
A Putative Role For Language in the Origin of Human Consciousness (The Evolution of Human Language, 2010) I. Tattersall
On the Evolution of Human Language (The Evolution of Human Language, 2010) P. Bingham
An Acorn Grows to a Sapling (Adam's Tongue: How Humans Made Language, How Language Made Humans, 2009) D. Bickerton
The Social Brain and the Distributed Mind (Social Brain, Distributed Mind, 2010) R. Dunbar, C. Gamble, J. Gowlett
A Technological Fix for 'Dunbar's Dilemma'? (Social Brain, Distributed Mind, 2010) L. Barham
Enhanced Working Memory and the Evolution of Modern Thinking (The Rise of Homo Sapiens, 2009) F. Coolidge, T. Wynn
Anatomy, Behavior, and Modern Human Origins (The Human Career: Human Biological and Cultural Origins, 2009) R. Klein
The Morphological and Behavioural Origins of Modern Humans (The Speciation of Modern Homo Sapiens, 2002) C. Stringer


Introduction: The First Humans, D. Johanson, G. Burenhult

The study of paleoanthropology concerns the origins of humankind—a very personal investigation of how we came to be Homo sapiens. The broad outlines of the last four million years of human evolution are fairly well known to paleoanthropologists. Every year, with each new discovery, with each novel interpretation, we are slowly beginning to fill in the details. Because of the vagaries of the archaeological and paleontological record, however, vital clues may never be found. One overriding observation is certain: Africa has played a critical role at every stage of the human career.

Hominids arose in Africa from an as yet unidentified ape-like ancestor some time between 4 and 10 million years ago, during the Miocene period. By 4 million years ago, a primitive, but erect, walking hominid known as Australopithecus afarensis made an appearance in the geological record of East Africa. This species apparently gave rise to two distinct branches of hominid evolution. One branch consisted of robust vegetarians, which became extinct about a million years ago. The other lineage was characterized by increasing brain size; the first known species is called Homo habilis. Current evidence suggests that 1.5 million years ago, Homo erectus arose and became the first hominid to leave Africa and begin to populate Eurasia. Although still very controversial, recent work suggests that anatomically modem humans first appeared in Africa more than 100,000 years ago. Anthropologists cannot agree on how many branches sprouted from the "mother" of us all, "Lucy", but today only one hominid survives—ourselves.

It is often said that space is the last frontier, but I believe that time is also one of the last frontiers—the distant past when our ancestors initiated the long evolutionary journey that led to Homo sapiens. This was a tortuous and unpredictable path. There was no grand plan that ensured that modern humans would evolve. At any point along the way, as was the case for the robust vegetarians, our ancestors could have become extinct. So far, we have survived the evolutionary journey, and today we have became the introspective species, a species that has the capacity to plan for the future while retaining the curiosity to ponder its past. Let us hope that we will use the enlightenment of the past to carefully and thoughtfully prepare the road for the future.

In the last few decades , research into our prehistoric past has undergone explosive developments. The biological and cultural evolution of humankind is being seen in a new light. Traditional and stereotyped ideas about different "cultures" and lines of evolution, and one-sided explanations of changes resulting from the spread of people, ideas, and other cultural elements, have been replaced by a deeper understanding of the fact that human cultural manifestations are the result of regional adaptations to the surrounding environment and its resources, to the continuously changing ecology of planet Earth. New findings and a greater degree of interdisciplinary cooperation have provided a better insight into environmental development and into the social and economic conditions our ancestors would have experienced. New and more accurate dating methods have in many cases revolutionized our thinking about considerable parts of our prehistoric past.

Compared to the vast span of time that human evolution embraces, our own individual lifetime is little more than the twinkling of an eye. Tool-making humans have existed on Earth for more than two million years. As a comparison, historic time comprises only some two or three thousand years—that is, about 0.2 percent of the time that humans have existed. The heyday of the Vikings, for instance, is only 30 generations away. During this long period, humans have adapted to—and evolved in—a variety of different ecological systems: from tropical rainforests and deserts to high mountains and icy tundras. The highly specialized economy of the Paleolithic big-game hunters, the rich subsistence of Europe's Mesolithic hunter-gatherers, and the appearance of farming cultures are all examples of this process of adaptation. It was not until this century that, as a result of the industrialization process, the majority of the world's population was no longer actively involved in producing food. In just a few generations, this alienation from our natural surroundings has created a profound and alarming ignorance of the vital natural balance. Short-term economic interests and employment policies destroy irreplaceable parts of the ecosystem, and we are often overconfident of modern technology's ability to artificially replace this natural balance.

One of the primary aims of prehistorians must be to spread knowledge of how the conditions of human life have changed over the millennia as a result of our remarkable ability to adapt to our surroundings. A more profound knowledge of human evolution paves the way for a better understanding of humans' role in the ecosystem and can in this way help to arrest the overexploitation of natural resources and the destruction of the environment.

What is Humankind?: The Evolution of Human Behavior, R. Fletcher

Humans attach moral values to caring for each other, but morals are probably a very recent development. To transmit moral values, we need language, and there is no evidence that our current form of language is much more than 50,000 years old. We were becoming human long before that, and our humanness is founded in our distant past, not uniquely created by our most recent forms of behavior.

Over the past two to three million years, human cultural behavior has become considerably more complex. As our ability to make things, using progressively longer sequences of actions, has evolved, we have also developed a much greater range of distinctively human behavior. The capacity to retain "information in our mind and to retrieve this information has increased enormously over this time, and at some point allowed us to consciously know who we are. Our evolving culture creates new opportunities but also complex problems. Creating tools requires the ability to remember actions. Humans have long made simple tools and continue to do so, but we have developed the ability to carry out the increasingly complex sequences of actions required to make more and more elaborate stone tools. Similarly, while we made only the simplest windbreaks a million or more years ago, by the end of the most recent Ice Age, about 15,000 years ago, we were able to build elaborate huts from hundreds of interlocked mammoth bones. Within the last half-million years, the action sequences became complex enough for us to make and sustain fires. By remembering actions, we began, within the last 100,000 years, to recall the movements and gestures associated with our dead, turning meaningless corpses into remembered relatives. And within the last 50,000 years, our capacity to make such associations and to retain mental images has led to the ability to represent the outside world and the content of our minds in art. What we must strive to understand is how long all tiiese things took. Our evolution began very slowly, and the majority of the changes and elaborations we recognize as distinctly human are, in archaeological terms, very recent.

We did not have to walk upright in order to be able to make and use tools, but it did free the hands to carry and manipulate objects much more readily. Primates are very dexterous. They are also playful and inquisitive. The making and use of tools by early hominids is not in itself surprising, since our nearest primate relatives possess this ability. What is different with hominids is that they began to manipulate durable materials. We might expect this from a creature living in less wooded country and camping on open ground by streams and lakes. The search for food in shallow water, scrabbling among pebbles for lizards and insects, or pushing aside dead branches and bits of rock to define a camping space brought hominids into habitual contact with durable materials. This ability is what sets us apart. Inevitably, the camp sites became marked by fragments of stone and food debris as hominids began to use naturally fractured rock, and then started to smash rocks to obtain sharp pieces. Eventually, the ability to repeatedly produce tools of similar form evolved. The earliest known tools of this kind have been found at Hadar, in Ethiopia. But we should not assume that stone tools gave us an immediate adaptive advantage. For a million years, hominids were no more successful as a species than monkeys or apes had been.

This relationship with durable materials, whether in the form of tools or debris on camp sites, began a profound transformation of our behavior. Several new factors were introduced into our social life. Significantly, territorial control could be signaled by inanimate objects, such as abandoned camp sites. These not only indicated the location of hominid groups throughout the landscape, they also served as a warning to newcomers that they might be trespassing. An inanimate, durable, cultural geography appeared, signaling the way in which hominids were spread across the landscape even when the individuals were no longer present. While this signaling was not at first deliberate, it would gradually have become so under selective pressure. Here, too, our behavior sets us apart from the higher primates, who mark territory primarily by active confrontation between individuals.

We believe that language emerged when our ancestors realized that the sounds or signals they made were a means of referring to features of the environment. Once this happened, then and only then could these sounds or signals be used, altered, and multiplied to refer to more and more such features. In this way, early humans discovered the symbolic possibilities of gestures used for communication. As a result, their behavior became more complex. They could remark upon their current behavior, conceive of other ways of behaving, recite past events, and plan future ones—and were thus able to bring their environment, including their social environment, under increasing control.

Hominid evolution is said to have involved an increasing capacity for visual control of the arms and hands, and thus improvement in one-handed, aimed throwing. This, of course, would have enabled hominids to point in order to indicate features of the environment—such as prey or predators. In turn, this could have led to tracking the movements of animals, and characteristic features of the animal being tracked could also have been signaled by hand and arm movements tracing or mimicking the animal's gait or outline. These silent maneuvers would have communicated the necessary information to other group members without alerting the prey to the group's presence.

The next step in the emergence of language-like behavior could have occurred when, in the act of making such signs, our ancestors left marks in mud or sand. The marks would have become visible as objects in the external world. The gesture as a visible record could then be seen as an independent entity, conveying information by itself. When this happened, the way was open for signs, both visible and audible, to be seen and exploited as symbols. In Europe, evidence of such symbols is not found earlier than about 36,000 to 32,000 years ago, in the form of three-dimensional figures which were clearly created using common conventions of representation and reference. The arrival of humans in Australia, which has been reliably dated to at least 50,000 years ago, is our earliest evidence for the use of language. The people who came to Australia would have to have crossed open seas, and to do this they obviously had to build seagoing craft. To have conceived, planned, and carried out such activity without the use of language would clearly have been impossible.

Being able to remember and to predict actions would have led to the development of the characteristic human trait of persistence. Clearly, the more information a hominid could remember as a basis for its actions—whether in making tools, social competition, or the search for food— the more advantaged it would be. For instance, between one and two million years ago, we may have hunted, but we were certainly opportunistic scavengers. At least 100,000 to 50,000 years ago, we were successfully hunting big game, perhaps even in a coordinated way. Modern humans continue to hunt even when the prey is out of sight for many hours. The more information we have stored in our brains, the more persistent we are likely to be in order to achieve our end.

Alone among the animals, humans are attracted to fire and can control it. But we should not suppose that savanna animals are unfamiliar with fire or always avoid it. Grassland fires are frequent and extensive. Numerous insects and small animals die. Scavengers move in behind the fires to obtain food, and early hominids would presumably have sought food in the same way, picking through the ashes and moving charred sticks to get at food. They might well have carried a smouldering stick as a tool. But actually maintaining fires and having the capacity to create fire require quite elaborate sequences of actions. Near the underground lake of Escale, in France, possible remains of fires have been found, along with the debris of human occupation, dating back as far as 700,000 years ago. Most evidence of hearths, however, is found within the last 300,000 years.

The control of fire has several consequences. Food can be cooked, and wooden tools shaped and hardened. Controlled burning of grassland helps to drive animals toward hunters. Accidental, repeated fires in woodland increase the extent of open pasture, encouraging larger populations of herbivores, such as deer. Fires also signal the presence of humans. When Captain James Cook first approached the eastern seaboard of Australia, he knew the country was inhabited because he could see plumes of smoke from numerous camp fires. At night especially, a fire can indicate where people are living. By using fire, humans inadvertently provided additional signals, visible over considerable distances, about their location in the landscape. Meetings no longer depended entirely on chance, and the social world became more complex. Just as stone tools gave humans a new means of signaling, so too did fire, adding to their capacity to predict and control their world.

The next major change in human behavior was the ability to recognize that the dead body was once human. Up to 100,000 years ago, there is no evidence that hominids perceived their dead any differently from other kinds of dead meat. Hominid bones have been found scattered and broken among the rubbish of camp sites. This is consistent with the way other primates neglect the dead. A chimpanzee mother will carry her dead baby around for a day or two, but becomes increasingly less aware of it. At first she cuddles the corpse, but she is then likely to carry it by one leg or drag it along, until eventually she puts it down and forgets it. The body no longer emits signals indicating that it is a chimpanzee, and the mother does not have the conceptual capacity to remember the actions once associated with her offspring. After about 100,000 years ago, the Neanderthals began to interact with their dead, which they treated in many different ways. But we should beware of inferring that they thought as we do. The Neanderthals may be fascinating precisely because they perceived their dead in ways entirely beyond our experience. What if memory span varied among Neanderthals, as intelligence does among modern humans? The Neanderthal bodies found at Krapina, in the Balkans, were elaborately defleshed and burned, yet in the cave of Hortus, in the Pyrenees, they were just dumped in with the rubbish. In the cave of La Chapelle-aux-Saints, in southern France, an old man was buried in a deep pit.

Aggression in humans, as in animals, serves jsveral different purposes. It can lead to destructive behavior and cause enormous difficulties, but sometimes it has a positive aspect. For example, we overcome physical arid mental problems— any goal-directed activities blocked by an obstacle—by taking an aggressive approach to caern. This point needs to be emphasized, for aggression is often considered solely as a negative force. Some people have even proposed mat children should be brought up in such a way that they would lack any aggressive tendency at all. This would do great harm to the individuals concerned, rendering them defenseless. Among other things, aggression is necessary if people are to rebel against injustice and dictatorship.... When hominids began to bury their dead, we have the first indication that they were able to connect past actions with an inert body. No great memory capacity was required, probably no more dian a few weeks. The Neanderthals usually interred complete bodies, presumably within a few days of death. We should not conclude from this that these early burials are evidence of a belief in an afterlife. That surely requires the combination of a developed capacity for memory and the ability to envisage a future stretching beyond one's own lifetime.

Once humans could consciously link past actions with an object, they possessed the basic capacityneeded for artistic behavior. Linking an observed object with remembered characteristics leads to the ability to recall the characteristics of a person or animal without direct observation. Instead of seeing an object and merely remembering the past, humans could remember versions of the past and represent them as objective shapes, or see an object and shortly after immediately recall it to produce an image. This was probably an unusual aptitude— even today many people cannot do it well. At first these shapes and images were vague, uncertain, or simple. There is much dispute as to what constitutes the earliest recognizable art, such as a few pieces of polished and scratched ivory from Tata, in Hungary, and some scratches on a bone from La Ferrassie, in southern France, dating back to 50,000 years ago. By 30,000 years ago, we find small carvings of horses and simple engraved shapes which have generally been interpreted as images of vulvas. Over the next 10,000 to 15,000 years, art became more elaborate, both in technique and content. Among the cave art discovered atLascaux, also in southern France, there are even images of fictitious creatures combining features of several animals. As well as trying to understand what the art meant, we can ask what purpose it served. Just as fire signaled location over great distances, so art allowed detailed messages to be transmitted through time. We no longer needed to remember all we had learned—we simply needed to know where to find the required information in the material records we could create.

What our ancestors did two to three million years ago was to commit us to a complex relationship with artifacts made of hard materials. In a myriad ways, these objects created social stresses and new signals and have successively affected the way we interact with other human beings in our communities, across space and time. Not only did these artifacts help to shape our behavior, developing our ability to predict and persist, they eventually gave our finite brains the material means to store, organize, and analyze potentially unlimited knowledge.

The earliest fossil evidence of creatures directly ancestral to humans belongs to the genus Australopithecus, who lived in East and South Africa two to four million years ago. Although this continues to be debated, some scientists studying these fossil bones note a marked difference in the body size of males and females. An adult female individual (named "Lucy" by her discoverers) has been reconstructed as about 1.1 meters (3 feet, 7 inches) tall and some 27 kilograms (59 pounds) in weight A presumed male individual was reconstructed to perhaps 1.6 meters (5 feet, 3 inches) tall and about 50 kilograms (110 pounds) in weight. On the basis of this evidence, it has been suggested that early in our evolutionary history the pair-bonded, monogamous nuclear family common to many modern human societies, and the sex roles associated with it, had notyet evolved. During subsequent human evolution, the fossil evidence shows a gradual reduction in dimorphism, so that by the time individuals who looked like modern humans appear on the scene, some 100,000 years ago, dimorphism is similar to that in living humans. None of this tells us precise details about sex roles; but if our understanding of skeletal and social relationships is accurate, it implies that male-female relationships were different in our earliest prehistory. What the situation was, we do not know.

Personal ornaments are found about 30,000 years ago in Europe: shells with holes drilled in them, probably to be strung as bracelets and body ornaments, found in the graves of both sexes. Not long after, also in Europe, are found some of the remarkable remains of the plastic art of the time: incised or, sometimes, sculptured images of women on bone and ivory, and clay sculptures such as the well-known Venus of Willendorf. While some images of males are found in the famous cave art paintings or engraved on rock faces, most images are of women with exaggerated anatomical features in the form of pronounced breasts and buttocks. Scholars debate the significance of these objects. Were they simply accurate images of local women? Did they represent fertility images? Or were they intentionally erotic? By the time agriculture appeared, some 10,000 years ago, humans were living in organized groups that we would not find unusual.

Lacking specific evidence, scholars speculate about the ways by which male-female sex roles evolved in our evolutionary history. These conjectures have often projected modern notions onto the past and have followed our own social trends. For example, many scholars believed until recently that the evolutionary history of human social groups involved male and female division of labor, whereby males hunted and brought back the spoils to so-called home places, where they shared this food with the females, who waited for them, nurturing and caring for infants and young. A modification of this view proposed that males and females initially had separate feeding strategies and that a monogamous pair bond was not only a way of sharing resources and maximizing the survival potential of each sex but also had the benefit of ensuring paternity.

The dietary preferences and concerns of many modern societies can be identified in these scenarios. By contrast, recent researchers have pointed to the importance of women in modern, egalitarian hunting and gathering societies and have suggested this as a model for our prehistoric past. Here, men and women have overlapping activities and spheres of influence, in which the basic food items (vegetables, seeds, nuts, insects, small animals) are supplied by women, with men supplying much less material of daily importance. Other contemporary accounts not only focus on gender roles but also look at aspects of sexual behavior and emotion. Recognizing nonmonogamous lifestyles and reconsidering the nature of male and female sexuality, some investigators have suggested multiple sexual partners for both sexes, in the form of mild polygyny or serial pair-bonding, as a pattern accompanying our evolutionary history. Thus, the concerns of modern American and European society as to the place of monogamy and the nuclear family, and division of labor in terms of power relations, sharing, and cooperation in daily life, have all contributed to reconstructions of prehistoric sex roles. The accuracy of these notions when applied to prehistory is unclear. It may be that instead of looking down the long corridor of the past, we have been looking in the mirror.

Human Origins: Our Earliest Ancestors, C. Groves

As early as 1863, in an essay entitled "Man's Place in Nature", Thomas Huxley had concluded that the African apes—the chimpanzee and the gorilla—are more closely related to us than is me orang-utan. In his Descent of Man, published in 1872, Darwin argued that if the African apes are indeed more closely related to us dian the Asian ones, our own origins are likely to have been in Africa. Though a number of authorities have from time to time argued otherwise, since the 1940s the consensus has been that Huxley and Darwin were right: the chimpanzee and gorilla are closer to us than is the orang-utan, and it is in Africa, not in Asia or elsewhere, that we should look for remains of our earliest ancestors. Traditionally, apes have been classified as belonging to a zoological family, Pongidae, separate from our own family, Hominidae. Increasingly, however, specialists have been inclined to include the Great Apes as well among the Hominidae, putting the orang-utan in one subfamily and humans, chimpanzees, and gorillas in another.

By the early 1960s, analytical techniques were available that allowed us to compare different primate groups in terms of their biochemistry. Immunological techniques were used at first, but these were gradually superseded by the more sophisticated techniques of protein sequencing and, eventually, analysis of the DNA itself. The answer was always the same. Chimpanzees, gorillas, and humans are very closely related indeed, orang-utans are more distantly related to us, gibbons are further away, and monkeys still further off. There is still disagreement as to whether chimpanzees are closer to humans or to gorillas (or whether all three are equally closely related), but there is no longer any doubt about the closeness of all three.

Many now believe that, taken overall, changes in the protein structure and DNA of living organisms occur fairly regularly over long periods of time. If it is known how different two species are in terms of one of their proteins, or in parts of their genome (the complete genetic material for any cell, which determines heredity), it is possible to calculate how long ago they shared a common ancestor. This concept is known as the molecular clock, and although it does not keep perfect time, it does set limits—and it tells us that our evolutionary line must have separated from the chimpanzee's between about seven and five million years ago.

The taxonomic group that includes humans and apes—the superfamily known as Hominoidea, or the hominoids—was established by about 20 million years ago. In the Early Miocene period (19 to 18 million years ago), there were at least 10 different species of apes in East Africa, large and small. The best-known belong to the genus Proconsul (named after a popular zoo chimpanzee of the 1890s called Consul), discovered in 1933 and now known from a nearly complete skeleton (and several partial ones) and dozens of jaws, teeth, and skull fragments. Studies of the remains have shown that Proconsul lived in trees, walked on all fours, lived on fruit, was probably tailless, and had large canine teeth. One species was smaller than a modern chimpanzee, another nearly as big as a gorilla.

Many authorities considered Proconsul a good candidate for the common ancestor of the Hominidae, the family comprising humans, chimpanzees, gorillas, and orang-utans (but not gibbons, whose evolutionary line was already separate). Others were unsure, pointing to features of the teeth, the jaw, and the limb bones that were not what we would expect to see in such a common ancestor. In the mid-1980s, a new fossil ape was discovered, contemporary with Proconsul. Named Afropithecus, it is still less well known than Proconsul but seems much more like what we would expect the common ancestor to have looked like. In a sense, it is a fossil that had to be invented before it was discovered. Another large ape, Kenyapithecus, has been identified from the Middle Miocene period (14 million years ago). Like its probable ancestor, Afropithecus, it had large canine teeth, but the face was shorter, and in other respects, too, it was more "advanced" in evolutionary terms towards the living Hominidae, whose last common ancestor it may well have been.

From about four million years ago, it is as if a curtain has suddenly been lifted. Instead of a few frustrating scraps of bone, we are confronted with an abundance of fossils. Key sites are Laetoli, in Tanzania, dating to between 3.75 and 3.5 million years ago; Hadar, in Ethiopia, dating to between 3.3 and 2.9 million years ago; and two sites in South Africa, Sterkfontein and Makapansgat, both between 3 and 2.5 million years old. The fossils found at these sites belong to the genus Australopithecus (meaning "southern ape"). Like apes, they had a small cranial capacity and a protruding jaw (a feature known as prognathism), but their canine teeth were much shorter and they walked upright. The first specimen was an infant, discovered by Raymond Dart in 1924 atTaung, in Cape Province, South Africa. Robert Broom discovered the rich site of Sterkfontein, while Dart himself excavated Makapansgat, and Mary Leakey, Tim White, Don Johanson, and others were involved in the discoveries further north. The earliest of these fossils, from Laetoli, have been given the name Australopithecus afarensis. They consist of the jaws and teeth of some 24 individuals, the partial skeleton of a juvenile, and some fossil footprints. The jaws show canine teeth much smaller than those of apes, but rather larger and more pointed than our own, and the dental arcades are neither parabolic like those of modern humans nor rectangular like those of apes. The footprints are contentious, but they seem, on most assessments, to indicate creatures that walked on two legs, but had slightly divergent great toes and lateral toes that are long relative to humans but somewhat shortened relative to apes.

After Australopithecus africanus disappears from the archaeological record about 2.5 million years ago, there is a gap in the record of half a million years broken only by a few rather uninformative scraps of fossils recovered from deposits in the Omo Valley of southern Ethiopia. The next important fossils we have date from two million years ago and come mainly from two Rift Valley sites: Olduvai Gorge, in Tanzania, and Koobi Fora, in Kenya. The thick deposits accumulated at these two sites in the course of at least a million years have yielded abundant remains of skulls, jaws, teeth, and parts of skeletons, giving us a picture of the changes that occurred over this period. Unexpectedly, the picture is one of diversity. At both sites, two different prehuman species lived side by side from at least 2 to about 1.5 million years ago, and at the lower levels of Koobi Fora (known as the Upper Burgi Member), there is a third contemporary species as well.

The two Olduvai species are quite distinct. There is a small, lightly built one and a larger one with enormous premolar and molar teeth. The small one has a higher, more rounded braincase, with an average cranial capacity of 650 cubic centimeters (40 cubic inches), ranging in four specimens from 590 to about 700 cubic centimeters (36 to 43 cubic inches); a lightly built face with smaller, narrower cheekteeth; and the beginnings of a protruding nose. The large one has a smaller braincase—the average cranial capacity is 515 cubic centimeters (31 cubic inches), ranging in five specimens from 500 to 530 cubic centimeters (30 to 32 cubic inches)—and a foreshortened but heavily buttressed face, with tiny front teeth but huge cheekteeth and enormously developed chewing muscles, commonly giving rise to a crest on top of the head where they attached (known as the sagittal crest). Both walked upright, with the foramen magnum even further forward than in Australopithecus—as far forward as in modern humans—but both still had short legs and long arms. There is no doubt that the small, lightly built one is Homo. In every respect it is more "modern" than Australopithecus, more like ourselves. The very earliest specimen of Homo is a scrap of skull from Chemeron, near Lake Baringo, dated as being 2.5 million years old. The Olduvai species is known as Homo habilis. The large, robustly built type has traditionally been considered to be a late survival of Australopithecus, but most authorities now recognize it as something rather different and call it Paranthropus (sometimes, affectionately, "the Nutcracker"). The Olduvai species is called Paranthropus boisei... There seems little doubt that the habilines made simple stone tools. The earliest stone tools were found at Hadar and are 2.6 million years old. From two million years ago, artifacts are found in the archaeological record in their thousands, and wherever we find traces of their makers, they are members of the genus Homo—beginning with the habilines.

Over the course of the last two million years, the hominid brain has become bigger. Although the brain does not fossilize, it leaves some blurred indications of the convolutions of the cortex on the inside of the skull By making a latex endocast, it is possible to study something of the shape of the cortex. It has been suggested that the cortex of australopithecines was very similar to that of chimpanzees, while the earliest habiline skull, known as KNM ER-1470, which is at least 1.8 million years old, already shows signs of some of the distinctive features of the human brain, particularly in the regions said to be associated with speech. But this theory does not attempt to explain how or why spoken language became distinct from the noises made by other apes. The suggestion is simply that language appeared as a result of the human brain becoming bigger. A second theory that has been put forward is that language was necessary for early hominids to have organized their actions sufficiently to make stone tools. But others have pointed out that the earliest tools that have been found. from the Oldowan period, did not require anymore organization or technical skills than tools made by modern chimpanzees.

Shelter, the use of fire, and meateating are often considered fundamental to early hominids' ability to move out of Africa and successfully colonize new territory in more temperate and seasonal parts of the world. But much of the claimed evidence for these aspects of behavior has recently been brought into question. For instance, a stone circle dated to about 1.8 million years ago at Olduvai Gorge, in present-day Tanzania, said to be the remains of a human shelter, was in an area where crocodiles would most likely have eaten any hominids who rested there. Similarly, claims for the existence of a bough hut at Terra Amata, in southern France, 230,000 years ago rest on the evidence of nothing more than four stains in the sand. And although evidence of fire has been claimed from sites such as Chesowanja, dating back to 1.4 million years ago, and Zhoukoudian, dating back to 500,000 years ago, a recent assessment suggests that none of the claims earlier than Tena Amata is reliable—and even 230,000 years ago, it is doubtful whether hominids could regularly make fire. Furthermore, although meat has probably been an important part of the hominid diet since the genus Homo began to emerge, it is not clear how early hominids obtained meat. Early sites such as Torralba and Ambrona, in Spain, where large deposits of animal bones have been found, seem more likely to have been scavenging areas than places where hunted animals were butchered. There seems to be no good evidence that hominids built shelters, regularly made and used fire, or hunted systematically earlier than 125,000 years ago.

It seems, then, that language is not necessary to account for a number of early features of the archaeological record. It is necessary, however, to account for events that occurred around the world from about 60,000 years ago: the colonization of Australia and later of the Arctic and the Americas; the beginnings of art; the fact that ritual and convention became regionalized and localized; the beginnings of gender roles and power structure; and the start of agriculture. More than this we cannot say at present.

About 1.6 million years ago, a new species appeared in East Africa. The first specimen discovered was a complete skull, ER-3733, from Koobi Fora (KBS Member). Other, less complete skulls, as well as other bones, have been found there since. In the mid-1980s, a nearly complete skeleton, WT-15000, was discovered at Nariokotome, on the other side of Lake Turkana from Koobi Fora. So we now know a good deal about this new species, which for the moment we can call simply the Turkana Newcomer. Bernard Wood has proposed that the species should be called Homo ergaster, and this is probably correct....They had projecting brow ridges, a short face, a rather angular skull, and the merest beginnings of a projecting nose. They also had long legs and a much more modern skeleton dian the australopithecin.es or habilines. The skeleton known as WT-15000 was that of a boy about 12 years old. If he had survived into adulthood, he would have been 180 centimeters (6 feet) tall. Very clearly, the Turkana Newcomers were directly ancestral to later members of the human stock. Equally clearly, they were descended from the habilines—in fact, there is one habiline specimen, ER-1805, that some authorities prefer to place along with the Newcomers. These people made stone tools, at first not very different from those made by Homo habilis. Did they make fire, hunt big game, speak? We do not know—the evidence is equivocal. What they did do is replace the habilines. In some way, they were just that much better at—what? At being human, or "nearly human", we suppose.

The earliest traces of humans found outside Africa appear a little more than a million years ago. The best-known fossils of this period belong to a species called Homo erectus. In Java, the earliest specimens are about a million years old, the youngest only 100,000 years old. In China, they range from at least 800,000 to 230,000 years old. Like the Turkana Newcomers, Homo erectus have large brow ridges, but they are different in form: straight and thick, flaring out to the sides. The cranial capacity is larger, ranging from 750 to 1300 cubic centimeters (46 to 79 cubic inches), with some evidence from both Java and China that it increased over time. The braincase was low, flat, and angular, with thickened bone along the midline and at the back. There are some differences between fossils found in Java and China: the Java skulls have a flat, receding forehead, while the Chinese skulls have a convex forehead, and there are other slight differences. They are generally considered to be two different subspecies: Homo erectus erectus (Java) and Homo erectus pekinensis (China). The forehead shape of the Java fossils is the more primitive type, and the earliest of the China fossils, from Gongwangling, is, in fact, similar to the Java type. The earliest subspecies of all was excavated from levels at Olduvai dating to about 1.2 million years ago, and this primitive race, Homo erectus olduvaiensis, is held by some to be the only record of Homo erectus in Africa. If so, it evolved in Africa, then migrated elsewhere, and died out in its homeland. Others consider the Turkana Newcomers to be early representatives oiHomo erectus, and others again include later African fossils in the same species.

The earliest representatives of our own species, Homo sapiens, are known from two sites in Israel. Fossils found at Qafzeh have been dated by the thermoluminescence technique to 91,000 years ago, although a technique known as electron spin resonance analysis (ESR) suggests an even earlier date. Those found at Skhul are dated by ESR to 80,000 years ago. However, two sites in South Africa, Border Cave and Klasies River Mouth, may be equally old. Like modern humans, they have a high, rounded, shortened braincase, a rounded forehead, and a straight face with a chin. The brow ridges are smaller than in more primitive species, and the limb bones are long and straight.

If Homo sapiens evolved in Africa between 130,000 and 120,000 years ago, they had probably begun to spread out into Eurasia by about 90,000 years ago, or a little earlier. By 68,000 years ago, our species was in China. By 50,000 years ago, they were in Australia (which they had to reach by crossing open water, as Australia was never connected to Asia by dry land). And by 36,000 years ago, they were in western Europe, where we know them as the Cro-Magnon. It seems, however, that they did not reach the Americas until 15,000 to 12,000 years ago, although there is much controversy about this. If the regional continuity model, rather than the replacement model, is correct, then these dates simply record when modern humans evolved independently in different areas. Wherever Homo sapiens were found—in Africa, Europe, East or Southeast Asia, or Australia—the earliest people tended to resemble present-day peoples of the same region, but with one difference: they were bigger and more "robust". At the end of the Pleistocene period, people everywhere rapidly became slighdy smaller-boned, with smaller teeth. This is puzzling. It was at one time suggested that once people began to practice agriculture, they did not need such big teeth, but the same development took place even in people who remained huntergatherers, as in Australia. Perhaps, as the climate became warmer, more succulent foods became available, and it was simply easier to exist with smaller teeth and less chewing effort. The changes were small, but we simply do not know why they occurred.

The Great Apes are not only closely related to us anatomically, they also have very similar biochemistry to ours. A study carried out in the 1970s showed that humans and chimpanzees have nearly 99 percent of their DNA (the material of heredity) in common—a pretty amazing statistic. Given that they are so similar to us in terms of both anatomy and genetics, might we not expect them to be similar psychologically as well—particularly in terms of those features we think of as being uniquely human, such as tool-making, intelligence, selfawareness, and even language? Should we not expect to find these qualities at least in a rudimentary form?

The use of stone tools has characterized human (or, at first, protohuman) activity from 2.6 million years ago. It has been known for a long time that Great Apes in zoos and laboratories show a certain inventive flair in regard to mechanical aids. During the First World War, Wolfgang Koehler found that the chimpanzees in his laboratory on the Canary Islands could not only use sticks to get food that was out of reach, but could join different-sized sticks together, and pile boxes on top of each other, to reach food that was high up. Though gorillas are less dexterous, some orang-utans have developed extraordinary toolmaking skills. In the London Zoo, an orang-utan manufactured a wooden replica of the key to its cage and let itself out. Another, in the 1960s, was shown how to work stone, and made itself a sharp-edged flake to cut the string around a box containing food. In the 1970s, the intriguing discovery was made that chimpanzees learn to recognize themselves in mirrors. Monkeys, in contrast (like dogs and even elephants), react to their reflection as if it were another individual, even though they can come to understand the general concept of a mirror and use it to find hidden objects, as well as recognizing cage mates in it. Like chimpanzees, orang-utans and gorillas can also learn to recognize their own reflection. Does this mean that, like humans and unlike other animals, the Great Apes have a concept of self?

All this work on the mentality and intelligence of apes reminds us that we, as members of the human species, are part of the natural world. Even our special abilities (those we think of as being uniquely human) are not qualitatively but quantitatively different from those of our nearest nonhuman relatives. When we start speculating on the origin of various characteristically human forms of behavior, we have to remember that we did not evolve directly from animals that acted purely by instinct and lacked all traces of a humanlike intellect.

Did life on the savanna become so complex that we developed big brains to cope with it? Did early humans' way of life—the cooperation needed to hunt big game, or the need to outsmart lions to scavenge their prey, or the need to calculate where the most productive plants were likely to be ripening, or the requirements of food sharing, or the need to make tools—require us to have greater intelligence? Before speculating on such things, it is as well to recall that the Great Apes are already more intelligent than other primates, including gibbons and monkeys, and to ask ourselves why this should be so. Chimpanzees and orang-utans, and some populations of gorillas, live on fruit, and because they are all very large, they have an energy conservation problem. They certainly make calculations, both about the likelihood of fruiting in particular parts of the forest and about each other's motives. Their high intelligence also seems to enable them to be physically lazy. Is this what brainpower is really all about?

Perhaps, then, the question is not why are we so intelligent, but what is it that apes do that our ancestors did more of? In addition, there is certainly a great deal of serendipity involved, different aspects of our ancestors' anatomy and psychology seeming to pre-adapt us for full humanity. By the evolutionary process known as neoteny, the head remains juvenile in appearance (small jaws, large brain) but continues to grow. Upright posture: the hands are freed for tool use. Head balance: the larynx is repositioned, as if ready for articulate speech. Mobile shoulder: the arm is already adapted for throwing. Intelligence and sociability: social traditions develop into culture. Humans could not have evolved from any creatures other than apes.

Towards Homo Sapiens: Habilines, Erectines, and Neanderthals, G. Burenhult

A number of anatomical changes occurred during the period that preceded the erectines—that is, about 1.5 million years ago. Brain size increased, hips and thigh bones became more and more adapted to bipedalism, and there was a reduction in sexual dimorphism—that is, size difference due to sex. The oldest fossils of Homo have a brain size of little more than 500 cubic centimeters (30 cubic inches), but apart from that the difference between the new genus and Australopithecus was not' particularly striking. They all grew to roughly the same height, 1 to 1.3 meters (3 to 4 feet), and weighed 40 kilograms (88 pounds) on average. All of them were bipedal and thus moved freely on two legs. Early Homo had a slighdy more rounded skull and were probably less ape-like than other hominids. The greatest anatomical difference was the appearance of the teeth, especially the reduced premolar and molar width, but the signs of wear on preserved teeth show that all species fed mainly on seeds and plants, especially fruits. Moreover, anatomical studies indicate that early Homo probably spent a great deal of time in the trees and for this reason was less "human" than previously assumed. It has turned out that the greatest difference was the mental capacity. Habilines were the first hominids to make stone tools.

The first tool-making technique—which, as far as we know at present, is entirely linked to early Homo—existed between 2.5 and 1.5 million years ago and is distinguished by the use of pebbles from riverbeds as raw material. By means of another, smaller stone, flakes were struck off from both sides of the core. This bifacial flaking procedure is usually called the chopping-tool technique and was named the Oidowan industry after the site of its first discovery—Olduvai. Even though this technique sometimes has been considered simple, it nevertheless reveals a sound knowledge of the nature of the raw material, how to strike the stone to get a suitable flake, and, not least, the final result after a long series of strokes in a given succession.

Today, most experts agree that human evolution resulted from the same sorts of pressures as the evolution of other animal species, and very often it is obvious that these processes of evolution occurred at the same time. Clearly, global climatic alterations, and the ecological changes that followed, played a crucial part in these processes. About five million years ago, the Antarctic ice sheet started to grow substantially, whereas the corresponding glacial period of the Arctic did not begin until about 2.5 million years ago. During these two Ice Ages, the average temperature on Earth dropped markedly. In Africa, as in other parts of the world, this meant great changes in both flora and fauna. Vast tropical rainforest regions disappeared and were replaced by savanna, and parts of the fauna became extinct or changed owing to the adaptation to the new environment. These great ecological changes can be traced back to both of the glacial periods. The first one resulted in the development of the australopithecines— perhaps the separation of the human line itself— and it was surely not an accidental occurrence that the latter Ice Age coincided with the appearance of the genus Homo and the rise of tool use.

As we have seen, a number of different protohuman species lived side by side during this period, but, as far as we know at present, australopithecines never manufactured or used stone tools. While australopithecines in the course of time became extinct, the Homo groups survived and evolved into modern humans. But what was the biological difference between nonhuman hominids and early humans? This is a controversial issue, but one of the basic differences is human females' total lack of estrus periods—that is, mating seasons. They are, unlike many other mammals, always sexually receptive, almost independently of the menstrual cycle, although chimpanzees, especially pygmy chimpanzees (bonobos), also have very little sexual cyclicity. This evolution of human sexuality can perhaps be linked to a gradual reduction of body hair, resulting in increased skin sensitivity and a strengthening of female sexual signals. For example, the growth of the breasts is not necessary for the production of mother's milk or for breastfeeding, but is instead related to a visual, sexual stimulation for males. A change in the food composition, with a changeover to a diet consisting of more meat, and accompanying changes in the social organization, has been suggested as one reason for this evolution.

It seems clear, then, that hunting behavior, the lack of mating seasons, the distribution of food within the group, as well as family structure, are factors that are intimately associated with each other and that probably were of crucial importance in the subsequent evolution of humans. An increasingly marked disposition toward living in couples, or perhaps small polygamous groups, which also created a basis for a more rigid distribution of work between the sexes, may have helped to reduce conflict within groups....The use of stone tools made possible the exploitation of foodstuffs previously inaccessible to hominids. When taking care of meat, entrails, and hides, sharpedged flakes were a great advantage, especially in competition with predators such as hyenas and lions. Large amounts of meat could be cut loose from a dead animal in a short time, something that would have been impossible if only hands, teeth, and wooden objects were used. But detailed knowledge of the eating habits of early Homo is still very limited.

The period between 2.5 and 1.5 million years ago was a crucial and formative phase in the evolution of humans—mentally, technologically, and economically, as well as socially. The pressure from the ecological competition enforced early human characteristics in early Homo and at the same time led to the extinction of other protohominid species. The number of hominids in central East Africa may have been equivalent to the number of baboons living in the same region today—in other words, a very large number of individuals in mutual competition. The everincreasing brain size led to considerably smaller brains in infants than in adults, which facilitated childbirth. This in turn resulted in a considerable prolongation of me period in which children were dependent on their mothers, which involved important changes in the social organization and the division of work between the sexes. Habilines probably lived in small groups or bands, much like present-day hunter-gatherers, but the social organization was more similar to that of chimpanzees. Only with the appearance oiHomo erectus some 1.6 million years ago did a more human social structure develop.

When the erectines came on the scene, entirely new characters in human evolution appeared, with abilities and driving forces that made our ancestors spread outside Africa for the first time. This demanded totally different ways of ecological adaptation. The cold climate and trying environment further north implied that humans used fire and wore well-adapted clothing to be able to keep warm during the winter. Above all, the migration of Homo erectus into northern regions shows that people now were able to adjust themselves to considerably harsher ecological situations, where the supply of food varied markedly during the different seasons and where hunting became increasingly important, especially during the winter. Many of the edible plants withered in the autumn, and it was necessary to store nonperishable foodstuffs such as nuts, bulbs, and tubers. Physically, the erectines were more similar to modern humans than to habilines. The greatest difference was probably the shape of the head and face, which still had strikingly primitive features— a sloping forehead, very heavy brow ridges, and a receding chin. The muscles at the nape of the neck were extremely well developed. Brain size increased over time from 775 to 1,300 cubic centimeters (47 to 79 cubic inches), which on average is equivalent to 70 percent of that of modern humans. Fully adapted to an upright gait and equipped with a muscular and stocky body of between 1.5 and 1.8 meters (about 5 and 6 feet) in height, Homo erectus must have made the impression of being very strong and powerful.

Today, most experts agree that the erectines slowly evolved from the habilines in central East Africa, from where they spread north across the Old World. The oldest fossils have been found in EastTurkana, in Kenya—sometimes referred to as the "Turkana Newcomers"—and date back some 1.6 million years. A million years later, Homo erectus and its sister species, Homo heidelbergensis, occupied all of Eurasia, from the Atlantic coast in the west to China and Java in the east. It was never really a matter of migration. Hunter-gatherers move across vast areas in search of food, and an increasing population implied that groups split up and new territories were occupied. At a pace of 20 kilometers (12 miles) per generation, a distance of 14,000 kilometers (9,000 miles), or roughly the distance between Nairobi and Beijing, was covered in 20,000 years. Even with much shorter movements, this natural, successive spread was enough for the erectines to occupy these vast areas in just a few hundreds of thousands of years. As colder and darker regions of Europe and Asia became populated, skin color became lighter to allow the rays of the sun to penetrate the skin to produce vitamin D, and the protecting fat layer, as well as the sweat glands, adapted to the new climatic situation. The big question is why these groups of people were forced "to leave the always well-laid African table.

As we have seen, Homo itself, and, later, the erectines, evolved on the savannas of tropical Africa. The great climatic fluctuations that prevailed between five and one million years ago intensified about 900,000 years ago, and the global climate was influenced by glacial periods alternating with warmer interglacials. Consequently, the African vegetation was characterized by savanna alternating with rainforest. To be able to survive these freaks of nature, humans had to adapt in different ways, either by moving or by occupying new climatic zones. The latter implied, among other things, the ability to alternate vegetable foodstuffs with a meat diet. Obviously, the Sahara Desert played an important role in this process. During periods with higher rainfall, populations from the south entered the virgin soils in the north, and during drier periods they were forced to leave. In some cases, the retreat southwards may have been cut off, and for that reason a northward expansion toward the Mediterranean coast and southwestern Asia was necessary. Demonstrably, a heavy increase in the number of big land animals took place in Europe about 700,000 years ago, when elephants, hoofed animals, hippopotamuses, and a series of predators such as lions and leopards migrated north from Africa. It is probable that the causes that lie behind these migrations are also behind the contemporary appearance of humans outside Africa.

To sum up, the erectines—of which Homo erectus is the best known species—first appeared in Africa, and a number of finds from Lake Turkana, Chesowanja, and Olduvai date back to between 1.6 and one million years ago. The Asian finds, on the other hand, are all of a later date. Ban Mae Tha, in Thailand, is one of the oldest known sites in Southeast Asia, at an age of 700,000 years, whereas the erectus finds from Zhoukoudian, in China, date back to between 460,000 and 230,000 years ago. Other Chinese finds from Lantian, Jenjiawo, and Gongwangling have proved to be somewhat older than the earliest layers of Zhoukoudian, and have been dated to 600,000 years ago. For the Java humans, there are potassium-argon dates of between 900,000 and 600,000 years ago, and these are supported by fission^track dates which go back to a little over a million years ago. Erectine groups, it seems, entered Europe at roughly the same time as they entered Asia. The oldest date in Southwest Asia has been obtained at Ubeidiya, in modern-day Israel, at an age of 700,000 years, although there are no diagnostic human remains from there, while the oldest known find in western Europe has been uncovered in Italy, at Isernia La Pineta, southeast of Rome; stone tool finds show that humans lived there some 730,000 years ago. However, on the basis of a recent find of a lower jaw beneath the city of Dmanisi, southwest of Tblisi, in the former Soviet republic of Georgia, it has been claimed that humans had already spread outside Africa 1.8 million years ago.

The period between 300,000 and 40,000 years ago was an important transitional period between erectine and sapient stages and was characterized by a series of physical and technological changes. Brain size increased from 1,100 to about 1,400 cubic centimeters (67 to 85 cubic inches), and at the same time, face and bodily constitution more and more resembled those of modern humans. The tool technology was refined, and during the Neanderthal era the first signs of ritual life and religious beliefs appeared. In addition to the handaxe technology of the Acheulean tradition, a typical flake technology arose, which is named the Clactonian, after the site of its first discovery— Clacton-on-Sea, east of London. The finds of fossil humans from this important transitional period are still few, but in Europe some remains have been found that seem to document the emergence of Neanderthal features. A young adult woman found at Swanscombe, in England, lived about 225,000 years ago and had a brain volume of 1,325 cubic centimeters (80 cubic inches). Another woman, found at Steinheim, in Germany, is slightly older. However, the most important finds from this period so far discovered are from Arago, in the French Pyrenees, and date back at least 200,000 years. Undoubtedly, these were precursors of the classic European Neanderthals, who appeared for the first time about 130,000 years ago. Similar fossils have been uncovered at Bilzingsleben, in eastern Germany (more than 300,000 years old), and at Petralona, close to Thessaloniki, in Greece. The finds from Swanscombe, Steinheim, and Arago show that people with an almost modern brain capacity lived in Europe and probably also in large parts of Asia and Africa between 300,000 and 200,000 years ago. In South and East Asia, populations of the same type replaced Homo erectus at about the same time: remains from Dali and Jinniushan, in China, and Hathnora, in India, are clearly like Steinheim or Petralona. But Homo erectus lingered on in Southeast Asia until 100,000 years ago. Their European/West Asian descendants, the enigmatic Neanderthals, were just round the corner.

For a long time it has been clear that the physical appearance of European Neanderthals in particular strongly differed from that of anatomically modern humans. Their brain was actually larger than ours on average. They had a considerably bigger face, with a heavy brow ridge and a remarkably robust nose. Their lower jaw was massive, and they had a receding chin. Even the teeth were considerably larger and were placed in a U-shaped curve, not in a parabola shape, like ours. Their head was supported by short and very robust muscles at the nape of the neck. The Neanderthals reached only about 1.6 meters (5 feet, 3 inches) in height, but were extremely muscular. In spite of the physical differences, modern humans were for a long period of time considered to be lineal descendants of me Neanderthals. Only with the work of French paleontologist Marcellin Boule at the beginning of the twentieth century was it suggested that these differences were too great for the Neanderthals to be the ancestors of modern humans.

The Neanderthals evolved between 200,000 and 100,000 years ago and are usually associated with the so-called Mousterian culture, named after a site at Le Moustier, in the Dordogne, in France. Since anatomically modern humans made their appearance in Europe about 40,000 years ago, there clearly was no time for a transition from Homo neanderthalensis to Homo sapiens sapiens. Boule, instead, suggested that the Neanderthals became extinct during the last Ice Age and were replaced by the new immigrants. We now know that the last of the Neanderthals lingered on until 35,000 years ago, so that there was a brief period of coexistence.

However, a series of new discoveries led to conclusive reinterpretations of the relationship between Neanderthals and Homo sapiens sapiens. On Mount Carmel, close to Haifa, in Israel, and situated on the Mediterranean coast, several caves have been known for some time to contain important finds of fossil humans—for example, Mugharet es-Skhul, Mugharet et-Tabun, Kebara, and Jebel Qafzeh. In Qafzeh, a primitive form of modern humans has been found which already lived in the area some 92,000 years ago, whereas Tabun contained the remains of a Neanderthal form from about 120,000 years ago, and Neanderthals at Kebara lived only about 60,000 years ago—in other words, spanning the period of the modern people at Qafzeh. In Skhul, remains at an age of about 80,000 years have been found. All of mese Levantine humans can be linked to the Mousterian tradition of the Middle Paleolithic, even those with modern traits.

If Neanderthals and people of modern type actually coexisted in the Middle East for some 60,000 years, and overlapped for about 5,000 years in Europe, then, obviously, one cannot be descended from the other. Most likely they both evolved from Homo heidelbergensis: tihe Neanderthals in the temperate zone of Europe and/or the Middle East, from ancestors such as Petralona or Arago; Homo sapiens in Africa, from precursors such as Kabwe or Bodo. In fact, a whole series of transitional remains from Homo heidelbergensis to Homo sapiens is now known, in the range of (approximately) 130,000 to 120,000 years ago: Omo, Ngaloba, Jebel Irhoud, and Eliye Springs. The earliest representatives of our species emerged from Africa about 100,000 years ago and coexisted with Neanderthals for many millennia, until something—perhaps the chance invention of the Upper Paleolithic stone technology—gave them an advantage and they were able to spread further and replace the unfortunate Neanderthals altogether.

Thinking about the Neanderthals has gone through several phases. In the 1860s, they were regarded as our ancestors. Boule then exaggerated the differences between them and us, not realizing that the skeleton he studied, known as the "Old Man of La Chapelle-aux-Saints", was deformed with arthritis. William Straus and A.J.E. Cave pointed this out in 1952, and once again the Neanderthals were given a place in our family tree, though perhaps as ancestors of modern Caucasians (Europeans, Middle Easterners, and Indians) only. Now, with new dating methods such as thermoluminesence and electron spin resonance, we have a fresh perspective, and they are preferably seen as our cousins rather than our ancestors, though very like us in many features. The discussion is once again directly linked to the original home of humans—Africa.

The Neanderthals were the first humans to really adapt to the cold northern climate. On the whole, their evolution took place during a warm period, the last interglacial. However, the classic tool technology of the Neanderthals—the Mousterian, named after the legendary rock shelter at Le Moustier, in the Dordogne—did not appear in Europe until the last Ice Age, about 70,000 years ago, although Mousterian-like industries were being used in the Middle East as early as about 120,000 years ago. They became an Ice Age people with all that this implied in terms of arctic - survival and economic flexibility.

Like their predecessors, the Neanderthals roamed over extensive territories and used seasonal settlements during different times of the year. Presumably, big-game hunting (mainly deer and reindeer) played an increasingly important part in their subsistence. But above all, the cold climate forced people to adapt their diet to the cycle of the seasons, and the increasingly rich assemblages of different kinds of stone tools with different functions may be looked upon as a result of this. Storage of food was a vital necessity during parts of the year. Cave openings and rock shelters, so-called abri, were commonly used as dwellings, and although open-air settlements are known in many places, it is not unfounded to apply the term "cave people" to the Neanderthals... As we have seen, there was great anatomical variation within the Neanderthal population. Remains of classic Neanderthals are limited to western Europe, while those of Southwest Asia show less extreme features. It is believed that this physical diversity was the result of climatic adaptations, since the European Neanderthals really were the only ones living in typically Ice Age surroundings.

The appearance of the first burials shows that humans' ability to think in the abstract and to communicate orally had reached an advanced level. Whether a burial reflected notions of a "kingdom" of the dead or whether it was just a way to show regret at the loss of a family member, the burial ceremony reveals the presence of ritual conceptions and long-term thinking that previously had been impossible. Burial customs, such as sprinkling the dead with red ocher or depositing grave goods, bear witness to a world of magic thinking that lies behind the most definite of manifestations—that of dying.... Lately, however, many experts have questioned the Neanderthal burials and the validity of the evidence. Excavations of Paleolithic sites are difficult enterprises, and the stratigraphy is generally hard to interpret. Later activities at the site, as well as falling debris, often make it impossible to determine what actually took place at the time of the depositing of the body. Some have even stated that there is, in fact, no evidence whatsoever that Neanderthals buried their dead and that the finds are the result of coincidences and later disturbances. Contrary to this, it may be said that there is no evidence that they didn't....As far as the archaeological material is concerned, it is extremely difficult to interpret notions of the supernatural or the presence of magical or religious systems. In any case, however, it is no exaggeration to say that all these finds indicate that the Neanderthals developed complex ideological and social behavior which later on was also to become characteristic of modern humans.

Modern People in Africa and Europe, G. Burenhult

The most obvious differences between modern humans and Neanderthals are much more profound than the way they made and used tools. American anthropologist Richard Klein summarizes them as follows: "Initially, their behavioral capabilities differed little from those of the Neanderthals, but eventually, perhaps because of a neurological change that is not detectable in the fossil record, they developed a capacity for culture that gave them a clear adaptive advantage over the Neanderthals and all other nonmodern people." In other words, they developed new, more flexible forms of social organization. They built new types of settlements, developed a rich ceremonial and ritual life, and began to express themselves through me medium of art. Richard Klein adds: "The result was that they spread throughout the world, physically replacing the nonmoderns, of whom the last to succumb were perhaps the Neanderthals of western Europe." The fact that these people simultaneously and rapidly replaced the Neanderthals all over Europe indicates, according to Klein, that we are dealing with a rapidly expanding population that can only be explained if modern humans migrated to Europe from elsewhere.

This theory is strongly supported by new finds in South Africa. Five notable sites—Klasies River Mouth Cave, Border Cave, Equus Cave, Florisbad, and The Kelders Cave—have yielded evidence that is vital to our understanding of the origins of modern humans. Richard Klein considers Klasies River Mouth Cave, about 160 kilometers (100 miles) east of Cape Town, to be the most significant of these sites. Large numbers of fossilized human bones have been found there, some between 115,000 and 80,000 years old, and they have a strikingly modern appearance. These relatively modern humans, therefore, lived in Africa at the same time as the Neanderthals flourished in Europe. Although the differences in tool technology between the Middle and Upper Paleolithic periods were as great in Africa as they were in Europe, the humans who inhabited South Africa during the Middle Paleolithic period looked much more like modern humans than the European Neanderthals did.

The South African finds indicate that modern humans evolved in Africa over a very long period of time, perhaps more than 200,000 years. To the older and more archaic stage of this course of events belongs, for example, the find of "Rhodesia Man" from Kabwe (formerly Broken Hill), in Zambia, while the above-mentioned South African fossils represent a more advanced stage of human evolution. But if the cradle of Homo sapiens is to be found in Africa, who was the ancestor? Homo erectus? Many experts don't think so, suggesting that another, as yet unclassified, human species existed in Africa. Others are of the view that this species was the 1.5 million-year-old Turkana Newcomer, Homo ergaster. The evolutionary process was probably slow and steady. The African savanna landscape probably provided ideal conditions for humans, and the tropical environment offered plentiful supplies of good-quality food. This might have brought humans together to live in larger groups, with a more selective diet.

These modern human beings, according to Richard Klein, did not migrate outside Africa until fairly late, and they subsequently replaced other, more archaic human types all over the so-called Old World. Anthropologist William Howells has named this the "Noah's Ark" hypothesis, because it implies that Homo sapiens sapiens originated in one single area of Africa. The opposing, more traditional view of human development is called the "candelabra" model. (These models are also known as the replacement theory and the regional continuity theory, respectively.) Recendy, the replacement theory has been strongly supported by modern genetic research, which has been able to tell us for how long the various human races have been separated from each other and how they are mutually related. Using the mitochondrial DNA technique (mt DNA), Allan Wilson and Mark Stoneking, of the University of California, at Berkeley, and Rebecca Cann, of the University of Hawaii, have concluded that all nowliving humans are descended from a common first mother in South Africa who lived about 200,000 years ago. This is known as the "Eve" theory. It has been calculated that the descendants in the female line of all other mothers became extinct over the course of 50,000 generations, leaving only one set of matrilineal descendants. These results are consistent with the archaeological finds at such sites as Klasies" River Mouth Cave.

The same DNA technique shows that as modern humans spread across the globe, they rarely interbred with existing, but more archaic, human beings, such as the Neanderthals. The "Eve" theory has not remained unchallenged, of course, but most evidence does indicate that modern humans spread rapidly from Africa to the rest of the world. If Africa is indeed the original home of modern humans, how and why did they spread into Europe and Asia? Finds from the caves of Mount Carmel, in present-day Israel, indicate that this migration occurred somewhat before 100,000 years ago. The only previous obstacle had been the Sahara Desert, but higher rainfall during this period transformed it into an area of verdant plains/with lakes and streams. A plentiful supply of game and edible plants would have made northern migration not just possible but very attractive. Southwest Asia was a natural first stopping place, and this explains the Mount Carmel finds. But during the last Ice Age, which began about 75,000 years ago, this Levantine area became much drier. Food would have become scarce, and this may well have forced the humans who had settled there further north, towards the richer tundra and steppe regions of Europe. At the same time, modern humans were also spreading across Asia.

In the part of Africa south of the Sahara Desert, people were using much the same tools between 200,000 and 40,000 years ago, at much the same periods, as they were in Europe. In other words, the Middle Paleolithic period—in Africa, the Middle Stone Age—corresponds fairly closely in both these parts of the world. The Upper Paleolithic, or Late Stone Age, differs markedly from the previous period, but finds from 40,000 to 20,000 years ago are few. More recent finds, dating from 20,000 years ago or less, are much more common, and these show that people were hunting big game, especially buffalo and wild pig, to a greater extent than in earlier times and that scavenging had become much less important. Fishing had also become very important, which it had not been during the Middle Paleolithic. The appearance of microliths indicates that bows and arrows were in use, and from finds of microliths suitable for mounting in rows on sickles made of bone, antler, or wood, we know that plant gathering was also becoming more important.

Longer or shorter periods of glaciation are a natural part of our planet's history. The most recent Ice Age epoch, called the Pleistocene, began about 2.5 million years ago. During the Pleistocene, there was a fluctuating pattern of cold, glacial periods, known as glacials, interrupted by warm periods, known as interglacials. The impetus behind these ice-sheet rhythms appears to have been the three solar insolation cycles, governed by the Earth's tilting arid its orbit around the sun.The most recent Ice Age began about 115.000 years ago and ended about 10,000 years ago, when, the present interglacial was initiated. It was in this period that Homo sapiens, or modern human beings, developed. They had to adapt to quite rapid climatic changes which had a huge impact on the geological and other features of this planet.

Forty thousand years ago, we were in a milder part of the last glacial period. But Europe had already experienced three cold spells since the last interglacial, and the world looked very different from today. The sea level was 50 meters (nearly 200 feet) lower, the mountainous parts of North America and Eurasia were glaciated and surrounded by windswept tundras, and the vegetation and climate zones were much farther south than they now are. And the Big Chill was still to come! Twenty-five thousand years ago, the ice caps started to accumulate so much snow that large parts of northwest Europe, North America, and alpine areas such as the .Andes, the Alps, and parts of central Asia gradually disappeared under huge sheets of ice. These continental ice sheets reached their greatest extent 20,000 years ago, when the sea level dropped to 120 meters (400 feet) below the present level and parts of today's continental shelf areas were dry land. Land bridges were formed in many areas where today there are sounds, including the English Channel, Bering Strait, and some of the sounds between Southeast Asia and Australia. Most of Europe that was not actually under ice was virtually barren, with only tundra or steppe vegetation, arid exposed to wind and cold. The average temperature was about 8 degrees Celsius (14 degrees Fahrenheit) lower than today.

About 15,000 years ago, temperatures began to rise throughout the globe. The melting ice caused the oceans to expand, drowning former land bridges. Forests grew in areas that had been tundra—earlier in North America than in Europe—and the highly productive grasslands and half-open woodlands that developed supported a wide range of animals. In low latitudes, such as in North Africa, the climate became moister. Here, former deserts changed into savannas and woodlands, not only supporting a diverse range of plants and animals but also being very favorable for humans.

About 11,000 years ago, the solar insolation reached a peak in the temperate regions, arid a new interglacial was in sight But in Europe arid parts of North America, a glacial climate suddenly took hold. During this so-called Younger Dryas event, which lasted about 500 years, the North Atlantic polar front migrated from Iceland to the Bay of Biscay; summer temperatures dropped by 5 to 10 degrees Celsius (9 to 18 degrees Fahrenheit); the ice sheets that had started to melt began to increase and advance; the tundra and permafrost extended once again; and many animals and plants were forced southward. This rather enigmatic period ended almost as abruptly as it began., about 10,000 years ago. Its ending marks lie'division, between the Pleistocene and Holocene eras. With the start of this new interglacial, vegetation once again spread to the north. At high, latitudes, this meant a change from glacial and periglacial tundra to temperate woodlands, while at low latitudes, it meant a change from arid desert steppe to humid, tropical woodlands. It was a period of dramatic environmental changes to which humans had to adapt.

These new theories about the origins of modern humans have changed the way we look at Europe in the context of evolution. Once considered a center of physical and cultural development, it must now be regarded as a backwater, a marginal, stagnating region without further importance. The classic finds of human fossils dating from the interglacial period that occurred between 300,000 and 200,000 years ago, at Swanscombe, in England, and Steinheim, in Germany, have recently been supplemented by new finds. These include the Petralona skull from Greece, dated to between 400,000 and 300,000 years ago, and the Arago skull from the French Pyrenees, which is about 200,000 years old. All are now classified, along with their African contemporaries, as Homo heidelbergensis. Although the earlier finds were considered to be the ancestors of modern humans and were called "pre-sapiens", these new finds indicate that they are most likely the ancestors of the European Neanderthals. It was these "presapiens" finds that provided the framework of the traditional "candelabra" model of human evolution.

When Homo sapiens sapiens appeared in Europe about 40,000 years ago, during the last glacial, they did so in the form of the Cro-Magnon people, who were named after a site in Les Eyzies, in the Dordogne, in France. They found the climate considerably harsher than the North African one they had come from, but, gradually, they adapted. Litde by little their skin color became lighter to facilitate the absorption of the necessary vitamin D from the weak sunshine of the north. The arctic climate meant totally different requirements for human survival. At that time, Europe was very different from the continent of today. The whole Scandinavian peninsula, as well as large parts of northern Germany, England, and Ireland, was covered with ice sheets a kilometer (3,000 feet) thick. As a result, the sea level was much lower than it is today. South of the ice edge lay widespread tundra plains with a rich variety of animals, including reindeer, horses, aurochs, bison, deer, mammoths, and rhinoceroses. Lions, leopards, and wolves competed with humans for the game. England and Ireland were part of the continental landmass, and large parts of the Bay of Biscay and the North Sea were drained. The climate was more hospitable in southern France and the Iberian peninsula, with summer temperatures close to 15 degrees Celsius (59 degrees Fahrenheit). Here, food was more varied and plentiful, with a range of plant foods as well as fish and other marine foods.

The remains of Neanderthal-like individuals excavated at Hahnofersand, near Hamburg, in Germany, and St Cesaire, in France, may indicate that surviving Neanderthals lived side by side with the new immigrants. The fossils at both these sites have been dated to 36,000 years ago, and thus postdate the appearance of Homo sapiens sapiens in Europe. Neanderthals and modern humans may have interbred in some places, and so the original population may have been assimilated to some extent. But basically the newcomers took over completely. Time simply did not allow for any evolution from Neanderthals to Homo sapiens sapiens, since the physical differences were too great. As Richard Klein says, "compared to their antecedents, Upper Paleolithic people were theremarkably innovative and inventive, and it is this characteristic more than any other that is their hallmark".

With the end of the glacial, about 10,000 years ago, the climate rapidly became warmer and wooded areas increased. Some animals adapted to the new conditions or migrated to other areas, but many others died out. Many species disappeared earlier, in the later stages of the glacial. Such extinctions were even more marked elsewhere, notably in the Americas. No evidence exists for a similar pattern of extinctions from previous glacials, and no one theory has yet emerged that satisfactorily explains it. Humans alone could not have caused these extinctions on such a vast scale. Nor is it likely that so many species simply failed to adapt to the new conditions, given that they appear to have survived srmilar changes following previous glacials. Many other theories founder on misinterpretations of fossil evidence. The reasons for these extinctions remain mysterious. We can only speculate that they were caused by some as yet unknown combination of factors, including climatic change and human activity.

British archaeologist Paul Bahn has suggested that the way of life of the Upper Paleolithic people in southern France and the Pyrenees was probably very similar to that of present-day reindeer hunters and herders in Siberia. These people are seasonally nomadic, and as well as hunting, they keep domesticated animals to provide milk and'to use as beasts of burden. Bahn suggests that the reindeer would have moved to pastures in several different directions during their seasonal migrations from the Dordogne: toward the Atlantic coast on the Bay of Biscay, the Pyrenees, and perhaps also the Alps. Analysis of the bone material from the abri settlement of Abri Pataud, in Les Eyzies, has shown that this settlement was occupied exclusively during late autumn, winter, and early spring. Abundant finds of cockle and mussel shells in almost every inland settlement lend support to the theory that humans followed the herds as they headed for the Atlantic coast. Bahn is therefore open to the suggestion that Magdalenian hunters may have tamed part of the reindeer herds. It has long been wondered whether they may have tamed another important herd animal—the horse.

Notwithstanding this, there is some evidence to suggest that the big-game hunters of the Upper Paleolithic period did exercise some degree of control over horse and reindeer herds, and perhaps also mountain goats. How is this to be reconciled with the large-scale drives and mass killings considered so characteristic of the period? These hunting methods would have required a joint effort by a very large number of hunters, who would have had to force the herds over a precipice or into a narrow gorge where the animals could be easily brought down. But to drive a galloping herd of horses on foot is clearly no easy task. Perhaps these Upper Paleolithic hunters were indeed the first horsemen.

Whatever future research may tell us about Ice Age humans' control over animals, there was clearly great variation in living conditions and the availability of food across Eurasia. While big-game hunting was crucially important across the extensive tundra along the ice edge, the milder climate of southwestern Europe produced a plentiful and reliable supply of a variety of foods. As noted earlier, cockle and mussel shells have been found in most settlement sites in this area, indicating that marine foods were eaten, at least during some parts of the year. The numerous images of salmon, sole, and other saltwater species found among cave art would seem to suggest that fishing was an important means of subsistence. In some areas, fish may even have been the staple food, encouraging people to adopt a more settled way of life. Plant foods were readily available in western Europe, though lacking further east. All these factors were vitally important in promoting population growth, which, in turn, led to new forms of social organization and ceremonial life.

How was society organized within and between different groups of people during this period? Within modern traditional societies, the two main factors determining group size are the ability to survive and the ability to live in peace. The ability to survive is drastically reduced if the group is too small. A lone individual rarely survives for more than a year, whereas a group of five can continue for up to a generation (about 30 years)., A group of about 25 has a good chance of surviving for perhaps 500 years, assuming it is in regular contact with other groups, not least for intermarriage. On the other hand, the risk of conflict within groups increases with the number of people. Again, it turns out that 25 is a reasonable average number, and ethnographic studies among present-day hunter-gatherers have shown that most of them live in groups of between 20 and 70. This is true of the Australian Aborigines, the Kalahari Bushmen, and the Andamanese, as well as the Birhor of northern India. American anthropologist Robert Carneiro has observed that when the Yanomami Indians of South America form a group of more than 100 people, aggressive tendencies become so great that the group has to split into two halves. For survival, a number of groups or bands must be organized in larger units or tribes, and here there are always certain limits. To avoid the problem of inbreeding, there must be at least 475 in the larger group. In fact, most known tribes of huntergatherers have about 500 people, with a maximum of about 800. It seems likely that similar conditions prevailed in southwestern Europe during the Magdalenian phase.

A study of 76 skeletons from the Upper Paleolithic period in Europe and Asia has shown that less than half the people reached the age of 21, only 12 percent were more than 40, and not one single woman reached 30 years of age. Many skeletons showed signs of malnutrition, rickets, and other deficiency diseases. Significantly, many bore traces of injury resulting from physical violence. Life during the Ice Age was undoubtedly a relentless struggle for survival, and the evidence of social organization and ceremonial life that has survived from this period reflects a society under heavy pressure as people competed for limited resources.

There is a good deal of evidence to show that certain people had higher status than others, and these people—possibly in the form of shamans— probably conducted rituals and ceremonies. Finds of a number of magnificendy ornate burials provide, perhaps, the most persuasive evidence of status in this period. At Sungir, near Moscow, remains have been found of two adults and two children, the . man and children buried in clothes decorated with thousands of beads of ivory and animal teeth and accompanied by ornately carved weapons and other objects suggesting a high status. The remains of magnificently adorned children have also been excavated from a site known as Grotte des Enfants, on the Italian Riviera.

Because these children were aged between 7 and 13, they cannot have attained high rank in their society by their own efforts. Their elaborate adornments have been interpreted as the first Modern People in Africa and Europe examples of hereditary status, evidence that some families in Paleolithic society were ranked higher than others. American writer John Pfeiffer summarizes: "A great deal of effort went into these burials, and into the appropriately elaborate ceremonies that must have accompanied them. Such honors are not for everyone, only for special people, indicating the beginnings of formal social distinctions. The burying of young children suggests further developments. A leader who earns his position by actual deeds needs time to win recognition as hunter or shaman. He must keep proving himself and when he can no longer do so he is replaced by someone else who can. But the existence of children buried with high honors before they are old enough to do anything outstanding raises the possibility of status by heredity rather than achievement.

The Rise of Art, G. Burenhult

The earliest images found are more than 30,000 years old. As far as we know at present, the Neanderthals never expressed themselves in art. The image as a means of expression belongs exclusively to modem humans, Homo sapiens sapiens, who appeared in Europe about 35,000 years ago in the form of the people we know as the Cro-Magnon. There is nothing to indicate that Neanderthals were less able to express abstract thought in the form of imagery, but for some reason they never did. What happened at the start of the Upper Paleolithic? Why did Ice Age humans suddenly need to express themselves in images? And why did they find their way, with the evident danger of being killed, through extremely difficult passages to the deepest parts of caves? Why are these caves to be found almost exclusively within an area in southern France and northern Spain? Were the cave paintings created by one person, or did entire communities find their way into the silent sanctuaries to perform their cult ceremonies? Who created the paintings? Were they men or women, or perhaps children? No preliminary stages of paintings or carvings have been found, and this has been interpreted to mean that the cave art was created by a limited group of selected individuals—a sort of priesthood, in the form of medicine-men or shamans. Another possible explanation is that people trained on perishable materials, such as hides or wood, outside the caves.

To understand what led up to the beginnings of art, we have to look far beyond the art itself. The Upper Paleolithic period was a time of dramatic changes. The appearance of a new human species in Europe is sensational enough, but these newcomers were to introduce a number of social and technological innovations that radically changed conditions of life within a short period. The population grew markedly, and nomadic family groups began to gather in larger units and for longer periods than in earlier times. The archaeological record shows that goods were interchanged over greater distances, indicating a growing network of relationships between groups.

The most conspicuous change that occurred at this time was the development of a new and stylistically more complex tool technology. The stone tools of the Neanderthals consisted of a few similar types. Prepared, tortoise-shaped pieces of flint provided the raw material, from which flakes were struck off. These flakes were then retouched to make scrapers and points of different kinds. (This is known as Levalloisian technology.) Homo sapiens sapiens developed a more advanced blade technology, and were able to produce long, thin tools, such as blades in the shape of a laurel leaf. Many new types of tools were developed, including flint tools with double functions, such as retouched flakes where one end was shaped as a scraper, while the other served as a burin. The large variety of burins that have been found from this period clearly indicates that they served a range of specialized purposes.

But tools and works of art are not the only evidence of profound social change in the Upper Paleolithic period. New needs and traditions are reflected even more clearly in a series of spectacular finds which indicate that the people of this time had developed a rich ceremonial life based on complex concepts and rituals. For the first time, evidence of regular burials appears, the bodies dotted with red ocher, many dressed in magnificent clothing and adornments, and accompanied by sets of tools. At Sungir, some 200 kilometers (125 miles) northeast of Moscow, four well-preserved burials dated to between 25,000 and 20,000 years ago have been found—a man, a woman, and two adolescent children. The man had been buried together with blades of ivory, and was dressed in a headband and a number of necklaces carrying some 3,000 beads of mammoth ivory. The cranium of a female had been placed on the grave in the course of his burial ritual. In a double grave in which a girl and boy were buried head to head, more than 10,000 ivory beads were found, together with rings, ornaments, the teeth of arctic fox, and 16 weapons in the form of spears, spearthrowers, and daggers. Similar burials from this period have been found at a number of other European sites, including Grimaldi, in Italy, and La Madeleine, in France.

Together with the different kinds of art, both portable art objects and cave art, these burials clearly indicate that the people of this time felt the need to communicate abstract information by means of symbols. In the whole history of human evolution, this is the first evidence we have of the need to show group affiliation and social status—while the child burials are the first indication that status may have been hereditary. For this reason, it has been suggested that the Upper Paleolithic period may mark the beginning of the end of the totally egalitarian society. As the population grew and these new and more complex patterns of social organization emerged, people clearly had a greater need to communicate both within their own group and with outside groups. The development of images and symbols can be directly linked to this growing need for communication, and it has been suggested that language may have undergone a parallel development at this time.

Cave art, with its naturalistic images of animals, arose during a later stage of the Upper Paleolithic period. The first expressions of art in Europe consist of symbols of female sexuality. As early as 35,000 years ago, Cro- Magnon people carved images of vulvas on rock and other surfaces. Some millennia later, about 29,000 years ago, the first portable art appeared in the form of the famous Venus figurines, small female figures with a characteristically stylized form. They were to dominate artistic expression for nearly 10,000 years. Most of these female figures have exaggeratedly swelling breasts and buttocks, while the head and legs taper off into a less defined shape, clearly being seen as of minor importance. They have been found over extensive areas, indicating widespread contacts and a common system of rituals throughout widely scattered communities during this period. Similar-looking Venus figurines have been found in great numbers from southern Russia in the east to the Atlantic coast in the west, a distance of more than 2,000 kilometers (1,200 miles). The most important sites include Dolni Vestonice, in the Czech Republic, Kostenki, in Russia, Willendorf, in Austria, and Brassempouy and Lespugue, in France.

Two possible explanations have been put forward to explain this emphasis on female genitals in the artistic and ritual life of the Cro-Magnon people and the distinctive characteristics of the Venus figurines. First, we know from the skeletons of this period that Cro-Magnon women were generally less robust than their predecessors, the Neanderthals, and had a considerably narrower pelvic opening. This may have resulted in more difficult childbirth and, as a result, a high death rate among mothers and babies. Second, it is not unlikely that the rapidly growing population led to increasing conflict, and conflict within traditional societies usually involves competition for women. In any case, women's vital role in ensuring the continued existence of a society whose survival was coming under increasing pressure may very well have given rise to a cult centered on women. It was not until about 23,000 years ago that cave art appeared. When it did, it was concentrated in one main area—the Franco-Cantabrian region of southern France and northern Spain.

The appearance of the first human expressions of art is one of the most evident signs of the fact that Homo sapiens sapiens possessed mental capacities superior to those of their predecessors—the ability to communicate through symbols. But this abstract world of symbols also reveals the modern human's need of religious and ritual systems, emanating from a changed subsistence and thus a changed social organization.

One might expect that the first symbols of the Upper Paleolithic big-game hunters would be linked to the most essential part of Ice Age survival—the game animals that constituted the main source of food across much of the European tundras. Instead, the world of images was centered on sexuality and fertility—another vital part of the struggle for survival: securing the continuity of the group. The oldest known figures consist of carved vulva depictions, which may be linked to the Aurignadan tradition and which date back about 30,000 years. TTiey have been found on rocks at, for example, Abri Blanchard, Abri Castanet, and La Ferrassie, in the Vezere Valley, in the Dordogne. But the famous Venus figurines, which have been found over a very large area, became the most characteristic kind of object in this world of beliefs. They were made out of a number of different kinds of materials, such as mammoth ivory, antler, bone, stone, and clay, and they all share the same standardized design: exaggeratedly swelling breasts and buttocks, and many of them appear to be pregnant. Most of them are naked and equipped with marked genitals. With a few exceptions, their heads are rudimentary and most often shaped only as little knobs, and, likewise, the swelling thighs taper off to poorly marked feet. The fertility symbolism is evident—the important thing was reproduction, fertility, and pregnancy.

But not all of these litde'fertility goddesses—perhaps even depictions of the Mother Goddess herself— have been depicted pregnant. American archaeologist Marija Gimbutas has pointed out that probably not even the classic Venus figurines from Willendorf, in Austria, and Lespugue, in France, are pregnant. Breasts and buttocks are the focus of attention, and, moreover, their hands are placed over their breasts. Others, like those from Kostenki, in Russia, and the famous limestone bas-relief from Laussel, in France, have their hands placed over their abdomen and may be interpreted as being pregnant. Consequently, breasts and buttocks are not particularly marked in these figures. The remarkable Venus tradition belonged primarily to the Gravettian phase, between 29,000 and 22,000 years ago, a period of increasing cold and advancing glaciers and ice sheets. The figurines had a standardized appearance over a distance of more than 2,000 kilometers (1,200 miles), from the Atlantic Ocean in the west to Russia in the east, and this bears witness to far-away contacts and intensive communication between the Upper Paleolithic big-game hunters along the Eurasian ice edge.

It has been estimated that about 20,000 years ago, between 2,000 and 3,000 people lived in what is now France, while the population of the rest of Europe, including Spain, cannot have exceeded 10,000. In the heart of the French area, at Les Eyzies, on the Vezere River, between 400 and 600 people lived side by side, at the same time, under the protection of four or five rock shelters known as abri. Similar gathering places located much farther apart are known further east—for example, at Dolni Vestonice, in the Czech Republic, and at Sungir and Kostenki, in Russia. The evidence of increased mass killings of single species of animals also suggests that the population was growing substantially during this period—although it is usually difficult to determine whether the bone deposits in these kill sites are the result of a single hunt or have accumulated over a longer period from regular killings. Some sites in eastern Europe have yielded the remains of close to 1,000 mammoths. At Solutre, in eastern France, a kill site has been found with the bones of perhaps as many as 100,000 horses, which were either driven to tiieir death over the cliffs or herded into the natural trap formed by the narrow pass below to be slaughtered.

Paleolithic art has sometimes been called "animal art", and it is true that the vast majority of paintings, carvings, and reliefs depict game animals, such as reindeer, horses, mammoths, bison, woolly rhinoceroses, deer, ibex, and aurochs (wild cattle). Occasionally, cave lions, bears, fish, and birds appear. But there are also some images of humans, often dressed in animal hides and equipped with hooves, horns, or other animal attributes. These probably depict shamans during cult ceremonies. There are many images of genitals, mainly vulvas, with or without the female body. In some caves, there are numerous schematic symbols or signs, often in the form of standardized, geometrical figures. Different types of figures predominate in different regions. Schematic symbols, for instance, are much more common in the southern region—the Pyrenees and the caves of southern Spain, such as La Pileta, near Malaga.

One of the great areas of debate in the interpretation of cave art has been whether the different images should be seen as parts of compositions, or whether each constitutes an individual, self-contained ritual. A problem is posed by the fact that there are many instances of images painted or carved over an earlier image. These superimposed images can be interpreted in a number of different ways. The new image may have been placed over the old one to make use of the existing supernatural power or, equally likely, to destroy it. Alternatively, the earlier image may have been regarded as unimportant and only happened to be at the site of a later ceremony. The same difficulties apply when trying to interpret the many puzzling signs and symbols which most often appear on or alongside the animals. The animals are depicted in greatly varying sizes, even on the same section of rock surface, and are placed among each other in a seemingly formless muddle, with no perspective or horizon line. With the notable exception of the famous galloping black horses of Lascaux, it would seem that the concept of composition, in a traditional artistic sense, cannot be applied to Paleolithic art. However, modern research has revealed that cave art reflects a much more sophisticated understanding and use of symbolism than previously thought to be possible among the hunters of the Upper Paleolithic.

Our new knowledge of the social and economic background to the rise of Paleolithic art does not, however, satisfactorily explain the individual peculiarities of the various groups of images, with their elements of sexual symbolism, fertility cult, hunting magic, shamanism, and totemism. The total picture is far too complex for a single explanation, and much of the substance of the rituals no doubt changed during the many thousands of years the cave art tradition persisted. However, recent research has revealed that many caves were used as ceremonial gathering places. A number of pieces of evidence clearly point to this. Images of humans are comparatively rare in cave art, but a striking number of them combine human and animal features, and often the features of several animals. The most famous example is the "sorcerer" from Les Trois Freres, in the Pyrenees, a male figure with the antlers of a stag, a nose like the beak of a bird of prey, and staring, owl-like eyes. The figure also has a horse's tail and unnaturally short forelimbs ending in bear-like paws, with claws. His genitals are abnormally placed, under the tail. This strangely hunched, or crouched, figure seems to be engaged in a ceremonial dance.

Shamanism is the dominant element in the religion of most known arctic and subarctic hunter-gatherers, including the present-day Inuit (or Eskimos) and the reindeer hunters of northeastern Asia. Shamans are men or women who have a special relationship with the spiritual world and are called upon in times of sickness and other troubles to mediate with the spirits on the community's behalf. For example, when a shortage of game animals threatens the community's survival, the shaman enters a trance and sends out his or her soul to find out why the spirit who controls the animals is withholding them, and to persuade the spirit to send more animals. Shamans are also called upon to cure sickness (which in many such societies is believed to result from the breaking of a taboo rather than from natural causes). It seems likely that similar beliefs were important to the big-game hunters of the Ice Age, and that shamans—or some equivalent—may have conducted ceremonies in the caves.

As is the case with many archaeological findings, we may never understand the precise significance of cave art, with its many different images. Mythological symbols cannot simply be read like a book. Nevertheless, we are closer than ever before to understanding its function within Paleolithic society. The images in the caves are unique social documents, a kind of prehistoric encyclopedia, in which the different entries together reflect the need for communication, identification, and cohesion widiin a rapidly expanding and changing society. Contemporary evidence of ceremonial gathering places in the open as well as in caves suggests that some communities in this period had developed more accessible forms of ritual. Perhaps the most famous is the ceremonial center at Mezhirich, southeast of Kiev, in the Ukraine, built of 70 tonnes (almost 70 tons) of mammoth bones from 200 animals. Art and ritual as forms of human expression would seem to represent one of the key ways in which people came to terms with a new, more demanding, and socially more complex way of life. The world of beliefs, and the ceremonies and rituals this engendered, were a means of binding the society together, protecting it, and preserving its values—ultimately, a strategy for survival.

Spreading Throughout the Globe: Towards New Continents, G. Burenhult

We know that modern humans first appeared in Africa between 200,000 and 150,000 years ago. From there they spread outward to the rest of the continent, to Europe, and to parts of Asia. Southwest Asia seems to have provided a natural passage for these pioneers. Human remains have been found in the Levantine area dating to about 100,000 years ago, whereas Europe was first populated by Homo sapiens sapiens only about 40,000 years ago. These modern humans spread into unknown territory slowly but steadily. It was never a question of deliberate migration. Scattered groups extended their hunting grounds by only a few kilometers (2 or 3 miles) per generation, but this was enough for them to populate the world in some tens of thousands of years.

The characteristics these modern humans acquired were to influence cultural evolution throughout the world. They were remarkably adaptable, which meant that they could inhabit areas that had been inaccessible to earlier hominids. For the first time, people settled in the arctic regions of Eurasia, adapting to the most difficult ecology on Earth. Ten to fifteen thousand years before Cro-Magnon humans entered Europe, their cousins in Southeast Asia had crossed 90 kilometers (56 miles) of open water by some form of sea-craft and reached present-day Australia and New Guinea. Some millennia later, groups of people took advantage of the low sea level during the last glacial to walk or paddle across present-day Bering Strait, entering the continent of North America. In the space of a few tens of thousands of years, modern humans opened up new worlds—in the north, the northeast, and the southeast.

While cultural evolution in western Asia generally corresponded with that in Europe, East Asia developed characteristics of its own. Human fossils found there, dated to between 200,000 and 100,000 years ago—the period during which the Neanderthals were evolving—look very different from those found in the west. As American anthropologist Richard Klein says, "At a time before 50,000-40,000 years ago when western Asia was variably occupied by Neanderthals perhaps derived from Europe or by very early moderns arguably derived from Africa, eastern Asia seems to have been occupied by a distinctive human type(s) that was neither Neanderthal nor modern". The lack of modern excavations makes this period in East Asia difficult to evaluate. We have known for a long time that the blade tools that appeared in Europe with Homo sapiens sapiens were apparently not introduced or developed in East Asia. Instead, the flake and chopper tools used by Homo erectus survived there for more than 300,000 years, to as late as about 10,000 BC. Correspondingly, there seemed to be no evidence in eastern Eurasia of the cultural developments that took place in Europe during the Upper Paleolithic period—a wider and more sophisticated use of ander and bone, the rise of art, and evidence of a rich ritual life, with complex burial practices.

A number of important new finds indicate, however, that modern humans who had developed advanced stone and bone tools and a complex ritual and artistic life settled the southern, eastern, and southeastern parts of Asia during the very early phases of the Upper Paleolithic period. The remarkable fact that Homo sapiens sapiens populated Australia and New Guinea at least 50,000 years ago certainly points to this, but this is not the only evidence. Fossils of modern humans found at Liujiang, in China, have been dated to 67,000 years ago. At Batadomba lena, a cave in the southwestern part of Sri Lanka, setdement layers dating back to about 29,000 years ago have yielded the remains of modern humans, together with small, technically sophisticated stone tools (so-called geometric microliths) and bone tools. Apparently, sophisticated stone tools existed in Southeast Asia, but they were not widespread and not as standardized as those found further west. Because they existed side by side with more traditional tools, it would appear that tiiey were developed locally. Consequently, the big, as yet unanswered, question is why Homo sapiens sapiens took new technology with them to the west but not to the east. Between 35,000 and 20,000 years ago, Upper Paleolithic big-game hunters spread over the vast tundras of northeastern Siberia for the first time and soon became the first humans to set foot in America. Siberian stone tools differed from their contemporary European equivalents, being made from different raw materials and influenced by the blade cultures of the west and the flake cultures of the southeast. As time went on, this Siberian cultural tradition spread south and east to Mongolia, China, Korea, and Japan.

During the last glacial, northeastern Siberia had such low levels of rain and snow that ice sheets and glaciers like those in northern Europe never formed. The hunters who inhabited these immense, frozen, and treeless expanses had to cover vast territories in pursuit of game and other food. There were few caves and rock shelters for protection, so they had to build huts that could withstand the severe cold. They also needed effective fireplaces that would allow them to maintain fires almost continuously, and closefitting clothes of fur and hide. Ander and bone implements were crucial aids to these enterprises. The mammoth became a sought-after game animal, because it provided food, hides, and large quantities of bones that could be used as fuel, building material, and tools.

During two phases of the last Ice Age—from 50,000 to 40,000 years ago, and again from 25,000 to 14,000 years ago—the present Bering Strait was drained, as were large parts of the Arctic Ocean in the north and the Bering Sea in the south. This territory, often called Beringia, connected Alaska with Siberia's Chukchi Peninsula, making it possible for humans and animals to cross from one continent to the other. It is unlikely that humans migrated into North America during the first phase, as there is no evidence of human settlement in northeastern Asia from this period. It is also very unlikely that people crossed the Arctic Ocean in the intermediate period, between 40,000 and 25,000 years ago. A boat trip across Bering Strait would have been a very difficult undertaking indeed at that time. The warm climate and waters of Southeast Asia were much more conducive to seafaring, as we know from the fact that people crossed the Sunda Strait to reach present-day Australia and New Guinea.

The most likely time for people to have crossed Bering Strait is clearly between 25,000 and 14,000 years ago. This corresponds with the known spread of modern humans into the arctic regions of Europe and with the earliest finds of big-game hunters' settlements within the Mal'ta and Dyukhtai traditions of northeastern Siberia, which date back to between 18,000 and 15,000 years ago. The oldest known tools on both sides of Bering Strait exhibit a similar microlithic blade technique: those within the Dyukhtai tradition, in Siberia, and those found in sites such as Bluefish Caves (c. 13,000 BC), Dry Creek (9000 BC), and Akmak (c. 8000 BC), in Alaska. In spite of intensive efforts to find older signs of human occupation in Alaska, there are at present no reliable finds older than those from Bluefish Caves, going back 15,000 years. Tools of a similar age have been excavated from several Upper Paleolithic sites in the Chukchi Peninsula, including Kurupka, Puturak, and Ul'khum, which undoubtedly housed the ancestors of the first Americans.

At the end of the last Ice Age, between 15,000 and 8,000 years ago, the climate changed dramatically throughout the world. These changes decisively altered the patterns of human life and, in time, led to the birth of agriculture. As the climate in western Europe improved rapidly and the ice edge retreated, the herds of reindeer and horses, so important to the big-game hunters of the Magdalenian period, moved north. With the spreading forests came totally different forms of subsistence. Some groups of people followed the animals north, while others adjusted to the new conditions where they were. This meant a change from big-game hunting to fishing, beachcombing, hunting small game, and—an activity that was to become increasingly important—gathering plants. For the big-game hunters, the desolate tundras of Scandinavia provided a short-lived refuge for their several-thousand-year-old way of life.

These big-game hunters gradually spread into the vast tundras of northern Europe, ranging of necessity over large areas. Population was sparse, and large areas in the west which today are covered by the North Sea were part of the reindeer hunters' territory. About 13,000 years ago, reindeer hunters from the Federmesser cultures were the first to move into the ice-free regions of southern Scandinavia. Reindeer skeletons and anders have been found in Denmark and southern Sweden, but the age of the bones indicates that these kill sites resulted from short visits, perhaps of only a few weeks. This is almost certainly why so few settlements from this time are known.

Since the vast majority of people in western and nortiiern Europe hunted reindeer, a similar range of tools is found throughout these areas until about 10,000 BC, when, in response to climatic changes, the forests spread north. This led to an increase in both big and small game and also made plants a more important source of food, and new tools were gradually developed to suit local conditions and the new subsistence patterns. With a vastly richer ecosystem, smaller areas of land were able to support larger groups of people, leading to rapid population growth, and clear seasonal settlement patterns developed within different regions. Coastal settlements became more common, although we know very little about these, as most coastlines that existed during the late Ice Age, like those of the early postglacial period known as the Holocene, have long since been submerged by the rising seas.

Big-game hunting did not, of course, disappear entirely as a way of life. The hunters who spread to southern Scandinavia probably moved further north along the ice-free coastal areas of western Scandinavia, where they established various hunting-gathering economies that survived unaltered for thousands of years in the form of the Fosna culture. In the extreme north, scattered groups of reindeer hunters from the tundra steppes of eastern Europe reached the ice-free coasts of the Kola Peninsula and Nordland, establishing the so-called Komsa culture, which survived until about 2000 BC—well into the late Neolithic period. In most of Europe, however, a new era was just around the corner. Mesolithic peoples were showing a remarkable ability to adapt to a diverse range of environments and ecosystems, and in time these successful hunter-gatherers would adopt herding and farming as an increasingly necessary part of their subsistence.

Recent genetic evidence indicating that all living humans trace their descent to a single hypothetical woman ("Eve") who lived in Africa less than 250,000 years ago is of immense importance. Even more recently several new lines of work have come together to support this picture and are beginning to give us a revolutionary new insight into our origins. Genetics have provided one "family tree". The various human populations are not characterized by the simple presence or absence of particular genes, but have different 'uendes of different genes. By analyzing a huge body of genetic data, L.L. Cavalli-Sforza has recently produced the family tree shown in the chart. The populations he studied are listed down the middle. Most are ethnically defined, but a few are linguistically defined, and these are shown in italics. The interrelationships are shown at left, calibrated against the scale of genetic distance (or difference) between the various populations (see the caption). African populations are distantly related to all the other groups, a factor that supports an African origin for all living humans.

Are distant echoes of a linguistic "big bang" still reverberating? Could it be that the speech of all of us reflects, however distantly, a single original "protoworld" language, spoken by the first modern humans in Africa 1 Most linguists believe that such a language existed, but do not accept that any trace of it persists to this day. Archaeology is also coming up with evidence, in two main ways. First, it is obviously not possible to excavate a language, but it may be possible to detect in the archaeological record the type of symbolic thought processes without which a spoken language of modem complexity could not exist. Art and other evidence of symbolic thought appear only about 35,000 years ago, and tool types start to become stylized at about the same time. There is also some (controversial) evidence, derived from studies of the lower parts of skulls, that Neanderthals would not have been able to make all the complex sounds we can make. William Noble and Iain Davidson have recently suggested that language as we know it is no more than about 50,000 years old—an unexpectedly short time span. If this is so, it seems just posible that linguistic traces of a "protoworld" language could have survived into the present.

Second, archaeology can provide fossil evidence of our origins. New finds, and new dating methods, currently support the hypothesis that modern humans originated in Africa. The earliest dated finds of modern humans are African, going back more than 100,000 years. These people were living at the same time that Neanderthals were living in Europe. Dates from finds in Israel show that modem humans were present there 90,000 to 100,000 years ago. The next dates we have for modern humans are 67,000 years ago, for human remains found at Liujiang, in China, and about 50,000 years ago, when Australasia was colonized Europe was colonized about 35,000 years ago, the Americas about the same time or later. This pattern indicates that humans did indeed originate in Africa and spread from there to the rest of the world. These three lines of evidence— genetics, linguistics, and archaeology— can, therefore, be brought together to tell a single, coherent story. All three are controversial and need much more detailed testing before the story can be accepted as fact. Only two things are certain. First, by the time you read this in print, significant new developments will have taken place. And second, if these three lines of evidence are not disproved by future work, we are on the threshold of a colossal breakthrough in our understanding of ourselves.

The Settlement of Ancient Australia: The First New World, J. Peter White

The people who settled in Greater Australia unquestionably came from Southeast Asia. Several different lines of evidence point to this. First, we may be certain that Homo sapiens did not evolve in Greater Australia. No primates (apes, monkeys), or even more distant human relatives, are found east of Java, Sumatra, and Borneo. The permanent water barrier of Wallacea kept early people out, as it did other recent mammals. All human remains found in Australia belong to Homo sapiens sapiens, our own modern species. No earlier forms have been found. This suggests that only modern people developed the cultural ability to cross water barriers by means of boats or rafts.

Second, Southeast Asia is the closest landmass from which people might have come. There were no sophisticated sailing craft 50,000 years ago, and people could not have paddled in bark canoes or on bamboo rafts from India, China, or Africa and survived. The most likely craft would therefore be outrigger canoes or single-log hulls with outriggers. The island chains of what is now Indonesia offered a pathway. Indeed, the tropical, generally calm, waters of these areas would have provided a kind of sheltered nursery, where, for the first time in human history, people could learn safely to exploit the sea and its resources. Finally, humans have been in Southeast Asia for at least the last million years. Skeletal remains of earlier hominids, Homo erectus, have been found in Java, as well as the remains of more modern, but still ancestral, humans. We do not know the precise routes people followed when they came to Greater Australia. Every move from island to island was probably made by a few people traveling to the next piece of visible land, or land they inferred to exist from bushfire smoke, clouds, or bird movements. Almost certainly, there was no large-scale, deliberate migration. On the other hand, it seems likely that different parts of Greater Australia (New Guinea and Australia, for instance) were settled by different groups of people. Today, New Guineans and Australians are more closely related than either is to anyone else in me world, but they are still two identifiable groups, although similar enough to suggest a common origin. Although it is unlikely, each group may have descended from a single boatload of people, whose descendants intermarried relatively little and whose gene pool was litde affected by later comers.

We cannot say exactly when people came to Greater Australia. Because people of the time could not make long sea voyages, they probably traveled when sea levels were lower, island areas were more extended, and sea crossings were shorter. One period when the sea level was lower was between about 55,000 and 50,000 years ago. At these ages, the reliable tool of radiocarbon dating is stretched to its limits. The fact that many human sites throughout Greater Australia have been dated to between 28,000 and 37,000 years ago, but that there are very few older sites, may mean only that we are at the limits of the technique. Or it could mean that by that time the population had grown large enough to leave traces in the archaeological record. Other radiometric dating techniques (most of which involve measuring the ratios of certain radioactive isotopes in specimens) are being used, and if we accept current scientific claims, they show that humans came to Greater Australia at least 50,000 years ago.

By 30,000 years ago, the entire continent of Greater Australia had been settled by descendants of Asian tropical foragers, who crossed sea barriers and discovered and settled environments ranging from the familiar to the subtemperate, from rainforest to near-desert. Even the desert core was probably occupied, although sparsely. These people were the direct ancestors of most of today's Aborigines.

The colonizing of Tasmania in late Pleistocene times represents a surprising and important chapter in the story of the spread of anatomically modern humans throughout the globe. Archaeological evidence now being won from limestone caves in southwest Tasmania shows that humans had traversed the vast continent of Greater Australia, from the tropical north and the islands of Melanesia to the glacial south, at least 35,000 years ago, demonstrating a remarkable ability to exploit the entire range of ecological habitats.

The combined archaeological evidence shows that these Tasmanian Ice Age people had developed a technology and a social system that allowed them to exploit the southwest region for more than 25,000 years. Clearly, they were no mere puppets of environmental change. But about 12,000 years ago, almost all the caves were abandoned. Why this happened we don't know, but it has been suggested that with the rapid change in climate at the end of the Pleistocene period the zone was covered in unproductive temperate rainforest, which drove out the game animals—and the humans who had hunted them for more than a thousand generations.

Modern People in the New World, G. Frison

For more than half a century, archaeologists have been probing the geological deposits of the late Pleistocene period in their quest to identify the first humans to set foot in North America. These efforts have been both frustrating and rewarding. Although much information has come to light, the identity of the first migrants and their Old World ancestors, the entry route they used, and the conditions under which they arrived are still vigorously debated. To date, the earliest cultural complex in North America that all archaeologists recognize is known as Clovis. (A cultural complex refers to a group of distinctive cultural artifacts found in association with each other and presumably used by a single population, perhaps over several generations.) It appeared somewhere between 12,000 and 11,000 years ago, just before the last of the large mammals (or megafauna) of the Late Pleistocene period became extinct. Two other cultural complexes, Folsom and Goshen, appeared not long after Clovis, and the surviving evidence of these three cultures constitutes our knowledge of the early North American Paleoindian hunters.

At the time of the last glacial maximum, about 18,000 years ago, huge ice sheets covered nearly all of Canada and extended south of the present-day Great Lakes in the eastern United States. So much water from the oceans was frozen in glaciers around the world that sea levels dropped, exposing a continental shelf that included a large, unglaciated landmass known as Beringia connecting northeast Asia and present-day Alaska. Beringia was a flat, wellvegetated landmass capable of supporting not only the giant animals of the late Pleistocene era but also human predators crossing into the area from the west. As the glaciers melted, sea levels began to rise, gradually submerging the exposed continental shelf. By 12,000 years ago, the Laurentide ice sheet had retreated to the east and the Cordilleran ice sheet to the west, resulting in an open corridor between them that stretched south from presentday Yukon across Canada to Montana. Usually referred to as the "ice-free corridor", this is believed by many prehistorians to have been the route the earliest big-game hunters took to reach the Great Plains of North America. Others think they entered from Alaska, south along the northwest coast.

Significant climatic and environmental changes took place between 12,000 and 10,000 years ago. When the Clovis Cultural Complex first appeared, between 12,000 and 11,000 years ago, winters were warmer and summers were cooler, so seasonal extremes of temperature were less marked than at present. Vegetation was different: tall grasses, for example, covered the short-grass plains of today. The giant animals of late Pleistocene times were on the edge of extinction. They included mammoths, mastodon, horses, camels, and giant sloths. Large bison would survive for another 3,000 years. Perhaps even more impressive were the carnivores, such as the short-faced bear—twice the size of a presentday grizzly bear—and the American lion and American cheetah—again, both bigger than their present-day African counterparts. There were many small animals as well. The collared lemming, for example, a tiny creature that can survive only in cold environments, lived around the margins of the glaciers. It can still be found today in arctic glacial environments.

Some time after about 11,000 years ago, another environmental shift occurred. Seasons became more marked, with long periods of sunlight and warmer temperatures, while snowfall and annual rainfall declined. These changes culminated about 10,000 years ago, at the beginning of the Holocene period, when climatic conditions were similar to those we know today. Although many archaeologists are convinced that humans were present in North America during pre-Clovis times, all agree that Clovis tools and weapons are the earliest found to date that would have enabled people to hunt the large animals present at that time. This strongly suggests that the Clovis hunters were related to the Upper Paleolithic hunters of the Old World, who had pursued and killed mammoths, bison, reindeer, and other large animals for many thousands of years previously.

When the people we now know as the Clovis Cultural Complex began to spread south of the continental ice sheets, they emerged into an area populated by large animals that had not previously been exploited by human predators. Clovis people developed superior types of tools and weapons and became efficient hunters. There was no competition for food resources, so they had little, if any, need to defend their hunting territories. They lived in small bands, and when resources diminished, they could simply move on to a new area. This could explain why the Clovis culture spread so rapidly and widely over the icefree areas of North America, and why similar tools and weapons are found over vast distances. In contrast to later Paleoindian groups, the Clovis hunters were not forced to adopt specialized subsistence strategies that would have restricted their mobility and required them to modify their tools and weapons to suit local and regional conditions. Evidence of Clovis occupation—in the form of their distinctive fluted projectile points and variations of these weapons—has been found in mammoth kill sites across the whole of North America.

The most significant, if least dramatic, event in the history of the Americas occurred when the first human footprint appeared in the New World. No one knows exactly when this happened, or where. We do not know what these Paleoindians wore, looked like, spoke, or thought. We do not know when they left their Asian homeland, or what conditions they experienced along the way. And yet there remains no reasonable doubt that the first Americans did indeed travel from Asia during the late Pleistocene. Biology, language, and archaeology all point to an Asian homeland. It is the timing and conditions surrounding their arrival that remain unknown.

The relatively conservative estimate [that Clovis is the earliest cultural complex in the New World, established some time between 12,000 and 11,000 years ago] remains reasonable, because despite decades of concerted research, no undisputed evidence of pre-Clovis occupation has been uncovered anywhere in the Western Hemisphere. Considerable nonarchaeological evidence also supports this position. Joseph Greenberg's recent reanalysis of American Indian languages suggests that there were three waves of migrants into the New World and that the earliest took place about 12,000 years ago: these were the people of the Clovis Complex. Other investigators, independently analyzing human tooth morphology and blood genetics, have come to the same conclusion.

But considerable controversy surrounds Greenberg's broad-brush linguistic reconstructions, and numerous skeptics question the relevance of the dental and genetic evidence in this prehistoric context. Moreover, although it is still controversial, archaeological evidence emerging from a number of sites suggests that people arrived considerably earlier. Many archaeologists have begun to acknowledge, if sometimes only privately, that people might well have arrived in the New World as early as 40,000 years ago.

Numerous sites throughout North and South America have provided tantalizing suggestions of pre- Clovis occupation, but none has yielded ironclad proof acceptable to all archaeologists. Some of the best evidence conies from excavations at Meadowcroft Rockshelter, a remarkably well-stratified site in southwestern Pennsylvania. Here, James Adovasio and his colleagues have documented a sequence of more than 40 radiocarbon dates in near-perfect stratigraphic order. The oldest cultural date is a little more than 19,000 years ago, and the oldest stone artifacts appear to date from between 15,000 and 14,000 years ago. Evidence of early human habitation found in the various occupation levels consists of firepits, stone tools and by-products of tool-making, a wooden foreshaft, a piece of plaited basketry, and two human bone fragments. Although many archaeologists consider the evidence from Meadowcroft to be conclusive, others remain unconvinced. Tie stone implements are scarce and small, and don't tell us much. They are also disturbingly similar to much later artifacts. Surprisingly, there are no remains of the giant animals known to have existed in the Pleistocene era, and the plant remnants recovered are clearly of types that grow in temperate zones, whereas, for part of this time, the ice front should have been less than 75 kilometers (47 miles) to the north.

Another leading pre-Clovis candidate is Monte Verde, an open-air residential site in southern Chile. Excavator Tom Dillehay and his colleagues have unearthed four distinct zones of buried cultural remains. The foundations and fallen pole-frames of close to 12 residential huts have been excavated, with fragments of skin (perhaps mastodon) still clinging to the poles. Abundant plant remains have been found in the deposits, as well as numerous shaped stone tools, including several grooved bola stones. Dillehay argues that the upper layers at Monte Verde contain "well-preserved and clear, conclusive evidence" of human habitation about 13,000 years ago. Even more controversial are the deep layers, where remains associated with possible cultural features and several fractured stones have been radiocarbon-dated to 33,000 years ago.

These controversial findings suggest not only that humans arrived in the New World much earlier than previously thought, but also that the earliest Americans were not the glamorous big-game hunters associated with the sophisticated Clovis Complex and its elegant stone tools. Rather, the plant and animal remains from Monte Verde suggest that they were hunter-gatherers who lived mainly on wild plant foods and shellfish. They may also have scavenged, and hunted slowmoving mammals such as seals, but these would have been secondary activities. Yet, despite the findings from Meadowcroft, Monte Verde, and numerous other sites, we have no unequivocal, indisputable archaeological evidence that the New World was inhabited before Clovis times. The debate rages on, and until more substantial evidence comes to light, claims as to the identity of the first Americans are likely to be based, as one skeptic has put it, as much on psychological as on archaeological grounds.

The hunting groups known as the Folsom Cultural Complex appeared about 10,900 years ago, close on the heels of Clovis, and survived for about 600 years. Folsom remains are not found over such wide areas as Clovis, being confined to the Great Plains, the southwest, the central and southern Rocky Mountains, and several intermontane basins partly or entirely within the Rocky Mountains. These people did not hunt mammoths but mainly targeted a now-extinct species of bison, along with occasional pronghorn and mountain sheep. A few camel bones, usually in the form of tools, have been found in some Folsom sites, but there is little evidence to suggest that camels were hunted. In 1934, amateur archaeologists discovered fluted projectile points identical to those found earlier at the Folsom site, in New Mexico, at a site in northern Colorado now known as the Lindenmeier site. To this day, Lindenmeier remains the largest and most complex Folsom site known. The large number and variety of tools, weapons, and other artifacts found here shows that the Folsom groups were very skilled at making and using stone tools. A small but significant number of bone tools and decorative items show that they were equally adept at working bone, i heir flake tools, made from carefully selected materials, exhibit a wide variety of skillfully prepared edges, points, and corners that equal or surpass those of any other Paleoindian cultural complex in the archaeological record of the New World. Microscopic analysis of wear patterns on tool edges suggests that these tools were used to work bone, wood, hide, and possibly other materials as well. Folsom projectile points are crafted extremely skillfully, and reflect a knowledge of flake technology equal to that known anywhere else in the world.

Paleoindian bison hunters on the plains of North America were familiar with their hunting territory and the dayto- day habits of the animals, and they had developed the best weapons known anywhere in the world during that period. The communal hunt was an important social event as well as a means of acquiring food. The entire band, or even several bands, gathered at a designated location not only to provision the group with meat but also to perform numerous social obligations and related activities that served to reinforce the solidarity of the group and helped to ensure its continued existence. Communal bison hunts were events in which the chances of failure were always present, so the supernatural was called upon to reinforce the chances of success. The religious leader, or shaman, performed the necessary rituals involved in calling in the animals and ensuring that the spirits of the dead animals were properly treated. In this way, future hunting success would not be imperiled, since the general belief among hunting societies such as these was that the animals made themselves available for the benefit of humans, but only as long as they were accorded the proper respect through the performance of established rituals. Repeated failures could be blamed on the shaman, but he could protect himself by claiming that the failure was due to someone in the group neglecting to observe the proper rituals.

Many Paleoindian sites on the plains bear witness to as much as 2,000 to 3,000 years of repeated use, owing to topographic features that in their natural form or with slight modification formed traps for animals. Since the teeth of bison are known to erupt at certain ages, especially in young animals, and the calving season is restricted to a short time in early spring, we know from the fossil evidence that many Paleoindian bison kills were carried out in late autumn and winter. Evidence from some kill sites on the Northern Plains indicates that surplus meat was temporarily frozen and placed in protected caches for use as needed.

What happened to the giant animals of the late Pleistocene period that were present, but apparently on the verge of extinction, when the Ciovis hunters arrived? This is a fascinating question. Unless evidence is uncovered showing that highly efficient hunters were present in North America in pre-Clovis times, we cannot attribute their demise to human predation alone. Although the Ciovis culture was widespread in North America, there is very little to suggest that these people could have hunted these animals to extinction—although they may well have .delivered the coup de grace in some cases, in particular to mammoths. Nor does it seem likely that either horses or camels were seriously hunted by humans in North America, while bison survived as the main prey of Paleoindian hunters on the Plains for at least 3,000 years after the Ciovis hunters disappeared. The large-scale animal extinctions at the end of the Pleistocene period have yet to be satisfactorily explained and probably resulted from several contributing factors.

Paleoindian studies involve many and varied disciplines, but in the final analysis, the surviving evidence must be interpreted within a framework of human behavior. The archaeological record shows that Paleoindians developed the ability to make different types of stone tools and weapons. However, weapons (especially projectile points) changed in form over time, and it is on the basis of diese changes that archaeologists are able to identify different cultural complexes. Tools do not show as much change, and are therefore a less reliable guide to time. Because of this, the enduring stone projectile point has become the "guide fossil" for North American archaeologists. This, along with stratigraphy and radiocarbon dating, has allowed us to establish a chronology for the various Paleoindian groups that inhabited North America.

We cannot project ourselves back in time and observe Paleoindian groups at first hand, but we can use our knowledge of recent hunter-gatherer societies at a similar cultural stage in different parts of the world to shed light on the past. For example, Inuit (or Eskimo) groups who hunt caribou and sea mammals and the historic record of the bison hunters of the North American Plains provide good bases for comparison and allow us to look at the Paleoindians as human societies rather than archaeological sites. We must remember, however, that we can compare such groups only in general terms. On this basis, we can conjecture that Paleoindian communities consisted of small groups of nomadic Homo sapiens sapiens concerned with problems of day-to-day survival. They lived in close harmony with a harsh and unforgiving environment, where a single mistake in the everyday quest for food could mean death or disablement and even result in the family starving to death. They had to compete directly with large predators and scavengers for food, and they had to protect stored surpluses from these and other dangers. For example, rodents burrowing into a food cache from below could have the same end result as a grizzly bear tearing the cache apart from the outside.

The mainstay of the Paleoindian economy was hunting, which was a male-centered activity. Women butchered the meat from the kill, prepared it for consumption, and gathered plant foods. While the latter was a less prestigious activity than hunting, plants were an important part of the everyday food supply: Paleoindians lived in small bands, the only political leadership being provided by the male who claimed the greatest charisma by virtue of being the most accomplished hunter and provider. Bands ranged in size from 20 to 50 individuals made up of 4 to 10 nuclear families. For most of the year, the band fragmented into smaller single or multiple family groups to exploit the available food resources more efficiently. Communal hunting or a windfall in the form of surplus food would have brought the entire band or even several bands together. The wide-ranging sources of the stone used to make the flake tools found at the Lindenmeier site, in northern Colorado, suggest that more than one band may have gathered there to take part in communal bison hunting.

Bands were territorial, and resources within the territory were exploited systematically, although boundaries were less distinct than those defining modern-day states or countries. Bands were exogamous—that is, members took partners from outside the band—and this involved crossing territorial boundaries. In hunting societies such as these, it was the women who moved to the husband's residence, since it was vitally important to survival that the intimate knowledge of the hunting territory be passed on from father to son. Hunting groups such as these had a special relationship to the animal world. Hunting magic and ritual dominated most animal hunting activities, especially where the chances of failure were high, as in the case of a communal bison kill. These people believed that animals made themselves available to humans, but that a welldefined measure of respect was expected in return. The animal spirits had to be treated appropriately at every stage of the hunting process, and if this was not done, the animals would no longer make themselves available. The shaman, or medicineman, was present at communal hunts to ensure that the proper rituals were observed. Shamans also had a role in curing sickness, which was generally believed to result from breaking taboos, rather than from natural causes.

These Paleoindian hunting societies survived for thousands of years, and one of the secrets of their enduring success was cooperation. No matter who killed the animal or gathered the food, all members of the group shared, since not every hunter or gatherer could be successful at every attempt. When food was in short supply, sharing was even more important. It was considered reprehensible to hoard food, and in these lends of societies it was next to impossible to do this without being found out. The people most admired were those who were the best providers and who shared the most. Storing food in caches was quite different from hoarding and was strictly a short-term measure to provide for periods of extreme cold or other times when it was not possible to go out and search for food. Caching may also have been more common in the Arctic, and was not as necessary for groups living in warmer areas.

It is difficult to imagine humans, however acclimatized they may have been, surviving the winter in the colder regions without adequate clothing and shelter. The archaeological record tells us very little about this aspect of Paleoindian life, and very few sites offer clues to the nature of winter living quarters. What evidence there is suggests that they lived in simple structures, perhaps similar to the tipis of the North American Plains Indians, consisting of hides of large animals stretched over a conical framework of poles. Such shelters are, in fact, remarkably warm in winter when well insulated with snow and heated with small fires. Even so, it is difficult to imagine them surviving subzero temperatures without adequate clothing, especially footwear, and we know that they possessed the tools to make such clothing. For example, eyed bone needles not unlike the metal ones of today have been recovered from Folsom sites.

Ideally, the archaeologist hopes to find sites with undisturbed cultural levels, containing characteristic projectile points along with organic material that can be radiocarbon-dated. Unfortunately, this rarely happens. But we should never lose sight of the fact that Paleoindian archaeology deals with people, much like ourselves, who managed to survive under very difficult environmental conditions, to raise families, and to maintain the continuity of human populations from one generation to the next. It is much more than the mere study of artifacts of stone and bone.

Just Another Species of Big Mammal, J. Diamond

A zoologist from Outer Space would immediately classify us as just a third species of chimpanzee, along with the pygmy chimp of Zaire and the common chimp of the rest of tropical Africa. Molecular genetic studies of the last half-dozen years have shown that we continue to share over 98 percent of our genetic program with the other two chimps. The overall genetic distance between us and chimps is even smaller than the distance between such closely related bird species as red-eyed and white-eyed vireos. Thus, we still carry most of our old biological baggage with us. Since Darwin's time, fossilized bones of hundreds of creatures variously intermediate between apes and modern humans have been discovered, making it impossible for a reasonable person to deny the overwhelming evidence. What once seemed absurd—our evolution from apes— actually happened.

Yet the discoveries of many missing links have only made the problem more fascinating, without fully solving it. The few bits of new baggage we acquired—the 2 percent difference between our genes and those of chimps—must have been responsible for all of our seemingly unique properties. We underwent some small changes with big consequences rather quickly and recently in our evolutionary history. In fact, as recently as a hundred thousand years ago that zoologist from Outer Space would have viewed us as just one more species of big mammal. Granted, we had a couple of curious behaviors, notably our control of fire and our dependence on tools. But those behaviors would have seemed no more curious to the extraterrestrial visitor than would the behaviors of beavers and bowerbirds. Somehow, within a few tens of thousands of years—a period that is almost infinitely long when measured against one person's memory but is only a tiny fraction of our species' separate history—we had begun to demonstrate the qualities that make us unique and fragile. What were those few key ingredients that made us human? Since our unique properties appeared so recently and involved so few changes, those properties or at least their precursors must already be present in animals. What are those animal precursors of art and language, of genocide and drug abuse?

It's restating the obvious to mention that we feed our children after the age of weaning, instead of leaving them to find food on their own; that most adult men and women associate in couples; that most fathers as well as mothers care for their children; that many people live long enough to experience grandchildren; and that women undergo menopause. To us, these traits are the norm, but by the standards of our closest animal relatives they are bizarre. They constitute major changes from our ancestral condition, though they don't fossilize and so we don't know when they arose. For that reason they receive much briefer treatment in books on human paleontology than do our changes in brain size and pelvis. But they were crucial to our uniquely human cultural development, and merit equal attention.

Our large brain was surely prerequisite for the development of human language and innovativeness. One might therefore expect the fossil record to show a close parallel between increased brain size and sophistication of tools. In fact, the parallel is not at all close. This proves to be the greatest surprise and puzzle of human evolution. Stone tools remained very crude for hundreds of thousands of years after we had undergone most of our expansion of brain size. As recently as forty thousand years ago, Neanderthals had brains even larger than those of modern humans, yet their tools show no signs of innovativeness and art. Neanderthals were still just another species of big mammal. Even for tens of thou- sands of years after some other human populations had achieved virtually modern skeletal anatomy, their tools remained as boring as those of Neanderthals.

These paradoxes sharpen the conclusion drawn from the evidence of molecular biology. Within that modest percentage of difference between our genes and chimpanzee genes, there must have been an even smaller percentage not involved in the shapes of our bones but responsible for the distinctively human attributes of innovation, art, and complex tools. At least in Europe, those attributes appeared unexpectedly suddenly, at the time of the replacement of Neanderthals by Cro-Magnons. That is the time when we finally ceased to be just another species of big mammal.

A Tale of Three Chimps, J. Diamond

For centuries it's been clear approximately where we fit into the animal kingdom. We are obviously mammals, the group of animals characterized by having hair, nursing their young, and other features. Among mammals we are obviously primates, the group of mammals including monkeys and apes. We share with other primates numerous traits lacking in most other mammals, such as flat fingernails and toenails rather than claws, hands for gripping, a thumb that can be opposed to the other four fingers, and a penis that hangs free rather than being attached to the abdomen. Already by the second century A.D., the Greek physician Galen deduced our approximate place in nature correctly when he dissected various animals and found that a monkey was "most similar to man in viscera, muscles, arteries, veins, nerves, and in the form of bones."

Depending on what anatomical characters one considers most important and how one interprets them, biologists differ as to whether we are most closely related to the orangutan (the minority view), with chimps and gorillas having branched off our family tree before we split off from orangutans, or whether we are instead closest to chimps and gorillas (the majority view), with the ancestors of orangutans having gone their separate way earlier. Within the majority, most biologists have thought that gorillas and chimps are more like each other than either is like us, implying that we branched off before the gorillas and chimps diverged from each other.

Let's pause to let some of the implications of these momentous numbers sink in: The gorilla must have branched off from our family tree slightly before we separated from the common and pygmy chimpanzees. The chimpanzees, not the gorilla, are our closest relatives. Put another way, the chimpanzees' closest relative is not the gorilla but humans. Traditional taxonomy has reinforced our anthropocentric tendencies by claiming to see a fundamental dichotomy between mighty man, standing alone on high, and the lowly apes all together in the abyss of bestiality. Now, future taxonomists may see things from the chimpanzees' perspective: a weak dichotomy between slightly higher apes (the three chimpanzees, including the "human chimpanzee") and slightly lower apes (gorillas, orangutans, gibbons). The traditional distinction between "apes" (defined as chimps, gorillas, etc.) and humans misrepresents the facts.

It would be absurd to think that a human hallmark such as art, language, or aggression depends on a single gene. Behavioral differences among individual humans are obviously subject to enormous environmental influences, and it's very controversial what role genes play in such individual differences. However, for those behaviors that differ consistently between chimps and humans, genetic differences are likely to be involved, even though we can't yet specify the genes responsible. For instance, the ability of humans but not chimps to speak surely depends on differences in genes specifying the anatomy of the voice box and the wiring of the brain. A young chimpanzee brought up in a psychologist's home along with the psychologist's human baby of the same age still continued to look like a chimp and didn't learn to talk or walk erect. But whether an individual human grows up to be fluent in English or Korean is independent of genes and dependent solely on childhood linguistic environment, as proved by the linguistic attainments of Korean infants adopted by English-speaking parents.

At present, all we can say with confidence is this: much of our DNA is junk; at least some of the 1.6 percent that differs between us and chimps is already known to be junk; and the functionally significant differences must be confined to some as-yet-unidentified small fraction of 1.6 percent....While we don't know which particular genes or nucleotide bases are the crucial ones accounting for our observed differences from chimps, there are numerous precedents for one or a few genes having big impacts.

Somewhere along the scale from bacteria to humans, we have to decide where killing becomes murder, and where eating becomes cannibalism. Most people draw those lines between humans and all other species. However, quite a few people are vegetarians, unwilling to eat any animal (yet willing to eat plants). And an increasingly vocal minority, belonging to the animal-rights movement, object to medical experiments on animals—or at least on certain animals. That movement is especially exercised about research on cats and dogs and primates, less concerned about mice, and generally silent about insects and bacteria. If our ethical code makes a purely arbitrary distinction between humans and all other species, then we have a code based on naked selfishness devoid of any higher principle. If our code instead makes distinctions based on our superior intelligence, social relationships, and capacity for feeling pain, then it becomes difficult to defend an all-or-nothing code that draws a line between all humans and all animals.

The Great Leap Forward, J. Diamond

For most of the many millions of years since our lineage diverged from that of apes, we remained little more than glorified chimpanzees in how we made our living. As recently as forty thousand years ago, western Europe was still occupied by Neanderthals, primitive beings for whom art and progress scarcely existed. Then there came an abrupt change, as anatomically modern people appeared in Europe, bringing with them art, musical instruments, lamps, trade, and progress. Within a short time, the Neanderthals were gone.

That Great Leap Forward in Europe was probably the result of a similar leap that had occurred over the course of the preceding few tens of thousands of years in the Near East and Africa. Even a few dozen millennia, though, is a trivial fraction (less than 1 percent) of our long history separate from ape history. Insofar as there was any single point in time when we could be said to have become human, it was at the time of that leap. Only a few more dozen millennia were needed for us to domesticate animals, develop agriculture and metallurgy, and invent writing.

To place human evolution in a time perspective, recall that life originated on Earth several billion years ago, and that the dinosaurs became extinct around sixty-five million years ago. It was only between six and ten million years ago that our ancestors finally became distinct from the ancestors of chimps and gorillas. Hence human history constitutes only an insignificant portion of the history of life. Science-fiction films that depict cavemen fleeing from dinosaurs are just that, science fiction. The shared ancestor of humans, chimps, and gorillas lived in Africa, to which chimps and gorillas are still confined, and to which we remained confined for millions of years. Initially, our own ancestors would have been classified as merely another species of ape, but a sequence of three changes launched us in the direction of modern humans. The first of these changes had occurred by around four million years ago, when the structure of fossilized limb bones shows that our ancestors were habitually walking upright on the two hind limbs. In contrast, gorillas and chimps walk upright only occasionally, and usually proceed on all fours. The upright posture freed our ancestors' forelimbs to do other things, among which toolmaking proved the most important. The second change occurred around three million years ago, when our lineage split into at least two distinct species. The third and last of the big changes that began to make our ancestors more human and less apelike was the regular use of stone tools. This is a human hallmark with clear animal precedents.

While early humans ate some meat, we don't know how much meat they ate, or whether they got the meat by hunting or scavenging. It's not until much later, around 100,000 years ago, that we have good evidence about human hunting skills, and it's clear that humans then were still very ineffective big-game hunters. Hence human hunters of 500,000 years ago and earlier must have been even more ineffective.

Although Peking Man may have already used fire hundreds of thousands of years earlier, Neanderthals were the first people to leave undisputed evidence of fire's regular use. Neanderthals may also have been the first people who regularly buried their dead, but that's disputed, and whether it would imply religion is a matter of pure speculation. Finally, they regularly took care of their sick and aged. Most skeletons of older Neanderthals show signs of severe impairment, such as withered arms, healed but incapacitating broken bones, tooth loss, and severe osteoarthritis. Only care by young Neanderthals could have enabled such older Neanderthals to stay alive to the point of such incapacitation.

The scene that the human world presented from around 100,000 to somewhat before fifty thousand years ago was this. Northern Europe, Siberia, Australia, oceanic islands, and the whole New World were still empty of people. In Europe and western Asia lived the Neanderthals; in Africa, people increasingly like us moderns in their anatomy; and in eastern Asia, people unlike either the Neanderthals or Africans but known from only a few bones. All three of these populations were, at least initially, still primitive in their tools, behavior, and limited innovativeness. The stage was set for the Great Leap Forward. Which among these three contemporary populations would take that leap?

The evidence for an abrupt rise is clearest in France and Spain, in the Late Ice Age around forty thousand years ago. Where there had previously been Neanderthals, anatomically fully modern people (often known as Cro-Magnons, from the French site where their bones were first identified) now appear. Had one of those gentlemen or ladies strolled down the Champs Elysees in modern attire, he or she would not have stood out from the Parisian crowds in any way. As significant to archaeologists as the Cro-Magnons' skeletons are their tools, which are far more diverse in form and obvious in function than any in the earlier archaeological record. The tools suggest that modern anatomy had at last been joined by modern innovative behavior.

Several types of evidence testify to the effectiveness of Late Ice Age people as big-game hunters. Their sites are much more numerous than those of earlier Neanderthals or Middle Stone Age Africans, implying more success at obtaining food. Numerous species of big animals that had survived many previous ice ages became extinct toward the end of the last Ice Age, suggesting that they were exterminated by human hunters' new skills.

Improved technology now allowed humans to occupy new environments, as well as to multiply in previously occupied areas of Eurasia and Africa. Australia was first reached by humans around fifty thousand years ago, implying watercraft capable of crossing water gaps as wide as sixty miles between eastern Indonesia and Australia. The occupation of northern Russia and Siberia by at least twenty thousand years ago depended on many advances: sewn clothing, whose existence is reflected in eyed needles, cave paintings of -parkas, and grave ornaments marking outlines of shirts and trousers; warm furs, indicated by fox and wolf skeletons minus the paws (removed in skinning and found in a separate pile); elaborate houses (marked by pestholes, pavements, and walls of mammoth bones), with elaborate fireplaces; and stone lamps to hold animal fat and light the long Arctic nights. The occupation of Siberia and Alaska in turn led to the occupation of North America and South America around eleven thousand years ago.

The evident aesthetic sense reflected in Late Ice Age trade in ornaments relates to the achievements for which we most admire the Cro-Magnons: their art. Best known, of course, are the rock paintings from caves like Lascaux, with stunning polychrome depictions of now-extinct animals. But equally impressive are the bas reliefs, necklaces and pendants, fired-clay ceramic sculptures, Venus figurines of women with enormous breasts and buttocks, and musical instruments ranging from flutes to rattles.

What happened when invading Cro-Magnons met the resident Neanderthals? We can be certain only of the end result: within a short time, no more Neanderthals. The conclusion seems to me inescapable that Cro-Magnon arrival somehow caused Neanderthal extinction....My guess is that events in Europe at the time of the Great Leap Forward were similar to events that have occurred repeatedly in the modern world, whenever a numerous people with more advanced technology invades the lands of a much less-numerous people with less-advanced technology. For instance, when European colonists invaded North America, most North American Indians proceeded to die of introduced epidemics; most of the survivors were killed outright or driven off their land; some of the survivors adopted European technology (horses and guns) and resisted for some time; and many of the remaining survivors were pushed onto lands that Europeans did not want, or else intermarried with Europeans.

Thus, we have a tentative picture of anatomically modern people arising in Africa over a hundred thousand years ago, but initially making the same tools as Neanderthals and having no advantage over them. By perhaps sixty thousand years ago, some magic twist of behavior had been added to the modern anatomy. That twist (of which more in a moment) produced innovative, fully modern people who proceeded to spread westward from the Near East into Europe, quickly supplanting Europe's Neanderthals. Presumably, those modern people also spread east into Asia and Indonesia, supplanting the earlier people there, of whom we know little. Some anthropologists think that skull remains of those earlier Asians and Indonesians show traits recognizable in modern Asians and Aboriginal Australians. If so, the invading moderns may not have exterminated the original Asians without issue, as they did the Neanderthals, but instead interbred with them. Two million years ago, several protohuman lineages had coexisted side by side until a shakedown left only one. It now appears that a similar shakedown occurred within the last sixty thousand years, and that all of us alive in the world today are descended from the winner of that shakedown. What was the last missing ingredient whose acquisition helped our ancestor to win?

The identity of the ingredient that produced the Great Leap Forward poses an archaeological puzzle without an accepted answer. It doesn't show up in fossil skeletons. It may have been a change in only 0.1 percent of our DNA. What tiny change in genes could have had such enormous consequences? Like some other scientists who have speculated about this question, I can think of only one plausible answer: the anatomical basis for spoken complex language.

Given the capability for symbolic communication using sounds, why have apes not gone on to develop much more complex natural languages of their own? The answer seems to involve the structure of the larynx, tongue, and associated muscles that give us fine control over spoken sounds. Like a Swiss watch, all of whose many parts have to be well designed for the watch to keep time at all, our vocal tract depends on the precise functioning of many structures and muscles. Chimps are thought to be physically incapable of producing several of the commonest human vowels. If we too were limited to just a few vowels and consonants, our own vocabulary would be greatly reduced.

It's easy to appreciate how a tiny change in anatomy resulting in capacity for speech would produce a huge change in behavior. With language, it takes only a few seconds to communicate the message, "Turn sharp right at the fourth tree and drive the male antelope toward the reddish boulder, where I'll hide to spear it." Without language, that message could not be communicated at all. Without language, two protohumans could not brainstorm together about how to devise a better tool, or about what a cave painting might mean. Without language, even one protohuman would have had difficulty thinking out for himself or herself how to devise a better tool....If the missing ingredient did consist of changes in our vocal tract that permitted fine control of sounds, then the capacity for innovation would follow eventually. It was the spoken word that made us free.

I've argued that we were fully modern in anatomy and behavior and language by forty thousand years ago, and that a Cro-Magnon could have been taught to fly a jet airplane. If so, why did it take so long after the Great Leap Forward for us to invent writing and build the Parthenon? The answer may be similar to the explanation why the Romans, great engineers that they were, didn't build atomic bombs. To reach the point of building an A-bomb required two thousand years of technological advances beyond Roman levels, such as the invention of gunpowder and calculus, the development of atomic theory, and the isolation of uranium. Similarly, writing and the Parthenon depended on tens of thousands of years of cumulative developments after the arrival of Cro-Magnons—developments that included the bow and arrow, pottery, domestication of plants and animals, and many others. Until the Great Leap Forward, human culture had developed at a snail's pace for millions of years. That pace was dictated by the slow pace of genetic change. After the leap, cultural development no longer depended on genetic change. Despite negligible changes in our anatomy, there has been far more cultural evolution in the past forty thousand years than in the millions of years before.

An Animal with a Strange Life Cycle, J. Diamond

In some respects, we're greatly different from apes. Here are some obvious differences whose functions are well understood: Human babies continue to have all food brought to them by their parents even after weaning, whereas weaned apes gather their own food. Most human fathers as well as mothers, but only chimpanzee mothers, are closely involved in caring for their young. Like seagulls but unlike apes or most other mammals, we live in dense breeding colonies of nominally monogamous couples, some of whom also pursue extramarital sex. All these traits are as essential as large brain cases for the survival and education of human offspring. That's because our elaborate, tool-dependent methods of obtaining food make weaned human infants incompetent to feed themselves. Our infants first require a long period of food provisioning, training, and protection—an investment much more taxing than that facing the ape mother. Hence human fathers who want their offspring to survive to maturity have generally assisted their mates with more than just sperm, the sole parental input of an orangutan father.

We are unusual in having sex mainly in private and for fun, rather than mainly in public and only when the female is able to conceive. Ape females advertise the time when they are ovulating; human females conceal it even from themselves. While anatomists understand the value of men's moderate testis size, an explanation for men's relatively enormous penis still escapes us. Whatever their explanation, all these features, too, are part of what defines humanity. Certainly, it is hard to picture how fathers and mothers could cooperate harmoniously at rearing their children if human females resembled some primate females in having their genitalia turn bright red at the time of ovulation, becoming sexually receptive only at that time, flaunting their red badge of receptivity, and proceeding to have sex in public with any passing male. Thus, human society and child rearing rest not only on the skeletal changes mentioned in Part One, but also on these remarkable new features of our life cycle. Unlike the case with our skeletal changes, however, we can't follow through our evolutionary history the timing of each of these life-cycle changes, because they leave no direct fossil imprint. As a result, they receive only brief attention in paleontology texts despite their importance.

Enormous cultural influences obviously operate on our motivation for providing child care or seeking extramarital sex, and there is no reason to believe that genes contribute significantly to differences among individual people in these traits. However, genetic differences between humans and the other two chimpanzee species probably do contribute to the consistent differences in many life-cycle traits between all human populations and all chimpanzee populations. There is no human society, regardless of its cultural practices, whose men have chimpanzee-sized testes and whose women forgo menopause. Among those 1.6 percent of our genes that differ between us and chimps and that have any function, a significant fraction is likely to be involved in specifying traits of our life cycle.

Mate selection is a decision of major consequence in the human life cycle, because married couples share parental responsibilities as well as sexual involvement. Precisely because care of human children demands such heavy and prolonged parental investment, we have to select our coinvestor much more carefully than does a baboon. Nevertheless, it turns out that we can find animal precedents for our procedure in choosing sex partners, by going beyond primates to rats and birds. Those mate-selection criteria of ours prove relevant to the vexed question of human racial variation. Humans native to different parts of the globe vary conspicuously in external appearance, as do gorillas, orangutans, and most other animal species occupying a sufficiently extensive geographic range. Some of the geographic variation in our appearance surely reflects natural selection molding us to local climate, just as weasels in areas with winter snow develop white fur in winter for better camouflage and survival. But I'll argue that our visible geographic variability arose mainly through sexual selection, as a result of our mate-choice procedures.

The Evolution of Human Sexuality, J. Diamond

The evolution of human sex organs has been intertwined with that of human tool use, large brains, and child-rearing practices. Our progress from being just another species of big mammal to being uniquely human therefore depended on the remodeling not just of our pelvises and skulls, but also of our sexuality.

The reasons that human infants are totally incompetent at food gathering are actually twofold: mechanical and mental. First, making and wielding the tools used to obtain food requires fine finger coordination that children take years to develop. Just as my four-year-old sons still can't tie their own shoelaces, four-year-old hunter-gatherer children can't sharpen a stone axe or build a dugout canoe. Second, we depend on much more brainpower than do other animals in acquiring food, because we have a much more varied diet and more varied and complicated food-gathering techniques....Weaned human infants can't support themselves because they lack these mechanical and mental skills. They need adults to teach them, and they also need adults to feed them for the decade or two that they are being taught.

At least officially, human pairing is more or less monogamous in most modern political states, but is "mildly polygynous" among most surviving hunter-gatherer bands, which are better models for how mankind lived over the last million years. By "mildly polygynous," I mean that most hunter-gatherer men can support only a single family, but a few powerful men have several wives. Polygyny on the scale of elephant seals, among which powerful males have dozens of wives, is impossible for hunter-gatherer men, because they differ from elephant seals in having to provide child care. The big harems for which some human potentates are famous didn't become possible until the rise of agriculture and centralized government let a few princes tax everyone else in order to feed the royal harem's babies.

In a monogamous species, every male can win a female, but in a very polygynous species most males languish without any mates, because a few dominant males have succeeded in rounding up all the females into their harems. Hence, the bigger the harem, the fiercer is the competition among males and the more important it is for a male to be big, since the bigger male usually wins the fights. We humans, with our slightly bigger males and slight polygyny, fit this pattern....Because competition for mates is fiercer in polygynous than monogamous species, the polygynous species also tend to have more marked differences between males and females in other respects besides body size. These differences are the secondary sexual characteristics that play a role in attracting mates.

For animals, copulation is a dangerous luxury. While occupied in acto flagrante, an animal is burning up valuable calories, neglecting opportunities to gather food, vulnerable to predators eager to eat it, and vulnerable to rivals eager to usurp its territory. Hence copulation is something to be accomplished in the minimum time required to do the job of fertilization. In contrast, human sex, as a device to achieve fertilization, would have to be rated a huge waste of time and energy, an evolutionary failure. Had we retained a proper estrus cycle like other mammals, the wasted time could have been diverted by our hunter-gatherer ancestors to butchering more mastodons. By this results-oriented view of sex, any huntergatherer band whose females advertised their estrus period could thereby have fed more babies and outcompeted neighboring bands.

The Science of Adultery, J. Diamond

For men, the minimum effort needed to sire an offspring is the act of copulation, a brief expenditure of time and energy. The man who sires a baby by one woman one day is biologically capable of siring a baby by another woman the next day. For women, however, the minimum effort consists of copulation plus pregnancy plus (throughout most of human history) several years spent nursing—a huge commitment of time and energy. Thus, a man potentially can sire far more offspring than can a woman. A nineteenth-century visitor who spent a week at the court of the Nizam of Hyderabad, a polygamous Indian potentate, reported that four of the Nizam's wives gave birth within eight days, and that nine more births were anticipated for the following week. The record lifetime number of offspring for a man is 888, sired by Emperor Moulay Ismail the Bloodthirsty of Morocco, while the corresponding record for a woman is only sixty-nine (a nineteenth-century Moscow woman specializing in triplets). Few women have topped twenty children, whereas some men easily do so in polygynous societies. As a result of this biological difference, a man stands to gain much more from EMS or polygamy than does a woman—if one's sole criterion is number of offspring born.

The other sexual asymmetry relevant to mating strategy involves confidence that one really is the biological parent of one's putative offspring. A cuckolded animal, deceived into rearing offspring not its own, has thereby lost the evolutionary game while advancing the victory of another player, the real parent. Barring a switch of babies in the hospital nursery, women cannot be "cuckolded": they see their baby emerge from them. Nor can there be cuckoldry of males in animal species practicing external fertilization (i.e., fertilization of eggs outside the female's body). For instance, some male fish watch a female shed eggs, then immediately deposit sperm on the eggs and scoop them up to care for them, secure in their paternity. However, men and other male animals practicing internal fertilization— fertilization of eggs inside the female's body—can readily be cuckolded. All that the putative father knows for sure is that his sperm went into the mother, and eventually an offspring came out. Only observation of the female throughout her whole fertile period can absolutely exclude the possibility that some other male's sperm also entered and did the actual fertilizing.

Surveys comparing men with women in various cultures scattered around the world typically purport to find the following differences: men are more interested in EMS than are women; men are more interested than women in seeking a variety of sexual partners for the sake of variety itself; women's motives for EMS are more likely to be marital dissatisfaction and/or a desire for a lasting new relationship; and men are less selective in taking on casual female sexual partners than women are in taking on casual male partners.

Adultery laws provide a clear example of how men have dealt with these dilemmas. Until recently, essentially all such laws—Hebraic, Egyptian, Roman, Aztec, Moslem, African, Chinese, Japanese, and others—were asymmetrical. They existed to secure a married man's confidence in his paternity of his children, and for no other purpose. Hence these laws define adultery by the marital status of the participating woman; that of the participating man is irrelevant. EMS by a married woman is considered an offense against her husband, who is commonly entitled to damages, often including violent revenge or else divorce with refund of the bride price. EMS by a married man is not considered an offense against his wife. Instead, if his partner in adultery is married, the offense is against her husband; if she is unmarried, the offense is against her father or brothers (because her value as prospective bride is reduced).

Asymmetric adultery laws, tattooing of wives after insemination, virtual imprisonment of women, genital mutilation of women: these behaviors are unique to the human species, defining humanity as much as does invention of the alphabet. More exactly, they are new means to the old evolutionary goal of males' promoting their genes. Some of our other means to this goal are ancient ones shared with many animals, including jealous murder, infanticide, rape, mtergroup warfare, and adultery itself.

In short, we evolved, like other animals, to win at the contest of leaving as many descendants as possible. Much of the legacy of that game strategy is still with us. But we have also chosen to pursue ethical goals, which can conflict with the goals and methods of our reproductive contest. Having that choice among goals represents one of our most radical departures from other animals.

Sexual Selection, and the Origin of Human Races, J. Diamond

People tend to marry a person who looks like the parent or sibling of the opposite sex. That's because we begin already as children to develop our search image of a future sex partner, and that image is heavily influenced by the people of opposite sex whom we see most often. For most of us that's our mother (or father) and sister (or brother), plus close childhood friends.

Racial variation has characterized humans for at least the past several thousand years, and possibly much longer. Already around 450 B.C., the Greek historian Herodotus described the Pygmies of West Africa, the black-skinned Ethiopians, and a blue-eyed, redhaired tribe in Russia. Ancient paintings, mummies from Egypt and Peru, and bodies of people preserved in European peat bogs confirm that people several thousand years ago differed in their hair and faces much as they do today. Origins of modern races can be pushed back still further, to at least ten thousand years ago, since fossil skulls of that age from various parts of the world differ in many of the same respects that modern skulls from the same regions differ.

Darwin despaired of imputing human racial variation to his own concept of natural selection. He finally dismissed the attempt with a succinct statement: "Not one of the external differences between the races of man are of any direct or special service to him." When Darwin came up with a theory that he preferred, he termed it "sexual selection" to contrast with natural selection, and he devoted an entire book to explaining it. The basic notion behind this theory is easily grasped. Darwin noted many animal features that had no obvious survival value but that did play an obvious role in securing mates, either by attracting an individual of the opposite sex or by intimidating a rival of the same sex. Familiar examples are the tails of male peacocks, the manes of male lions, and the bright red buttocks of female baboons in estrus. If an individual male is especially successful at attracting females or intimidating rival males, that male will leave more descendants and will tend to pass on his genes and traits—as a result of sexual selection, not natural selection. The same argument applies to female traits as well.

For sexual selection to work, evolution must produce two changes simultaneously: one sex must evolve some trait, and the other sex must evolve in lockstep a liking for that trait. Female baboons could hardly afford to flash red buttocks if the sight revolted male baboons to the point of their becoming impotent. As long as the female has it and the male likes it, sexual selection could lead to any arbitrary trait, just as long as it doesn't impair survival too much.

Could human breast shape and skin color similarly be the outcome of sexual preferences that vary arbitrarily from area to area? After 898 pages of his book Darwin convinced himself that the answer to this question was a resounding "yes." He noted that we pay inordinate attention to breasts, hair, eyes, and skin color in selecting our mates and sex partners. He noted also that people in different parts of the world define beautiful breasts, hair, eyes, and skin by what is familiar to them. Thus, Fijians, Hottentots, and Swedes each grow up with their own learned, arbitrary beauty standards, which tend to maintain each population in conformity with those standards, since individuals deviating too far from the standards would find it harder to obtain a mate.

Just as in animals, sexual selection had a big effect in molding the external traits by which we pick our mates. For us humans those traits are especially the skin, eyes, hair, breasts, and genitals. In each part of the world those traits evolved in lockstep with our imprinted aesthetic preferences to reach different, somewhat arbitrary end points. Which particular human population ended up with any given eye or hair color may have been partly an accident of what biologists term the "founder effect." That is to say, if a few individuals colonize an empty land and their descendants then multiply to fill the land, the genes of those few founding individuals may still dominate the resulting population many generations later. Just as some birds of paradise ended up with yellow plumes and others with black plumes, so some human populations ended up with yellow hair and others with black hair, some with blue eyes and others with green eyes, some with orange nipples and others with brown nipples.

I've argued that much of our variability is a by-product of a distinctive feature of the human life cycle: our choosiness with respect to our spouses and sex partners. I don't know of any other wild animal species in which eye color of different populations can be green, blue, gray, brown, or black, while skin color varies geographically from pale to black and hair is either red, yellow, brown, black, gray, or white. There may be no limits, except those imposed by evolutionary time, on the colors with which sexual selection can adorn us. If humanity survives another twenty thousand years, I predict that there will be women with naturally green hair and red eyes—and men who think such women are the sexiest.

Why Do We Grow Old and Die?, J. Diamond

As language evolved, far more information became available to us to pass on than previously. Until the invention of writing, old people acted as the repositories of that transmitted information and experience, just as they continue to do in tribal societies today. Under hunter-gatherer conditions, the knowledge possessed by even one person over the age of seventy could spell the difference between survival and starvation for a whole clan. Our long life span, therefore, was important for our rise from animal to human status.

Obviously, our ability to survive to a ripe old age depended ultimately upon advances in culture and technology. It's easier to defend yourself against a lion if you're carrying a spear than just a hand-held stone, and easier yet with a high-powered rifle. However, advances in culture and technology alone would not have been enough, unless our bodies had also become redesigned to last longer. No caged ape in a zoo, enjoying all the benefits of modern human technology and veterinary care, reaches eighty. We'll see in this chapter that our biology became remolded to the increased life expectancy that our cultural advances made possible. In particular, I'd guess that Cro-Magnon tools weren't the sole reason why Cro-Magnons lived on the average longer than Neanderthals. Instead, around the time of the Great Leap Forward our biology must also have changed so that we aged more slowly. That may even have been the time when menopause, the concomitant of aging that paradoxically functions to let women live longer, evolved.

For many or most species, males suffer greater accidental mortality than females, partly because males put themselves at greater risk by fighting and bold displays. This is certainly true of human males today and has probably been so throughout our history as a species: men are the sex most likely to die in wars against men of other groups, and in individual fights within a group. Correlated with this greater accidental death rate of men, men also age faster and have a higher nonaccidental death rate than women. At present, women's life expectancy is about six years greater than that of men; some of this difference is because more men than women are smokers, but there is a sex-linked difference in life expectancy even among nonsmokers. These differences suggest that evolution has programmed us so that women put more energy into self-repair, while men put more energy into fighting. Expressed another way, it just isn't worth as much to repair a man as it is to repair a woman. But I don't mean to denigrate male fighting, which serves a useful evolutionary purpose for a man: to gain wives and to secure resources for his children and his tribe, at the expense of other men and their children and tribe.

Since transmitting one's genes to the next generation is what drives evolution, other animal species rarely survive past reproductive age. Instead, Nature programs death to coincide with the end of fertility, because there is then no longer any evolutionary benefit to gain from keeping one's body in good repair. It's an exception in need of explanation to realize that women are programmed to live for decades after menopause, and that men are programmed to live to an age when most men are no longer busy siring babies. But the explanation becomes apparent on reflection. The intense phase of parental care is unusually protracted in the human species and lasts nearly two decades. Even those older people whose own children have reached adulthood are tremendously important to the survival of not just their children but of their whole tribe. Especially in the days before writing, they acted as the carriers of essential knowledge. Nature has programmed us with the capacity to keep the rest of our bodies in reasonable repair even at an age when the female reproductive system itself has fallen into disrepair.

Human female menopause probably resulted from two other distinctively human characteristics: the exceptional danger that childbirth poses to the mother, and the danger that a mother's death poses to her offspring. Recall the enormous size of the human infant at birth relative to its mother: our big seven-pound babies emerging from hundred-pound mothers, compared to little four-pound gorilla babies emerging from two-hundred-pound gorilla mothers. As a result, childbirth is dangerous to women. Especially before the advent of modern obstetrics, women often died in childbirth, whereas mother gorillas and chimps virtually never do.

A hunter-gatherer mother with several children was gambling the lives of those children at every subsequent childbirth. Since her investment in those prior children increased with their age, and since her own risk of death in childbirth also increased with her age, the odds of her gamble's paying off got worse and worse as she got older. When you already have three children alive but still dependent on you, why risk those three for a fourth? Those worsening odds probably led through natural selection to menopausal shutdown of human female fertility, in order to protect a mother's prior investment in kids....Thus, the longer life span of modern humans as compared to that of apes does not rest only on cultural adaptations, such as tools to acquire food and deter predators. It also rests on the biological adaptations of menopause and increased investment in self-repair. Whether those biological adaptations developed especially at the time of the Great Leap Forward or earlier, they rank among the lifehistory changes that permitted the rise of the third chimpanzee to humanity.

Uniquely Human, J. Diamond

If genetically specified features were our sole distinctions, we wouldn't stand out among animals, and we wouldn't now be threatening the survival of ourselves and other species. Other animals, such as ostriches, walk erect on two legs. Others have relatively large brains, though not as large as ours. Others live monogamously in colonies (many seabirds), or are very long-lived (albatrosses and tortoises). Instead, our uniqueness lies in the cultural traits that rest on those genetic foundations and that in turn give us our power. Our cultural hallmarks include spoken language, art, tool-based technology, and agriculture. But if we stopped there, we'd have a onesided and self-congratulatory view of our uniqueness. The hallmarks I just mentioned are ones that we're proud of. Yet the archaeological record shows the introduction of agriculture to have been a mixed blessing, seriously harming many people while benefiting others. Chemical abuse is a wholly ugly human hallmark. At least it doesn't threaten our survival, as do two of our other cultural practices: genocide, and mass exterminations of other species. We're uncomfortable about whether to regard these as occasional pathological aberrations, or as features no less basic to humanity than the traits we're proudest of.

All of these cultural features that define humanity are seemingly absent in animals, even in our closest relatives. They must have arisen sometime after our ancestors parted company from the other chimpanzees around seven million years ago. Furthermore, while we have no way of knowing whether Neanderthals spoke or indulged in drug abuse and genocide, they certainly didn't have agriculture, art, or the capacity to build radios. Hence these latter traits must be very recent human innovations of the last few tens of thousands of years. But they couldn't have arisen from nothing. There had to have been animal precursors, if we could only recognize them. For each of our defining cultural traits, we need to ask: What were those precursors? When in our ancestry did the trait approach its modern form? What were the early stages of its evolution like, and can those stages be traced archaeologically? We're unique on Earth, but how unique are we in the universe?

Among our unique cultural traits, art is perhaps the noblest human invention. There seems to be a gulf separating human art, supposedly created just for pleasure and doing nothing to perpetuate our genes, from any animal behavior. Yet paintings and drawings created by captive apes and elephants, whatever the motives of those animal artists, look so similar to work of human artists that they have fooled experts and have been bought by art collectors. If one nevertheless dismisses those animal artworks as unnatural productions, what is one to say about the carefully arranged colored bowers of normal male bowerbirds? Those bowers play an unquestioned crucial role in passing on genes. I'll argue that human art also had that role originally, and often still does today. Since art, unlike language, does show up in archaeological deposits, we know that human art didn't proliferate until the time of the Great Leap Forward.

Agriculture, another human hallmark, has an animal precedent but not precursor in the gardens of leafhopper ants, which lie far off of our direct lineage. The archaeological record lets us date our "reinvention" of agriculture to a time long after the Great Leap Forward, within the last ten thousand years. That transition from hunting and gathering to agriculture is generally considered a decisive step in our progress, when we at last acquired the stable food supply and leisure time prerequisite to the great accomplishments of modern civilization. In fact, careful examination of that transition suggests another conclusion: for most people the transition brought infectious diseases, malnutrition, and a shorter life span. For human society in general it worsened the relative lot of women and introduced class-based inequality. More than any other milestone along the path from chimpanzeehood to humanity, agriculture inextricably combines causes of our rise and our fall.

While animal precursors can be identified for all of our hallmarks, they still rank as human hallmarks because we're unique on Earth in the extreme degree to which we've developed them. How unique are we in the universe? Once conditions suitable for life exist on a planet, how likely are intelligent, technologically advanced life forms to evolve? Was their emergence on Earth practically inevitable, and do they now exist on innumerable planets circling other stars?

There is no direct way to prove whether creatures capable of language, art, agriculture, or drug abuse exist elsewhere in the universe, because from Earth we can't detect the existence of those traits on planets of other stars. However, we might be able to detect high technology elsewhere in the universe if it included our own capacity to send out space probes and interstellar electromagnetic signals. I'll conclude this part by examining the ongoing search for extraterrestrial intelligent life. I'll argue that evidence from a quite different field—studies of woodpecker evolution on Earth—instructs us about the inevitability of evolving intelligent life, and therefore about our uniqueness, not only on Earth but also in the accessible universe.

Bridges to Human Language, J. Diamond

Human language origins constitute the most important mystery in understanding how we became uniquely human. After all, language lets us communicate with each other far more precisely than can any animals. It lets us lay joint plans, teach one another, and learn from what others experienced elsewhere or in the past. With it, we can store precise representations of the world within our minds, and encode and process information far more efficiently than can any animals. Without language we could never have conceived and built Chartres Cathedral—or V-2 rockets. These are the reasons that I speculated that the Great Leap Forward (the stage in human history when innovation and art at last emerged) was made possible by the emergence of spoken language as we know it.

Between human language and the vocalizations of any animal lies a seemingly unbridgeable gulf. As has been clear since the time of Darwin, the mystery of human language origins is an evolutionary problem: how was this unbridgeable gulf nevertheless bridged? If we accept that we evolved from animals lacking human speech, then our language must have evolved and become perfected with time, along with the human pelvis, skull, tools, and art. There must once have been intermediate languagelike stages linking Shakespeare's sonnets to monkey's grunts. Darwin diligently kept notebooks on his children's linguistic development, and he reflected on the languages of "primitive" peoples, in the hope of solving this evolutionary mystery. Unfortunately, the origins of language prove harder to trace than the origins of the human pelvis, skull, tools, and art. All those latter things may survive, and can be recovered and dated, but the spoken word vanishes in an instant.

The average human has a daily working vocabulary of around a thousand words; my compact desk dictionary claims to contain 142,000 words; but only ten calls have been distinguished even for vervets, the most intensively studied mammal. Animals and humans surely do indeed differ in vocabulary size, yet the difference may not be as great as these numbers suggest. Remember how slow our progress has been in distinguishing vervet calls. Not until 1967 did anyone realize that these common animals had any calls with distinct meanings. The most experienced observers of vervets still can't separate some of their calls without machine analysis, and even with machine analysis the distinctness of some of the suspected ten calls remains unproven. Obviously, vervets (and other animals) could have many other calls whose distinctness we haven't yet recognized. There's nothing surprising about our difficulties in distinguishing animal sounds, when one considers our difficulties in distinguishing human sounds. Children devote much of their time for the first several years of their lives to learning how to recognize and reproduce the distinctions in the utterances of adults around them.

I would be surprised if wild chimp and gorilla vocabularies did not eclipse those reported for vervets and comprise dozens of "words," possibly including names for individual animals. In this exciting field in which new knowledge is being added rapidly, we should keep an open mind on how large the vocabulary gap is between apes and humans. The remaining unanswered question concerns whether animal vocal communication involves anything that could be considered grammar or syntax. Humans don't just have vocabularies of thousands of words with different meanings. We also combine those words and vary their forms in ways prescribed by grammatical rules (such as rules of word order) that determine the meaning of the word combinations. Grammar thereby lets us construct a potentially infinite number of sentences from a finite number of words.

Humans, but not vervets, possess grammar, meaning the variations in word order, prefixes, suffixes, and changes in word roots (like they/them/ their) that modulate the sense of the roots. A second difference is that vervet vocalizations, if they constitute words at all, stand only for things that one can point to or act out... Still another difference between human and vervet vocalizations is that ours possess a hierarchical structure, such that a modest number of items at each level creates a larger number of items at the next higher level. Our language uses many different syllables, all based on the same set of only a few dozen sounds. We assemble those syllables into thousands of words. Those words aren't merely strung haphazardly together but are organized into phrases, such as prepositional phrases. Those phrases in turn interlock to form a potentially infinite number of sentences. In contrast, vervet calls cannot be resolved into modular elements and lack even a single stage of hierarchical organization.

Because the earliest written languages of five thousand years ago were as complex as those of today, human language must have achieved its modern complexity long before that. Can we at least recognize linguistic missing links by searching for primitive peoples with simple languages that might represent early stages of language evolution? After all, some tribes of hunter-gatherers retain stone tools as simple as those that characterized the whole world tens of thousands of years ago.

Actually, it turns out that there is no correlation between linguistic and social complexity. Technologically primitive people don't speak primitive languages, as I discovered on my first day in the New Guinea highlands among the Fore people. Fore grammar proved deliciously complex, with postpositions like those of the Finnish language, dual as well as singular and plural forms like those of Slovenian, and verb tenses and phrase construction like no language that I had encountered previously. I already mentioned the eight vowel tones of New Guinea's lyau people, whose sound distinctions proved imperceptibly subtle to professional linguists for years. Thus, while some peoples in the modern world retained primitive tools, none retained primitive languages. Furthermore, Cro-Magnon archaeological sites eontain lots of preserved tools but no preserved words. The absence of such linguistic missing links deprives us of what might have been our best evidence about human language origins. We are forced to try more indirect approaches.

Chomsky was convinced that children learning their first language would face an impossible task unless much of language's structure was already preprogrammed into them. Chomsky concluded that we are born with a "universal grammar" already wired into our brains to give us a spectrum of grammatical models encompassing the range of grammars in actual languages. This prewired universal grammar would be like a set of switches, each with various alternative positions. The switch positions would then become fixed to match the grammar of the local language that the growing child hears. However, Bickerton goes further than Chomsky and concludes that we are preprogrammed not just to a universal grammar with adjustable switches, but to a particular set of switch settings: the settings that surface again and again in Creole grammars. The pre with what a child hears in the local language around it. But if a child hears no local switch settings at all because it grows up amidst the structureless anarchy of a pidgin language, the Creole settings can persist.

Now let's pull together all these animal and human studies to try to form a coherent picture of how our ancestors progressed from grunts to Shakespeare's sonnets. A well-studied early stage is represented by vervet monkeys, with at least ten different calls that are under voluntary control, are used for communication, and have external referents. The calls may function as words, explanations, propositions, or as all of those things simultaneously. Scientists' difficulties in identifying those ten calls have been such that surely more await identification, but we still don't know how large the vervet vocabulary really is. We also don't know how far other animals may have progressed beyond vervets, because the vocal communications of the species most likely to have eclipsed vervets, the common and pygmy chimp, have yet to be studied carefully in the wild. At least in the laboratory, chimps can master hundreds of symbols that we teach them, suggesting that they have the necessary intellectual equipment to master symbols of their own.

A further step toward Shakespeare is exemplified by two-year-old children, who in all human societies proceed spontaneously from a one-word to a two-word stage and then to a multiword stage. But those multiword utterances are still mere word strings with little grammar, and their words are still nouns, verbs, and adjectives with concrete referents. As Bickerton points out, those word strings are rather like the pidgin languages that human adults spontaneously reinvent when necessary. They also resemble the strings of symbols produced by captive apes whom we have instructed in the use of those symbols.

From pidgins to Creoles, or from the word strings of two-year-olds to the complete sentences of four-year-olds, is another giant step. In that step are added words lacking external referents and serving purely grammatical functions; elements of grammar such as word order, prefixes and suffixes, and word root variation; and more levels of hierarchical organization to produce phrases and sentences. Perhaps that step is what triggered the Great Leap Forward discussed earlier in this book. Nevertheless, creole languages reinvented in modern times still give us clues to how these advances arose, through the Creoles' circumlocutions to express prepositions and other grammatical elements. Thus, animal communication and human language once seemed to be separated by an unbridgeable gulf. Now we have identified not only parts of bridges starting from both opposite shores, but also a series of islands and bridge segments spaced across the gulf. We are beginning to understand in broad outline how the most unique and important attribute that distinguishes us from animals arose from animal precursors.

Animal Origins of Art, J. Diamond

Supposedly, art is the noblest distinctively human attribute—one that sets us apart from animals at least as sharply as does spoken language, by differing in basic ways from anything that any animal does. Art ranks as even nobler than language, since language is really "just" a highly sophisticated advance on animal communication systems, serves an obvious biological function in helping us to survive, and obviously developed from the sounds made by other primates. In contrast, art serves no such transparent function, and its origins are considered a sublime mystery.

To appreciate that our art must have some animal precursors, recall that it's only about seven million years since we branched off from our closest living relatives, the chimpanzees. Seven million years sounds like a lot on the scale of a human lifetime, but it is barely 1 percent of the history of complex life on Earth. We still share over 98 percent of our DNA with chimps. Hence art and those other features that we consider uniquely human must be due to just a tiny fraction of our genes. They must have arisen only a few moments ago on the evolutionary time clock.

Modern studies of animal behavior have been shrinking the list of features once considered uniquely human, such that most differences between us and so-called animals now appear to be only matters of degree. For example, I described in the preceding chapter how vervet monkeys have a rudimentary language. You may not have considered vampire bats allied with us in nobility, but they prove to practice reciprocal altruism regularly (toward other vampire bats, of course). Among our darker qualities, murder has now been documented in innumerable animal species, genocide in wolves and chimps, rape in ducks and orangutans, and organized warfare and slave raids in ants.

As absolute distinctions between us and animals, these discoveries leave us few characteristics besides art, which we managed to dispense with for the first 6,960,000 of the 7 million years since we diverged from chimps. Perhaps the earliest art forms were wood carving and body painting, but we wouldn't know it because they wouldn't have remained preserved. The first preserved hints of human art consist of some flower remains around Neanderthal skeletons, and some scratches on animal bones at Neanderthal campsites. However, the interpretation that they were arranged or scratched intentionally is in doubt. Not until the Cro-Magnons, beginning around forty thousand years ago, do we have our first unequivocal evidence for art, surviving in the form of the famous cave paintings at Lascaux, statues, necklaces, and flutes and other musical instruments.

Perhaps we can now answer the question why art as we know it characterizes us but no other animal. Since chimps paint in captivity, why don't they do so in the wild? As an answer, I suggest that wild chimps still have their day filled with problems of finding food, surviving, and fending off rival chimp groups. If wild chimps had more leisure time plus the means to manufacture paints, they would be painting. The proof of my theory is that it actually happened: we're still 98 percent chimps in our genes.

Agriculture's Mixed Blessings, J. Diamond

Recent discoveries suggest that the adoption of agriculture (plus animal husbandry), supposedly our most decisive step toward a better life, was actually a milestone for the worse as well as for the better. With agriculture came not only greatly increased food production and food storage, but also the gross social and sexual inequality, the disease and despotism, that curse modern human existence.

For most of our history, all humans had to practice a primitive life-style termed "hunting and gathering": they hunted wild animals and gathered wild plant food. That hunter-gatherer life-style is often characterized by anthropologists as "nasty, brutish, and short." Since no food is grown and little is stored, there is (according to this view) no respite from the time-consuming struggle that starts anew each day to find wild foods and avoid starving. Our escape from this misery was launched only after the end of the last Ice Age, when people began independently in different parts of the world to domesticate plants and animals. The agricultural revolution gradually spread until today it is nearly universal and few tribes of huntergatherers survive.

From the progressivist perspective on which I was brought up, the question "Why did almost all our hunter-gatherer ancestors adopt agriculture?" is silly. Of course they adopted it because agriculture is an efficient way to get more food for less work. Our planted crops yield far more tons per acre than do wild roots and berries. Just imagine savage hunters, exhausted from searching for nuts and chasing wild animals, suddenly gazing for the first time at a fruit-laden orchard or a pasture full of sheep. How many milliseconds do you think it took those hunters to appreciate the advantages of agriculture? The progressivist party line goes further and credits agriculture with giving rise to art, the noblest flowering of the human spirit. Since crops can be stored, and since it takes less time to grow food in gardens than to find it in the jungle, agriculture gave us free time that hunter-gatherers never had. But free time is essential for creating art and enjoying it. Thus ultimately it was agriculture that, as its greatest gift, enabled us to build the Parthenon and compose the B Minor Mass.

Among our major cultural hallmarks, agriculture is especially recent, having begun to emerge only ten thousand years ago. None of our primate relatives practices anything remotely resembling agriculture. For the most similar animal precedents, we must turn to ants, which invented not only plant domestication but also animal domestication....Agriculture grew from human behaviors, and from responses or changes in plants and animals, leading without plan toward domestication. For example, animal domestication arose partly from people's keeping captive wild animals as pets, partly from wild animals' learning to profit from the proximity of people (e.g., wolves following human hunters to catch crippled prey). Similarly, early stages of plant domestication included people's harvesting wild plants and discarding seeds, which were thereby accidentally "planted." The inevitable result was unconscious selection of those plant and animal species and individuals most useful to humans. Eventually, conscious selection and care followed.

As I explained at the outset of this chapter, we're accustomed to assuming that the transition from the hunter-gatherer life-style to agriculture brought us health, longevity, security, leisure, and great art. While the case for this view seems overwhelming, it is hard to prove. How do you actually show that lives of people ten thousand years ago got better when they abandoned hunting for farming? Until recently, archaeologists couldn't test this question directly. Instead, they had to resort to indirect tests, whose results (surprisingly) failed to support the view of agriculture as an unmixed blessing.

Another indirect test of the progressivist view is to study whether surviving twentieth-century hunter-gatherers really are worse off than farmers. Scattered throughout the world, mainly in areas unsuitable for agriculture, several dozen groups of so-called "primitive people," like the Kalahari Desert Bushmen, continued to live as hunter-gatherers in recent years. Astonishingly, it turns out that these hunters generally have leisure time, sleep a lot, and work no harder than their farming neighbors. For instance, the average time devoted each week to obtaining food has been reported to be only twelve to nineteen hours for Bushmen; how many readers of this book can boast of such a short work week? As one Bushman replied when asked why he had not emulated neighboring tribes by adopting agriculture, "Why should we plant, when there are so many mongongo nuts in the world?"

While farmers concentrate on high-carbohydrate crops like rice and potatoes, the mixture of wild plants and animals in the diets of surviving hunter-gatherers provides more protein and a better balance of other nutrients. The Bushmen's average daily food intake is 2,140 calories and 93 grams of protein, considerably greater than the RDA (Recommended Daily Allowance) for people of their small size but vigorous activity. Hunter-gatherers are healthy, suffer from little disease, enjoy a very diverse diet, and do not experience the periodic famines that befall farmers, dependent on few crops. It is almost inconceivable for Bushmen, who utilize eighty-five edible wild plants, to die of starvation, as did about a million Irish farmers and their families during the 1840s when a blight attacked potatoes, their staple crop.

One straightforward example of what paleopathologists have learned from skeletons concerns historical changes in height. Many modern cases illustrate how improved childhood nutrition leads to taller adults: for instance, we stoop to pass through doorways of medieval castles built for a shorter, malnourished population. Paleopathologists studying ancient skeletons from Greece and Turkey found a striking parallel. The average height of hunter-gatherers in that region toward the end of the Ice Age was a generous five feet ten inches for men, five feet six inches for women. With the adoption of agriculture, height crashed, reaching by 4000 B.C. a low value of only five feet three for men, five feet one for women. By classical times, heights were very slowly on the rise again, but modern Greeks and Turks have still not regained the heights of their healthy huntergatherer ancestors.

There are at least three sets of reasons to explain these findings that agriculture was bad for health. First, hunter-gatherers enjoyed a varied diet with adequate amounts of protein, vitamins, and minerals, while farmers obtained most of their food from starchy crops. In effect, the farmers gained cheap calories at the cost of poor nutrition. Today just three high-carbohydrate plants—wheat, rice, and corn— provide more than 50 percent of the calories consumed by the human species. Second, because of that dependence on one or a few crops, farmers ran a greater risk of starvation if one food crop failed than did hunters. The Irish potato famine is merely one of many examples.

Finally, most of today's leading infectious diseases and parasites of mankind could not become established until after the transition to agriculture. These killers persist only in societies of crowded, malnourished, sedentary people constantly reinfected by each other and by their own sewage. The cholera bacterium, for example, does not survive for long outside the human body. It spreads from one victim to the next through contamination of drinking water with feces of cholera patients. Measles dies out in small populations once it has either killed or immunized most potential hosts; only in populations numbering at least a few hundred thousand people can it maintain itself indefinitely. Such crowd epidemics could not persist in small, scattered bands of hunters who often shifted camp. Tuberculosis, leprosy, and cholera had to await the rise of farming, while smallpox, bubonic plague, and measles appeared only in the past few thousand years with the rise of even denser populations in cities.

Besides malnutrition, starvation, and epidemic diseases, farming brought another curse to humanity: class divisions. Hunter-gatherers have little or no stored food, and no concentrated food sources like orchards or herds of cows. Instead, they live off the wild plants and animals that they obtain each day. Everybody except for infants, the sick, and the old join in the search for food. Thus there can be no kings, no full-time professionals, no class of social parasites who grow fat on food seized from others. Only in a farming population could contrasts between the diseaseridden masses and a healthy, nonproducing elite develop. Skeletons from Greek tombs at Mycenae around 1500 B.C. suggest that royals enjoyed a better diet than commoners, since the royal skeletons were two or three inches taller and had better teeth (on the average, 1 instead of 6 cavities or missing teeth). Among mummies from Chilean cemeteries around A.D. 1000, the elite were distinguished not only by ornaments and gold hair clips, but also by a fourfold lower rate of bone lesions stemming from infectious diseases. These signs of health differentials within local communities of farmers in the past appear on a global scale in the modern world.

While giving rise to class divisions for the first time, farming may also have exacerbated sexual inequality already in existence. With the advent of agriculture, women often became beasts of burden, were drained by more frequent pregnancies, and thus suffered poorer health. For example, among the Chilean mummies from A.D. 1000, women exceeded men in osteoarthritis and in bone lesions from infectious disease....As for the claim that agriculture laid the foundations of art by providing us with leisure time, modern hunter-gatherers have on the average at least as much free time as do farmers. I grant that some people in industrial and farming societies enjoy more leisure than do hunter-gatherers, at the expense of many others who support them and have far less leisure. Farming undoubtedly made it possible to sustain full-time craftsmen and artists, without whom we would not have such large-scale art projects as the Sistine Chapel and Cologne Cathedral. However, the whole emphasis on leisure time as a critical factor in explaining artistic differences among human societies seems to me misguided.

With the advent of agriculture an elite became healthier, but many people became worse off. Instead of the progressivist party line that we chose agriculture because it was good for us, a cynic might ask how we got trapped by agriculture despite its being such a mixed blessing. The answer boils down to the adage "Might makes right." Farming could support far more people than hunting, whether or not it also brought on the average more food per mouth. (Population densities of hunter-gatherers are typically one person or less per square mile, while densities of farmers average at least ten times higher.) Partly, this is because an acre of field planted entirely in edible crops produces far more tons of food, hence lets one feed far more mouths, than an acre of forest with scattered edible wild plants. Partly, too, it's because nomadic hunter-gatherers have to keep their children spaced at four-year intervals by infanticide and other means, since a mother must carry her toddler until it's old enough to keep up with the adults. Because sedentary farmers don't have that problem, a woman can and does bear a child every two years. Perhaps the main reason we find it so hard to shake off the traditional view that farming was unequivocally good for us is that there's no doubt that it meant more tons of food per acre. We forget that it also resulted in more mouths to feed, and that health and quality of life depend on the amount of food per mouth.

As population densities of hunter-gatherers slowly rose at the end of the Ice Age, bands had to "choose," whether consciously or unconsciously, between feeding more mouths by taking the first steps toward agriculture, or else finding ways to limit growth. Some bands adopted the former solution, unable to anticipate the evils of farming, and seduced by the transient abundance they enjoyed until population growth caught up with increased food production. Such bands outbred and then drove off or killed the bands that chose to remain huntergatherers, because ten malnourished farmers can still outfight one healthy hunter. It's not that hunter-gatherers abandoned their lifestyle, but that those sensible enough not to abandon it were forced out of all areas except ones that farmers didn't want. Modern huntergatherers persist mainly in scattered areas useless for agriculture, such as the Arctic and deserts.

Hunter-gatherers practiced the most successful and long-persistent life-style in the career of our species. In contrast, we are still struggling with the problems into which we descended with agriculture, and it is unclear whether we can solve them. Suppose that an archaeologist who had visited us from Outer Space were trying to explain human history to his fellow spacelings. The visitor might illustrate the results of his digs by a twenty-four-hour clock on which one hour of clock time represents a hundred thousand years of real past time. If the history of the human race began at midnight, then we would now be almost at the end of our first day. We lived as hunter-gatherers for nearly the whole of that day, from midnight through dawn, noon, and sunset. Finally, at 11:54 P.M. we adopted agriculture. In retrospect, the decision was inevitable, and there is now no question of turning back. But as our second midnight approaches, will the present plight of African peasants gradually spread to engulf all of us? Or will we somehow achieve those seductive blessings that we imagine behind agriculture's glittering fagade, and that have so far eluded us except in mixed form?

Alone in a Crowded Universe, J. Diamond

On Earth, we certainly are unique. No other species possesses language, art, or agriculture of a complexity remotely approaching ours. No other species abuses drugs. But we've seen in the last four chapters that, for each of those human hallmarks, there are many animal precedents or even precursors. Similarly, human intelligence arose directly from chimpanzee intelligence, which is impressive by the standards of other animals though still far below ours. Isn't it likely that some other species on some other planets have also developed such widespread animal precursors to the level of our own art, language, and intelligence?

Alas, most human hallmarks lack effects detectable at a distance of many light-years. If there were creatures enjoying art or addicted to drugs on planets orbiting even the nearest stars, we'd never know it. But fortunately there are two signs of intelligent beings elsewhere that might be detectable on Earth: space probes and radio signals. We ourselves are already becoming effective at sending out both, so surely other intelligent creatures have mastered the necessary skills.

If many or most stars have planetary systems, and if many of those systems include at least one planet with conditions suitable for life, and if life is likely eventually to evolve where suitable conditions exist, and if about one percent of planets with life include an advanced technical civilization—then one estimates that our own galaxy alone contains about a million planets supporting advanced civilizations. But within only a few dozen light-years of us are several hundred stars, some (most?) of which surely have planets like ours, supporting life. Then where are all the flying saucers that we would expect? Where are the intelligent beings that should be visiting us, or at least directing radio signals at us? The silence is deafening. Something must be wrong with the astronomers' calculations. They know what they're talking about when they estimate the number of planetary systems, and the fraction of those likely to be supporting life. I find these estimates plausible. Instead, the problem is likely to lie in the argument, based on convergent evolution, that a significant fraction of biotas will evolve advanced technical civilizations. Hence let's scrutinize more closely the inevitability of convergent evolution.

I mentioned early in this chapter that the existence of radios on the one planet known to us seemed at first to suggest a high probability of radios' evolving on other planets. In fact, closer scrutiny of Earth's history supports exactly the opposite conclusion: radios had a vanishingly low probability of evolving here. Only one of the billions of species that have existed on Earth showed any proclivities toward radios, and even it failed to do so for the first 69,999/70,000ths of its seven-million-year history. A visitor from Outer Space who had come to Earth as recently as the year 1800 would have written off any prospects of radios' being built here.

In reality, vanishingly few animals on Earth have bothered with much of either intelligence or dexterity. No animal has acquired remotely as much of either as have we; those that have acquired a little of one (smart dolphins, dexterous spiders) have acquired none of the other; and the only other species to acquire a little of both (common and pygmy chimpanzees) have been rather unsuccessful. Earth's really successful species have instead been dumb and clumsy rats and beetles, which found better routes to their current dominance.

It was an extremely unlikely fluke that we developed radios at all, and more of a fluke that we developed them before we developed the technology that could end us in a slow stew or fast bang. While Earth's history thus offers little hope that radio civilizations exist elsewhere, it also suggests that any that might exist are short-lived. Other intelligent civilizations that rose elsewhere probably reversed their own progress overnight, just as we now risk doing....Yes, out there are billions of galaxies with billions of stars. Out there must be some transmitters as well, but not many, and they won't last long. Probably there are no others in our galaxy, and surely none within hundreds of light-years of us. For practical purposes, we're unique and alone in a crowded universe. Thank God!

Nothing Learned, and Everything Forgotten?, J. Diamond

The first indications that our ancestors were in any respect unusual among animals were our extremely crude stone tools that began to appear in Africa by around two and a half million years ago. The quantities of tools suggest that they were beginning to play a regular, significant role in our livelihood. Among our closest relatives, in contrast, the pygmy chimpanzee and gorilla don't use tools, while the common chimpanzee occasionally makes some rudimentary ones but hardly depends on them for its existence.

Nevertheless, those crude tools of ours did not trigger any quantum jump in our success as a species. For another million and a half years, we remained confined to Africa. Around a million years ago we did manage to spread to warm areas of Europe and Asia, thereby becoming the most widespread of the three chimpanzee species but still much less widespread than lions. Our tools progressed only at an infinitely slow rate, from extremely crude to very crude. By a hun- dred thousand years ago, at least the human populations of Europe and western Asia, the Neanderthals, were regularly using fire. Yet in other respects we continued to rate as just another species of big mammal. We had developed not a trace of art, agriculture, or high technology. It's unknown whether we had developed language, drug addictions, or our strange modern sexual habits and life cycle, but Neanderthals rarely lived beyond age forty and hence may not yet have evolved female menopause.

Clear evidence of a Great Leap Forward in our behavior appears suddenly in Europe around forty thousand years ago, coincident with the arrival of anatomically modern Homo sapiens from Africa via the Near East. At that point, we began displaying art, technology based on specialized tools, cultural differences from place to place, and cultural innovation with time. This leap in behavior had undoubtedly been developing outside Europe, but the development must have been rapid, since the anatomically modern Homo sapiens populations living in southern Africa 100,000 years ago were still just glorified chimpanzees as judged by the debris in their cave sites. Whatever caused the leap, it must have involved only a tiny fraction of our genes, because we still differ from chimps in only 1.6 percent of our genes, and most of that difference had already developed long before our leap in behavior. The best guess I can make is that the leap was triggered by the perfection of our modern capacity for language.

Although we usually think of the Cro-Magnons as the first bearers of our noblest traits, they also bore the two traits that lie at the root of our current problems: our propensities to murder each other en masse and to destroy our environment. Even before Cro-Magnon times, fossil human skulls punctured by sharp objects and cracked to extract the brains bear witness to murder and cannibalism. The suddenness with which Neanderthals disappeared after Cro-Magnons arrived hints that genocide had now become efficient. Our efficiency at destroying our own resource base is suggested by extinctions of almost all large Australian animals following our colonization of Australia fifty thousand years ago, and of some large Eurasian and African mammals as our hunting technology improved. If the seeds of self-destruction have been so closely linked with the rise of advanced civilizations in other solar systems as well, it becomes easy to understand why we have not been visited by any flying saucers.

At the end of the last Ice Age around ten thousand years ago, the pace of our rise quickened. We occupied the Americas, coincident with A mass extinction of big mammals that we may have caused. Agriculture emerged soon thereafter. Some thousands of years later, the first written texts start to document the pace of our technical inventiveness. They also show that we were already addicted to drugs, and that genocide had become routine and admired. Habitat destruction began undermining many societies, and the first Polynesian and Malagasy settlers caused mass exterminations of species. From A.D. 1492 onward, the worldwide expansion of literate Europeans lets us trace our rise and fall in detail.

Within the last few decades we have developed the means to send radio signals to other stars, and also to blow ourselves up overnight. Even if we don't blunder into that quick end, our harnessing of much of the Earth's productivity, our exterminations of species, and our damage to our environment are accelerating at a rate that cannot be sustained for even another century. One might object that, if we look around us, we see no obvious sign that the climax of our history will come soon. In fact, the signs become obvious if one looks and then extrapolates, Starvation, pollution, and destructive technology are increasing; usable farmland, food stocks in the sea, other natural products, and environmental capacity to absorb wastes are decreasing. As more people with more power scramble for fewer resources, something has to give way.

So what is likely to happen? There are many grounds for pessimism. Even if every human now alive were to die tomorrow, the damage that we have already inflicted on our environment would ensure that its degradation will continue for decades. Innumerable species already belong to the "living dead," with populations fallen to levels from which they cannot recover, even though not all individuals have died yet. Despite all our past self-destructive behavior from which we could have learned, many people who should know better dispute the need for limiting our population and continue to assault our environment. Others join that assault for selfish profit or out of ignorance. Even more people are too caught up in the desperate struggle for survival to be able to enjoy the luxury of weighing the consequences of their actions. All these facts suggest that the juggernaut of destruction has already reached unstoppable momentum, that we too are among the living dead, and that our future is as bleak as that of the other two chimpanzees.

Despite all the grounds I've mentioned for being equally cynical about humanity's future, my view is that our situation isn't hopeless. We are the only ones creating our problems, so it's completely within our power to solve them. While our language and art and agriculture aren't quite unique, we really are unique among animals in our capacity to learn from the experience of others of our species living in distant places or in the distant past. Among the hopeful signs, there are many realistic, often-discussed policies by which we could avoid disaster, such as by limiting human population growth, preserving natural habitats, and adopting other environmental safeguards. Many governments are already doing some of these obvious things in some cases.

Rethinking the Human Revolution, P. Mellars

At the most basic level it seems to me that the notion of a behavioural revolution — as a description of patterns of change in human behaviour — can be assessed from two very different standpoints: either in terms of the patterns (and above all the speed) of the behavioural changes in question; or alternatively in terms of the perceived consequences of these changes for the subsequent, long-term development of the societies involved. With this initial ambiguity in what we individually conceive of as a 'revolution', it is hardly surprising that no clear consensus on the use or appropriateness of this concept has emerged in studies of either the modern 'human revolution', or for that matter in those of later prehistoric or historic societies.

Almost regardless of whether we choose to define revolutions by one or other of the two different criteria referred to above (i.e. in terms of the documented patterns and processes of change, as opposed to the eventual consequences of the changes) we cannot escape the issue of how and why significant, accelerated episodes of behavioural change should occur in human cultural development. I am making the assumption here that the notion of behavioural 'revolutions' that would be most acceptable to the broadest range of current opinion would in fact be in terms of significantly accelerated episodes of change, and it seems to me that this is what has been most widely accepted in earlier discussions of processes such as the 'Neolithic', 'Urban' or 'Industrial' revolutions — or at the least, significant inflection points in the overall patterns and trajectories of behavioural change.

The central point to recognize is that particular, significant episodes of behavioural change or innovations would lead, inexorably and inevitably, to two specific reactions or responses on the part of the societies involved: 1. A significant behavioural or technological change would inevitably generate new opportunities for further behavioural changes or innovations, which were ultimately directly dependent (or 'contingent') on the preceding innovations or changes; and, 2. Many of these changes would be likely, in one way or another, to create new 'pressures' or 'stresses' in the original behavioural systems, which would 'require' some kind of further (technological, economic, social etc.) 'adaptations' to cope with these new pressures or stresses — as well as with the new opportunities for additional behavioural change... Specific examples of these significant inflection points in human development are not difficult to find in many different periods of world prehistory. In the case of the 'Neolithic Revolution', for example, it is easy to visualize how certain significant innovations — such as the emergence of crop cultivation or animal domestication, or the emergence of more large-scale sedentary communities — could have initiated an almost endless succession of changes in technology, demography, social organization, or indeed religion and ideological patterns, which were both made possible by the preceding technological innovations, and arguably essential to cope with attendant new economic, social or demographic pressures on the societies involved — i.e. the whole process of the endlessly debated 'Neolithic Revolution'.

When viewed from a Eurasian perspective, the whole concept of the 'Upper Palaeolithic revolution' is of course still entirely valid, as an empirical description of the remarkable changes in almost all aspects of human behaviour which have been repeatedly and consistently identified and reaffirmed across all regions of western Eurasia, extending from the Atlantic coasts of western Europe into at least the central parts of Asia....There is now ample evidence (at least in the opinion of most researchers) to show that virtually the whole pattern of radical behavioural changes as reflected in the archaeological records of the classic Middle-to-upper Palaeolithic transition in Eurasia is due entirely to the replacement of one human population (that of the Eurasian Neanderthals) by the new, intrusive populations of biologically and behaviourally modern humans, from an ultimately African source. The archaeological situation in western Eurasia in other words reflects simply a 'before and after' scenario reflecting not in situ cultural or evolutionary processes, but simply a fairly rapid and abrupt replacement of one human population by another.

[There is] clear evidence for at least eight or nine significant technological and other behavioural features, which contrast sharply with those so far documented from earlier MSA sites, and show in several respects some obvious similarities to those which have always been used to define the classic early Upper Palaeolithic technologies of Europe and western Asia. Briefly, these can be summarized as follows:

1. The appearance of new techniques of blade production, involving both 'soft hammer' techniques of percussion and, above all, the production of carefully controlled new 'bladelet' forms. 2. The appearance of classic forms of end-scrapers and burins, effectively identical to those which characterize both the ensuing African Later Stone Age industries, and the early Upper Palaeolithic industries in Eurasia, and apparently implying new patterns of both skin-working and bone-working technology.

3. The proliferation of a range of small, geometrically- shaped 'segment' forms in the Howiesons Poort assemblages, which clearly formed parts of complex, multi-component hafted tools. 4. Arguably, the appearance of increased 'imposed form' and 'style' in the shaping of the Still Bay bifacial 'leaf-point'forms'. 5. The appearance of relatively complex and extensively bone tools — best represented at present by the wide range of simple awls and more elaborate, polished bone points.

6. The occurrence of at least 65 imported and deliberately perforated shells of Nassarius kraussianus in the Still Bay levels at Blombos Cave, clearly representing personal ornaments of some kind. 7. The presence of two pieces of red ochre in the Still Bay levels at Blombos Cave incised with relatively complex (and repeated) criss-cross 'design' motifs — at present the earliest unambiguous 'art' motifs known world-wide.

8. The complex pattern of closely superimposed hearth deposits documented throughout the Howiesons Poort levels at Klasies River and apparently Boomplaas Cave and Diepkloof, suggesting both increased 'intensity' of occupation patterns in these levels, and possibly more socially 'structured' occupation areas. 9. Finally (and more controversially) indications of apparently new subsistence practices, including the systematic exploitation of marine fish resources and possibly the deliberate fire-manipulation of 'Fynbos' vegetation communities, to increase the growth of underground plant foods.

As noted above, most if not all of these features are in their basic respects remarkably similar to those that the characterize the classic Upper Palaeolithic technologies in western Eurasia, and it would seem from the available archaeological evidence that most of these features appear essentially as some form of broadly related 'package' of developments, concentrated largely if not entirely within the period from c. 80,000-60,000 BP.

What gives these recent archaeological observations even greater force is the fact that there is now increasing evidence from recent DNA studies for what Forster & Matsumura have recently described as a 'remarkable expansion' of the L2 and L3 mitochondrial lineages in Africa, dated (broadly) once again to between c. 60,000 and 80,000 BP, and apparently involving the expansion of these two lineages initially from one or more small geographical regions to the rest of the African continent....Again, the evidence points to a rapid expansion of populations centred initially in one limited region of Africa, which subsequently expanded both numerically and geographically to most other regions of Africa, apparently through a 'wave front' process of dispersal which to a large extent either replaced or largely assimilated the pre-existing populations in other parts of Africa.

It was presumably this process of rapid population expansion and dispersal that led eventually (probably between c. 60,000 and 50,000 BP) to the dispersal of a small sub-group of the African L3 mitochondrial lineage across the mouth of the Red Sea and into the adjacent areas of Arabia and India — where technologies showing some striking resemblances to the African Howiesons Poort industries (and including typically African ostrich-eggshell beads) have recently been documented. It is reasonable to suggest that it could have been precisely this range of new technological and other behavioural innovations reflected in the later African MSA technologies which fuelled the widespread dispersal of these populations not only over the rest of the African continent, but eventually into a range of sharply contrasting environments in western Eurasia. If this is indeed the case (as all the current evidence would seem to suggest) then to see the patterns of technological and cultural developments in the later MSA industries of Africa as some significant behavioural 'revolution' would hardly be stretching the combined archaeological and demographic evidence too far.

If this scenario is accepted, then the question of what factor (or factors) could have initiated this process of rapid technological and demographic development during the later (Still Bay and Howiesons Poort) stages of the southern African MSA sequence becomes, arguably, the most central issue in the current 'Human Revolution' debate. In this context I have suggested that we could be confronted by two fairly stark alternatives. On the one hand, we could follow Richard Klein's lead and propose some kind of major cognitive or neurological 'mutation' in the relevant African populations, which provided a range of new cognitive capacities for new forms of symbolic, linguistic and strategic forward-planning behaviour — though in the scenario discussed here at around 80,000 BP, and not at the much later date of c. 40,000 BP, as Klein himself has suggested.

Or alternatively we could look for some more prosaic, purely 'processual' explanation, involving some kind of rapid succession of interrelated behavioural innovations and adaptations, perhaps stimulated by the economic and demographic pressures imposed by the rapid succession of climatic and related environmental changes which marked the transition from oxygen-isotope stage 5 to stage 4 in the marine and ice-core climatic records, possibly combined with the effects of the Mount Toba volcanic 'super-eruption' in Sumatra around 74,000 BP. Until the precise pattern of climatic and environmental changes over this time range has been denned more clearly in southern Africa itself, this scenario will inevitably remain difficult to model or to evaluate in specific cultural-adaptive terms. Clearly the task of identifying 'prime-movers' or specific 'cause and effect' processes should not be underestimated.

My suggestion, in short, is that what we see in the African 'Human Revolution', as discussed above, may be something closely analogous to the 'Neolithic Revolution' as reflected in the archaeological records of the Near East and elsewhere at around 13,000-10,000 BP — a point which has already been made forcibly in another context by Ofer Bar-Yosef. That is, a process which was ultimately dependent on the prior existence of the essential neurological and cognitive capacities for new patterns of behaviour, but which can only be accounted for at the specific time and place in which the behavioural 'revolution' itself occurred in terms of some specific combination of 'processual', cultural-intensification factors (whether climatic, demographic, social, or simply dependent on some specific behavioural invention or innovation) which provided both new opportunities and/or new pressures and stresses towards complex, multivariate cultural change. This, I suggest, may well be exactly what happened in southern Africa between c. 80,000 and 60,000 BP (i.e. broadly synchronous with the OIS 5/4 climatic transition) to initiate the much more culturally, technologically and symbolically complex pattern of behaviour we see in the later MSA and subsequent LSA archaeological records of southern Africa from this point onwards. It is probably also, as I have suggested above, what initiated — and made possible — the closely ensuing Out-of-Africa dispersal, and the rapid colonization of the rest of the occupied world.

Personally, I still find the evidence for a significant acceleration and increased complexity of the archaeological evidence in southern Africa from c. 80,000 BP very difficult to set aside, and it is impossible not to be impressed by the broad similarities in many aspects of these developments to those which mark the transition from the Middle to the Upper Palaeolithic periods in western Eurasia — unless of course subsequent discoveries in southern Africa produce some unexpected surprises. Whether the ultimate causes of these changes and increased complexity were neurological, climatic, demographic, technological or 'symbolic', is effectively beyond the resolution of the available biological, climatic and archaeological evidence. But if this process led to the rapid expansion and dispersal of one small, regional African population both to the rest of Africa, and eventually to the colonization of Eurasia — as I have suggested above— then at least the consequences of this process would indeed be 'revolutionary', almost regardless of how this term is defined!

The Origin and Dispersal of Homo Sapiens, C. Stringer

While in 1987 it was still possible to argue that Africa played no special role, it is now generally agreed that the basic anatomy of Homo sapiens was present there by at least 150 ka. The African fossil record between 100-250 ka has been expanded by new discoveries, and there has also been parallel progress in deciphering the behavioural record of the Middle Stone Age. The extension of early modern humans to the Levant by 100 ka has been confirmed by further dating analyses on the Skhul and Qafzeh material, but evidence has also emerged that Neanderthals "were in the region at around the same time, and it seems probable that potential overlap of the emerging modern and Neanderthal clades was a consistent feature of the region during the later Middle Pleistocene, as well as the early Late Pleistocene.

In the Far East and Australasia, there remain many unanswered questions about modern human origins, and the level of our ignorance has been highlighted by the remarkable discoveries from Flores. Dating the dispersal of Homo sapiens in these regions is unfortunately still an inexact science. Early modern fossils from China, such as Liujiang and the lost Upper Cave Zhoukoudian material, may date from more than 70 ka and about 30 ka respectively, or alternatively could be much more recent — direct dating, if ever practicable, would be invaluable. However, the crania concerned are generally acknowledged to be quite distinct from present regional samples, and it is unclear whether this indicates replacement events by the ancestors of present populations, or rapid late Pleistocene evolution of modern regional features. (I tend to favour the former scenario.)

For the European picture, new discoveries and detailed palaeoclimatic records potentially promise an examination of the dispersal of modern humans and the period of Neanderthal extinction in unprecedented detail. But such research is still limited by the precision with which we can date and correlate palaeontological records, archaeological records and other events. The wide application of accelerator radiocarbon dating has purified the fossil record and provided an approximate framework for study of the period between 25^5 ka BP, when late Neanderthals and early modern humans may have co-existed in western Eurasia. But while accelerator radiocarbon dating gives good precision, it has serious problems of accuracy compared with calendar years during this period, and despite improved techniques and the advent of calibration curves, we still cannot reliably map human populations in relation to each other and the landscape at a finer resolution than several millennia, preventing progress on many outstanding questions. The first appearance of modern humans in Europe can currently only be confirmed at >35 ka, based on radiocarbon dates for the Oase site, and while directly dated moderns of probable Aurignacian association are still younger than this age, the likelihood is that in terms of real years, modern humans were in Europe more than 40,000 years ago, based on corrections to radiocarbon chronologies.

There remain many unresolved questions about the first appearance of modern humans in Europe, and two of the most fascinating are whether there were pre-Aurignacian dispersals, and whether the first arrivals were significantly more archaic than the people we know so far from the Aurignacian and Gravettian. And on the subject of Neanderthal extinction, we must beware of monolithic explanations, since this was a long-term process over a wide geographical area, although in "western Europe I do consider that the predominant factor was a combination of the arrival of modern humans and climatic instability.

African fossils from 100-250,000 years ago show a great deal of anatomical variation, and we are only sampling a tiny part of the whole continent's populations at present (there is no significant record from much of central and western Africa, for example, but we know people were there from lithic evidence). Did the early modern morphology evolve gradually and spread outwards from, say, East Africa, replacing more archaic forms? Or could there have been an African version of multiregionalism, with modern genes, morphology and behaviour coalescing from various populations across the continent? Better samples and dating of the records, and continuing genetic analyses, are needed to help resolve these fundamental questions, but there is growing molecular evidence of deep divisions within African populations that suggests extensive periods of fission and fusion. It is my feeling that many 'ancient' DNA markers being picked up outside of Africa and enlisted to argue for gene flow from extra-African archaics, could eventually turn out to be have been carried from Africa in modern human dispersals, followed by subsequent major frequency changes between Africa and the outside. Whether such special pleading can be sustained will only be resolved when African genetic diversity has been mapped and analysed in the required depth and breadth.

It has been argued that major behavioural changes occurred in Africa around 50 ka, catalyzing the global dispersal of modern humans and their behaviour. Thus morphological and behavioural evolution may have been decoupled, with 'morphological modernity' evolving before 'behavioural modernity'. Others have argued that 'modern' behavioural features do occur across a broad spectrum of Middle Stone Age industries in Africa that date well before 50 ka. This may suggest, instead, a gradual assembly of the package of modern human behaviours in Africa during the Middle Stone Age, perhaps beginning beyond 200 ka.

It may well be that the predominance of Africa in modern human origins was fundamentally a question of its larger geographical and human population size, giving greater opportunities for morphological and behavioural innovations to both develop and be conserved, rather than the result of a unique evolutionary pathway. Perhaps 'modernity' was not a package that had a unique African origin in one time, place and population, but was a composite whose elements appeared at different times and places, and were then gradually assembled to assume the form we recognize today.

Many changes were still to come, but the basic reshaping of cranial form to the 'modern' pattern had occurred in Africa by 150 ka, although the underlying factors remain unknown. Regarding behavioural change, factors such as improved adaptations, larger social networks and relative stability brought increased population survival, greater social complexity and continuity. Did population size in Africa then pass a critical threshold, allowing technological changes to accumulate and behavioural change to accelerate, and is that threshold the one we recognize as 'The Human Revolution'?

A Constructionist Approach to the Evolution of Human Mental Capacities, K. Gibson

Major quandaries confront those who would chart the evolution of the human mind. Did, for example, modern human intellectual and linguistic capacities arise abruptly with the sudden origin and spread of a major genetic mutation or did they emerge in a gradual and stepwise fashion? Irrespective of how our intellectual capacities arose, when did they reach their modern form — with the emergence of anatomically modern humans, with the onset of the Upper Palaeolithic, or at some other earlier or later period? Does each of our linguistic and intellectual capacities reflect the functioning of a separate genetically-determined neural module, or do our abilities reflect the emergence of general neurological capacities that cross-cut behavioural domains? Qualified 'experts' support virtually all combinations of these widely diverging scenarios. Hence, we find ourselves confronted with numerous competing, often incompatible evolutionary models. How can we choose among them?

The multidisciplinary nature of this endeavour exacerbates our confusion. To be considered viable, any evolutionary model must be compatible with modern data and theory in numerous pertinent disciplines. Yet, none of us can possess genuine expertise across all relevant areas. This chapter focuses on material pertinent to modelling cognitive evolution from the areas of primate neuroanatomy, brain development and primate behaviour, and places that information within the contexts of current findings in the field of evolutionary developmental psychology (evo-devo). It suggests that modern human mental capacities reflect, in part, expanded mental constructional skills across varied behavioural domains, that these skills evolved in a gradual or stepwise fashion, rather than by a sudden mutational event, and that they require enhanced processing capacities in multiple brain regions, rather than in one or a few mental modules. In addition, humans possess motor constructional skills which subserve their intellectual and linguistic endeavours, and they possess infantile behaviours which help channel developing intelligence in specific human-like directions.

It is likely that pre-Upper Palaeolithic human populations already possessed mental constructional and linguistic capacities essentially comparable to our own. Although selection may have continued to refine, enhance and assure the development and expression of these capacities well into modern times, environmental impacts on developmentally plastic juvenile brains are likely to have been of far greater importance to the changing manifestations of human cognitive capacities than major genetic 'mutations', at least since the emergence of anatomically modern humans.

Western religious and philosophical traditions have long held that humans are qualitatively different from and superior to other animals. These views have permeated discussions of human evolution with expressions such as: man is the only animal that makes tools, uses symbols, possesses syntax, imitates, has culture, lies, or possesses certain neurological structures. Such views encourage archaeologists and palaeoanthropologists to search for the sudden emergence of fully developed human behaviours at specific points in time. They date, however, to times when little was known about the cognitive and communicative capacities of other animals. In recent decades, human uniqueness views have faced repeated challenges from the findings of primatologists and other animal behaviourists.

We now know that our closest phylogenetic kin, chimpanzees and bonobos, can make tools, use symbols, use 'rules' to combine two symbols into meaningful units, comprehend some English grammatical constructions, transmit learned information through social means, deceive and exhibit numerous other behaviours once presumed to be unique to our species. Such findings are difficult to reconcile with classic human qualitative uniqueness views, because they imply that the rudiments of many supposedly uniquely human behaviours were already present in the last common ancestor of humans and great apes. Consequently, they mandate a different kind of evolutionary model, one that both accepts the continuity of great ape and human cognition and also explains why human achievements in so many cognitive demands far exceed those of their ape brethren.

One model suggests that ape achievements in varied cognitive domains fall short of those of their human relatives, because humans have greater hierarchical mental constructional capacity. The term, hierarchical mental constructional capacity refers to the ability to break perceptions, motor actions and concepts into fine units, and to then combine and recombine these units into higher order, often novel, constructs. New constructs can then stand alone or be incorporated as subunits of still higher order constructs. Although mental constructional capacities can be found in varying degrees in other animals, humans construct objects, sentences, social concepts, and motor acts that incorporate more information and exhibit greater levels of hierarchical embeddedness than do the constructions of the great apes.

It is not tool-making per se but, rather, the ability to construct a seemingly infinite variety of complex objects from interchangeable subcomponents that distinguishes human from ape tool behaviour. The capacity to incorporate previously manufactured subcomponents into new more complex artefacts also explains why human tool-making, but not animal tool-making exhibits a ratcheting effect. Human technology, but not ape technology, can become more complex with time because humans, but not apes, can incorporate elements of previous technologies into new inventions....This may reflect a failure to mentally construct images of compound tools in advance, to mentally construct an action plan for creating such tools, and/or a lack of understanding of how to pre-shape objects in order to create good 'fits'.

The ability to comprehend others' thoughts, perceptions and intentions, a capacity that is sometimes called theory of mind, serves as the basic cornerstone of human social intelligence. It is essential for both Machiavellian 'political' manipulation and for the social cooperation in pursuit of shared goals so characteristic of modern human societies. In the first year of life, human children demonstrate an understanding of what others are looking at and engage in teasing and other behaviours that demand some understanding of others' thoughts, but they become increasingly adept at mind-reading throughout the maturational period at least to the point of late childhood or early adolescence.

Humans are certainly surpassed by many other animals is strength and speed, and they fall short of most apes in arboreal locomotor skills and in pedal manipulative capacity. It is doubtful, however, whether any animal exceeds humans in the ability to construct novel body postures and rapid, smoothly produced, sequences of novel postures, such as those that are used in dance, swimming, gymnastics, some complex tool-making-using endeavours, mime and gestural sign languages.

Human phonemic production is very much a motor-constructional process in that each phonetic sound reflects a specific combination of discrete postures of the lips, larynx, uvula, tongue tip, tongue body and tongue root. Articulate speech involves smooth, rapid, transitions between these postures and must be integrated with movements of the respiratory system. Only some birds and some cetaceans appear to match humans in their abilities to create novel speech-like vocalizations. Great apes lack speaking capacities entirely. This lack remains to be fully explained. We do not, for instance, yet know whether apes can produce the same range of fine-grained tongue and uvular movements as can humans or whether they have the motor-constructional capacity to create rapidly changing novel combinations of tongue, lip, laryngeal and uvular movements. We do know, however, that apes do not construct the full range of speech sounds or anything resembling human speech sequences. We also know that they do not coordinate respiratory movements and vocalizations in the human manner.

Conscious planning is also very much a mental constructional endeavour in that it involves envisioning a sequence of actions and desired results. It is often hierarchical, as for example, -when a cook envisions the production of meal as a series of coordinated action units, each demanding a subset of actions: e.g. setting the table, preparing the soup, tossing the salad, roasting the meat, and boiling the potatoes. Great apes certainly plan some actions in advance, but none has exhibited planning skills in the object manipulation and production realm as complex as those needed to produce a human-like cooked meal or a complex constructed tool. None engages in behaviours that require advanced anticipation and planning of socially- coordinated endeavours, such as the advanced planning of seasonal migrations, seasonal festivals or the seasonal capture of migratory animals such as caribou or fish.

Neural connectionist models postulate that many complex behaviours, including the generation of rapid sequential movements and the mental construction of relationships between perceived objects and actions, require the existence of a number of neural networks firing in parallel, hence, expansions in neural tissue.... Finally, a number of human neurological disorders are associated with decreased size of specific neural structures, while, in other instances, enhanced capacities correlate with enlargement of specific structures.

An examination of the skills needed for human mental constructional tasks suggests that they reflect enhanced processing capacities of many brain regions, including those regions known to have exhibited the greatest enlargements in human evolution: neocortex (including the frontal, parietal, and temporal lobes), cerebellum, basal ganglia and hippocampus). For example, in order to mentally construct relationships between various concepts, an individual must first be able to hold two or more concepts in mind simultaneously. At least two brain regions contribute to this activity, the cerebellum which mediates rapid shifts in attention and the prefrontal lobes which provide working memory capacity, i.e. the capacity to hold a number of items in mind simultaneously....Speech, tool-use-making, dance and other behaviours that require the smooth, rapid, semi-automatic execution of motor sequences require well-developed procedural memory capacities. These are provided by interactions between the basal ganglia, the cerebellum, and portions of the frontal lobes.

The range of neurological capacities needed for complex mental constructional tasks, the large number of brain areas which contribute to these capacities, and the fact that some capacities, such as working memory, attention shifting and procedural memory, contribute to human skills across a number of behavioural domains strongly suggests that the evolution of human intelligence was not primarily a matter of the addition of domain-specific neural modules. Rather, a number of general-purpose, crossdomain neural-processing capacities expanded in the human lineage. That expansion helps account for the human superiority over the apes in language, toolmaking, social intelligence, planning capacity, dance, gymnastics, and other behavioural domains. To the extent that increases in brain size and overall information-processing capacity account for increased human mental constructional abilities, one might expect that these skills increased in the hominin lineage gradually in accordance with advancing brain size.

Modern evolutionary developmental biologists emphasize that the initial response of organisms to new environmental challenges is nearly always phenotypic, rather than genotypic, change. Selection for successful phenotypes results in concomitant selection for genes that facilitate their development and, thus, increase their prevalence in the population, a phenomenon known as the Baldwin effect. In some cases, phenotypes that were initially environmentally induced may actually become genetically fixed in a population (genetic assimilation).

These concepts from the field of evolutionarydevelopmental biology imply that our hominin ancestors probably began using rudimentary symbolic communication, making stone tools, and otherwise moving in human-like behavioural directions prior to the evolution of species-specific genes predisposing to those behaviours. As these human-like behaviours became increasingly prevalent, environmental exposure at young ages would have helped further channel developing brains in human-like directions. Once these behaviours proved adaptive, selection would also have favoured those chance mutations that facilitated their expression and development. These may have included genes for the enhancement of any of the capacities that contribute to human mentaland motor-constructional abilities, such as genes that favour working memory, articulate speech, or procedural learning capacities. Alternately, they may have included genes that assure appropriate environmental input to developing human brains, such as genes that alter parental behaviour towards the young and genes that assure that infants, themselves, generate or seek relevant inputs.

By Oldowan times, hominins, with ape-sized brains, were making stone tools. This does not necessarily indicate that these early tool-makers were cognitively advanced over the apes. It does, however, indicate that a major behavioural shift in a human-like direction had already occurred. In consequence, these early hominins may already have been facing behaviourally- induced selective pressures that favoured human- like cognitive and motor skills not only with respect to tool-making, but also with respect to other skills that may enhance tool-making and tool-using endeavours, such as imitation, teaching, and rudimentary symbolic communication.

Both Neanderthals and early anatomically modern humans appear to have been constructing objects from diverse manufactured parts. Both applied hierarchical mentalconstruction skills to their tool-making endeavours and were clearly more human-like than ape-like in their object construction skills. No direct evidence exists pertaining to the presence or absence of language in these early peoples. To the extent, however, that mental constructional capacities can be applied across behavioural domains, the simplest hypothesis is that hominins who could construct tools could also construct and communicate ideas and language-like utterances although not necessarily as articulately and fluently as modern humans. One would expect that they would also have possessed more human-like than ape-like social intelligence and planning capacities.

The technological and artistic accomplishments of Neanderthals and early modern humans, however, paled in comparison to those of the later European Upper Palaeolithic peoples who, beginning about 35,000 to 40,000 years ago, exhibited greatly increased technological sophistication, increased production of symbolic art and jewellery and the first evidence of social stratification. This evidence has convinced many scholars that language or other distinctively human cognitive capacities arose just prior to the Upper Palaeolithic. This view, however, seems increasingly unlikely given the emerging evidence from Africa of advanced tool types, engraved objects and the symbolic use of ochre long prior to Upper Palaeolithic times. Surely, the humans who reached Australia by boat more than 40,000 to 60,000 years ago would also have had modern intelligence and linguistic capacities? Indeed, given that all modern human populations have spoken languages and exhibit similar intellectual abilities when reared in similar environments, the only viable hypothesis is that these capacities were already present prior to the time the earliest modern humans left Africa to spread throughout Eurasia.

Throughout human history, significant inventions have ushered in major technological, subsistence and social changes. No evidence indicates genetic mutations for new cognitive, creative or linguistic capacities preceded or directly accompanied any of these inventions. Very possibly, the Upper Palaeolithic simply represents one of the earliest technologicallyinduced cultural revolutions. Certainly, a number of innovations that may have occurred during Upper Palaeolithic times would have had major reverberating effects on technology, subsistence and society.

To state that the pre-Upper Palaeolithic peoples possessed essentially modern mental capacities, as is hypothesized here, is not to imply that selection for cognitive, linguistic and social skills ceased at any specific point in time. Genetic variation in behavioural traits surely existed in Palaeolithic peoples, just as it continues to exist today. Selection may still be acting in favour of genes that facilitate the ability of individuals to function within society and to adapt to modern technological trends. This does not mean, however, that new genes are now emerging that endow some humans with entirely new cognitive and behavioural capacities not present in the rest of us or that any such genes have emerged in recent times. Exposure at young ages to computers, the internet, and video games is far more likely to change the phenotypic manifestations of human intelligence in our grandchildren than is genetic change. Such environmental impacts on developing brains have surely been of major importance throughout our evolutionary history and cannot be discounted as an additional contributing factor to technological and social changes in Palaeolithic times.

Rudiments of many behaviours once considered to be uniquely human can be found in our closest phylogenetic kin, the great apes. Human cognitive and linguistic achievements exceed those of the apes, however, because humans possess expanded mental constructional skills which they apply to varied cognitive and motor behavioural domains. Human infants and adults also possess species-typical behaviours which help channel infantile and juvenile brain development in linguistic, technical and hurnan-like directions. These mental constructional capacities and infantile behaviours reflect the functions of a number of neural structures that are enlarged in humans as compared to apes. Evidence from the archaeological record suggests that mental constructional capacities evolved in a gradual or stepwise manner beginning possibly as early as Oldowan times and that modern constructional capacities were already present in pre- Upper Palaeolithic times.

Did Working Memory Capacity Power the Evolution of Modern Thinking?, T. Wynn, F. Coolidge

Working memory is a theoretical construct initially proposed in 1974 by Alan Baddeley to explain certain kinds of experimental results in human memory research. A classic example of a working memory problem is the 'reading-span' test. A subject is asked to read a series of 14-16 word sentences, and asked to remember the final word of each sentence. The maximum number of final words remembered is the subject's reading span. This test reveals several of the important components of working memory. First, there is a processing component (reading) and a memory component (terminal words); the test is not simply a short-term memory test. But performance on the reading-span test also requires an ability to perform in the presence of distraction, and this is clearly a matter of attention, not storage. The essence of working memory is the capacity an individual has to hold and manipulate information in active attention. Working memory is much more than recall. It is, in a real sense, what one can 'hold in mind' and process at the same time. This attentional component clearly distinguishes working memory from simple measures of recall. Indeed, Baddeley himself has recently opined that a better label would have been working attention. The emphasis on attention is clear in all recent treatments of working memory.

Modern humans express considerable withinpopulation variability in working memory performance. Kane and others have provided support that individual differences in working memory capacity may be related to general intelligence (also known as Spearman's g) and more specifically, fluid intelligence (Cattell's gF). Citing the work of Kyllonen & Christal, they note a strong positive correlation (.90) between working memory and gF. This latter aspect of intelligence is thought to be the ability to solve novel problems and depends less on schooling and acculturation (than crystallized intelligence, gC, which is more dependent on formal schooling and acculturation).

Comparative research with non-human primates indicates that human working memory capacity significantly exceeds that of even our nearest relative. It is clearly one of the cognitive developments that accompanied human evolution. As palaeoanthropologists we are faced with three questions: 1. When did significant developments occur? 2. In what evolutionary context did they occur? 3. Why? We suspect that working memory capacity is a theme that could be followed through the course of human evolution. We have chosen, initially at least, to examine the palaeoanthropological evidence for the final step in that evolution — the enhancement of working memory (EWM) that enabled modern thinking and problem solving. We do not think that this final enhancement need have been dramatic, or that more archaic varieties of Homo sapiens were simpletons. Instead, we suggest that a small but significant enhancement of working memory yielded profound long-term advantages for populations that possessed this characteristic in significant frequencies.

The enhancement of working memory was a change in degree; modern Homo sapiens exhibit a greater working memory capacity than Homo erectus, and probably even archaic members of the species. However, there is no known connection between working memory and encephalization, and the neural changes associated with enhanced working memory leave no recognizable landmarks on the gross anatomy of the brain, so even a superbly-preserved endocast would fail to reveal enhanced working memory.

We cannot make a truly compelling case for EWM prior to about 28,000 years ago. As few discussions of the evolution of the modern mind invoke such a late date, we are left with a conundrum. There are three possible solutions: 1. Absence of evidence is not evidence of absence. The archaeological record is either too poor, or of the wrong things, to document enhanced working memory reliably, and its apparent late appearance reflects differential preservation more than evolutionary developments. 2. Enhanced working memory appeared with the first anatomically modern humans, and enabled progressive cultural change. Owing to the ratchet effect, innovative cultural developments were at first few and far between, but the pace quickened toward the end of the Pleistocene. 3. The archaeological record is accurate. Enhanced working memory evolved late in human in evolution, and powered the dramatic developments in human culture that emerge after 50,000 years ago.

The current hypothesis of an enhancement of working memory has several advantages over alternative hypotheses concerning the emergence of the modern mind: 1. Working memory has been extensively investigated and has voluminous experimental support. 2. Working memory has been shown to have a strong correlation with the 'g' of general intelligence as measured by intelligence tests and with fluid intelligence gF. 3. Working memory has clearly established implications for language. 4. Working memory capacity can be linked to innovation. 5. Working memory has integrated the vast literature on executive functions of the frontal lobes by subsuming these functions into the 'central executive' component of Baddeley's working memory model. 6. The power of the theory, and the variety of experimental conditions in which it has been applied, make it a good source from which to generate archaeologically visible attributes.

We have proposed that a relatively simple genetic mutation about 100,000 years ago (more or less), probably in general working memory capacity or one of its subsystems, was the final evolutionary development that modernized the human mind. We are not the first to propose a genetic mutation as the cause of modern thinking. Mithen proposed that a genetic mutation may have led to a neural reorganization beginning about 100,000 years ago and ending about 30,000 years ago. This neural reorganization, whose substrate he did not specify, occurred without an increase in brain size yet resulted in 'cognitive fluidity', a seamless blending of various aspects of intelligence and knowledge. Klein & Edgar also proposed a sudden neural change about 50,000 years ago that promoted the modern human ability to innovate, and they also hinted that one essential aspect of this change might have been the ability to produce and comprehend rapidly spoken phonemic language.

The essential argument against the geneticchange hypothesis is that a single gene mutation could not possibly result in a major reorganization of the brain. We too would argue vehemently against a single gene hypothesis....Thus, we are not arguing that some single dominant gene somehow miraculously reorganized the human brain. One possibility for our concept of enhanced working memory is that it was created through the very common process of genetic mutation, by which nearly all modern human and animal behaviour was created (at least indirectly, since natural selection acts on variations in behaviour as a result of genetic transcription errors). However, we are not arguing that it was single dominant genetic mutation but that it may have been a single additive genetic mutation that somehow influenced working memory capacity.

Thus, the old tired mantra, 'modern human behaviour cannot be due to a single dominant gene mutation' has some grain of truth to it. However, there are still many vaguely understood patterns and forms of genetic transmission that could support our enhanced working memory hypothesis. At some point, genetic transmission critics must also own up to the fact that at some place in time, there were demonstrable behavioural and neural substrate differences between ancient human types and moderns. The likelihood that some form of genetic transmission changed human potential now appears not only possible but highly probable. Also, it is important to recognize that all complex human behaviour has arisen as a result of positive selection upon a background of genetic mutations. There would be no evolutionary change without them.

There appears to be a general consensus among palaeoanthropologists that the ascendancy of modern human behaviour was somehow tied to symbol use. And many, perhaps most, now think that symbol use was a cognitive development as much as, or even instead of, a cultural development. However, advocates of a cognitive/symbolic root to modern behaviour rarely specify exactly what cognitive ability is responsible. To be sure, there are references to language, symbolic mediation, and social marking, but these are little more than assertions that modern minds generate modern behaviour. They are cognitive arguments in only a very general sense.

We aver that it is not only possible, but necessary, for palaeoanthropologists to apply more sophisticated models. Our particular approach has been to explore the implications of a working memory model for modern thinking. We do not pretend that working memory is the only component of modern thinking. However, we do maintain that it is a very important component and, moreover, that it can explain much that has often been loosely attributed to symbol use. Of course, it will not do simply to reject or ignore the symbolic argument. Instead, the working memory model must be explicit about its implications for symbolic systems, language, and social interaction.

Cognitive scientists and linguists are now in general agreement that language is a complex communication system that is based in multiple neural systems or networks. There is considerable disagreement about how many discrete components there are, which is most important, and how they relate structurally, functionally, and developmentally to other neural systems but, despite identification of genes such as the FOXP2, few if any neurolinguists believe that language as a whole is inherited simply. However, individual components may be inherited simply, and we contend that working memory is one of these. But how does working memory relate to other neuro-linguistic components, and what effect would enhancement of working memory have on language as a whole?

An enhancement of working memory would have had a significant impact on the grammatical component of the language faculty. Indeed, the impact of an expanded working memory on grammar and syntax is far easier to specify than the impact of grammar on problem solving. Recall that at the behavioural level working memory is the amount of information held in and manipulated by attention. An enhancement of working memory would increase the sheer number of words, or phrases if the phrases could be remembered and held as chunks, available for processing. This alone would provide the potential for increased complexity in communication, a fact appreciated by most linguists.

Recursion is the mechanism in grammar that enables a speaker to use an entire phrase as an object of a higher level phrase, e.g. 'He said that she said'. It is this feature that supplies native speakers of a language with the ability to produce, in principle, an infinite number of meaningful sentences. In practice, the size of this 'infinity' is constrained by several practical limitations, one of which is working memory. The number of recursions must be held and processed in attention if they are to be understood. 'Ron said that Harry said that Hermione said that Hagrid said that Dumbledore wants to see you', is a grammatically-correct sentence, but one that just about exhausts the capacity of working memory to analyse. Add two more levels of recursion and few native speakers could keep track....Perhaps the simplest interpretation of the effect enhanced working memory had on linguistic communication is to conclude that it enlarged the recursive capacity of language. An enhancement of working memory would yield immediate results in the length and complexity of sentences.

Grammar enables speakers to encode more information in fewer words; indeed in many languages verb inflections can carry a tremendous amount of relevant information — person, tense, mode, transitivity, etc. It is this productivity and efficiency that has always made language such a unique and powerful communication system. So isn't it rash to claim that working memory, which has clear antecedents in mammalian cognition, should have equal billing in our understanding of the evolution of the modern mind? We think that it is not rash, and indeed contend that an enhancement of working memory is a better hypothesis than an enhancement of grammar for understanding the final step to modern cognition. Our major reason for this conclusion is that working memory encompasses more than just the production and decoding of utterances. It is the cognitive locus of immediate problem-solving, and much of this is non-verbal. Perhaps the most important non-verbal component is visuospatial, where shape and location information is held for processing. Much that we term 'creative' thought is based on the processing of images. And it is the central executive that processes all of this information, enabling such critical analytical procedures as analogy and thought experiment.... In sum, the implications of an expanded working memory for language are fairly clear. The effects of an enhanced grammatical ability would not be nearly as far reaching.

So what cognitive ability is required for the invention and maintenance of symbolic culture? As much as we would like to conclude that enhanced working memory is the answer, we cannot. Certainly the ability to hold more things in mind would open up possibilities for symbolic life, but would not in and of itself force the issue. Yes, enhanced 'working memory underpins innovation, analogical thinking, and abstraction, but none of these appears sufficient to explain the emergence of symbolic culture.

We do not mean to suggest that beads, or ochre, or engraved bones, might not be acceptable bits of evidence for modern behaviour (though we do believe the case has never been made convincingly). However, they cannot stand as evidence for modern cognition unless one can specify the cognitive abilities they require. Association with modern humans is not alone a strong enough argument for modern cognition. The record of pigment use and bead production may well document an independent evolution of symbol use. Barham has suggested that discussion of the evolution of symbol use be dissociated from assumptions about modernity and cognition. We heartily agree.

We do not claim that the evolution of working memory is the answer. But it was certainly an important component. The problemsolving abilities that are so important to the success of modern humans are not specifically entailed by symbol use or language. They are very much entailed by working memory capacity, and any discussion of the emergence of modern cognition must take this into account.

The Social Brain and the Cultural Explosion of the Human Revolution, R. Dunbar

I want to argue here that the more remarkable aspect of this sea-change in human evolution was not so much the tools and the artwork, but world of the imagination that, then and now, defines the human condition. The inevitable problem for the archaeologist, of course, is that this aspect of human behaviour is less directly visible in the material record than the more technological aspects of hominin behaviour on which discussion has hitherto focused. The temptation will always be to shy away from grappling with this aspect of humanity simply because it is easier to focus on that which is physically present. I want to argue, instead, that, since it is this 'world of the imagination' that is really the essence of what it is to be human, we must think more seriously about finding imaginative ways of recognizing the signatures of this world in the archaeological record.

The social brain hypothesis argues that the principal pressure selecting for increased brain size has been sociality, with instrumental intelligence (e.g. tool manufacture and use) being a by-product, albeit in the case of humans a particularly significant by-product. The claim is that sociality has been the key evolutionary innovation that made primates as evolutionarily successful as they have been. In practice, the focus of the social brain hypothesis has been on neocortex evolution. This has principally been because, in terms of primate brain evolution, it is neocortex size that has increased out of all proportion to the rest of the brain. In non-primate mammals, the neocortex accounts for between 10-40 per cent of total brain volume, but in primates in accounts for between 50-80 per cent. For modern humans, fourfifths of total brain volume is neocortex, and all the rest accounts for a very modest amount. In other words, when we ask what has driven primate brain evolution, what we are really is asking is what has driven primate neocortical evolution. However, since the neocortex is also the locus for most of the more sophisticated aspects of human cognition, it is perhaps significant that it is this component of the brain that has expanded out of all proportion.

It is the business of social bonding that imposes the cognitive constraint that is so demanding of brain-based computing power. Indeed, there is now a growing body of evidence to confirm this: for example, male mating strategies, the size of grooming cliques and alliances, the frequency of social play and the frequency of tactical deception have all been shown to correlate with relative neocortex size in primates.

If primate (and hence hominin) brain/neocortex evolution has been driven by social forces, this gives us a very different perspective on hominin evolutionary history. Traditionally, Palaeolithic archaeologists have focused on tool production and use. This has, perhaps, been understandable, since prior to the Upper Palaeolithic Revolution (irrespective of when and where it began) the archaeological record consists almost exclusively of lithic tools. However, the question posed by the social brain hypothesis is whether the tools themselves are something of a red herring: the substantive changes were in fact in the social domain.

Since behaviour seldom leaves an archaeological trace, and the mind states that lie at the core of the social brain hypothesis are even less likely to do so, this necessarily creates something of a dilemma for archaeologists, given that they can only ever deal with phenomena that leave a material trace in the fossil record. One response might be to ignore the phenomenon as beyond the scope of the discipline. However, doing so would, I -want to argue, seem to have the unfortunate consequence of writing off most of what it means to be human. Primate societies are implicit social contracts established to allow their members to solve the ecological problems of survival and reproduction more effectively than they could do on their own. Primate societies work as effectively as they do in this respect because they are based on deep social bonding that is cognitively expensive. Thus it is the computational demands of managing the complex interactions between group members that has driven neocortex evolution.

The social brain has come to lie at the centre of what it means to be a (modern) human. In particular, a strong case can be made for the claim that it underpins just those aspects of culture that we might see as being at the heart of the human experience, namely religion and culture in the literary sense. This is so both because these are activities that seem to be especially demanding in cognitive terms, but also because ultimately they involve interactions between people. Not only are tool-making and tool-use less cognitively demanding than navigating one's way through the minefields of the social world, but a case can be made that tools play only a modest part in the everyday lives of modern humans because these lives are primarily about the experiences of the social world.

Apes seem to represent something of a pinnacle in this respect [capable of achieving second-order intentionality], since it is widely accepted (and there is at present no evidence to suggest otherwise) that the best all other species of animals can manage is first order: they know the contents of their own minds, but cannot make the extra leap required to use that knowledge as a basis of imagining another individual's state of mind. Thus, what modern humans can achieve in this respect is three orders above what even the best animal contenders can manage.

The fact that social cognitive skills seem to be a function of cortical volume, and that brain matter is energetically so expensive, raises the fundamental question as to why modern humans (in particular) should need such demanding capacities. While social cognitive capacities like theory of mind clearly play a fundamental role in allowing us to negotiate the complexities of our social world, it is difficult to see when we might need to resort to more than thirdorder intentionality in most everyday circumstances.

However, there are two human activities that do appear to be very demanding in terms of social cognition, namely story-telling and religion. It is perhaps easiest to see this in respect of story-telling. For a story to be interesting, it really needs to involve at least three characters and their respective mind states. Anything less than this and it becomes a simple narrative.... Story-telling is an important activity for modern humans because it provides an activity that plays an important role in social bonding, especially in smallscale societies. It may well be that stories told around the camp fire lack the sophistication of Shakespeare's constructions, but it is clear that even for a story to be sufficiently interesting to grab the audience's attention, fifth-order intentionality is probably the minimum requirement for the story-teller.

The key point here is the fact that both story-telling and religion require those involved to work in a virtual mental world, one that they cannot see and touch through immediate experience. They have to be able to imagine that the world they live in could be other than as they immediately experience it. Their functional significance, however, lies in their capacity to bond large groups of individuals, allowing them to function as coordinated entities despite the fact that the members might be dispersed over very considerable distances. My claim here is that it is this capacity, and these two activities in particular, that we should identify with what it is to be a modern human.

Fifth-order intentionality is unlikely to have appeared prior to the appearance of anatomically modern humans. Homo erectus populations were clearly limited to third-order intentionality and all but a handful of the archaic humans to fourth order (and even these cases — all late Neanderthals — is questionable, since there is some evidence to suggest that Neanderthal frontal lobes may have been smaller than their cranial volumes "would imply).

The evolutionarily most parsimonious explanation is that fifth-order intentionality evolved because religion and, perhaps, story-telling were crucial mechanisms for bonding large social groups, and without these mechanisms in place it would have been impossible for early modern humans to have maintained the temporal and spatial coherence of their large dispersed social groups.

My aim here has been to present at least a prima facie case for arguing that we should attribute the time of origin of the modern human mind to some time around 200 kya (i.e. the appearance of anatomically modern humans). This is substantially earlier than archaeologists have previously considered plausible, even on the most charitable interpretations of the material culture record. Inevitably, therefore, archaeologists are likely to worry about how we can test such predictions from the material culture record. One solution, of course, is to insist on the primacy of the artefact record, and so argue that the only uncontroversial evidence for the modern mind is the shift in style and quality of tools and other artefacts that begins to appear after about 100 kya in Africa. My concern, and in a sense my challenge, is to worry that if we adopted this solution we might seriously underestimate the true historical depth of the modern human mind, and that might be just as seriously remiss.

Of course, the relative paucity of the archaeological record prior to 100 kya does not help — although the time depth of sites in Africa is being pushed increasingly further back with each passing decade. More importantly, perhaps, my suggestion should encourage us to think more creatively about the archaeological record in an effort to identify archaeological signatures that we can read with confidence. Since the social brain hypothesis (along with its various implications like religion and story-telling) is strictly social in nature, it is some facet of the social world that we need to focus on. Tools per se may have little or nothing to tell us in this respect, although the extent of trading networks over which tools and raw materials are exchanged may do so. What we thus need, perhaps, is better data on raw material sourcing and evidence for the geographical range and dispersion of cultural similarities in artefact design.

Did Syntax Trigger the Human Revolution?, D. Bickerton

Until recently, the notion persisted that, around approximately 50 kya, there occurred an event or process commonly referred to as the 'Human Revolution' or 'great leap forward', during which all the major cognitive traits characteristic of modern humans emerged rather suddenly. Indeed, it has been claimed that only some mutation affecting the human brain around that time could have caused such a development.

Over the last few years, however, it has become apparent that the 'great leap forward' was an artefact resulting from a Eurocentric sampling of the fossil record. Studies of sites in Africa have shown that almost all of the innovations that cluster between 50 and 30 kya in Europe and the Middle East can be found at one site or another in Africa at much earlier periods. This has led many to conclude that characteristically human cognitive capacities (CHCCs) evolved gradually over a period of two or three hundred thousand years, a view highly compatible with a strictly gradualist view of Darwinian evolution.

[One plausible scenario is that] characteristically human cognitive capacities (CHCCs) developed quite rapidly (in evolutionary terms), probably around the time (150 kya) when genetic and some morphological indicators suggest that the speciation of anatomically modern humans occurred. However, while it may make novel artefacts possible, acquisition of CHCCs does not mandate their creation. Rather, new possibilities now opened will be exploited only as and when some selective pressure or cultural development requires that they be exploited. Consequently, the timescales of CHCCs and human technology may be quite distinct. On the other hand, it seems counterintuitive to suppose, as Deacon does, that CHCCs could have lain dormant (at least as regards their expression in the archaeological record) for several hundreds of thousand of years. Humans and their immediate ancestors are niche-constructing animals par excellence, and that they would have totally ignored novel capacities in their development of new niches is as unlikely as that those capacities would have instantaneously triggered a full flood of new inventions.

Let us assume that, at approximately 2 mya, some form of proto-language (a mode of communication distinct from animal communication systems, in that it contained truly symbolic units and was capable of predication) began to develop. What, precisely, would have been the effect of such a medium on human cognitive processes? It would have enabled a much more efficient storage and sorting of memories. Moreover, this storage and sorting could be shared among group members. As the number and functions of proto-language units increased, the existing capacities of a social, foraging species would be correspondingly enhanced. However, it is unclear that any significant novelty would have emerged without further development of the proto-language.

In what follows, I will try to distinguish between three kinds of thinking, which we may distinguish as 'thought 1' (pre-linguistic thinking), 'thought 2' (thinking after the emergence of proto-language) and 'thought 3' (thinking after the emergence of language). Of these, only 'thought 3' would be adequate for the construction of a sustained train of thought.

To understand how the human capacity for sustained thinking developed, we need to go back to the beginning, to the last common ancestor (LCA) of chimpanzees and humans. We can assume that this animal (along with modern apes and other cerebrally well-developed species, such as dolphins and parrots) had a rich conceptual structure, but that this structure could only be triggered reactively. That is to say, the animal could have thoughts that would, in human terms, correspond to 'That's a lion!' (reacting to an external stimulus) or I'm hungry' (reacting to an internal stimulus). It could not, however, have thoughts equivalent to 'Hungry lions are dangerous', because while its conceptual system contained a class of lions, it had no means to activate that class as a concept in its own right — it could only activate the concept of a member of that class, and then only when there occurred some external phenomenon that it interpreted (rightly or wrongly) as indicating the physical presence of a lion. In other words, the LCA (along with contemporary apes and probably australopithecines) was capable only of thought.

How would this have been affected by the development of a proto-language? Although there is considerable disagreement about the precise nature of the proto-language, holders of very different viewpoints agree that it would not have had any kind of systematic internal structure. Either it would have been holophrastic ('The whole thing means the whole thing', in Wray's terms) or, as will be assumed here, it consisted of short, unstructured strings of single units (whether orally or manually produced is of no importance in this context) roughly corresponding, in terms of semantics, to the individual words of modern-human languages.

The creation of a proto-language would have enabled its possessor, for the first time, to think about things in their physical absence, without the automatic triggering of a response (flight, aggression, watchful waiting, or whatever) that a signal from a pre-linguistic communication system might have occasioned. This new capacity would have been limited, early on, by a relative scarcity of symbols, which would only gradually have been remedied.

Proto-language users would have remained in the state described as 'thought 2'. 'Thought 2' would have suffered from limitations similar to those of proto-language; just as, with the one, it would not have been possible to talk about very much for very long, so with the other it would not have been possible to think about very much for very long. Certainly the long trains of thought necessary for the creation of behavioural and technological innovations would have been impossible.

What differentiated this system [thought after language] from the system that generated proto-language lies in the fact that it is hierarchical rather than purely linear. As pointed out nearly half a century ago by Simon, hierarchical structures are always more stable, more robust and will support more complexity than structures assembled by piecemeal addition. In other words, instead of adding B to A and C to B and D to C..., the new system would have combined A and B, then combined C with [AB], then D with [[AB]C], and so on.

A perhaps even more crucial distinction between the two types of system lies in their timing. A protolinguistic mode is inescapably slow: because it has no internal structure, any message, regardless of whether it is uttered or merely thought, has to be assembled as well as expressed in a strict linear order, one concept or word after another. A linguistic mode can perform simultaneous sub-assemblies. Take a sentence like "The man you saw yesterday knows the woman you are meeting this afternoon'. The two phrases, 'the man you saw yesterday' and 'the woman you are meeting this afternoon' can be assembled simultaneously. This may be critical in showing why novel and complex artefacts could not have been produced by earlier hominins. It would appear that such a system is a necessary pre-requisite for the kind of thinking that gives rise to innovation.

Accordingly it may be assumed, unless and until some reliable counter-evidence emerges, that development of the syntax of language was a necessary and possibly a sufficient pre-requisite for the emergence of true modernity and the CHCCs that have given our species the unusual position in nature it now holds. If it was indeed sufficient (which is of corse the null hypothesis, and incidentally one that satisfies Occam's Razor) an interesting consequence, one that would surely have pleased Darwin, follows. It is that CHCCs are not in fact novel capacities, but merely capacities common among primates (and possibly other taxa) that have been supercharged by the development of language in general and syntax in particular.

Is it possible to determine the precise onset of the syntactic capacity? One way to do this is to look at the archaeological record for signs of artefacts whose form and function had to be imagined in some detail before they could be produced.... The conclusion, on the basis of presently-available evidence, must then be that there was in some sense a 'human revolution', a clear distinction in terms of cognitive and creative power that developed quite rapidly in evolutionary time, exclusively in Africa, and long before the previously supposed date of the 'great leap forward'. Indeed, there is no reason to suppose any significant difference between the contemporary human mind and that of 100 kya. In the absence of any other convincing cause for this development, we may assume that it resulted from the emergence of syntax, an agency just as powerful in its manipulation of thought as in its manipulation of words. We may think of ourselves as 'the Symbolic Species', but 'the Syntactic Species' perhaps more accurately distinguishes us from all the species that preceded us.

Music and the Origin of Modern Humans, S. Mithen

On one hand, this is surprising/ music is universally found within present-day human societies and those documented in the historic and ethnographic records; people in all cultures invest a great deal of time, energy and resources in singing and dancing, whether participating themselves or simply observing. As such, we cannot pretend to have understood modern human origins, or to be undertaking an appropriate range of studies, without addressing why and how we became a musical species.

In his classic 1964 work The Anthropology of Music, Alan Merriam described the many functions that music provides, ranging from aesthetic enjoyment and entertainment to validating social norms, communication and supporting religious rituals. The last of these is important because although music is universally found in human societies, its role is highly variable with the only common feature appearing to be its use in religious activities. There is unquestionably a very close connection between music and religion and on that basis one might initially assume that they originated at the same time — it is indeed difficult to imagine how any form of religious activity could exist without the involvement of music in some manner, whether that is song or dance. That would, however, be a false assumption as the anthropological and archaeological records provide evidence that activity we should classify as musical, if not music itself, originated early in human evolution and long before we have any traces for religious behaviour. I will outline this evidence, much of which is necessarily circumstantial, and in doing so also reflect on the relationship between the evolution of music and language.

By 'music' I am simply referring to sound that involves variations in pitch, rhythm and melody, whether made by a voice or by the use of material culture. There are many similarities and overlaps between music and language. Both exist in three modes: as audible sound, as gesture and in written form. Both emanate from the human body but can be manipulated and transformed by material culture, whether loudspeakers or musical instruments. Both are hierarchical, combinatorial systems with language using words as the building blocks of phrases, and music using pitches. Musical styles and languages can be grouped into families; they evolve and patterns of descent and blending can be reconstructed. There is a musicality to spoken language — prosody — which becomes particularly exaggerated when we speak to young children, while poetry and song are explicit combinations of spoken language and music.

Just as there are similarities, there are also significant differences. The most important is that language is efficacious at conveying information because words have shared meanings. Moreover, words are combined by rules of grammar which are also shared by a linguistic community and which add a secondary level of meaning — hence 'man bites dog' is different in meaning from 'dog bites man', even though the words are the same. Music does not convey information in the same manner, if it conveys information at all. The units of music — pitches — do not have shared meanings and while musical styles have rules, these do not impose additional levels of meaning in the manner of a linguistic grammar (they can not because there is no meaning in the first place). In this regard, describing the complex of rules of a musical style as 'grammar' is unhelpful.

While primate calls may be quite unlike human language, the continuities with human music are more readily apparent — monkeys and apes make extensive use of variations in pitch, rhythm and melody. Richman has described this for geladas which chatter together in groups using a wide range of rhythms and melodies to facilitate their communication and manipulate the emotions of other individuals. Perhaps the most musical of non-human primates are the gibbons of southeast Asia. Males and females of each of the twelve species sing alone before and after mating, while the mated pairs of all but two species sing duets. Their songs are species specific and appear to be largely a product of biological inheritance rather than of learning. Quite why mated pairs sing together remains unclear: this may be to strengthen their pair bond, advertise their paired status or as a means of territorial defence.

It is reasonable to assume that vocal communication by the common ancestor to the chimpanzee and modern humans that lived 5-6 million years ago was similar to that of the chimpanzee today. We can then ask how this is likely to have changed in light of what we know about the anatomy and lifestyles of the australopithecines and early Homo? First we should note that the smaller teeth and jaws of early hominins, and especially early Homo, in comparison to extant apes would have changed the shape and volume of the final section of the vocal tract and increased the potential for movement by the tongue and lips. This would have increased the number and diversity of oral gestures, which Studdert-Kennedy argues form the fundamental units of speech. While more diverse, it seems unlikely that these fundamentally differed from those of modern apes..

Changes in hominin lifestyles are likely to have created selective pressures for enhanced vocal communication. With regard to subsistence, early hominins lived on relatively open savannas in comparison to chimpanzees and the common ancestor, and were more dependent upon meat. This lifestyle would have increased predator risk and hence we might expect this selective pressure to have enhanced the number and type of predator alarm calls. This is likely to have included the ability to modulate the loudness of calls, so that communication could occur without attracting the attention of potential predators. For the same reason, we should expect there to have been an increased use of gesture.

Changes in social life would have also created selective pressures for an expansion of vocal communication. A key development is likely to have been an increase in group size over that observed among the apes today. This would have arisen as a means to reduce predator risk in relatively open habitats. Aiello & Dunbar argue that the larger cranial volume of early hominins, especially early Homo, in comparison to non-human primates today is indicative of living in larger groups. They suggest that physical grooming by one individual of another would no longer have been sufficient to service social relationships — there was simply not enough time to groom all the necessary individuals. As a consequence, they argue, 'vocal grooming' was selected as 'an expression of mutual interest and commitment that could be simultaneously shared with more than one individual'.

While Aiello & Dunbar describe vocal grooming as providing the origins of language, it seems far more relevant to the origins of music because they prioritize its tone and emotional impact over its information content. We should, therefore, think of early hominin social vocalizations as elaborations of the type of rhythmic and melodic utterances used by gelada monkeys. Expressing and inducing emotion may have been just as important as demonstrating social commitment within these hominin groups. Those individuals who were able to express their emotions vocally and, more importantly, induce emotions in others and hence manipulate their behaviour, would have been at a reproductive advantage. Such vocal expressions may have functioned to diffuse social tensions or increase the likelihood of social cooperation.

Bipedalism would have had numerous consequences for the nature of communication. One of these is the positioning of the larynx. This is found much lower in the human throat than that of the chimpanzee, which allows a wider array of sounds to be produced. Aiello explained that this is most likely a consequence of bipedalism, as this requires the spinal cord to enter the brain case from below rather than behind to enable an upright stature. To do so, it reduces the space between the spinal cord and the mouth — the space where the larynx is found in chimpanzees. This space "was further reduced in capacity by changes in the hominin face and dentition. A consequence of these anatomical developments was that the larynx had to become positioned lower in the throat, which then had the incidental effect of lengthening the vocal tract and increasing the diversity of possible sounds it could produce.

My use of the term music in this chapter refers not just to sound but also to movement — dance as well as song. Standing or walking on two legs requires that the centre of gravity is constantly monitored and small groups of muscles are frequently recruited and changed to correct its position; the movement of legs has to be integrated with that of arms, hands, trunk and head in order to maintain a dynamic balance. As such, bipedalism required a larger brain and more complex nervous system to attain the relatively high degree of sensorimotor control. These will then have allowed the possibility for movement other than for locomotion itself; movement that was expressive, making use of rhythm and melody — movement that we would characterize as dance.

Those mothers who "were able to communicate with their infants in a music-like manner and hence support their emotional development and enculturation would have gained a reproductive advantage. The physical possibility of doing so had arisen from bipedalism which had provided an enhanced number and diversity of oral and physical gestures. Mother-infant interaction would, however, have just been one context in which music-like abilities would have been selected. Another is likely to have been male-female interaction with regard to mate choice... This was Charles Darwin's explanation for how the human capacity for music evolved. Having already argued that male birdsong had evolved by a process of female choice — 'The true song ... of most birds and various strange cries are chiefly uttered during the breeding-season, and serve as a charm, or merely a call note, to the other sex' — Darwin proposed that the same had occurred in human evolution: 'we may assume that musical tones and rhythm were used by our half-human ancestors, during the season of courtship'.

This sexual selection argument for the evolution of music was elaborated by Miller who argued that singing and dancing constituted a package of indicator traits for those choosing mates, predominately females. They demonstrated fitness, coordination, strength, agility, self-confidence and so forth. As such, a female seeking 'good genes' for her offspring would have been attracted to males whose song and dance revealed such characteristics as these were desirable features for her children to inherit. By this process of mate choice, Miller argued, the capacities for song and dance would have been selected for in successive generations contributing to the capacity for music that we have today, whether or not that is still used for male display.

Berlin has shown a strong tendency for the words that people use for small animals to make use of the vowel sound [i] (e.g. bee, fly), created by making a small oral cavity, and those for large animals to use the vowels [e, a, o, u] (e.g. mammoth, elephant), created by a large oral cavity. Berlin's work is important because it challenges one of the most fundamental claims of linguistics: that of an arbitrary link between an entity and its name. He has shown that the names of animals often reflect the inherent properties of the animals concerned, whether it be the sounds they make (as in onomatopoeias), their size or the way they move....Whether or not hominins had names for animals or birds, I suspect that their communication systems were similarly moulded by the sounds and sights of the natural world.

One of the key roles of music in the modern world is social bonding. The majority of music-making is done by groups, whether church choirs or football crowds. Singing and dancing together promotes feelings of group identity. It does so not only by strengthening bonds between individuals, but also by excluding others — those who sing or dance to another tune or anthem. Quite why music is so effective at building such bonds remains unclear. It may relate to the release of the hormone oxytocin into the brain which is believed to facilitate social bonding. My own view is that group music-making serves to develop feelings of trust towards others that provide the emotional foundation for cooperation.

The need for cooperation would have provided a further selective pressure for music-like communication combining with those relating to mother-infant interactions, mate choice and information transmission about the natural world. The changes in hominin anatomy, principally relating to bipedalism, would have enabled such selective pressures to become realized in the evolution of a communication system with music-like qualities in terms of making extensive use of rhythm and melody. While this communication system might be described as 'proto-language', whether it would have had entities equivalent to words and a structure equivalent to grammar is contentious. Theories about the nature of proto-language fall into two categories: compositional and holistic theories. The essence of compositional theories is that proto-language consisted of words with limited, if any, grammar — as championed by Bickerton.

The transformation of such proto-language into language required the evolution of grammar — rules that define the order in which a finite number of words can be strung together to create an infinite number of utterances, each with a specific meaning. Compositional theories of proto-language have dominated studies of language evolution for the past decade; they have been highly influential but may have led us in the "wrong direction for understanding the evolution of language. Alternative views have recently emerged which fall into the category of 'holistic' theories, such as by Wray and Arbib. Wray argues that the precursor to language "was a communication system composed of 'messages' rather than words; each hominin utterance was uniquely associated with an arbitrary meaning, as are the majority of words today, and indeed those of a Bickertonian-like proto-language.... The communication system that I have proposed above, one that makes extensive use of rhythm and melody, is more compatible with a holistic rather than composition theory of proto-language. Indeed, the same communication system could be equally described as proto-music, and music remains as a holistic means of communication today.

Symbolic artefacts are those that have been either shaped or decorated so that they have a meaning that is quite arbitrary to their form. Pieces of art are obvious examples, especially those that are non-representational. A problem which archaeologists face is that modern hunter-gatherers, indeed modern humans in general, frequently attribute symbolic meanings to entirely unmodified found objects and to natural features of the landscape. We can never be sure that the Neanderthals did not do the same. But the absence of any objects that have been intentionally modified and lack a feasible utilitarian or other non-symbolic interpretation suggests that we should err on the side of caution. The connection between symbolic artefacts and spoken language is simple, but to some contentious: if Neanderthals were able to use words — discrete utterances with symbolic meanings — they would have also been able to attribute symbolic meanings to objects. Objects that have symbolic meanings provide an invaluable aid to social interaction — we use them continually and are entirely surrounded by them.

It seems unlikely that Neanderthals would have had the capacity to produce symbolic objects but did not discover or choose to use that capacity for more than 200,000 years while living in the most challenging of environments, and often on the very edge of survival itself. The absence of symbolic objects must imply the absence of symbolic thought, and hence symbolic utterances. Without these, by definition, there was no compositional language. A second argument against the idea that Neanderthals possessed language is the immense stability of their culture. The tools they made and the way of life they adopted at c. 250,000 years ago were effectively no different to those at the moment of their extinction, just under 30,000 years ago. As we know from our own personal experience and from a moment's reflection on human history, language is a force for change: with it we can exchange ideas so that our technology is improved and new ways of living are introduced. So if the Neanderthals had possessed language, then how could their culture have remained so stable and so limited in scope?

If there was ever a population of humans that needed to invent bows and arrows, the means for storing food, needles and thread, and so forth, it was the Neanderthals. But all of these only came with modern humans, who then went on to invent farming, towns, civilization, empires and industry. In contrast, the Neanderthals had immense cultural stability, and so we are most likely looking at a species that had a holistic rather than compositional form of communication.

Wray uses the term 'segmentation' to describe the process whereby hominins began to break up holistic phrases into separate units, each of which had their own referential meaning and could then be recombined with units from other utterances to create any infinite array of new utterances. This is the emergence of compositionality the feature that makes language so much more powerful than any other communication system. Wray suggests that segmentation may have arisen from the recognition of chance associations between the phonetic segments of the holistic utterance and objects or events to which they related. Once recognized, these associations might then have been used in a referential fashion to create the new, compositional phrases.

It may only have been within the earliest Homo sapiens communities in Africa that people began to adopt specialized economic roles and social positions, that trade and exchange with other communities began, and that 'talking with strangers' became an important and pervasive aspect of social life. Such developments would have created pressures to exchange far greater amounts of information than was previously necessary in the socially intimate, undifferentiated groups of Early Humans. Only then would there have been the need for generalization, in the manner that Kirby describes within his simulation.

The archaeological record of Africa relating to the appearance of Homo sapiens certainly suggests that such social developments occurred. The dilemma, of course, is whether we are dealing with cause or effect: one might argue that the development of economic specialization and exchange relations between groups were a consequence of compositional language that enabled the necessary communication to be undertaken. My guess is that we are dealing with strong feedback between the two — they 'boot-strapped' each other to create rapid changes in both society and communication. The kick-start to such developments may have been a chance genetic mutation — the second possibility for why segmentation of holistic utterances only occurred in Africa with the advent of modern humans. This may have provided the ability to identify phonetic segments in holistic utterances which had previously been absent.

Some aspects of language are dependent on the possession of the specific gene FOXP2, the modern human version of which seems to have appeared in Africa soon after 200,000 years ago. Perhaps the process of segmentation was dependent upon this gene in some manner that has yet to be discovered. Indeed, it may be significant that those members of the KE family that "were afflicted by a faulty version of the FOXP2 gene not only had difficulties with grammar but also with understanding complex sentences and judging whether a sequence such as 'blonterstaping' is a real word. These difficulties seem to reflect a problem with the segmentation of what would have sounded to them as holistic utterances. So perhaps it was only with the chance mutation of the FOXP2 gene to create the modern human version that segmentation became possible. Alternatively, there may have been other genetic mutations at a similar date that enabled the transition from holistic phrases to compositional language, perhaps by the appearance of a generalpurpose statistical learning ability.

If Wray is correct and compositional language evolved from holistic proto-language relatively recently by a process of segmentation, then the most likely time and place for this is between 200,000 and 70,000 years ago in Africa, and for this process to be related to Homo sapiens. It may have taken such length of time — about 4500 generations — for compositional language to have become the dominant form of communication. This would explain the initially slow and then dramatic behavioural changes associated with Homo sapiens that McBrearty & Brooks document in the Middle Stone Age archaeological record from Africa. Language will not only have enhanced communication. It will have also changed the nature of thought by enabling cognitive fluidity — the integration of ideas and stores of knowledge that had previously been isolated in separate cognitive domains. Cognitive fluidity provides the capacity for analogical and metaphorical thought which, as I argued in The Prehistory of the Mind provide the foundations for art, science and religion, all of which we see for the first time in the Middle Stone Age of Africa.

Music is used today for a wide variety of functions including the expression of emotions, building social bonds, acculturating and supporting the development of infants and as means of sexual display. All of these functions can find their roots in the evolutionary history of our species, because long before compositional language had evolved, communication would have made extensive use of rhythm and melody to achieve these same ends. We are a musical species today because musicality has pervaded the lives of our ancestors ever since the first members of our genus had appeared — and probably for long before. Unlike compositional language, visual art or religious thought, music is one universal feature of modern humans for which there was a long and gradual evolutionary history and which we share "with both our close and our distant relative within the genus Homo.

Down With the Revolution, S. McBrearty

The European Upper Palaeolithic is a wondrous thing. European archaeologists have rightly been fascinated by its richness, inventiveness and sheer volume: elegant and variable flint tools, human burials with striking ornaments, superb and enigmatic art objects, painted caves with skillfully rendered depictions of extinct Ice Age animals, and even individual portraits of people dead for tens of thousands of years. The Middle to Upper Palaeolithic transition in Europe is equally riveting. It is the last time, except perhaps on the island of Flores 18,000 years ago, when human beings actually beheld a member of another species of their own genus.

These phenomena provide a source of justified and never-ending fascination, but what have they got to do with the origin of human beings, language or symbolic thought? Nothing much. No more than the equally stunning material culture of the Natufian, of Angkor Wat, of Tudor England, of Great Zimbabwe, of Beijing's Forbidden City, or of twentieth-century Manhattan. But by a series of historical accidents, the European Upper Palaeolithic became the model for what it means to be human. I will argue here that this practice hampers our understanding of past events and reveals the persistence of a regrettable attitude of European superiority. The European Upper Palaeolithic is important as a local archaeological phenomenon, but has no relevance for explaining how, when or why sophisticated behaviour, advanced cognition, or spoken language originated.

When I was an undergraduate, it was still possible to entertain the possibility that Neanderthals, the sole inhabitants of Europe for most of the Middle and Later Pleistocene, were the ancestors of Homo sapiens.2 The dramatic difference between Middle and Upper Palaeolithic material culture, and the abruptness of the change between them, was seen as the sudden appearance of the human capacity for language and symbolism, the flipping of a cognitive switch about 40 kya that suddenly illuminated where previously was darkness. As chronology became more perfectly known, it became obvious that there simply was not enough time to accomplish the transformation of Neanderthals into Homo sapiens, and no morphologically intermediate fossils were discovered. Neanderthal anatomy and genetics have confirmed their distinct evolutionary trajectory, and highlight the fact that the ancestors of Homo sapiens must be sought elsewhere.

The oldest securely dated specimen as yet ascribed to our own species is the Omo 1 partial skeleton from the Kibish Formation, Ethiopia, dated to -195 kya. If fossils such as Florisbad are included in Homo sapiens, then our species has a time depth of at least 260 kya. Somewhat younger are the three crania from the Herto Member of the Bouri Formation, Ethiopia, dated to -160 kya. The oldest of the human skeletal material from Klasies River that provoked such debate in the 1980s dates to perhaps 110 kya. Clearly both fossil and genetic evidence support an African origin for Homo sapiens. The European Middle to Upper Palaeolithic transition is sudden and dramatic because it represents a replacement of the indigenous Neanderthals by incoming populations of Homo sapiens.

The record of the struggle to define what it means to be human stretches back to the dawn of the written word, and forms most of the basis for the world's major religions, not to mention the discipline of anthropology. The use of the term 'human' in palaeontology and archaeology is now generally confined to references to Homo sapiens. Humans are contrasted to non-humans, proto-humans or archaic humans. A 'Human Revolution', then, should distinguish the humans from the rest. The 'Human Revolution' is said to be signalled by the appearance of'modern'human behaviour. The vernacular meaning of the term 'modern' of course varies with context. 'Modern European history', for example, refers to the entire post-medieval period, whereas 'modern art' denotes a specific twentiethcentury aesthetic. As it is used in discussions of the 'Human Revolution', 'modern behaviour' is equated with technological sophistication, cognitive acuity and symbolic behaviour. The word human in this context implies that these developments are what made some hominins3 human, and that others are not really human.

Klein and others insist that, although fossils that indisputably represent Homo sapiens are present by at least 100 kya in Africa, modern behaviour did not appear there until -40 kya. This implies that early African Homo sapiens, like the Neanderthals, were not really human. The statement that early Africans, while indistinguishable anatomically from modern people, were not really human resembles a racist pronouncement so strongly that it compels close examination. Others are troubled by the denial of humanity to Neanderthals. The critical difference here is that Neanderthals belong to a different species from modern people, whereas early Africans do not.

Some authors propose that the entire human species experienced a simultaneous, punctuated, genetically encoded event that resulted in a behavioural breakthrough that included the capacity for language. As in Europe, they see this event occurring about 40 kya, a date which corresponds in Africa to the beginning of the Later Stone Age (LSA). Because evidence in support of sophisticated behaviour in Africa in the preceding Middle Stone Age (MSA) and even earlier has continued to accumulate, a behavioural revolution at 80-60 kya has recently been proposed. A sudden increase in technological, cognitive, communicative and social complexity, driven by genetic, environmental or purely adaptive factors, is thought to have permitted the geographic expansion of Homo sapiens outside Africa. Either reading of the record supports the impression that the earliest Homo sapiens in Africa were behaviourally primitive.

The essential divide between humans and the rest of nature and the attribution of innate superiority to humans is an essential part of the Judeo-Christian tradition. In Genesis, humans are the result of unique acts of creation, and are given dominion over the beasts. The wide gulf between humans and the rest of the animal kingdom is exaggerated in the sciences by the fact that behavioural comparisons can be undertaken only with living taxa. Any trace of linguistic or symbolling facility that arose early in the genus Homo is absent from chimpanzees, and extinct species of Homo are not available for observation. Unhappily, in many archaeological scenarios, early members of our species in Africa are not considered truly human until they begin acting like early Europeans or, worse, until they actually set foot in Europe.

Alison Brooks and I pointed out that many of the behaviours thought to be unique to the European Upper Palaeolithic in fact could be found in the record of the African MSA. We argued that behavioural change in Africa was gradual, not sudden, and sophisticated behaviour appeared there very early, in fact hundreds of thousands of years earlier than predicted by either 'Revolution' model. The evidence in the African MSA for behaviours usually considered characteristic of 'modern' human behaviour is reviewed in our 2000 paper, and I will only briefly outline it here. But it is worth reiterating that the record of behavioural change begins with the Acheulean to MSA transition at -300 kya, not with the MSA to LS A transition at 40 kya. Thus the first appearances of many of these behaviours fall into the time span of fossils such as Florisbad and Ngaloba. These developments were summarized in much-reproduced diagram, which is presented here as Figure 12.1, amended to reflect developments since 2000.

Behavior Innovations

In our 2000 paper, Brooks and I discussed what these traces reveal about ecological adaptations, technology, and economic, social, and symbolic behaviour. We argued that African MSA technology shows logic and inventiveness, and that ecological and economic aspects of the record reflect human innovation, and abstract thought in the form of systematic planning depth, conceptualization of the future, and formalized social relationships among individuals and groups. We further argued that these features demonstrate a capacity to imbue aspects of experience with meaning, to communicate abstract concepts, and to manipulate symbols as a part of everyday life. The interplay of cultural processes and anatomical responses that resulted in the modern human adaptation therefore has African roots that penetrate deep into the Middle Pleistocene. This evidence shows that the mental capacity for sophisticated behaviour was present in Africa in the earliest Homo sapiens, that these behaviours arose by normal processes of innovation, and that their traces in the archaeological record accumulated sporadically over the course of the next 300,000 years.

Brooks and I argued that much of the proliferation of 'Upper Palaeolithic' extractive technology of the LSA can be seen as intensification in response to increased population or declining resources. Certainly it is a waste of effort to invent clever tools and to spend time extracting nutrition from tricky or problematic resources if preferred foods are plentiful and readily available, as they were in the MSA. The switch from eland to buffalo at some South African sites in the LSA can likewise be seen as a response to a decline in eland numbers. There was no merit in hunting dangerous animals in the MSA when docile ones were there for the taking. Brooks and I argued that the situation facing LSA peoples was in fact brought about by the success of preceding MSA hunters and fishers.

Some archaeologists have claimed to observe a sudden, late 'behavioural revolution' in Africa. It is imagined to have occurred at the MSA to LSA transition at -40 kya, the same time depth as the Middle to Upper Palaeolithic transition in Europe. The supposed contrast in behaviour between the MSA and the LSA is artificially enhanced in a number of ways: discounting evidence with dates that are 'too early', insisting that all items of material culture must appear together as a package in order to be meaningful, and comparing late time-restricted LSA assemblages with timeaveraged MSA assemblages that span tens of thousands of years. The effect is further exaggerated by attributing normal economic and cultural change to an intellectual breakthrough at the MSA-LSA transition. Partly in response to accumulating evidence for behavioural sophistication in Africa that predates 40 kya, and partly in response to findings in molecular genetics, several investigators have now proposed a slightly earlier 'revolution' in Africa dating to 80-60 kya.

Population dispersal is thought to be motivated by inbreeding avoidance, changing geography, or more frequently, habitat depletion and the resulting inter- or intragroup competition. Dispersal rates and patterns are affected by body size, generation length, fecundity, diet, population density, predator density, landscape carrying capacity, habitat diversity, resource patchiness, and aspects of the social system, especially those regarding mating and territorial defense. Dispersal is assumed to incur costs in either survival or fecundity, and these costs increase with distance travelled. Dispersal incurs increased mortality risks from unfamiliar habitat, passage through areas of high predator densities, or the absolute cost of increased movement, but populations often experience growth when entering new territory. Increased longevity, decreased time between births, or increased offspring survivorship can increase individual fecundity. These in turn can be positively influenced by factors such as increased foraging efficiency, decreased competition, decreased predator pressure, or enhanced disease resistance. While increased intelligence or improved technology might be proposed to enhance any of these factors, the success of invasive species is commonly due to either r-selection or tolerance of wide variance in environmental conditions.

Pathogens may have aided Homo sapiens in their replacement of archaic hominins. In some parts of the New World after initial European contact, imported infectious diseases, including smallpox, measles, and viral influenza, reduced Native American populations by more than 90 per cent in as little as five years. In the model of Zubrow that assumes interacting populations of stable size, Neanderthal mortality exceeding that of Homo sapiens by only two per cent can account for the extinction of the Neanderthals in only 30 generations, or 1000 years. None of this evidence supports a 'cognitive revolution' at either 80 kya or 40 kya. The items of material culture whose sudden appearance is cited as evidence for the mental leap forward in this time interval in fact appeared individually much earlier in the record.

Like gunpowder in a later age, the bow and arrow may have given dispersing African populations a competitive edge, but like gunpowder, the bow and arrow underwent a long period of experimentation and development; its invention did not require a human genetic mutation. Dispersal from Africa can be expected to have resulted from the interplay of population growth, normal migration among foragers, and ecological change driven by Milankovitch cycles. Mundane factors such as greater fecundity or disease immunity also may have played an important role. Mellars and others ask why Africans did not enter Europe until 40 kya. One might also ask, why did Asians not enter the New World until the late Pleistocene? Why did Europeans fail to enter the New World until the fifteenth century AD? Why were there no human footprints on the moon until 1969? Not due to a genetic mutation or cognitive enhancement, but for reasons of environment, demography, technology, historical contingency, and in the last two cases, motivation and finance.

It is acutely revealing that special human qualities, particularly advanced cognition, are invoked as a mechanism for the migration out of Africa. I see the importance attached to the 'time lag' between the appearance of Homo sapiens and their 'escape' from Africa, and the insistence that it required superior intellect, as an attempt to paint non-Africans in the most flattering conceivable light. The research seems to seek to identify not what makes us human, but what makes non-Africans special. The assumption seems to be that superior intelligence was required to leave Africa. The inevitable corollaries are that: 1) not much intelligence is required to survive there; 2) those who remain there are not particularly intelligent; and 3) once intelligence was acquired, the smart humans lost no time in making tracks to escape Africa.

The term 'Human Revolution' is a serious misnomer. The concept as it was originally envisioned was predicated upon a mistaken interpretation of the archaeological and evolutionary records. It was built upon the belief that the Middle to Upper Palaeolithic transition represented rapid in situ change, rather than a population replacement. Despite the repeated demonstration that this original reconstruction of events was in error, archaeologists have not jettisoned the idea of a revolution, but continue to seek a 'human revolution' of some kind, somewhere. This is seen as a single extraordinary moment that defines what it is to be human and explains all or most of subsequent events in prehistory. This quest for this 'eureka moment' reveals a great deal about the needs, desires, and aspirations of archaeologists, but obscures rather than illuminates events in the past. It continues to put Europe on centre stage, casting it either as the arena where the actual events of human origins were enacted, or as the yardstick by which human accomplishments elsewhere must be measured.

Now, to accommodate recent genetic and archaeological discoveries, a new version of the human revolution story is presented, this time in Africa or the adjacent regions of the Arabian Peninsula at 60-80 kya, rather than in Europe at 40 kya. Like its predecessor, it is seen as the product of an imagined human cognitive advance, perhaps caused by a genetic mutation. This new model now sets the entry into Europe, or at least the exit from Africa, as the test of true humanity, with earlier African and Levantine populations of Homo sapiens, despite their demonstrations of symbolic behaviour, now cast as 'failed' migrants who lacked the special qualities that such a migration required. Like its predecessor, this new 'revolution' can instead be explained by normal processes of invention and migration undertaken by early African populations of Homo sapiens, who had shared the capacity for advanced cognition at least since the origin of our species more than 200,000 years before. The European Upper Palaeolithic record shows that people ultimately of African origin responded to their new environment in interesting and arresting ways, not that they underwent a cognitive reorganization when they set foot on European soil, nor that the Africans 'left behind' never achieved the goal of true humanity.

The continued search for a 'human revolution' is the playing out of a number of well-established themes in the European intellectual endeavour: to provide simple answers to complex questions, to establish a clear gulf between humans and the rest of nature, and to set Europeans apart from their African ancestry. The perceived gulf between humans and the rest of the natural world is so fundamental to Western thought that it is not likely to disappear in response to this or any other rational presentation of the facts. But I would hope that the continued pernicious assertion of European superiority will give some pause. Geneticists must seek to understand archaeological data as diligently as some archaeologists have sought to understand genetic research. And at the very least, I would urge those who study the European Upper Palaeolithic or the Middle to Upper Palaeolithic transition to abandon the term 'Human Revolution'. It implies a swift, in-place transformation of Neanderthals into Homo sapiens, which is not in accord with the vast body of research findings of the last twenty years. The term conflates 'human' with 'European,' and implies that those outside Europe were not really human. As Paul Veyne has observed in another context: A social class proud of its superiority sang hymns to its own glory'.

I predict that those who argue for a very recent origin for language or advanced cognition, or for an origin for these things outside Africa, will need to revise their ideas as ever more ancient symbolic artefacts come to light in Africa. It is clear that the best place to look for these objects is in Africa since that is where early humans lived. But the tendency instead has been to shift the criteria for inclusion in humanity, with the result that it is always denied to early Africans.

Each of us chooses the research questions that we pursue for a variety of reasons: intellectual, personal, and practical. Perhaps it is inevitable that biologists and archaeologists will continue to search for a 'magic bullet' to answer the profound questions that have troubled human beings throughout history. But archaeologists have a special obligation to distinguish a marvelous, unique, specifically European prehistoric heritage from the defining characteristics of all of humanity.

Evidence for the Origin of Symbolic Behaviour In and Out of Africa, F. D'Errico, M. Vanhaeren

The scenario promoted by the large majority of the papers published eighteen years ago in the Human Revolution volume was that behavioural modernity resulted from a sudden change, taking place in Europe 40,000 years ago, coinciding with the arrival in this region of Modern populations bearing the Aurignacian culture. The losers in this scenario were the Neanderthals, considered by many contributors as inherently incapable of crossing the Rubicon that leads to modernity, or just able to briefly touch, thanks to acculturation, the opposite riverbank before ineluctable drowning. Widely disseminated in research articles, textbooks and novels, the Human Revolution scenario became a dominant paradigm in prehistoric research and the way Western civilization conceived the origin of modern human cultures.

The idea, exposed by a number of contributors to the Human Revolution volume, that our species emerged in Africa at least 100,000 years ago was not perceived as challenging the alleged more recent European cognitive shift as little evidence existed at the time for the emergence of behavioural innovations in Africa prior to 40 kya. For many scholars the Human Revolution scenario ceased to be a hypothesis to test against the empirical evidence and became a postulate that did not require further investigation. The aim of Palaeolithic archaeology became, in the mind of some, that of documenting this sudden change or modelling what a pre-modern cognition could have been, and how it may have interacted with our 'modern' mind at the moment of contact. The few scholars arguing for a different approach remained dissonant voices within this general consensus.

In the last decade a new model seems to have replaced the Human Revolution scenario. This new scenario tends to equate the biological origin of our species with the origin of modern behaviour. The agenda behind this model can be summarized as follows. Genetic data suggest that modern humans come from Africa. The process that produced our species in Africa must have granted it a number of advantages — syntactical language, advanced cognition, symbolic thinking — that favoured its spread throughout the world, determined its eventual evolutionary success and the extinction of pre-modern human populations. Are we to consider, as was the case with the Human Revolution, this new scenario as the ultimate axiom? Archaeologists adopting this stand try to identify and document in the African Middle Stone Age the emergence of cultural innovations that can be interpreted as the behavioural outcome of the speciation.

One can wonder, however, whether archaeology could not do more than just provide supporting evidence for hypotheses shaped by other disciplines. This is particularly so considering that results provided, for example, by palaeoanthropology or genetics are also not straightforward and models accepted today as established facts may soon be challenged by new discoveries. If we see archaeology as an independent discipline we should be able to assess issues which deal with cultural and behavioural change on primarily archaeological grounds and consider the link between biological and behavioural change as a matter of inquiry, not an assumption to be meekly accepted. In this case we will feel empowered to analyse the archaeological record in search of behavioural changes not only in Africa but also in Eurasia, the alleged realm of pre-modern populations, and contrast results with those of other disciplines.

Underlying the Out of Africa model for the origin of modern behaviour is the view, well exemplified by the McBrearty & Brooks graph, that the emergence of each of these new features marked a definite and settled threshold in the history of mankind and that the accumulation of these innovations contributed, as with genetic mutations, to create human societies increasingly different from those of their non-modern contemporary counterparts. However, documenting and dating the occurrence of these innovations in various regions may reveal their presence at times and places incompatible with the Out of Africa model. It may also show a discontinuous pattern with innovations appearing and disappearing or being associated in a way that does not match the expected trend. Finally, in-depth cross-cultural analysis of the behavioural traits used as archaeological proxies for modernity may show that they can hardly be attributed a univocal evolutionary significance. Each of them may play different functions in different societies, and it maywell be the identification of these functions, and the comprehension of the specific needs that they have satisfied in each case, that may be relevant here. The aim of the archaeology of behavioural modernity should be that of documenting the complex historical processes at work in and out of Africa and use the resulting chronicle to identify long-term trends that can be contrasted to those offered by other disciplines.

In this paper, we focus on the origin, history and evolutionary significance of personal ornaments, an element of the archaeological record which is playing an important role in the human origin debate. Beads have many different functions in human societies, all eminently symbolic. Through the display of beads individuals project meaning onto the members of the same or neighbouring groups by means of a shared symbolic language. This makes personal ornaments an unambiguous trendy hallmark of cultural modernity. Little is known, however, of the mechanism or mechanisms that triggered our ancestors to use personal ornaments and the role these early beadworks played in the creation and maintenance of early symbolic traditions. To answer these questions we have been conducting research on the oldest evidence for personal ornament use in Africa and Eurasia.

The African evidence for an early use of personal ornaments has for a long time been underestimated. The uncertain chronology of a number of sites yieldeing potentially old ornaments is probably the main reason for this. However, a discovery from southern Africa and the reappraisal of material found decades ago in the Near East and North Africa are definitely favouring an old chronology for the emergence of this behaviour. We know now that the oldest securely dated personal ornaments from sub-Saharan Africa come from Blombos Cave, Cape Province, where 41 Nassarius kraussianus shell beads bearing human made perforations and traces of use were found associated with a Still Bay assemblage dated by OSL and TL to c. 75,000 BP. Two perforated Nassarius gibbosulus marine gastropods from the cave site of Skhul, Israel, another perforated shell of the same species from Oued Djebbana, Algeria, and four perforated Glycymeris sp. bivalves from Qafzeh, Israel, may indicate that personal ornaments "were used in these regions 25 ka earlier than in sub-Saharan Africa.

Evidence for early beadworks in Asia is still scant and in many areas of the globe (West Africa, India, China, Far East Asia) ornaments are not reported from sites older than 20 kya BP and may be relatively rare or absent until 10 kya BP. The site of Mandu Mandu, Western Australia, yielded 22 Conns sp. shell beads dated to c. 32 kya BP, but only future research will establish whether absence of earlier beads is due to lack of data or of long-lasting bead-making traditions. In Europe the question of the origin of beadworking traditions is intimately intertwined with that of the tempo and mode of the Middle Upper Palaeolithic transition since beads are associated not only with the Aurignacian, generally considered as the product of modern humans, but also to other Early Upper Palaeolithic (EUP) cultural traditions of more ambiguous authorship.

Personal ornaments can be instrumental in investigating cultural interactions between EUP technocomplexes. Ethnographic studies have shown that beadwork, like body painting, scarification, tattooing, garments, and headdresses, is perceived by the members of traditional societies as a powerful indicator of their identity, enhancing withingroup cohesion and fixing boundaries with neighbouring groups. Ethnographic studies also indicate that the ethnic dimension of beadwork is conveyed through the use of distinct bead types and/or by particular combinations and arrangements on the body of bead types shared with one or more neighbouring groups.

Research conducted in the last few years on the early use of personal decoration has dramatically changed our view of the origin of this behaviour. Modern humans from north and southern Africa as well as from the Near East use similar shells as beads very early in time. The argument, used against the significance of the Blombos beads, that a single site cannot be considered as evidence for an early acquisition of a general capacity becomes untenable and ongoing research seems to indicate that more instances of early bead use in Africa will soon increase the number of sites where bead-working traditions are attested. The new discoveries, however, do not imply that personal ornaments were made and used everywhere in Africa after 100 kya. Many African sites younger than 75 kya and excavated using modern standards have yielded no personal ornaments and evidence for early bead work in areas of the globe that are supposed to have been colonized as early as 60 kya by AMH is absent or scant before 30 kya or later. Only future research will establish whether this scarcity is due to the lack of investigation, to taphonomic reasons or the absence of long lasting bead-making traditions.

The apparent disappearance and re-appearance in the archaeological record of this behaviour fits well with the hypothesis that the production of ornaments is, as with other modern traits, more the result of historical contingencies than the crossing of a cognitive Rubicon. This is also suggested by the different forms and functions that early bead-making traditions take in different regions of the globe. A remarkable difference appears in this respect between the African and the European record. Ostrich eggshell beads are the only bead type found at late MSA and early LSA sites over an area covering a large part of sub-Saharan Africa. Cross-cultural survey of personal ornament use in traditional societies suggests the main, though certainly not exclusive, function of MSA beads was, as for known San societies that of exchange media used in gift-giving systems, the role of which is that of strengthening networks of social and economic relationship.... Their role was likely that of reinforcing social networks, perhaps between coastal and inland human groups, rather than highlighting cultural boundaries between them.

In contrast, dozens if not hundreds of different bead types characterize the Early Upper Palaeolithic of Europe, the Near East and Siberia from its very beginning. Our results identify clear regional trends in the bead type used. For this reason the Early Upper Palaeolithic ornaments instead best fit an interpretation as integrated markers of ethnic, social and personal identity. Their ethnic dimension is suggested by regional patterns in bead-type association not explained by raw material availability. Use as social markers may be indicated by recurrent occurrence, within a region, of different types found in significantly different proportions. Presence of unique bead types may reflect use of ornaments by an individual to highlight his/her particular social role. Thus, the reasons that have stimulated the creation of bead-work traditions in the two areas, and probably in other areas of the globe, seem to have been different. In Africa beads had the function of reinforcing reciprocity networks ensuring the survival of hunter-gatherer groups in moments of stress. Beads were used in Europe to strengthen affiliation to a group and visualize social and individual roles within the group.

In sum, personal ornamentation seems to have emerged around the world at different times and for different reasons. Their presence attests the modern character of the human cultures involved; their absence perhaps does not necessarily imply a different cognition. And if one accepts the Neanderthal authorship for the Chatelperronian this hallmark of behavioural modernity cannot be seen as reflecting cognitive innovations inaccessible to other fossil species. Even if it were demonstrated, which is far from being the case, that the use of personal ornaments by Neanderthals was the result of an 'acculturation' this would in fact reinforce rather than dismiss the modern character of their cognition as it would show their ability, as observed in many historical instances among modern human populations, to incorporate external stimuli and reshape those influences in order to make them an integral part of their culture.

What Makes us Human?, W. Bodmer

For a biologist, who is a geneticist interested in evolution, the obvious explanation for what makes us human must lie within the genetic differences that distinguish Homo sapiens from other species, especially chimpanzees. The data now available on DNA sequences of many species, including the complete DNA sequences of humans, chimpanzees, and several other mammalian species, already are enough to place Homo sapiens in the 'chimpanzee family', and separated even from the other great apes. Though the human and chimpanzee sequences are very similar, sharing perhaps as much as 99 per cent of their sequence, that still leaves plenty of room for a large number of functionally significant differences. Even if only 1 per cent of protein coding genes show such a difference, that still means there are as many as 250 genes whose difference may contribute to the greater cognitive ability of humans.

Every species has, more or less by definition, a unique DNA sequence signature that distinguishes it from every other species. Our challenge, however, is to discern those human features which make Homo sapiens qualitatively distinct from all other species, especially with respect to its cognitive and related abilities. There is no such thing as 'a gene for' any particular characteristic. What must be sought is the variation in the sequences of those genes, or the versions of the genes, that determine, or contribute to determining the particular characteristic differences we see between humans and their nearest relatives. So the challenge is to find those genetic variations that are unique to humans and which really matter for establishing the key features that make us human and in some way qualitatively different from all other animals.

Until we have answers from such biologically oriented studies, we must continue to seek the answers as to what makes us human from more general observations on human and animal behaviour, in the most general sense. Furthermore, we cannot assume that the whole answer lies simply in the genes, though it surely must be the evolved genetic differences that contain the potential for human cognitive uniqueness and other associated distinctive human features.... We must ultimately seek the explanation for the evolution of the differences that led to an enormously increased cognitive ability in the conventional actions of natural selection. However, there is clearly a huge number of attributes that are a by-product of this increase, such as musical and mathematical ability, which cannot all be the direct result of natural selection.

As I have already pointed out, a huge increase in cognitive ability is the most obvious underlying common feature to almost all the attributes that have been suggested to make us human. These include, in particular, language and speech which have enabled a considerable increase in the rate and efficiency of cultural evolution. Paleontological data clearly suggest that increasing brain size has been a major feature of the evolution of the human brain. However, it is clear that while an increase in brain size may be a necessary requirement for increase in cognitive abilities, it is not sufficient. There must be many increases in the complexity of brain function, at the level of cellular changes and interconnections, that have made the ultimate increase in cognitive abilities possible.

The ability to cook follows from the discovery of how to make fire. Darwin argued that "The art of making fire ... is probably the greatest discovery, excepting language, ever made by man". While fire may well have been a unique discovery, language surely must have evolved over a period of time and speech required the evolution of changes in the anatomy of the larynx, probably dependent on the evolution of bipedalism. The ability to make fire probably also depended on the evolution of bi-pedalism and subsequently, as a consequence of freeing the hands, on the evolution of the flexible thumb.

It is clear from archeological evidence that Neanderthals shared many features with Homo sapiens that are not found in any other species. They may well have had language, they had stone tools, they probably cooked their food at least to some extent, and, following Robin Dunbar, they may well have had a level of cognitive ability that allowed them to develop some form of religion.

It has generally been assumed that group selection is a weak force in evolution, as compared to the effects of natural section on the individual. However, I believe that the wide-spread existence of cooperation and altruism between unrelated individuals in human societies may mean that group selection has played a much larger role in hominid evolution than is generally assumed. Robin Dunbar comments that religion "is a particularly effective way in which one can try to create a sense of belonging, 'groupishness'". That sense may be very important for the survival and success of groups in competition with other groups, and so could naturally be a basis for the evolution of religion. This evolution would be cultural, and not biological, though it is the preceding evolution of cognitive ability that makes this cultural evolution possible.

Pivotal changes, whether in biological or cultural evolution, such as the enlargement of the brain or the discovery of fire, allow subsequent rapid evolutionary changes to take place. That seems to me to be the reason why the pace of both biological and cultural evolution is not uniform. Pivotal changes will be uncommon and their occurrence largely determined by chance. The key feature of cultural evolution is horizontal transmission within generations, rather than just vertical transmission from generation to generation. The horizontal cultural transmission process in humans, which is largely dependent on their superior cognitive abilities, is enormously more rapid than conventional biological evolution. This undoubtedly is the major determinant of the extraordinarily rapid development of human society over the last few thousand years, which are hardly a tick in the usual time frame of the clock of biological evolution. It is especially during this process that I believe that group selection has played a much more important role than is usually assumed.

Much of the extreme development of human culture, such as music, science, mathematics, and literature, may simply be a byproduct of our superior cognitive abilities, which were selected, not to make music or solve complex mathematical problems, but for our better survival and adaptation to rapidly changing environmental conditions. Culture is nowadays largely passed on from generation to generation through education, and language, in one form or another, remains the main vehicle for cultural transmission.

Imitation Makes Us Human, S. Blackmore

To be human is to imitate. This is a strong claim, and a contentious one. It implies that the turning point in hominid evolution was when our ancestors first began to copy each other's sounds and actions, and that this new ability was responsible for transforming an ordinary ape into one with a big brain, language, a curious penchant for music and art, and complex cumulative culture. The argument, briefly, is this. All evolutionary processes depend on information being copied with variation and selection. Most living things on earth are the product of evolution based on the copying, varying and selection of genes. However, once humans began to imitate they provided a new kind of copying and so let loose an evolutionary process based on the copying, varying and selection of memes. This new evolutionary system co-evolved with the old to turn us into more than gene machines. We, alone on this planet, are also meme machines. We are selective imitation devices in an evolutionary arms race with a new replicator. This is why we are so different from other creatures; this is why we alone have big brains, language and complex culture.

Fundamental to all evolutionary processes is that some kind of information is copied with variation and selection. As Darwin first pointed out, if you have creatures that vary, and if most of them die, and if the survivors pass on to their offspring whatever it was that helped them survive, then those offspring must, on average, be better adapted to the environment in which that selection took place than their parents were. It is the inevitability of this process that makes it such an elegant and beautiful explanation of the origins of biological design. But it should not be confined to biology. Universal Darwinism is the idea that the same principles apply to any system which has the three requisites - variation, selection and heredity. With these in place you must get evolution. Dennett calls this the 'evolutionary algorithm', a simple mindless procedure which produces 'Design out of Chaos without the aid of Mind'.

[The] literal meaning [of 'meme'] is 'that which is imitated'. So other examples include gestures and games, urban myths and financial institutions, scientific theories and complex technologies. Most of these are not simple memes but 'co-adapted meme-complexes' or 'memeplexes'; groups of memes that fare better together than they would individually, so they tend to stick together, and get copied and passed on together.... The skills you learn by yourself and for yourself are not memes, nor are your memories of places you have seen or people you know, nor are emotions that cannot be accurately conveyed to anyone else. But every word in your vocabulary, every story or song that you know, and every idea you got from someone else is, and when you combine these to make new stories or inventions to pass on then you have created new memes.

Central to the idea of memes is that because they are replicators evolution will happen for the benefit of the memes themselves rather than for their carriers or for anything else. As Dennett emphasised, the ultimate beneficiary of any evolutionary process is whatever it is that is copied. Everything that happens, and all the adaptations that appear, are ultimately for the sake of the replicators. This idea is what distinguishes memetics from related theories in sociobiology, evolutionary psychology and gene-culture co-evolution theory.

For memetics culture is not, and never was, an adaptation. Imitation was an adaptation, allowing individuals to learn from each other, but the memes it unintentionally let loose were not. Culture did not arise for our sake, but for its own. It is more like a vast parasite growing and living and feeding on us than a tool of our creation. It is a parasite that we cope with — indeed we and our culture have co-evolved a symbiotic relationship. But it is a parasite nonetheless.

From the viewpoint of a meme, the important question is "How can I survive and get copied?" and usually this means "How can I get a human to pay attention to me, remember me, and pass me on?" Answers will be very varied but the general principle is that some memes succeed because they are good, useful, true, or beautiful, while others succeed even though they are false, useless or even harmful. From the meme's point of view their value to us or our genes is irrelevant, while to us it is critical. We try to select true ideas over false ones, and good over bad, but we do it imperfectly, and we leave all kinds of opportunities for other memes to get copied - using us as their copying machinery. In other words, there is an evolutionary arms race between us and the memes that we find ourselves copying.

In fact viral memes may be in a minority, with most of our culture consisting of memes that work at least as much for us as we do for them. These include our languages, the built environment, transport systems, communications technology and scientific theories. Without memes we could not speak, write, enjoy stories and songs, or do most of the things we associate with being human. Memes are the tools with which we think, and our minds and cultures are a mass of memes.

Step back a bit and think about a whole city. It is a spreading mass of copied memes — housing estates expanding; roads, railways and bus routes growing; the whole thing gobbling up resources, using humans as the willing meme machines that do the work. Now step back a bit further and look at the whole planet. You might be looking down from an aeroplane at night, seeing those dense patches of lights, with curious streams of moving lights within them, or stretching out to other distant patches. They look like living creatures, and according to memetics that is precisely what they are. They were built on the basis of memes rather than genes, but the same principles apply. This new view is different indeed from most people's normal way of looking at the world. Its power lies in its ability to unify all creative processes, both biological and cultural, within the same Darwinian framework. Yet after more than thirty years memetics is still not a thriving science.

One of the mysteries of human evolution is why our brains are so big. These outsize organs are expensive to build, dangerous to give birth to, and use a lot of energy to run, even during sleep. So there needs to be a very good reason. Nearly all conventional theories start by assuming that the big brain was an adaptation (i.e. an advantage to human genes) but differ in what advantage it provided — for example it might have been implicated in complex social relationships, increased group size, gossip or more practical things like tool making. Memetics provides a completely different argument: that the increase in brain size was driven by and for the memes, as they transformed an ordinary brain into a meme machine. I have called this process memetic drive and suggested that it would naturally begin as soon as our hominid ancestors were capable of imitating with sufficiently high fidelity to create the first memes. It does not matter what these were - possibly new ways of hunting, or lighting fires, or wearing clothes — but whatever they were they would change the environment in which human genes were being selected and give an advantage to individuals who could copy them.

All other theories of the origins of language — and the question has been hotly debated for centuries — assume that language is an adaptation. For memetics language is not an adaptation but a parasite turned symbiotic partner; an evolving system in its own right that fed off the humans who selected, remembered and copied sounds. Let us suppose that our hominid ancestors began imitating the sounds each other made, perhaps the sort of sounds that other primates make — food calls, mating calls, danger signals, and so on. Other primates cannot imitate well, but these early humans would be able to start copying the nuances of each other's calls, perhaps imitating the more powerful individuals, or starting whole lineages of different copying trends. In a society where imitation was prized, the number of sounds being made would increase and soon people would have to choose which to copy. In other words, looking from the meme's eye view, there would be competition between the sounds. So which would be copied most?

A general principle is that higher fidelity replicators do better. There is nothing magic about this rule. It simply means (in this example) that sounds that are more accurately copied will tend to survive unchanged for longer, and so increase in the meme pool. Once again, individuals capable of high fidelity imitation will gain higher status, attract more desirable mates and so pass on any genes responsible for their superior copying ability. So fidelity of copying will generally increase.... In fact the redesign of the human larynx, throat and brain for language was quite dramatic and is one of the features that most distinguishes us from other apes. According to memetics this redesign was driven by pressure from the memes.

A human mind is a veritable factory for new memes. Every word in your vocabulary is a meme and you routinely mix them up to produce unique new sentences, but so are all the more complex ideas you come across. And if you are a creative person your new mixtures will be more interesting than other people's and will set off on their own with a chance of being copied again. This is, indeed, a creative process. This is all that is happening as I write these words. All my ideas about evolution and memes have come from taking old ones and putting them together in new ways. It is certainly a creative process but not, I think, one that requires a conscious creator inside my head.

What makes us human? In the beginning it was imitation and the appearance of memes. Now it is the way we work as meme machines, living in the culture that the memes have used us to build. Is it depressing to think of ourselves this way — as machines created by the competition between genes and memes, and in turn creating more genes and memes? I don't think so. We have got used to the idea that we need no God to explain the evolution of life, and that we humans are part of the natural world. Now we have to take a step further in the same direction and change yet again the way we think about ourselves, our consciousness and free will. But this is precisely what makes it so exciting being human - that as meme machines we can, and must, reflect on our own nature.

Memory, Time, and Language, M. Corballis, T. Suddendorf

Natural selection is inevitably oriented toward the future. Many of the characteristics that are selected have to do with the ways in which animals behave - how they feed, fight, flee, and mate, or how they construct niches for successful living.... Behaviours that increase reproductive fitness, ensuring that the individual's genes will be continued into future generations, will be selected at the expense of those that place future generations at greater risk, or that threaten reproduction itself.

In this chapter, we start by proposing a hierarchy of behavioural adaptations, culminating in an adaptation that may help explain behavioural characteristics unique to our own species. This adaptation is mental time travel, and is based on the recording of specific events in the past, and the imagining of specific events in the future. Mental time travel is responsible for many human attributes, such as the ability to individually plan our futures in detail, and perhaps for phenomena such as religious belief and ideas about life after death. We argue that our ability to transcend time may lie at the heart of another capacity generally regarded as uniquely human. That capacity is language.

Declarative memory provides more flexible adaptation to the world than does simple learning or nondeclarative memory. First, it provides an explicit and often detailed model of the world in which we live. We know precisely where we live, the neighbourhoods, the geographical areas in which we work, play, and travel. As humans, we have a huge array of facts at our disposal that enable us to make precise plans for the future, and meet different obligations and contingencies. Second, declarative memories can be voluntarily triggered top-down from the frontal lobes, rather than bottom-up through perception. They can therefore be brought into consciousness "off-line" for flexible planning and decision-making. We can contrast and compare different pieces of knowledge, and choose knowledge relevant to a particular activity, such as planning a career, or a vacation.

Over thirty years ago, Tulving drew the distinction within declarative memory between semantic memory, which is memory for facts, and episodic memory, which is memory for events. Rough!}', semantic memory can be likened to a combined dictionary and encyclopaedia, while episodic memory can be likened to a personal diary. Of course part of our understanding of our personal past depends on semantic memory as well as on episodic memory, to make up what has been called autobiographical memory. The distinction between semantic and episodic memory has also been characterized as that between knowing and remembering, respectively. For example, most of us know when we were born but do not remember the event, but may possibly remember events of the first day at school. Episodic memory is perhaps the ultimate in flexibility, since it records the particularities of one's life, and allows the fine-tuning of personal events in the future.

Even without brain injury, people probably remember only a tiny fraction of actual past episodes, and events are often remembered inaccurately, even to the point that people will claim with some certainty to have remembered events that did not in fact happen. Given the unreliability of episodic memory, often a bane to courts of law, it seems clear that it did not evolve primarily to serve as a record of the past. Schacter suggests rather that its function is to build up a personal narrative, which may provide the basis for the concept of self, as well as a basis on which to ground future behavioural choices. If semantic memory provides knowledge about relatively constant aspects of one's environment, episodic memory picks out singular events that can augment one's decisions about how to behave in similar circumstance in the future.

However incomplete, episodic memories are located in time, and so provide our mental lives with the time dimension. Episodic memory, then, can be regarded as part of a more general capacity for mental time travel, allowing us not only to mentally relive events in the past, but also to imagine the future in episodic detail. Developmentally this capacity to envisage future episodes appears to emerge in children at around the same time as episodic memory itself, between the ages of three and four. Patients with amnesia are equally unable to answer simple questions about yesterday's events as to say what might happen tomorrow. Amnesia for specific events, then, is at least in part a loss of the awareness of time.

It has been proposed that episodic memory, and more generally mental time travel, are uniquely human. This has posed a challenge to animal researchers to show that these capacities exist in nonhuman species. Since the testing of episodic memory and mental time travel in humans typically involves language, it has proven difficult to design experiments to test episodic memory in non-linguistic species. Nevertheless there have been some concerted attempts to specify the necessary conditions for episodic memory, and then to test whether the conditions hold in a nonhuman animal.

Tool manufacture is sometimes taken as evidence for a sense of the future, especially when it is clear that a tool is manufactured on one occasion for later use. Again, birds seem to provide the most compelling evidence. For example, New Caledonian crows construct tools from pandanus leaves for the express purpose of extracting grubs from holes, and the design characteristics of these tools suggest a clear temporal distinction between construction and use. In contrast, there is little evidence that nonhuman primates make tools with any sense of future planning. Chimpanzees and other great apes may improvise objects for use as tools, such as using stones to crack open nuts, and they can be taught to use tools in specific ways, and even show culturally determined variation on how they use these tools, but there is little evidence that they construct tools for some given future purpose, although they may be capable of improvisation.

The evidence on mental time travel in nonhuman species remains scant, but might be taken to indicate some rudimentary ability to refer to specific past experiences or to imagine future ones. Convergent evolution may have produced various mechanisms designed to enhance future survival chances through cognitive capacities akin to mental time travel. Nevertheless the evidence pales beside the undoubted ability of humans to re-experience their past lives or to imagine the future in detail. The manufacture of tools by humans is clearly governed by design characteristics that are oriented toward future use, and there is a clear temporal separation between manufacture and use.

It is the planned manufacture of tools, perhaps, that provides the earliest indication of mental time travel in hominid evolution. Stone tool industries in our hominid forebears have been dated from about 2.5 million years ago in Ethiopia.... The true climb to humanity and humanlike cognition probably began with the larger-brained Homo erectus around 1.8 million years ago, and the somewhat more sophisticated Acheulian tool industry dating from around 1.5 million years ago. Apparently, these tools were made and kept for repeated future use. The Acheulian industry even persisted into the culture of early Homo sapiens some 125,000 years ago. Nevertheless, there was something of an increase in the sophistication of manufactured tools from about 300,000 years ago, and a more dramatic increase, as part of what has been termed the "human revolution," within the past 100,000 years. These developments may represent increasing sophistication in the understanding of time.

The sense of time, and selection for episodic memory, may well have evolved, perhaps gradually, with the global shift to cooler climate after 2.5 million years ago, when much of southern and eastern Africa probably became more open and sparsely wooded. This left the hominids not only more exposed to attack from dangerous predators, such as sabre-tooth cats, lions, and hyenas, but also obliged to compete with them as carnivores. The solution was not to compete on the same terms, but rather to establish what Tooby and DeVore called the 'cognitive niche', relying on social cooperation and intelligent planning for survival. We succeeded....Accurate recording of specific events in a hostile environment may have been critical to effective and safe scavenging, and eventually to hunting as the manufacture of tools and weapons became more sophisticated. As Pinker put it, it became increasingly important to encode information as to "who did what to whom, when, where, and why".

The progression toward what might be called episodic cognition might be tracked by the increase in brain size associated with the emergence of the genus Homo.... [Average brain size] rose to some 612 cc in Homo habilis, 854 cc in early Homo erectus (their African cousins are also known as Homo ergaster), and 1016 cc in later Homo erectus. After a period of stasis, there appears to have been a secondary increase from about 500,000 years ago, reaching about 1552 cc in the Neanderthals (Homo neanderthalensis), and a slightly smaller 1355 cc in Homo sapiens. This secondary increase might represent the arrival of a full sense of time, along with rituals associated with death and religion.

The human capacity that is most widely regarded as uniquely human is language. Though some argue that all that is unique about human cognition can be traced to language, we have argued that the evolution of mental content must have preceded the evolution of means to communicate such content. Language shares a number of critical properties with mental time travel, and may derive from the need to communicate about events that occur in other places and at other times.

Our imaginations are relentlessly generative, as we fantasize about what was, what might have been, and what hopefully will be. Indeed, the generative component allows us to understand time that precedes birth and that continues after death, giving rise to history, theories about the origins of the universe, and the concept of life after death. Hockett, who called it 'productivity', regarded generativity as one of the defining properties of human language. To Chomsky generativity is the critical component of human language that distinguishes it from animal communication, and further implies that language cannot be reduced to associative learning, or to a finitestate system.

As outlined earlier, episodic memory introduced the dimension of time into mental life, thereby vastly increasing demands on storage. A life lived in the present can be accommodated largely through instinct and learned adaptations. The time dimension is a characteristic property of language, in the form of tense, enabling us to talk about past, present, or future. Most languages have developed subtle variations on the simple notion of time as a continuum from the past to the future. In English there are as many as thirty different tenses. For example, the future perfect (e.g. he will have walked) illustrates our ability, not only to project ourselves mentally into the future, but also to imagine an event that will be in the past at some future date. This again illustrates the generative, recursive nature of both language and thought.

Given the commonalities between language and mental time travel, it seems reasonable to assume that they are linked. Logically, one might suppose mental time travel to have preceded language. It was during the Pleistocene, perhaps, that the mind became structured to record individual events in time, affording a more finely tuned capacity to adapt to future events. Nevertheless, survival undoubtedly depended also on cooperation between individuals, and the sharing of episodic information. This may have led to the evolution of a form of communication designed to refer to events that were separated in both time and space from the present. To some degree at least, then, the structure of events dictated the structure of sentences, involving nouns and verbs to designate objects and actions. Since events are located in space and time, language also incorporated spatial markers as well as tense.

Although mental time travel may have been a prerequisite for language, the two must also have coevolved. Gardenfors writes similarly that, in his view, "there has been a coevolution of cooperation about future goals and symbolic communication". Indeed language itself adds to the capacity for mental time travel, since it provides a means by which people can create the equivalent of episodic memories in others, and therefore contributes to episodic thinking. By telling you what happened to me, I can effectively create an imagined episode in your mind, and this added information might help you adapt more effectively to future conditions. Through story telling, there has no doubt been a long tradition in all cultures of creating potential scenarios that contribute to our ability to envisage future scenarios, both for ourselves and for others. Thus mental time travel and language can interact to vastly increase the storehouse of episodic information, leading also to the elaboration and refinement of semantic information.

Mental time travel and human language have been proposed as uniquely human capacities. The main theme of this chapter is that they are related, and indeed interdependent. The basic structure of language, we argue, is rooted in declarative memory, which permits the generation of specific episodes, whether past or future, actual or imagined. Language then allows this information to be shared. This does not mean, of course, that our language and thought consist entirely of episodic information, or the semantic knowledge gained from it, since language and thought both have undoubtedly expanded to include abstract thought, logic, and the like. Nevertheless it has been proposed that even processes as abstract as deductive logic may depend on 'mental models' that often involve visual images, which in turn may derive from the ability to imagine events.

In this chapter, we have merely sketched the idea that two characteristics that appear to be uniquely human might be fundamentally related. These characteristics are mental time travel, and language. Given that language seems so admirably designed to convey information about events that are located in space and time, we suggest that the relations between the two should be further explored.

Why Are Humans Not Just Great Apes?, R. Dunbar

We differ, as has so often been pointed out, by only about 2 per cent in terms of our DNA from chimpanzees, whose own direct ancestors we shared around 5—6 million years ago. But there are some anatomical oddities. We are bipedal, for example, and have unusually large brains. For a long time, these persuaded biologists to classify humans in a separate group among the apes. Conventional wisdom had it that we shared only a very deep ancestry with the other great apes, perhaps as long as 20 million years ago, long before even the orangutan's lineage branched off from its common ancestor with the other three great apes (the chimpanzee, bonobo and gorilla — or, strictly speaking as it has now turned out, the two gorilla species). The human lineage, which included all the australopithecines from Africa, had a long and - we supposed - distinguished history that was separate from the other apes. After all, if nothing else, we walked upright and all the other apes walked on all fours.

But appearances can be deceptive. As the genetic revolution unfolded through the 1980s, it became increasingly obvious that, no matter how different we might appear from the other apes, our genetic make-up was rather similar. In fact, more than just rather similar: it was all but identical. By the end of that decade, our whole understanding of ape evolutionary history had been turned on its head. So far from being a separate evolutionary lineage with deep roots, we humans were in fact embedded within the great ape family. Indeed, we were not just embedded within the great ape family, we were kith and kin to the chimpanzees - genetically more closely related to the two chimpanzee species (the common chimpanzee and the bonobo) than any of the three of us was related to the gorilla. Indeed, it has since turned out that we are more closely related to each other than the two species of gorilla (the physically barely distinguishable eastern and western gorillas) are related to each other! The universally accepted position is now that the big split in the great ape family is not between humans and the other great apes, but between the Asian orangutan and the four (or should it be five?) species of African great apes (one of which is us humans). Humans are now, strictly speaking, firmly ensconced within the chimpanzee family.

So let me begin with the easier question: in exactly what ways are humans different from other species? I want to argue that the real differences lie in our capacity to live in an imagined world. In respect of almost everything else, including the basic features of our cognition — like memory and causal reasoning — we are all but indistinguishable from great apes. What we can do, and they cannot, is to step back from the real world and ask: could it have been otherwise than we experience it? In that simple question lies the basis for everything that we would think of as uniquely human. Literature depends on our capacity to imagine a set of events, a world that is different from the one we personally inhabit on a day to day basis. Science depends on our capacity to imagine that the world could have been different, and then to ask why it has to be the way it is. Curiosity about distant places has led us to explore the furthest reaches of the planet, and even to venture hesitantly out into space.

By contrast, all other creatures alive today seem to have their noses thrust so firmly up against the grindstone of everyday experience that they cannot step back far enough to ask these kinds of questions. The world is as it is, and that seems to be all there is to it. They cannot explore the world of the imagination in the way that is so fundamental and so necessary to everything we would naturally identify as being the core to the human condition - what it means to be human. But there is one aspect of our cultural life that merits particular comment in this context. And this is our capacity to contemplate alternative universes. As a result of this, we have developed theories about transcendent spirit worlds, about life before birth and after death, about beings who live in a parallel universe who can influence the world in which we live. In short, it is the world of religion. There is something genuinely odd about our apparent capacity to live our lives partly in an inner world that is half-connected with the real world. I would argue that this capacity has a particularly close association with our capacity to produce literature. Both share the curious feature of requiring us to imagine fictional worlds with such intensity that we can believe in their veracity.

Language is critical in this context, because if we do not communicate our ideas and thoughts about this inner world, it might as well not exist. This is not to say that language creates or gives rise to these inner worlds. Far from it: my point is only that language is necessary to communicate our discoveries about these inner worlds to each other. If we cannot share these experiences with others, we cannot build those communal intellectual edifices that we think of as the essence of human culture - dramas, histories, the rituals of religion, metaphysics and theologies. So, what is it about apes' minds that prevents them from doing so?

There is a growing consensus among both developmental and comparative psychologists that the phenomenon known as 'theory of mind' (the capacity to imagine another individual's 'mind states') is a critical difference between humans and other animals. It is widely accepted now that most animals (certainly higher animals) have knowledge of their own mind states. In terms of philosophy of mind, this is what is usually referred to as first order intentionality. 'Intentionality' is a general term covering those mind states that involve beliefs. They are evidenced by the use of words like believe [that something is the case], know, suppose, intend, want, desire, wish, etc. They imply that we have a certain level of awareness about the contents of our minds, that we are aware that we believe, in some intuitive sense, that something is the case.

This state of self-knowledge is the world inhabited by very young children. This is not necessarily the same thing as consciousness, but it is certainly a form of consciousness — and probably one that is shared with a wide range of birds and mammals, at the very least. However, at around the age of four years, children develop the capacity to reflect on other's minds as well as their own. This capacity is now referred to as 'theory of mind' (literally intended to mean that they have a theory about minds, a theory about other people's capacities to have beliefs). It may even be the key to self-consciousness in some formal sense: they can believe that they themselves believe that something is the case.

So the lesson for us is that the flights of fancy that we engage in when dabbling in literature, even when just telling stories around the camp fire, are far beyond the cognitive capacities of any other species of animal currently alive. Great apes might be able to imagine someone else's state of mind, and so they might even be able to construct a very simple story, but it could never be much more than a narrative involving one character. Only adult humans could ever intentionally produce literature of the kind that we associate with human culture. It is possible, of course, to produce stories with third and even fourth order intentionality (perhaps the cognitive equivalent of eight- and eleven-year-old children), but they would inevitably lack the sophistication of the stories told by the average adult, never mind those of a Shakespeare or a Moliere.

Humans are intensely social animals, just like all our monkey and ape cousins. We live in tightly bonded social groups, and it is this sociality that has been the secret of our success, just as it has been the secret of primates' evolutionary success as a family. Primate societies are implicit social contracts: their members, in effect, agree to cooperate in solving the everyday problems of life and death collectively. Doing so allows them to solve these problems more effectively.

We find that fifth order intentionality did not appear before the appearance of our own species (usually known to archaeologists as 'anatomically modern humans' to differentiate them from people like the Neanderthals), and the genetic evidence suggests that they first originated about 200,000 years ago. The archaic humans who preceded us (among whom the Neanderthals are now numbered) would certainly have had fourth order intentionality, and that would surely have meant that they could have had religion in some more muted form. But full-blown religion, with its rituals and panoply of commonly held beliefs and commitments, would have sprung into life only with the emergence of modern humans, somewhere in Africa, a mere 10,000 generations ago. In the grand scale of human evolution stretching back to our last common ancestor with the apes some five and a half million years ago, that's not all that long ago.

I have tried to make three claims here. One is that humans really differ from other monkeys and apes in just one key respect, the world of the imagination. That capacity to live in a virtual world makes possible both literature and religion, perhaps the two most central aspects of culture that define our humanity. The other is that religion and story-telling evolved in order to enable our large and otherwise rather fragmented social groups to be bonded into effective communities so that they could do their intended job of fostering our survival and evolutionary success. Last, but perhaps by no means least, I have suggested that there is some indirect evidence to suggest that those capacities might have arisen rather late during the course of human evolution - indeed, that they explicitly demarcate and define the appearance of our particular species, modern humans.

The Hominid That Talked, M. Gentilucci, M. Corballis

It is commonly argued that the primary characteristic that distinguishes our species from other hominids is language. In this chapter, we argue that it was not language per se that distinguished Homo sapiens from our immediate, but now extinct, ancestors. Rather, it was the conversion from a primarily visual form of language to one that is conveyed primarily through the medium of sound. What distinguished our species was the power of speech. The argument derives initially from the question of how language evolved in the first place. Language is composed of symbols that bear little or no physical relation to the objects, actions, or properties they represent. How, then, did these abstract symbols become associated with aspects of the real world?

The capacity to convey the major part of the message entirely through the vocal medium, as on radio or telephone, may have evolved only in our own species, and may explain the so-called "human revolution" that has allowed our species to dominate the planet. It may also explain the demise of the Neanderthals, who in other respects seem to have been the cognitive equals of Homo sapiens. The questions of where, how, when, and why language might have been transformed from something resembling gesture language to vocal speech are considered in the following sections.

Although the connections between hand and mouth were probably well established in our primate forebears, fully articulate vocalization may not have been possible until fairly late in hominid evolution, and perhaps not until the emergence of our own species, Homo sapiens. As we have seen, there is little if any cortical control over vocalization in nonhuman primates, and it has proven virtually impossible to teach chimpanzees anything approaching human speech. Moreover, fossil evidence suggests that the alterations to the vocal tract and to the mechanisms of breath control necessary for articulate speech were not completed until late in hominid evolution, and perhaps only with the emergence of our own species, Homo sapiens, which is dated at some 170,000 years ago.

Selective changes to the vocal tract, breathing, and cortical control of vocal language suggest that there must have been selective pressure to replace a system that was largely based on manual and facial gestures to one that could rely almost exclusively on vocalization, albeit with manual accompaniments. This pressure must have been strong, since the lowering of the larynx to enable articulate speech increases the risk of choking to death. What, then, were the adaptive advantages associated with the switch from visible gestures to auditory ones? We think that there were several: "First, a switch to autonomous vocalization would have freed the hands from necessary involvement in communication, allowing increased use of the hands for such activities as grooming, carrying things, and the manufacture and tool use. Indeed vocal language allows people to speak and use tools at the same time, leading perhaps to pedagogy".

Second, speech allows communication over longer distances, as well as communication at night or when the speaker is not visible to the listener. Auditory stimuli also have greater attentional impact than visual stimuli, and at first grunts may have accompanied signing to attract attention rather than to convey information. Third, speech is much less energy-consuming than manual gesture. Anecdotal evidence from courses in sign language suggests that the instructors require regular massages in order to meet the sheer physical demands of sign-language expression. In contrast, the physiological costs of speech are so low as to be nearly unmeasurable. In terms of expenditure of energy, speech adds little to the cost of breathing, which we must do anyway to sustain life.

A possible scenario for the switch is that there was selective pressure for the face to become more extensively involved in gestural communication as the hands were increasingly engaged in other activities. Our hominid forebears had been habitually bipedal from some 6 or 7 million years ago, and from some 2 million years ago were developing tools, which would have increasingly involved the hands. The face had long played a role in visual communication in primates, and plays an important role in present-day signed languages. Consequently, there may have been pressure for intentional communication to move to the face, including the mouth and tongue. Gesturing may then have retreated partly into the mouth, so there may have been pressure to add voicing in order to render movements of the tongue more accessible — through sound rather than sight. In this scenario, speech is simply gesture half swallowed, with voicing added. Adding voicing to the signal could have had the extra benefit of allowing a distinction between voiced and unvoiced phonemes, increasing the range of speech elements.

The advantages of incorporating sound, at least to the point that the message can be conveyed without visible access to the speaker, may explain the so-called "human revolution", manifest in the dramatic appearance of more sophisticated tools, bodily ornamentation, art, and perhaps music, dating from some 40,000 years ago in Europe, and probably earlier in Africa. Freed from communicative duty, the hands were free to express themselves! These dates correspond reasonably closely to the estimated date of the mutation of the FOXP2 gene, which may have been the final step in the evolution of autonomous speech. This raises the possibility that the final incorporation of vocalization into the mirror system was critical to the emergence of modern human behavior in the Upper Paleolithic. It may also have been responsible for the eventual demise of the Neanderthals in Europe and Homo erectus in Asia. These species may well have had a form of language that was part gestural and part vocal, but the emergence of autonomous speech in our own forebears may have created a sufficient advantage, technological as well as communicative, to enable our own species to prevail.

The human revolution is commonly attributed to the emergence of symbolic language itself rather than to the emergence of speech. This implies that language must have evolved very late, and quite suddenly, in hominid evolution. Some have associated it with the arrival of our own species, Homo sapiens, about 170,000 years ago. Bickerton, for example, writes that"... true language, via the emergence of syntax, was a catastrophic event, occurring within the first few generations of Homo sapiens sapiens." Crow has similarly proposed that the emergence of language was part of the speciation event that gave rise to Homo sapiens. The association of the evolutionary explosion with the human revolution suggests that language may have emerged even later, as proposed by Klein et al. although there is still debate over the extent and time frame of the human revolution.

It seems much more likely that language evolved incrementally, perhaps beginning with the emergence of the genus Homo from around 2 million years ago. Pinker and Bloom argue, contrary to earlier views expressed by Chomsky, that language evolved incrementally through natural selection, and Jackendoff has proposed a series of stages through which this might have occurred. In something of a change of stance for Chomsky, he and his colleagues have also highlighted a continuity between primate and human communication, again suggesting the gradual evolution of human language — although they do not consider the possibility that language evolved from manual and facial gestures, nor do they speculate as to precisely when the uniquely human component (what they call "faculty of language in the narrow sense") emerged in hominid evolution. If syntactic language evolved gradually over the past 2 million years, then it seems reasonable to suppose that it was already well developed by the time Homo sapiens appeared a mere 170,000 or so years ago. As we have seen, it now seems likely that the FOXP2 gene has to do with oral-motor control rather than with syntax.

One may question whether the switch to an autonomous vocal language could have brought about an effect as apparently profound as the human revolution. As noted above, speech would have freed the hands, enhancing pedagogy, which itself may be a uniquely human characteristic. More generally, changes in the medium of communication have had deep influences on our material culture. Without the advent of writing, and the later development of mathematical notation, for example, we would surely not have had our modern contrivances such as the automobile, or the supersonic jet. We suggest, then, that the switch from a manuo-facial to a vocal means of communication would have especially enhanced material culture, including the manufacture and use of tools — not to mention weapons of war. Indeed, it is primarily in material culture that the human revolution is manifest, whereas the earlier evolution of language itself may have been expressed in, and perhaps driven by, complex social interaction, or what has been called cultural cognition. The social component may be less visible in the archaeological record. The human revolution may therefore give a false impression of the evolution of the human mind itself.

In summary, language may well uniquely distinguish our species from any other extant species, but need not have distinguished us from the Neanderthals or other late species of the genus Homo. We suggest that what really distinguished us from all other hominids was the honing of language into a cost-efficient mode that freed the hands and arms for other activities, and allowed humans to communicate in a wider range of environmental contexts than is possible in a purely visual medium. That mode was speech.

Curiosity and Quest, C. Pasternak

Aristotle considered that 'All men by nature desire to know'. In my view, this is one of the characteristics that make us human. Curiosity about the world we live in leads directly to exploration, to quest. Yet surely animals are likewise curious, and also quest: for food and water, for shelter and a mate. How, then, do curiosity and quest distinguish us from all other animals, and in particular from our nearest relative, the chimpanzee? In this essay I will show that searching is indeed fundamental to living organisms — to plants and microbes as well as to animals — but that humans are endowed with four attributes that enable them to search more avidly than any other creature. Man's quest has encompassed the entire surface of the earth; he has been to the moon and observed the most distant stars. The chimpanzee has not ventured beyond his environment. Man dominates the globe; the chimpanzee faces extinction.

The reason why no animal has come close to humans in so far as gratification of innate curiosity is concerned is the lack of four attributes that humans have acquired over the past four million years or so. The first is the upright gait. After the lineages leading to modern chimpanzees and humans diverged around 6 million years ago, there emerged, in the Great Rift Valley of eastern Africa, apes that walked on their two legs alone, instead of on legs and knuckles.... Several potential evolutionary advantages for an upright gait may be mentioned. First, it doubles the area of land surveyed and therefore gives earlier warning of an approaching predator or prey. Second, it frees the hands to carry offspring or food, and to feel for objects in the dark. On the other hand, the evolution of bipedalism may be due simply to sexual selection by females, as I have argued for the case of art. Australopithecus females may have been attracted by the sight of their male counterparts standing upright, towering above them. Once they favoured males able to do this, the underlying genes would soon have begun to spread among the population.

Freeing the hands has an equally important consequence. It allowed the gradual appearance — over a million years or so - of a more delicate, flexible thumb. This enables the owner to use the precision grip with firmness, and it is this that allowed our first Homo ancestor (H. habilis, or 'handyman') to begin tool-making around 2 million years ago. The importance of the human hand, not just for sharpening stone implements, for making music or for drawing pictures, but for the entire development of sophisticated technology over the last ten thousand years, cannot be stressed enough. For good reason did the eighteenth-century philosopher Immanuel Kant call the human hand 'the exterior brain of man'. The flexible thumb, leading to an agile hand, then, is the second human attribute that is key to human quest.

The third attribute is the human voice box (or larynx). Unlike the upright gait or the mobile thumb, it is difficult to put a date on the appearance of the human larynx. This is because there are no bones — just muscle and cartilage — associated with it. The vocal cords themselves are no more than infoldings of the larynx. They vibrate, like the strings of a violin, to produce a huge range of sounds in modern humans, but no more than grunts and cries in the chimpanzee... A decade ago, Robin Dunbar proposed an attractive hypothesis: language arose through gossiping among small groups of humans, which had gradually replaced the grooming typical of chimpanzees and other primates. Speech is useless without hearing. It has recently been suggested that the development of the human ear enabled it to pick up human speech better than the chimpanzee ear. This appears to have happened by 350,000 years ago, according to fossils of human remains found in northern Spain. So it is possible that human hearing actually preceded human speech.... The fourth attribute that allows us to exercise our superior quality of quest is, of course, the brain. We have some three times as many cortical neurons (nerve cells in the outer layer of the brain, that actually makes up most of the human brain) as a chimpanzee. It is in the cortex that processes like thought and memory, reasoning and deduction, take place.

It is easy to see why an increased brain size, and therefore superior mental function, should have an evolutionary advantage. Of course the argument for sexual selection by females can be applied to all three attributes — the mobile thumb ('man the toolmaker'), the voice-box ('man the talker') and the brain ('man the clever one') — as much as to the upright gait. But why did this happen in the lineage leading to modern humans and not in that leading to modern chimpanzees?

I would like to stress that I do not consider any of the attributes I have mentioned — especially the agile thumb, the voice box and the larger number of cortical neurons - to underlie the essential difference between humans and chimpanzees on its own. Rather it is the combination of all three — hand, speech and brain — that allows humans to search more avidly than other creatures.

The exploratory drive of humans, their search for new lands on earth, is particularly striking. Whether out of mere curiosity, or out of need, or from harassment of one sort or another, successive species of Homo (the most recent being H. sapiens) wandered out of Africa and into Eurasia between 1.9 million and 100,000 years ago. This was followed by H. sapiens taking to the seas of southeast Asia in simple boats more than 50,000 years ago, to finish up in Australia. Others roamed across the frozen north from Siberia into Alaska some 12,000-16,000 years ago. These migrations, especially those across water, must have taken a lot of courage.

I believe that what makes us human is our innate curiosity and our never-ceasing quest: for beauty and novelty in art (more novelty and less beauty these days), for understanding the inside of our brain and the outside of our solar system, for ways to produce offspring without sex and then educating them without learning, for the means to avoid pain (ours) and the design of weaponry to end lives (theirs), for faster travel and slower ageing.... Our ability for actions that elude the cleverest chimpanzee is the result of inheriting a unique combination of characteristics: an upright gait, a mobile thumb, a voice box capable of speech, and a brain that contains three times as many neurons as that of a chimpanzee.

Human Evolution and the Human Condition, I. Tattersall

Just what is it about we human beings that makes us feel ourselves to be so different from the rest of the living world? Quite evidently, we have a whole host of physical peculiarities, mostly related to our large brains, small faces, and upright postures, that clearly distinguish us from our closest living relatives, the great apes. But without any doubt, the differences that most significantly differentiate us from them are the cognitive ones — although the behavioural gulf that separates us from the apes seems to be steadily narrowing, as these complex creatures are discovered to indulge in ever more activities that we had formerly thought to be unique to ourselves.

Still, we obviously relate to the world around us in a significantly more complex manner than they do; and even though it is impossible for us to imagine just how apes subjectively experience their lives — and therefore to know the degree to which the quality of their experience resembles ours - there are clearly major differences between apes and humans in our ways of processing information about the environment and about each other. To put this complex and difficult matter in a nutshell, we are symbolic creatures; they are not. Only human beings, as far as we know, mentally divide up the world around them into discrete entities to which names are given. And once we have generated mental symbols of this kind - whether they represent concrete objects or abstractions - we can put them together in new combinations, and ask questions like "what if?" In other words, we can and do constantly remake the world in our heads; and it is in this mentally reconstructed world that we human beings live, rather than in the world as presented directly by Nature — which is the one in which, to the best of our knowledge, all other living creatures live.

Trying to identify what makes us different by selecting particular behaviours to dwell upon doesn't really get us very far. And the reason for this is, of course, that all of those behaviours are effects, not causes. They are products of the human mind and of conscious choice; they are not determinants of what makes us human. As a result, if we want to know just what it is that makes us cognitively unique we have to turn to the human mind itself, rather than merely to its expressions. The human mind is, of course, in turn the product of the physical brain. And if we could say just what it is about the human brain that makes it function in a different manner from anatomically not-too-dissimilar structures in other species, then we could truly specify, in physical terms, just what it is that makes us cognitively human. But at present, although new techniques of brain-imaging are teaching us more each day about which parts of the human brain are active in performing which behavioral functions, we still have no idea at all about how a mass of electrochemical signals in the human brain - and apparently in ours alone — give rise to what each of us subjectively experiences as his or her peculiarly human form of consciousness. Given this fact - and its own structural basis may very well in the end be the very last biological problem left for the human intellect to solve — we are left with the investigation of the mind, and of how we acquired it, as the most profitable avenue toward specifying just what it is that makes us cognitively unique.

The hominid behavioural record begins essentially with the invention of crude stone tools — small sharp flakes chipped off a cobble held in one hand using a hammer stone held in the other — at around two and a half million years ago. Whoever the makers were, and this is not certainly known, they do not seem to have differed significantly from their nontechnological predecessors, small-bodied creatures with short legs and brains of not much better than ape size. Indeed, the creatures who represented the hominid family for the first few million years of its existence are often referred to as "bipedal apes". In the million years or so following the introduction of stone tools average hominid brain sizes may have crept up a little, but there was no radical improvement in stone working technology. Indeed, even after tall hominids with substantially modern body proportions and at least some increase in brain size had made their unanticipated appearance at some time after two million years ago, stone tools of the early style continued to be made for several hundred thousand years.

It was not until about one and a half million years ago that a new kind of stoneworking was finally introduced in Africa. A larger piece of stone, typically around eight inches long, was carefully fashioned on both sides to a symmetrical teardrop shape, producing what is known as a "handaxe". At last, toolmakers seem to have been shaping stone to a "mental template" that they had in mind before knapping began, rather than like their predecessors simply going after an attribute - a sharp cutting edge - regardless of what the resulting implement looked like. Surely a cognitive leap, but apparently not one that was initially taken by a new kind of hominid, and one that may or not have been accompanied by wider behavioral changes: the record simply doesn't tell us.

Only about 300,000 years ago was a significantly new stone working technology introduced. This was the "prepared core" technique, whereby a stone was fashioned with numerous blows, often from a "soft" hammer of bone or suchlike, until a final blow would detach a flake that took little modification into a finished tool. This is surely evidence of an advance in the ability to envisage the potential offered by stone as a material; but the wider ramifications of this cognitive advance remain entirely conjectural. The best-known practitioner of the prepared-core technique was Homo neanderthalensis, an endemic European species that flourished in isolation between about 200,000 and 30,000 years ago. The Neanderthals, as members of this species are popularly known, were fine craftsmen in stone, invented the burial of the dead, ground ochre, and looked after their kin; but although they left behind themselves an abundant material record of their lives, there is essentially nothing in that record that convincingly suggests that they possessed symbolic behaviours: claimed examples of Neanderthal art, decoration or notation are few at best, and all are arguable. This is also true for all other extinct species of hominids.

But it is emphatically not true of the Homo sapiens — the Cro- Magnons — who entered Europe at about 40,000 years ago, and in a mere ten millennia drove them to extinction. The lives of the Cro- Magnons were thoroughly permeated by symbol. Not only were these people making spectacular animal art on the walls of caves well over 30,000 years ago, but they were producing elegant carvings and engravings, making notations on bone plaques, decorating utilitarian objects, and making music on vulture-bone flutes - an activity for which vulture bones, evidently possessing some deep symbolic meaning for the flute-makers, were entirely reserved. And if they made music, surely they sang and danced as well. Soon the Cro-Magnons were baking figurines in kilns and sewing tailored garments with the finest of eyed bone needles. These people, in other words, were us, possessed of language and of a sensibility entirely equivalent to our own. It is important to note, however, that they were not the first humans to show any material evidence of the modern symbolic sensibility - hints of this show up in Africa (the ultimate place of origin of the Cro-Magnons) as long ago as 75,000 years - but no other early people has left us such a dramatic material document of lives governed by symbol.

Yet the first human beings who looked exactly like us apparently did not behave in this way. The earliest putative Homo sapiens fossils in Africa show up in the 160-200,000 year range and are associated with quite crude stone tools, and nothing else. And around 100,000 years ago in Israel, we have good evidence that humans who were entirely modern in their bony skeletons were producing a material culture that was virtually indistinguishable from that of the Neanderthals with whom they shared their territory in some way for several tens of thousands of years at least.

What, then, happened to change things? It has been beguilingly suggested that at around 50,000 years ago a human species with the same bony structure as us acquired a genetic neural novelty that enabled symbolic mental activity, and that through enormous adaptive advantage the population with this gene spread very rapidly and replaced the more archaic form Old World-wide. More plausible, though, is the notion that the neural potential for symbolic thought arose as a by-product of the major developmental reorganization that signalled the emergence of Homo sapiens as a (highly) distinctive physical entity in the period between about 200,000 and 150,000 years ago. That biological potential then lay fallow, as it were, until it was released by a cultural stimulus of some sort.

In the human case, what might the cultural innovation have been? Most plausibly, it was the invention of language, which has the advantage in this connection of being a communal possession rather than a purely personal one (unless, of course, the essential function of language is, instead of communication, to be a conduit to thought). Language is the ultimate symbolic activity; indeed, at least from our modern perspective, symbolic thought seems impossible in its absence. Like thought itself, language involves forming mental symbols and combining and recombining them to come up with an infinity of new meanings and associations.... But whatever exactly it was that allowed human beings to discover their symbolic potential, it evidently followed some considerable time after the acquisition of the potential itself. But, once the horse was out of the stable, the rules of the game changed and the cultural history of humankind has subsequently been dominated by the discovery of new ways in which to use that potential.

The fact that [symbolic intelligence] did not come about via fine-tuning by natural selection not only clarifies why the cognitive mechanism that underpins our behavioural uniqueness is a generalized rather than a specialized one; it also explains a lot about the kinds of creatures we are.... If evolution had finetuned us for anything, surely it would be a great deal easier for us to specify the kind of creature we are. As it is, we are simultaneously many kinds of creatures: we all have a fish brain inside us, and a reptile brain, and a primitive mammal brain and probably nearly all of a chimpanzee brain too. But we have something else: an extra overlayering that makes symbolic thoughts possible — but symbolic thoughts that are also mediated by a lot of very ancient brain centres indeed. It is our combination of ancient emotional and intuitive brain processes with this additional symbolic element that makes us what we are; and the result is a structure, with an almost infinite range of potentialities, that no engineer would ever have dreamed of designing. This it is that gives us our free will — and the responsibility that goes with it.

Deep Social Mind in the Evolution of Human Nature, A. Whiten

I will propose that a diversity of kinds of evidence points to the evolution of a special social mentality as critical in the success with which our ancestors survived certain novel evolutionary bottlenecks - a success that can be fairly described as miraculous in the circumstances, but that may be crucial in understanding why human nature is as it is. Before I explain more fully what I mean by this, I need to indicate how we know any of these things about our past.

We now know much about the main stages of hominid evolution and its ecological context. When we put all the information together, it is, on the face of it, highly improbable that our ancestors should have survived the evolutionary obstacle course they were forced to run; and the solution through which they did so - the beginnings of the trajectory that led to present human nature—is no less a miraculous oddity in the context of the rest of nature.

The key findings that substantiate this rather expansive statement are as follows. First, the fossil record shows that the origin of the speciation that led to the creation of chimpanzees and humans occurred at a time of drastic environmental change in Africa. Forests which had supported several species of ape shrank towards the Congo basin, and were increasingly replaced by a broad arc of more thinly wooded grasslands. These provided for very large biomasses of grazing and browsing mammals. By contrast, primates were strongly adapted to arboreal habitats, and the ancestors of chimpanzees (and gorillas) remained restricted to the dwindling forested areas. Contrasting with this conservative response, our own ancestors adapted to the new, open environment. As apes, they were not initially suited to do so and had somehow to carve a new ecological niche in competition with a multitude of savannah-adapted species. In short, they had to become very peculiar apes — and they did. By 4 million years ago (Mya), some were already the first bipedal mammals.

This was already a peculiar innovation, but the real miracle was in the method of subsistence that was adopted. Hominids became serious carnivores and hunters, qualitatively and quantitatively different from anything seen amongst other primates. By 1.8 Mya, cut marks were being left on bones, and by 400 thousand years ago (Kya), long, straight spears were being made with a weight distribution as sophisticated as that of a modern javelin. What makes this so strange an evolutionary innovation is that this subsistence style was forged in competition with over a dozen species of formidable predators like the big cats - the "professional predators". By contrast with the claws, teeth and other adaptations of these animals, the small ancestral ape was physically an unlikely candidate for a savannah hunter. A vulnerable prey for these predators, our ancestors' generation of a niche that also involved outmanoeuvring them, in successfully competing with them at their own game, was a highly improbable scenario. Yet it is what happened.

In the case of the early hominids, it has been suggested, the bodily limitations were countered by exploiting a "cognitive niche". Essentially, there was selection for intelligence, with hominids becoming able to outwit prey (and perhaps other predators) sufficiently well to hold their own against competitors whose hunting success relied on other kinds of adaptation.

I do not want to deny the cognitive niche account; indeed, it seems not only consistent with all the evidence about our past, but perhaps an inescapable inference from it. However, I want to argue that this niche is not adequately described in terms of intellectual engagement only with the physical world of hunting, gathering and early technology. What is missing is a suite of adaptations in the way the mind operated socially that were no less critical to early hominids' "miraculous" transition in subsistence style. These adaptations included forms of cultural transmission, mental coordination and cooperation which allowed the group to act as one large, super-intelligent forager/ hunter, exploiting what would be better described as a distinctive "socio-cognitive niche".

Put most boldly, the claim is that humans are more social — more deeply social - than any other species on earth, our closest primate relatives not excepted. This may immediately strike the zoologist as unlikely: what about such species as ants which reach extraordinary heights of sociality? Surely in the size, density, intimacy and social structuring of their communities (including castes prepared to sacrifice their lives for their fellows) ants are truly "more social" than us? In these senses of "social", I suspect they are. However, by "deep" I arn referring additionally to a special degree of cognitive or mental penetration between individuals. This is best explained in relation to each of four main elements which together make up deep social mind.

[Mind reading] refers not to telepathy, but to the evidence that adult humans are mentalists rather than behaviourists, predicting and explaining others' actions in relation to a complex system of "everyday psychology", key constructs in which are states of mind such as thinking, wanting and believing.... Things get more complex as third parties attempt to read what is going on in such exchanges, as when "Jane suspects that John wants Jill to think Jane doesn't believe her". The web of such multiple mental interpenetrations in a human community makes it deeply social in a sense unprecedented in the natural world.

By contrast with the recency of what we have learned through the research effort on mind-reading, human cultural variations and their transmission have long received massive attention in disciplines as diverse as cultural anthropology, and the study of language acquisition. What this work tells us is that the contents of mature human minds are massively shaped by cultural acquisitions — even the most extensive of the regional variations in traditions found amongst animals cannot really be compared on the same scale. Thus, human minds are again unprecedentedly deeply social, in that vast amounts of the knowledge they accumulate (both "knowing how" and "knowing that") are acquired socially, from other minds.

Like the phenomenon of culture, language is a long-standing candidate for what makes human nature distinctive. I do not want to quarrel with this, but instead emphasize that the function language serves so well, in comparison with forms of communication found commonly in the animal kingdom, is to permit "what one has in mind" (e.g. intentions, ideas and knowledge) to be transmitted to another mind. Some of these transmissions will form the basis for cultural transmission. Thus, although language is often presented as the primary, distinctive human capacity, it may make as much sense to see it as a tool through which the two deeply social functions already described — mind-reading and cultural transmission — operate particularly powerfully. Either way, language allows our minds to interconnect in intimate ways which count as deeply social in the sense I have been outlining.

Cooperation: In my initial exploration of the idea of deep social mind, I distinguished two different (although related) aspects of cooperation, each of which takes distinctive forms in humans. The first is essentially the coordination of which human groups are capable, such that the group as a whole acts in the manner of a well coordinated organism. Individuals' minds here operate in a deeply social manner to the extent that they are subjugated to, and organized towards, achieving overarching group goals.... A second aspect of human cooperation is related to that of coordination, but worth distinguishing from it; it refers to capacities which emerge in certain social contexts to act in egalitarian ways, such as distributing resources and power relations in an equitable way. Again, this reflects a "submerging" of individual social minds in relation to group-level processes, achieved in a unique, human mental fashion.

Egalitarianism. One of the most striking patterns that recurs in hunter-gatherers round the world is egalitarianism, defined by social equality. This is expressed in a number of ways. First, there is food sharing; gathered food, and particularly the meat gained by hunting, is typically shared through the whole group according to need. Thus, the mental state is one in which these foods are considered group resources. Secondly, monogamy is also typical, with the consequence for equality lying in reproductive output, contrasting with the variation that can occur in great apes and the many polygamous non-hunting-gathering human societies, where some (males in particular) may gain great reproductive superiority. A third aspect of egalitarianism is that social status tends also to express equality; formal leaders are rare. The equality is often held in place by the concerted efforts of the rest of the group to resist any tendencies towards leadership or self-aggrandisement, a social tactic referred to as reverse dominance or counter-dominance. This multifaceted egalitarianism is unusual, whether contrasted with the societies of great apes or with humans not living in hunting-gathering bands. Therefore, Knauft proposed that a "U-shaped curve" describes hominid social evolution, beginning with non-egalitarian apes, shifting to hunting-gathering egalitarianism and then becoming very hierarchical again in many modern societies during the past 10,000 years of settlement.

Coordination. Hunter-gatherers also cooperate by coordinating actions at many different levels, in ways unprecedented amongst the great apes. Examples occur within hunting bands, where at several junctures in a hunt there is communication and negotiation in interpreting tracks and signs, using differing roles in ambushes, and in butchery and retrieval of carcasses; but coordination is also important at the highest levels of organization of group activities, such as in sharing relevant information between hunters and gatherers, and pooling the resources gained by a variety of foragers and foraging practices. The combined effect of the many different manifestations of egalitarianism and coordination is that a hunting-gathering group acts like one great forager, formidable in the way it scours its range for a variety of food types subjected to "evolutionary surprise attacks". This, it seems, is a viable niche even in the midst of professional carnivorous competitors like lions.

Hunter-gatherers often have few material possessions, only those which can be carried by individuals living an intermittently nomadic lifestyle. Nevertheless, the tools and weapons they do have are important aspects of the way in which evolutionary surprise attacks are executed, and the techniques involved in both manufacture and use represent the outcome of generations of invention, social transmission of knowledge and incorporation of refinements in a continual "ratcheting-up" process of cultural evolution. There is also much social transmission of wisdom and knowledge about non-technological aspects of hunting-gathering ways, like prey-tracking. Thus, the mind that operates hunting strategies or gathering strategies is as sophisticated as it is, and survives as well as it does in its competitive environment, very much because of the depth of its assimilation of socially transmitted information.

The proposal was outlined earlier that the aspects of social mind described above act together as an adaptive complex (aptly labelled "deep social mind"), and that the ways their precursors did so during hominid evolution were critical to the novel form of hunting-andgathering that managed to survive and prosper amongst its superficially better-equipped competitors.... Such effects existing in the past would have had the potential to create positive feedback loops favoured by natural selection, refining the underlying mental capacities to shape an important socio-cognitive component of what is now human nature.

Causal Belief Makes Us Human, L. Wolpert

In general, belief is about things that affect our lives. Belief is essential for making sense of the world and explaining the causes of events that are important for us. It is also about moral issues, good and evil actions and people. A characteristic of belief, unlike common knowledge is that it is always graded with respect to our confidence in it: it has a true and false value, how right or wrong it is. One can think of causal belief as an explanatory tool for understanding the physical world. Humans cannot tolerate not knowing the causes of important events like death, illness, and climate change. Belief is a property of the brain but what is the ultimate function of the brain itself? Just one, to control movement, so this must be at the core of any attempt to understand the origins of causal belief.

The ability to move is fundamental to animal life — not just finding food and shelter but the ability to escape from enemies. And this is where brains come from. Getting the muscles to contract in the right order was a critical evolutionary advance and required the evolution of nerves. Here we find the nerve circuits that excite muscles in the right order and are the precursors of brains. The first advantage of the ability to move was probably dispersal and access to new habitats. Further advantages then opened up, such as finding food and avoiding danger. For the first time it became necessary to perceive the nature of the environment in order to decide when and where to move. There was a need for senses. Light-sensitive cells are present among single cell organisms so it is not too difficult to imagine light coming to control movement. Later came the eye, and other sensory systems that detected touch, sound, temperature, and odour. All these had and have but one function, to control movement. Organisms were thereby enabled to begin altering their immediate environment, for example by building nests. Emotions developed to help ready the organism for appropriate motor responses like flight or attack.

Thus I propose that consciousness (in the sense that we have an internal model of what we are doing and can decide how to behave) has only one ultimate function, and that is to control movement. There is no human or animal emotion that is not ultimately expressed as movement. In fact the argument is almost a truism: for what else is human behaviour? Sense organs have but one function, to help decide how to move. The evolution of the brain that gave us beliefs is no more than an expansion of the original circuits that controlled movement in our ancient animal ancestors. Once the brain developed it took on other functions such as those related to homeostasis, like hormonal release and temperature regulation.

The evolution of causal belief opened up a completely new set of movements that were the basis of technology and understanding the environment. Technology is a fundamental characteristic of humans, the ability to deliberately manipulate the environment to improve chances of survival. Early technology owed nothing to science, and science only began to have an influence as recently as the late eighteenth century. Technology was the result largely of imaginative trial and error. In order to practise technology a belief in cause and effect was absolutely essential, and how this belief arose is a fundamental step in human evolution.

Babies already apparently have clear concepts about physical causality. They perceive physical events according to three principles which may be genetically determined: moving objects maintain both connectedness and cohesion, they do not break up or fuse; they move continuously and do not disappear and appear again without other objects in the way; they move together or interact only if they touch. There are many experiments to support this - for example, they clearly expect that for a moving block to make another one move, it must make contact with it. Young children thus perceive that certain objects have causal properties with a renewable source of energy or force, and this is a most sophisticated idea unique to humans.

Causal understanding is unique to humans - the weight of a falling rock clearly 'forces' the log to splinter. How did this ability to have causal beliefs evolve, for animals do not have such beliefs? There are of course cognitive similarities between human and mammalian and especially primate cognition: primates remember their local environment, take novel detours, follow object movement, recognise similarities and have some insight into problem solving. They also recognise individuals, predict their behaviour, and form alliances. However, they have little understanding of the causal relationships between inanimate objects. They do not view the world in terms of underlying 'forces' that are fundamental to human thinking. They do not understand the world in intentional or causal terms. Non-human primates do not understand the causal relation between their acts and the outcomes they experience.

Chimpanzees and apes are thus at the edge of causal understanding as shown by their use of simple tools, such as trimming a grass reed to get out ants. But in no case of stone tool use is there evidence of modifying the structure of the stone to improve its function. All the above evidence makes clear that while primates and some birds use simple tools there is an almost total absence of causal beliefs in animals other than humans. In no case of stone tool use is there evidence that individual animals modified the structure of the stone in order to improve the tool's function.

So while animals like crows and monkeys have some understanding of tool use, they have a very limited capacity for refining and combining objects to make better tools. The tools chimpanzees use have a narrow range of functions and there is little evidence that they can think up new functions for the same tool. Compare this with the way humans use a knife for a whole variety of purposes. Another important difference is that chimpanzees are slow to pick up skills from other animals. In essence, chimpanzees lack the technical intelligence needed for manipulating and transforming physical objects. The general consensus seems to be that primates lack causal beliefs.

What served as the prime mover in the evolution of the human brain? - technology or social behaviour? — and what were the adaptive advantages that led to the evolution of causal beliefs? A key issue is the relationship between causal beliefs, tool use and language; there may have been a mutual positive synergy between all three, possibly linked by the development of tool use.... The evolutionary advantage of causal beliefs, obvious even in young children as well as all adults, may be related to the making of complex tools. I suggest the key factor is that one cannot make a complex tool without a concept of cause and effect. By complex I mean a tool that has a well characterised form for the use to which it will be put and, even more importantly, any tool made out of two pieces put together like a spear with a stone head. It is only with causal beliefs that technology became possible, and it was technology - the ability to physically interact with the environment - that made life easier. Just consider things as simple as the basket and the wheel.

It is the technological path that we humans took that has separated us most profoundly from our primate ancestry and from our extant primate relatives. Our technological adaptation has been shaping our evolutionary trajectory in crucial ways for the past several million years.... Tool use was probably the most important adaptation in human evolution. There is even evidence that specific regions of the human brain are associated with tool use.

Robin Dunbar claims that there is a growing consensus that primate brain evolution has been driven principally by the demands of the social world, interactions with the environment and particularly with other members of the group. He argues that human brain growth, language, and intelligent behaviour were evolutionary changes related to the increasing social complexity of hominid community life. This argument is partly based on the increase in size of that part of the brain, the neocortex, which correlates with social skills such as mating behaviour, grooming and social play. Dunbar found that changes in the size of the neocortex correlate with changes in social variables, such as group size, rather than changes in the physical environment. There is also some evidence that neocortex size correlates with some measures of social skills such as mating behaviour. For Dunbar, the evolution of language is intimately linked to its ability to facilitate the bonding of larger groups and cooperation within them. But without causal thinking about interactions of objects I find it hard to see how improved social understanding could have been a real advantage, or how it could have led to technology.

Human manipulative skills are much greater than those of apes, and this is genetically determined because it is an intrinsic property of the brain. It has been suggested that opposability of the thumb, and the associated wondrous dexterity, completely transformed our ancestor's relationship with external objects. This relationship could have promoted human consciousness as manipulation of objects became a self-conscious activity; once the individual becomes an agent operating on external objects in numerous different ways causal beliefs are involved.

The use of fire was a major 'invention'. Fire, and how to spark it, presents a severe problem. Just how long ago humans began using fire is not clear and estimates vary from 300,000 years to 1.5 million years. Use of fire requires clear causal beliefs about how to ignite a fire and keep it going. The ability to ignite a fire may have evolved over many generations. Striking two flints against each other, or rubbing wood together to give a spark, would most likely have occurred by chance. It was the recognition that this would ignite a fire that was crucial. Before the discovery of ignition, fire had to be borrowed from natural sources.

No animal uses a container to carry food or water, though a captive chimp has been reported to use a coconut shell to carry water. Pots and bags are totally human. There have been many solutions as to how to interact with the environment in useful and quite simple ways, all of which require a concept of cause and effect. Consider, for, example 'simple' tools like digging sticks that humans use in a complex way. Humans also spend hours tracking game which clearly requires causal thinking. Planning ahead is essential, and trackers also needed an understanding of the environment they lived in, both animals and plants.

What then was the crucial change in evolution that led to causal thinking and the ability to make tools and to interact with environment - effectively the origin of technology? The relationship in evolution between tool using, causal thinking and language is an interesting but very difficult problem; one might have served to haul along the others. It is striking that tool use and language both appear in children at about eighteen months. All three involve what Calvin has referred to as stringing things together. Most theories see language as a help to learning how tools are used and made. But it is recognised that tools and language share some critical features - rule-governed behaviour and common sequencing neurology.

It is language that, in addition to causal beliefs, marks humans off from other species and its role in human evolution and its relationship to tool use and causal beliefs is a central problem. Alarm calls by animals communicate but contain no really factual information; and while bees do communicate information they do not use a language. Unlike human language, animals have closed communication systems. People use language, not just to signal emotional states or territorial claims, but to shape each other's minds. Gestures may have been involved in the origin of language but on their own they are not a language. Gorillas have been observed to make some thirty different gestures such as raising an arm before charging, putting hands on those of another, sucking their lower lip and then backing away, and poking another's body. Most involve some response. Even so, the gestures never involve a third object, or point to objects distant from themselves. By contrast, their vocalisations are much less complex.

From an evolutionary point of view, language could not arise out of nothing. It had to evolve out of neural structures or cognitive abilities already present; motor control is an obvious possibility. But another possible origin relates to tool use and to having causal beliefs. It helps in thinking about causality if one has language, but having language itself requires causal thinking for that is what, for example, many verbs relate to.... Thus causal thinking preceded and was an essential prerequisite for language development. So we are back to the importance of tools being the driving force. Perhaps the slow progress in tool complexity is related to the evolution of language some 100,000 years ago. Language would help enormously with the construction and use of new tools, both in terms of cooperation and imaginative thinking.

A strong case can be made that a key step in human evolution that made Homo sapiens different from other primates was the acquisition of causal beliefs. Without such beliefs it would not have been possible for technology, which is the main driver of human evolution, to develop. Causal beliefs are essential for making complex tools and planning ahead, and all other mammals, including non-human primates, lack these abilities. That infants and children develop causal beliefs during their infancy shows that it has a strong genetic basis, though the neurological mechanisms involved are not yet known. The relationship between causal beliefs and the evolution of language, and changes in cognition and brain structure remains a tantalising problem.

The Cooking Enigma, R. Wrangham

Coon wrote that cooking was "the decisive factor in leading man from a primarilyanimal existence into one that was more fully human." This perspective suggests that for humans, unlike the other 308 species, cooked food is a need rather than an option. Accordingly, our reliance on cooking results from certain features of human biology that have evolved in response to the control of fire, such as our small guts, small teeth, and slow life histories. From a dietary perspective, it means that humans are distinguished as much by what we do with our food as by the food sources themselves (whether meat, roots or grasses, for example). In short, this view sees Boswell's characterization of humans as the "cooking animal" as not only biologically but also evolutionarily significant.

The contrasting views on the role of cooking agree in at least one respect. Both acknowledge that cooking improves food. Some benefits vary across food types, such as reducing physical barriers, changing molecular structure, reducing toxin loads, and defrosting. Others appear to be consistent. For example, cooking leads to bursting of cells, making food molecules more available. It also tenderizes meat and softens plant foods, thereby making chewing easier. In addition, it reduces water content and increases the proportion of edible material. Exactly how these benefits translate into fitness has not been well established. However, current data suggest that they may lead to significant energetic savings.... By softening food and reducing meal size, therefore, cooking can be expected to reduce the cost of digestion, for example, by accelerating the digestive process. One measure of the rate at which foods are digested is the glycemic index, which assesses the rate of appearance of glucose in the blood following ingestion. As expected, the glycemic index is indeed consistently increased by cooking.

[Cooking] should increase the range of edible foods and therefore allow extension into new biogeographical zones. Other things being equal, it should also provide a more predictable food supply during periods of scarcity because it enables a range of otherwise inedible items to be used. It should have further effects by softening food. For example, it should lead to a greater ability of adults to provision infants, whose dentition is too immature to allow hard chewing, other than by giving milk. It should likewise cause a substantial drop in the amount of time that individuals spend chewing, with large consequences for the species activity budgets.

In short, the adoption of cooking is expected to be accompanied by a series of large influences on various important biological systems, such as foraging behavior, digestive strategy, infant development, geographical range, and the regulation of social competition. The fact that cooking is a human universal, therefore, ought to be intenselyprovocative for students of human evolution because it raises the question of whether the practice of cooking has indeed influenced these and other systems.

On the one hand, cooking is absent among animals, universal in humans, and rich in potent biological consequences. It is therefore expected to have a strong impact on evolutionary biology. On the other hand, archaeological data place the acquisition of cooking at a time when nothing dramatic was happening in human evolution. The cooking enigma, therefore, is how cooking became a human universal without having visible effects on our evolutionary biology.

It is known that all human populations cook their food and that cooking consistently increases the palatability of food. It is also known that a diet of raw plant food creates substantial energetic problems for humans under even the best conditions. The implication is that humans are adapted to eating cooked food.

Humans are not known to be able to survive on raw food, which suggests that during our evolution, we became physiologically committed to eating foods of such high quality that in most circumstances they had to be cooked. However, important gaps in our knowledge concern whether humans can survive on a raw diet with sufficient meat, whether human guts are better adapted to raw meat or to cooked food, what effects cooking has on food quality, and the conditions under which cooking can be recognized archaeologically. First, although raw plant food is evidently a poor diet for humans, a sufficient inclusion of raw meat might, in theory, create an energetically adequate diet. The diets of Arctic foragers included a substantial component of raw meat, for example. Experiments are therefore needed to assess the optimal balance of meat and plants for humans to maximize caloric intake on a raw diet.

A major effect of cooking, accordingly, may be that by tenderizing or softening food, it shortens the digestive process and therefore both reduces the energetic costs of digestion and increases the rate at which total calories can be absorbed (if the individual maximizes the rate of ingestion). This prediction conforms to evidence that merely by eating a softer diet animals have such reduced energetic costs of digestion that they have significantly higher energy gain.

Evidence from raw-foodists indicate that under subsistence conditions humans would not survive long term on the kinds of raw foods that are available in the wild. This implies that, as indicated by our digestive systems, humans are adapted to dietary items of unusually high quality compared with other species and that humans' high diet quality normally comes from the food being cooked. Therefore, given that humans currently depend on cooking for their high-quality food, a key question is how long we have done so. There are three types of solution, each with its own puzzle. The Late solution suggests that cooking is recent, that is, probably less than 25,000 years ago. This has the merit of explaining why the advent of cooking did little to influence the course of human evolution. It faces the considerable difficulty, however, of explaining how humans could have exerted sophisticated control of fire for at least 250,000 years without using it to cook their food. The Late solution is therefore highly improbable.

The Sneak solution suggests that cooking has been practiced for at least 250,000 years without causing any dramatic changes in human body size, sexual dimorphism, tooth size, or gross morphology. This response to the archaeological evidence leaves unanswered the puzzle of how an adaptive change in diet with large apparent consequences for various biological systems occurred without leaving its mark on human anatomy. Although the Sneak solution is the standard answer to the cooking enigma, no serious attempts have been made to explain why cooking apparently had such small effects on human evolution. The Basal solution suggests that cooking has been practiced since the origin of H. erectus and was responsible for many of the morphological changes associated with the evolution of H. erectus. This fits the many indications that early Homo had an unusually high-quality diet (such as small teeth and jaws and long-distance locomotion). It faces the challenge, however, of understanding why archaeological sites before 500,000 years ago show no evidence of the control of fire that is sufficient to convince skeptics.

Language: A Darwinian Adaptation?, C. Knight, M. Studdert-Kennedy, J. Hurford

The central goal of Chomsky's work has been to formalise, with mathematical rigour and precision, the properties of a successful grammar, that is, of a device for producing all possible sentences, and no impossible sentences, of a particular language. Such a grammar, or syntax, is autonomous with respect to both the meaning of a sentence and the physical structures (sounds, script, manual signs) that convey it; it is a purely formal system for arranging words (or morphemes) into a pattern that a native speaker would judge to be grammatically correct, or at least acceptable. Chomsky has demonstrated that the logical structure of such a grammar is very much more complex and difficult to formulate than we might suppose, and that its descriptive predicates (syntactic categories, phonological classes) are not commensurate with those of any other known system in the world, or in the mind. Moreover, the underlying principle, or logic, of a syntactic rule system is not immediately given on the surface of the utterances that it determines, but must somehow be inferred from that surface - a task that may defeat even professional linguists and logicians. Yet every normal child learns its native language, without special guidance or reinforcement from adult companions, over the first few years of life, when other seemingly simpler analytic tasks are well beyond its reach.

To account for this remarkable feat, Chomsky proposed an innate 'language acquisition device', including a schema of the 'universal grammar' (UG) to which, by hypothesis, every language must conform. The schema, a small set of principles, and of parameters that take different values in different languages, is highly restrictive, so that the child's search for the grammar of the language it is learning will not be impossibly long. Specifying the parameters of UG, and their values in different languages, both spoken and signed, remains an ongoing task for the generative enterprise. By placing language in the individual mind/brain rather than in the social group to which the individual belongs, Chomsky broke with the Saussurean and behaviouristic approaches that had prevailed in anglophone linguistics and psychology during the first half of the twentieth century. At the same time, by returning language to its Cartesian status as a property of mind (or reason) and a defining property of human nature, Chomsky reopened language to psychological and evolutionary study, largely dormant since The Descent of Man.

As for the evolutionary debate, Chomsky has had little to offer other than his doubts concerning the likely role of natural selection in shaping the structure of language. This scepticism evidently stems, in part, from the belief (shared with many other linguists) that language is not so much a system of communication, on which social selection pressures might indeed have come to bear, as it is a system for mental representation and thought. In any event, Chomsky has conspicuously left to others the social, psychological and biological issues that his work has raised.

Our difficulties arise, according to Bickerton, because we have focused too heavily on communication instead of on more basic systems of underlying representation. Natural selection favours increasingly complex systems of perceiving and representing the world. This is because enhanced sensitivity to aspects of the environment predictably affords an animal advantages over its fellows. Eventually, however, curiosity, attention and long-term memory reach a point of development such that any further gain in knowledge of the world can come only from more complex representation, and this is what language provides. 'Language ... is not even primarily a means of communication. Rather it is a system of representation, a means for sorting and manipulating the plethora of information that deluges us throughout our waking life'.

How and when did the new representational system arise? According to Bickerton, the first step was taken by Homo erectus somewhere between 1.5 million and five hundred thousand years ago. This was the step from primatestyle vocalizing into 'protolanguage', a system of arbitrary vocal reference that called only 'for some kind of label to be attached to a small number of preexisting concepts'. Bickerton's protolanguage is a phylogenetic precursor of true language that is recapitulated in the child, and can be elicited by training from the chimpanzee. Speakers (or signers) of a protolanguage have a referential lexicon, but essentially no grammatical items and no syntax. Bickerton justifies the concept of protolan^ guage as a unitary mode of representation, peculiar to our species, because it emerges, naturally and in essentially identical forms, through mere exposure to words. This happens not only in children under age two, but also in older children deprived of language during the 'critical period,' and even in adults obliged to communicate in a second language of which they know only a few words.

Pinker and Bloom... attributed the language module to unspecified selection pressures whose onset they traced to the Australopithecine stage. They exempted themselves from having to offer a more precise or testable theory by arguing that Darwinians need not address the emergence of novelty, being required only to provide evidence that a novel adaptation - once it has emerged - confers fitness. The two authors therefore by their own admission said 'virtually nothing' about language origins. They were satisfied with having established language as a biological adaptation, its evolution falling within the remit of standard Darwinian theory. We may easily suppose that the evolution of language is unproblematic since it seems so beneficial to all. Indeed, as Nettle has pointed out, Pinker and Bloom in their seminal paper clearly take this view: [There is] an obvious advantage to being able to acquire information second-hand: by tapping into the vast reservoir of knowledge accumulated by other individuals, one can avoid having to duplicate the possibly time-consuming and dangerous trialand- error process that won that knowledge.

For a strategy to evolve, however, it must not only increase fitness, but also be evolutionarily stable. That is, there must be no alternative strategy which gives competitors higher fitness. In the case of information exchange, there are such strategies: individuals who deceive others in order to further their own interests, or who 'freeload' - enjoying the benefits of cooperation without paying the costs - will, under most circumstances, have higher fitness than those abiding by the social contract. In the light of what we know about the 'Machiavellian' manipulative and deceptive strategies of the great apes, it is far from self-evident that reliance on second-hand information would have been a viable strategy for early hominids. Or rather, unless there were additional mechanisms to ensure against cheating on contractual understandings, it would seem that language could not have been adaptive.

Dunbar set out from the observation that primates maintain social bonds by manual grooming. Besides being energetically costly, this allows only one individual to be addressed at a time; it also occupies both hands, precluding other activities such as foraging or feeding. As group size in humans increased, multiplying the number of relationships each individual had to monitor, this method of servicing relationships became increasingly difficult to afford. According to Dunbar, the cheaper method of 'vocal grooming' was the solution. Reliance on vocalisation not only freed the hands, allowing simultaneous foraging and other activities, but also enabled multiple partners to be 'groomed' at once.

For Dunbar, the switch from manual to vocal grooming began with the appearance of Homo erectus, around two million years ago. At this early stage, vocalisations were not meaningful in any linguistic sense but were experienced as intrinsically rewarding, much like the contact-calls of geladas and other primates. Then from around four hundred thousand years ago, with the emergence of archaic Homo sapiens in Africa, 'vocalisations began to acquire meaning'. Once meaning had arrived, the human species possessed language. But it was not yet 'symbolic language'. It could enable gossip, but still fell short of allowing reference to 'abstract concepts'. Language in its modern sense - as a system for communicating abstract thought - emerged only later, in association with anatomically modern humans. According to Dunbar, this late refinement served novel functions connected with complex symbolic culture including ritual and religion.

For psychologist Merlin Donald andforneuroscientistTerrence Deacon, by contrast, the question of how humans, given their nonsymbolic primate heritage, came to represent their knowledge in symbolic form is the central issue in the evolution of language. The emergence of words as carriers of symbolic reference - without which syntax would be neither possible nor necessary - is the threshold of language. Establishment of this basic speech system, with its high-speed phonetic machinery, specialised memory system and capacity for vocal imitation - all unique to humans - then becomes 'a necessary step in the evolution of human linguistic capacity'. What selective pressures drove the evolution of the speech system? Donald starts from the assumption that the modern human mind is a hybrid of its past embodiments, still bearing 'the indelible stamp of [its] lowly origin'. Much as Bickerton takes the structureless word strings of modern pidgins as evidence for a protolanguage, Donald finds evidence for a prelinguistic mode of communication in the gestures, facial expressions, pantomimes and inarticulate vocalisations to which modern humans may have recourse when deprived of speech. 'Mimesis' is Donald's term for this analog, largely iconic, mode of communication and thought. The mode requires a conscious, intentional control of emotionally expressive behaviours, including vocalisation, that is beyond the capacity of other primates. We are justified in regarding mimesis, like Bickerton's protolanguage, as a unitary mode of representation, peculiar to our species, not only because it emerges naturally, independent of and dissociable from language, in deaf and aphasic humans unable to speak, but also because it still forms the basis for expressive arts such as dance, theatre, pantomime and ritual display. The dissociability of mimesis from language also justifies the assumption that it evolved as an independent mode before language came into existence.

Despite the current dominance of speech-based communication, we should not underestimate the continuing power of mimesis. Donald builds a strong argument for the necessity of a culture intermediate between apes and Homo sapiens, and for the value of a prelinguistic, mimetic mode of communication as a force for social cohesion. Homo erectus was relatively stable as a species for well over a million years, and spread out over the entire Eurasian land mass, its tools, traces of butchery and use of fire affording evidence of a complexity of social organization well beyond the reach of apes. Of particular importance for the evolution of language would have been the change in habits of thought and communication that a mimetic culture must have brought in its train. Mimesis, Donald argues, established the fundamentals of intentional expression in hominids, and laid the basis on which natural selection could act to engender the cognitive demand and neuroanatomical machinery essential to the emergence of words and of a combinatorial syntax as vehicles of symbolic thought and communication.

Language - including its distinctive representational level - is intrinsically social, and can only have evolved under fundamentally social selection pressures. Perhaps the most sophisticated, ambitious and elaborate presentation of this case was made by Terrence Deacon in his extraordinary book, The Symbolic Species, a work unique in its subtle meshing of ideas from the behavioural and brain sciences. Here, Deacon argues that language emerged concurrently with the emergence of social contracts. A contract, he observes, has no location in space, no shape or color, no physical form of any kind. It exists only as an idea shared among those committed to honouring and enforcing it. It is compulsory - one is not allowed to violate it - yet wholly nonphysical. How, then, might information about such a thing be communicated? Deacon's insight was that nonhuman primates are under no pressure to evolve symbolic communication because they never have to confront the problem of social contracts. As long as communication concerns only current, perceptible reality, a signaller can always display or draw attention to some feature as an index or likeness of the intended referent. But once evolving humans had begun to establish contracts, reliance on indices and resemblances no longer sufficed. Where in the physical world is a 'promise' ? What does such a thing look like? Where is the evidence that it exists at all? Since it exists only for those who believe in it, there is no alternative but to settle on a conventionally agreed symbol. In Deacon's scenario, such a symbol would originally have been an aspect of the ritual involved in cementing the contract. Selection pressures associated with such novel deployment of ritual symbolism led to the progressive re-engineering and enlargement of the primate brain.

Deacon argues that the key contracts whose symbolic representation preadapted humans for linguistic competence were those through which human females, increasingly burdened by child care, managed to secure long-term commitment from males. This argument ties in closely with recent Darwinian theory premised upon potential male/female sexual conflict, and brings speculation about the origins of language into the domain of anthropology in its widest sense.... If Deacon is right, then his argument would add force to a growing contemporary awareness that language evolution must have been driven by strategies not just of cooperative males, but crucially of females. In any event, regardless of the fate of Deacon's detailed anthropological scenario, his work in 'putting it all together' has raised our collective sights, lifting us decisively to a new plane.

Many of the contributors to this book argue that linguistic communication emerges and varies as an expression of distinctively human coalitionary strategies. Such models acknowledge no incompatibility between the methodological individualism of modern Darwinism and the grouplevel focus of much social, cognitive and linguistic science.

The Evolution of Cooperative Communication, C. Knight

Where, previously, attention has focused on speech as the biological competence of individuals, here our themes are social. To study communication is inevitably to study social structure, social conflict, social strategies, social intelligence. Communication, as Robbins Burling observes in the next chapter, 'does not begin when someone makes a sign, but when someone interprets another's behaviour as a sign'. Reminding us of this elementary principle, Burling spells out the logical corollary: where the evolution of language is concerned, it is comprehension, not production, which sets the pace. Even a purely instrumental action, after all, may be read by others as a signal.

Where this has evolutionary significance, instrumental behaviour may then undergo modification in the service of novel, socially conferred, signalling functions. Chomsky's focus upon the innate creativity of the speaker has been enormously productive. But over evolutionary time, Burling points out, 'the only innovations in production that can be successful, and thus consolidated by natural selection, are those that conform to the already available receptive competence of conspecifics'. If Burling is correct, then that syntactical structure which so radically distinguishes speech from nonhuman primate signalling must have become progressively elicited and then consolidated by generations of comprehending listeners. First, conceptual complexity is 'read into' signalling by the attentive mind reader; subsequently, the signaller - given such encouragement - may succeed in externalising aspects of that complexity in the signal itself.

Dessalles sets out to delineate more precisely the cooperative social matrix in which speech must have evolved. With Dunbar, Deacon and many others, he posits an evolutionary background in which increasingly large, stable coalitions engage in group-on-group competition and local conflict. The decisive selection pressure is status-linked social inducement to provide information relevant to the concerns of one's own group. Dessalles accepts that such coalitionary activity amounts to cooperation, driven by strategies of reciprocal altruism which are a precondition for the evolution of speech.... In competing against the out-group, each coalition seeks to allocate internal status exclusively in return for relevance. Rather than displaying altruism, therefore, conversationalists - like contestants in any competitive board game - strive to win through linguistic 'moves' capable of earning status while diminishing the relative significance of rival contributions... Internally, signallers may now avail themselves of a novel opportunity - to compete in producing messages valued by other members of their group. As Dessalles concludes: 'Social status among humans is not extorted by brute force. It emerges from others' willingness to establish social bonds with you. The decision to become closer to somebody is taken according to definite criteria. Linguistic relevance may be an essential component of this choice'.

If we are seeking a primate precursor for speech creativity and combinatoriality, Knight suggests that the most convincing candidate is primate play. But if conversational speech including humour in the human case extends and develops the creative, combinatorial potential of immature primate play, then we must ask how the conditions for such creativity came to be extended into adulthood during the course of human evolution. For Knight, the key factor acting to deny animals freedom to play is reproductive competition and conflict. The onset of sexual maturity brings with it the Darwinian imperative to engage in potentially lethal sexual competition. In the primate case, this impinges upon life concurrently with sexual maturity, setting up anxieties, divisions and status differentials which permeate and effectively constitute adult sociality. If imaginative playfulness diminishes in frequency, it is because autonomous, freely creative expressivity is simply not compatible with a situation in which individuals feel anxious or externally threatened. Admittedly, adult primates - most notably bonobos - do sometimes play with one another. But as competitive stresses intensify, the dominant tendency is for play fights to give way to real ones. On a more general level, by the same token, involvement in shared makebelieve yields to a more narrow preoccupation with the serious competitive imperatives of adult life.

Among humans, however, the transition to adulthood takes a different form. Human offspring go through an extended period of childhood followed by adolescence. During this extended period, the young are enabled to rely to a considerable extent on social as opposed to 'fend-for-yourself' provisioning. Hunter-gatherer ethnography demonstrates in addition that at a certain point, young adolescents become coercively incorporated into ritual coalitions. Rites of initiation - central to intergenerational transmission of human symbolic culture - may be viewed as a modality of animal play. In fact, they are spectacular 'pretend-play' performances, drawing on hallucinatory techniques such as trance, dance, rhythm, face painting and so forth. Whether or not genital mutilation is involved, the declared aim is to curb individualistic pursuit of sexual advantage. Bonds of coalitionary solidarity, typically modelled on sibling solidarity, are accorded primacy over sexual bonds....Knight argues that with the emergence of Homo sapiens, the childhood significance of kinship indeed became preserved within adult sociality, overriding sexual bonds and thereby opening up a new social space within which language - an extension of the creativity of primate play - could now for the first time flower.

The Emergence of Phonetic Structure, M. Studdert-Kennedy

The properties that afford reliable transmission from speaker to hearer are also those that afford reliable transmission from one generation to the next, from adult speaker to child hearer/learner. It is this aspect, transmission across generations by learning, that has enabled language to evolve, in perhaps no more than some tens of thousands of generations, from inarticulate cry to articulate speech.... The child's perceptual, articulatory and cognitive capacities are the filter through which words must pass from one generation to the next. That is one reason why the ontogeny of words offers our best, perhaps our only, natural model of their phylogeny. Indeed, initial steps in the emergence of language have proved recalcitrant to evolutionary theory precisely because we have lacked, until recent decades, a reliable description of how the infant progresses from gurgling to babbling to spoken words.

Donald posits a preverbal mimetic stage of symbolic culture linking primate modes of episodic cognition with the purposive culture of verbal Homo sapiens. Mimesis is an analog mode of representing events or acts by means of bodily posture, expression and gesture, and is still a medium for much human communication. According to Donald, mimesis established 'the fundamentals of intentional expression' in hominid groups. On this view, the capacity to observe the meaning of a verbal symbol arose from its precursor in comprehending mimetic action - although the leap from iconic representation in mimesis to arbitrary representation in language is still a puzzle. Similarly, mimesis is said to have put in place 'the fundamentals of articulatory gesture' - although again the move from analog iconic mimesis to digital articulatory imitation still had to be made, presumably through differentiation of the vocal machinery.

Vihman and DePaolis find a functional parallel to mimetic communication in early caretaker/ child interactions that set the scene for the move into language. They also find a plausible parallel to the likely lengthy process of evolving imitative capacity in what they term 'the articulatory filter' that seems to shape a child's early words. The filter is the perception-production link, rooted in proprioceptive and auditory feedback from early sound making and babbling, by which a child initially selects for imitation from the rich supply of adult words only those sound patterns that match phonetically the patterns it can already form. The child's articulatory filter parallels the evolutionary bottleneck of emerging imitative capacity through which early hominid language would have had to pass.

The carryover of child forms into adult language reminds us again that language development, like its evolution, is social, an extended process of adaptive interchange. Not only do learners adapt to language, but language adapts to learners. Language is then an epitome of its own evolution, a summary record of its passage through successive generations of learners. At the same time, the precise course of development varies across language communities. Thus, despite presumably universal articulatory constraints, languages differ in their phonology. Each language has come upon one of an indefinitely large number of solutions to the problem of adapting phonetic structure to the same finite vocal machinery.

Studdert-Kennedy proposes that a critical step in the evolution of the discrete phonetic structures that support the transmission of words was the evolution of a capacity for vocal imitation, unique among primates to humans. Imitating an utterance entails analysis of a sound pattern into its underlying articulatory components (gestures, segments, syllables), storage of the components for a shorter or longer period, depending on the interval between model and copy, and reassembly of the components in correct sequence. Notice that the meaning of the utterance plays no part in the process. Here perhaps, in the act of imitation, Studdert-Kennedy argues, is where phonetic form and semantic function were first dissociated in hominid communication. The dissociation was essential for an elaborated system of learned arbitrary reference, and its consequences ramified throughout what eventually became language. From it arose independent levels of phonetic representation and memory, prerequisite for displaced reference, for the production and comprehension of syntax, and even, many thousands of years later, for writing and reading. Arguably, then, vocal imitation was the point of breakthrough from Donald's analog mimesis into the discrete verbal symbolism that launched the entire linguistic enterprise. On such an account we would not postulate consonants, vowels and their descriptive features as axioms, but would derive them, no less than syllables, from prelinguistic perceptual and articulatory constraints on the imitative machinery.

The Emergence of Syntax, J. Hurford

One might have expected syntactic theorists to have been prominent in the literature on human evolution, given that what is most remarkable about humans is their capacity for syntactically complex language. But until very recently, syntactic theorists have kept away from evolutionary theorising. This avoidance has tended to apply to linguists in general; paradoxically, mainstream scholarly linguists have typically been outnumbered in speculation on the evolution of language by anthropologists, psychologists and palaeontologists (witness the list of contributors to Lock and Peters's massive handbook.

Why have linguists traditionally been so reticent in contributing to evolutionary debates? The very complexity of human languages, especially their syntactic components, of which linguists above all (and one might even say only linguists) have been fully aware, is a severe obstacle to theorising. Although evolutionary biology is a well-established field, one notices a similar reticence among many biologists to engage in evolutionary speculation, because biologists, above all, know how complex the explananda are. Just characterising the intricacies of human syntax has been work enough for linguists, let alone worrying about how it all might fit into an evolutionary scenario. But the time for turning to the evolution of such complexity had to come.

Pinker and Bloom argued that the innate human capacity to acquire language was likely to have been selected by orthodox Darwinian processes. Thus it seemed, at the beginning of the 1990s, that work on the evolution of syntax was set to take a decidely adaptationist (and biological) course.... None of the chapters in this part argue for biological adaptation of brain structures as the central mechanism behind the emergence of complex syntax. None of them deny the existence of selective pressures, either, but adaptation is not their prime focus. Thus, Carstairs-McCarthy's central argument is that a particular feature of language, for a long part of its prehistory, was not well adapted. Derek Bickerton advances an explicitly exaptationist position: (much of) syntactic complexity is built on nonlinguistic structure that existed before. Alison Wray also argues explicitly that much of grammar is dysfunctional for day-to-day communicative activities. The computational models of Simon Kirby and Jim Hurford are consonant with Bickerton's view, as they assume mental representational structures which preexist communication, and which give rise to linguistic structures.

Carstairs-McCarthy draws attention to significant structural parallels between syllable structure and simple clause structure. It is suggested that the human capacity for mental representation of signals was constrained, at both the syllabic and the clausal level, and for many millennia, by the same structural straitjacket. For Carstairs-McCarthy, the step to modern syntactic structures came later, with a capacity to represent recursively embedded structures. Wray argues cogently that protolanguage is still with us. There has been a tendency to talk of Bickertonian protolanguage 'giving way to', or being wholly supplanted by, modern language. Bickerton concurs with Wray that such was not the case: 'there is no need to suppose that in catastrophic changes of state, one state supersedes or abolishes the other.... We need not imagine parents who spoke only protolanguage with children who spoke like us'. Wray's and Bickerton's views coincide on the implications of the high cost of processing grammatically articulated language.

Bickerton's implicit assumptions about the nature of the semantic structure preexisting modern language are quite different from Wray's explicit view of the semantics of protolanguage. Bickerton envisages discrete and distinct mental categories corresponding to predicates and their arguments; in protolanguage there could have been 'words' corresponding to argument concepts and others corresponding to predicate concepts. The step from protolanguage to language involved externalising the syntax of semantic representations (more or less like predicate logic, without quantifiers, for Bickerton), so that, roughly, the forms corresponding to predicates became verbs, the forms corresponding to arguments became nouns, and the larger forms expressing embedded propositions became subordinate clauses. Thus, for Bickerton, the seeds of reference and predication are already present in the semantics of the protolanguage.

Wray's view of the semantics of protolanguage is quite different, with messages being essentially holistic speech acts, centrally pragmatic in function, and without inbuilt reference and predication. For Wray, reference and predication emerge with the segmentation of protolanguage utterances into smaller meaningful parts. This illustrates another difference between Bickerton and Wray. One can identify two views of the move from protolanguage to language, which one can label 'synthetic' and 'analytic'. Bickerton takes the synthetic view. The original words of protolanguage had meanings which became the atomic constituents of the meanings of the larger utterances of full language. The original words of protolanguage were strung together to make the phrases and sentences of full language. Wray takes an analytic view. The original words of protolanguage had meanings which became the meanings of high-level constituents in full language. The original words of protolanguage were dissected into parts which came to express the atomic meanings of full language.

The Evolution of Social Organization, C. Key, L. Aiello

Cooperation is a fundamental characteristic of human social life. Cooperative bonds between mothers and offspring, husbands and wives, infants and grandmothers, sisters and brothers and unrelated friends of both sexes form the backbone of our social world. Language and ritual may have evolved to help us develop and maintain our relationships with individuals whose goals and desires may be very different from our own. Starting from its primate roots, this chapter explores the evolution of cooperation in humans. One of the most distinctive features of primates is their tendency to live in social groups. While group living must confer benefits to individual members, the costs are not slight. Direct costs result from competition between group members for food, mates and other resources. In addition, indirect costs arise as group members must coordinate and compromise their activities in order to maintain group cohesion. To balance this competition a high degree of cooperation is required. In many primate species, position in the social hierarchy and reproductive success are dependent upon the ability to establish and maintain cooperative alliances with other group members. For instance, pairs of subordinate male baboons form partnerships in order to steal females away from more dominant males.

Cooperation in humans involves a wider network of individuals and a greater diversity of behaviours than in any other single primate species. We develop vast cooperative networks that include nonkin as well as kin, and that cross the boundaries of both age and sex. Our network of friends and relatives is typically so large that Dunbar argues that our large brains have evolved to help us keep track of these relationships, and that we use language as an efficient method of forming and maintaining social bonds. In fact, language may be one of a number of cognitive mechanisms that have evolved to manage complex cooperative relationships.

Connor and Norris suggest that reciprocal altruism requires a 'theory of mind', the ability to infer the mental states of others and to act upon and manipulate their beliefs and desires. Certainly, of all nonhuman primates, chimpanzees are the most likely to possess theory of mind and to use this ability to deceive other individuals. But human theory of mind surpasses that of chimpanzees and may have evolved to further enable us to negotiate a social world in which the benefits of cooperation and the dangers of deception are profound. Humans are particularly good at reasoning about social interactions and at identifying and remembering the faces of people who are likely to be cheats. After just thirty minutes of interaction, humans are adept judges of which individuals are likely to be cooperative and which are likely to cheat.

The need to cooperate, and protect ourselves from individuals who could exploit our cooperation, appears to be part of our evolved psychology. Frank argues that cooperation is so important to us that emotions such as guilt and love, and social values such as trust and honesty, have evolved to prevent us from succumbing to the shortterm temptation to cheat and thus protect the long-term advantages of cooperation.

Female—female cooperation is the easiest type of cooperation to establish, since females share a common goal, namely to provide their infants with the best possible resources. Females can exchange similar altruistic acts, such as suckling or provisioning of meat, which are easier to monitor than non-symmetrical exchanges of different goods or services. Cooperation between males and females is much more difficult to establish and is likely to be much less common than intra-female cooperation since the currencies of exchange are usually very different. For instance, Ligon suggests that males may exchange offspring care for mating rights from the female. But the female has no guarantee that the male will provide post-partum care for her offspring, and the male has no guarantee that he is the father of the female's offspring.... If males are investing in females and their offspring, it is to their advantage to try to target those in which they have a direct reproductive interest. Cooperation and reproduction may become closely associated as in chimpanzees and bonobos, where food exchange and copulation sometimes occur simultaneously.

Human cooperative behaviours are, at least partly, a legacy of our primate origins. Coalitions of females form the basic structure of many primate groups, especially Old World monkeys. Male care-giving is observed in 15 per cent of primate species, most notably the callitrichids and cebids. It is rare to find both intra-female cooperation, especially among nonrelatives, and male care in the same social group. Yet, this is exactly the pattern of cooperative bonds that we find in humans. In fact, humans have a unique social structure in which males and females form pair bonds which involve extensive amounts of cooperation, within the context of a large multi-male, multi-female group.

The structure of human social groups emerges from strong cooperative bonds between individuals of both the same sex and different sexes. The challenge, then, is to understand how this highly unusual social system may have arisen. The discussions above suggest that patterns of cooperation are dependent upon the energetic costs of reproduction of males and females. Thus, if it is possible to make predictions about the energetic costs of reproduction from fossil and archaeological material, it may then be possible to speculate on the evolution of cooperation in the ancestral hominids. As will be shown below, both postcranial and cranial evidence provide valuable clues as to how the energetic demands of the hominids may have changed over time.

The energetic cost of reproduction is defined as the sum of the energetic costs of every activity that contributes to the production of a single surviving offspring. For females these costs will primarily be i associated with the production of gametes, gestation, lactation and child care. Male costs arise from the production of gametes, courtship, malemale competition and, in some instances, child care. For both sexes, body condition is also an important determinant of reproductive success. Since the costs of body maintenance are closely related to body size, the relationship between male and female energetic costs is largely related to differences in body mass. In species where males and females have similar body mass, the energy requirements of females are greater than those of males due to the demands of direct parental care. However, when males are at least 50 per cent larger than females, the energetic costs to the male of maintaining a larger body balance the female's energetic costs, so that the total cost of reproduction for each sex is more or less equal. In short, female energetic costs are highest, relative to male energetic costs, in species with the lowest levels of body mass dimorphism.

Changes in sexual dimorphism in body mass can be determined from the fossil record. McHenry estimates that males were around 50 per cent larger than females in Australopithecus afarensis and 40 per cent larger in Australopithecus africanus. The most significant change in body mass dimorphism occurs with the appearance of early Homo. Homo erectus males were just 20 per cent heavier than females, indicating an important shift in the balance of energetic costs between the sexes. It is significant that the change in sexual dimorphism in Homo erectus is the result of changes in both male and female body size. Both sexes increased in size, with the greatest increase occurring in females, possibly in response to thermoregulatory demands. It seems unlikely that the decrease in sexual dimorphism was in response to a change in social system involving reduced intra-sex competition since this would be expected to involve changes in male body size alone. However, changes in sexual dimorphism that increased the relative energetic load on the female may have contributed to a change in social behaviour involving more cooperation both between females and between males and females.

The adaptive complex of an increase in brain size and a reduction in gut size, mediated by a change to an animal-based diet, implies a profound change in the energetic costs of reproduction for females. Firstly, an increase in brain size directly increases the energetic load on the mother, since the main period of brain growth occurs in utero and during the postnatal period prior to weaning. Secondly, a change to a diet with a high meat component requires that females provision their offspring until they have gained the necessary skills to acquire meat for themselves. The dual loads of extensive food-sharing between mother and offspring, and the training necessary for the offspring to find its own resources, would significantly increase the period of maternal investment beyond the weaning period.

In short, there is every reason to believe that the energetic loads on Homo females were much greater than had been the case for their smallerbodied, more sexually dimorphic, smaller-brained ancestors. In the previous discussion it was argued that intra-female cooperation is most likely when female energetic costs are high. In particular, it was shown that female cooperation in the care and feeding of offspring is especially important in social carnivores. Similarly, the combination of high energetic costs and a shift to a meat-based diet in Homo seems very likely to have been accompanied by an increase in female—female cooperation, fiawkes et al. suggest that menopause and long postmenopausal lifespans may have evolved as part of such a cooperative strategy. They found that senior post-menopausal Hadza women play an important role in provisioning their daughter's offspring. The benefits of this are clear for the child, the mother and the grandmother. With more provisioning the child would be expected to have higher survival. The mother is relieved of some of the burden of providing food, reducing her energetic stress and shortening her inter-birth interval. Finally, the decreased mortality of the child and the increased fecundity of the mother equate to higher inclusive fitness for the grandmother. The changes in sexual dimorphism in body mass and an increase in brain size seen in Homo erectus are indicative of a fundamental change in energetic requirements. At the very least, it seems likely that Homo erectus females were highly cooperative, particularly with respect to infant care.

However, female energetic costs were not only increasing in absolute terms, but also with respect to male energetic costs. This would suggest that there would also be selection for increased paternal investment. In Homo erectus, this imbalance in male and female energetic costs may have been adequately buffered by female cooperation. However, between 500,000 and 100,000 years ago there was an exponential increase in brain size which would have escalated female energetic costs far beyond those found in Homo erectus. During this time period, the first unequivocal evidence of large game hunting also appears in the archaeological record, the earliest finds at Schoningen and Boxgrove dating to between 500,000 and 400,000 years ago. It seems likely that during this period there would have been strong selection for male cooperation, particularly for providing animal food for females and their offspring.

This chapter has examined the primate origins of human social structure, concentrating on the evolution of cooperation between females and between males and females. There has been much debate in the current literature regarding the function of cooperation in human societies. Male hunting behaviour has been the focus of much of this attention, in particular whether it is a reciprocal behaviour or a male mating strategy. However, whatever function may be ascribed to cooperation, it is clearly a central feature of human society, to such an extent that we have even evolved specialized psychological mechanisms to negotiate our complex social networks. The development of pair-bonding and paternal care within the context of large multi-male, multi-female groups placed unique cognitive demands upon our hominid ancestors, elevating our capacity for altruism, deception, culture, communication and knowing other peoples' minds beyond anything yet observed in nonhuman primates. We conclude that these human patterns of cooperation are the result of changes in the energetic costs of producing large-brained offspring for males and females in association with a change to an animal-based diet.

Symbolism and the Supernatural, S. Mithen

Humans are quite [unique in their belief in supernatural beings]. In both the past and the present a great deal of time and effort has been spent in worshipping gods and spirits. Fortunately, this effort has not been wasted as it has led to some of the most impressive cultural achievements of humankind including great architecture, art, music and literature; unfortunately, however, it has also resulted in some of the most appalling episodes of violence and suffering due to the intolerance that people have for those with different religious beliefs to their own. The material symbols involved in religious behaviour, especially those that represent supernatural beings, appear to capture the epitome of the human symbolic capacity. Not only does an image of a deity represent something that is not present in time and space; it represents something that could notice present. Hard, tangible objects, such as carvings in stone, are used to symbolize intangible ideas and concepts: people appear to have no difficulty in understanding such symbolic links.

As the propensity to have religious ideas appears to be both pervasive among human beings, and quite unique to our species, this type of thinking provides a clear challenge to those who believe that viewing the human mind as a product of biological evolution can throw light on the nature of human thought and behaviour. Other aspects of the mind are much more readily understood as products of evolution. Consider, for instance, thoughts about food choice and sexual partners. Few would challenge the notion that these have been influenced by our evolutionary past as they have direct and immediate consequences for reproduction and survival. When examining these types of behaviour precisely the same methods that are used to study food choice and mating patterns in nonhuman animals can be applied. This has indeed thrown considerable light on human behaviour. Perhaps the best example is the study of foraging behaviour by modern hunter-gatherers which has been tackled by using the theoretical and methodological approaches of foraging theory originally developed for non-human animals. In this regard a robust methodology exists for examining the food choice behaviour of human beings from an evolutionary perspective. But no such methodology exists for human religious behaviour.

In this chapter I will suggest some possible directions for research on the evolution of symbols which relate to belief in supernatural beings. I will consider how we might conceive of 'protosymbolic' activity by Early Humans, and how this may have a different cognitive basis to the symbolic capacities of modern humans. I will also address the role of material artifacts in the conception and transmission of religious ideas, especially those concerned with the supernatural. My argument will be that material artifacts function as anchors for these ideas in the mind, and without them the development of religious institutions and thought about the supernatural are severely constrained. My first step is to clarify the problem that religious thought poses to the evolutionary anthropologist.

Belief in the existence of supernatural beings is pervasive among living human communities, and has been among those documented throughout historical times. The prehistoric archaeological record with its tombs and effigies suggests that many of those people who lived before the time of writing had also believed in supernatural beings. Indeed, it could be argued that belief in the supernatural is universal among human groups - or at least has been until the emergence of atheism in the very recent past. As I have already noted, this widespread belief in the supernatural poses major problems to those who believe that many of the critical features of being human can be explained by recourse to evolutionary theory. An evolutionary approach has been characterized as assuming that our thought processes have been honed by natural selection to match the realities of the world — those individuals for whom this was the case would have benefited from increased reproductive success. From this perspective, those who prayed to Gods for rain, rather than who spent their time building irrigation channels, would hardly have managed to spread their genes in the next generation when under competition from those who set to work with the spade.

I have as yet made no attempt to offer any definitions of religious behaviour; this is, of course, fraught with difficulty. At most one can perhaps highlight four beliefs which are of greatest significance to that way of thinking which we call religious: (1) the belief in non-physical beings; (2) the belief that a non-physical component of a person may survive after death; (3) the belief that certain people within a society are likely to receive direct inspiration or messages from supernatural agencies, such as gods or spirits; (4) the belief that performing certain rituals in an exact way can bring about change in the natural world.

I think... material symbols are critical not just to the sharing of religious beliefs but in their conceptualization within the mind of an individual. As such, the archaeological record of material symbols (assuming that it is not biased by preservation and discovery) should provide a true record for the emergence not just of shared religious ideologies, but for the first emergence of religious ideas themselves. To support this assertion we need to consider the evolution of the human mind, with specific regard to how religious ideas may arise.

In contrast to the view that 'primitive thought' was originally religious in nature, the archaeological evidence suggests that religious ideas and ritual activities appeared relatively recently in human prehistory. Although the first members of Homo are present in the fossil record 2.5 million years ago, the first unambiguous evidence for religious ritual is only associated with the appearance of anatomically modern humans not more than 100,000 years ago in terms of what appear to be ritualized burials in the caves of Skhul and Qafzeh in the East.... As much as 60,000 years elapsed before any symbols appear to have been manufactured. The first of these appears just 30,000 years ago at the cave of Chauvet in France, and Hohlenstein-Stadel in Germany. Both of these sites have produced images of half-man/half-animal figures which are likely to be representations of supernatural beings from an ice age mythology.

Other than ambiguous pieces of scratched bone, incised stones and ochre the archaeological record prior to 50,000 years lacks artifacts of a symbolic nature. We must always entertain the possibility, of course, that a considerable amount of symbolic behaviour was being undertaken in the form of dance, song and artifacts made from organic materials. As such, there would be no trace of these left in the archaeological record. As is so frequently stated: the absence of evidence is not evidence of absence. So what can we do about this as archaeologists? My own position is simply to argue that it is inconceivable that such symbolic activities could have been present, but not have also been expressed in ways that did indeed leave an archaeological trace. It is, I think, most prudent to adopt a cautious and conservative interpretation of the archaeological record for symbolic activity. Otherwise there seems no constraint on attributing symbolic dances, songs and feather headdresses not only to Neanderthals and archaic H. sapiens, but also to H. ergaster and the Australopithecines.

60-30,000 years ago appear to mark some form of threshold in human cognitive development in light of the changes in the archaeological record at that time which are apparent throughout the world, including the colonization of arid regions, technological developments as well as the first representational art. It appears to me profitable to treat the Early Palaeolithic artifacts I have described as evidence for a 'protosymbolic' capacity, in the same manner as one might refer to chimpanzee language use in laboratory conditions as evidence for a 'protolanguage'. Just as chimpanzee 'language' has some similarities to human language but appears to be far too simple to be placed into the same category, so too do the symbolic artifacts of Lower and Middle Palaeolithic Early Humans appear far too simple to be placed into the same category as the symbols of modern humans. Chimpanzee language most probably derives from quite different cognitive capacities than human language — possibly no more than a capacity for associative learning, I suspect that Early Human symbolic behaviour also derives from associative learning and the fact that it never went beyond body marking with ochre, or non-repeated incised lines, reflects the constraints on symbolic behaviour that arise when it has such a cognitive basis. The symbolic behaviour of modern humans, like their linguistic abilities, most likely derive from quite different cognitive abilities.

Early Palaeolithic body painting is a particularly good example of something that might be characterized as protosymbolic behaviour. Although a symbol is notoriously difficult to define, one essential feature is a degree of displacement between the signifier and the signified in terms of space and/or time. It is for this reason that we do not refer to facial expressions as symbolic in nature: although the muscular contractions of my face to produce a smile signify that I have a bodily sensation of happiness, there is no displacement between the smile and my body and consequently the smile is not truly symbolic. So if Early Palaeolithic body painting was used to exaggerate or draw attention to various bodily characteristics - such as the size of breasts or muscles, or the redness of lips - such body painting should not be described as truly symbolic as there is no displacement between the signifier and signified. Using the same ochre pigment for painting images on cave walls is quite different.

I would argue that without material symbols, there is a significant constraint on the extent to which religious ideas and conceptualizations of supernatural beings can be shared. To explain this, we need to consider the role of material objects in both the formation and the transmission of religious ideas. But let us first briefly consider the appearance and character of objects in the archaeological record which undoubtedly undertook this function. The very first art we possess appears to be intimately associated with religious ideas by containing images of what are likely to be supernatural beings, ihe earliest piece is a 28.1 cm high carving in mammoth ivory, of a figure half man and half lion from Hohlenstein Stadel and dating to c. 33,000 years ago. Contemporary with this are the paintings in Chauvet Cave which include a half-human/half-bison figure. Such anthropomorphic figures persist in Palaeolithic art even as the major animal themes change from carnivorous/ dangerous animals to herbivores... As we move beyond the Palaeolithic, anthropomorphic figures continue as a critical part of the archaeological record, such as the human/fish images from Lipinski Vir. During later prehistory figurative images pervade prehistoric art and are most readily interpreted as images of supernatural beings.

In summary, there can be little doubt that after 30,000 years ago religious ideas, ritual activity and material symbols pervade all human societies. This date is, of course, somewhat arbitrary but is a time when anatomically modern humans would have been dispersed throughout most of the Old World and entered Australia, and unambiguous evidence for symbolic behaviour exists in Africa, Asia, Europe and Australasia. Even though the meanings associated with the figurative or abstract images that are found cannot be inferred, there can be little doubt that the majority of this art related to religious ideas. Why should there be such a compulsion to represent religious ideas in material form? And could religious ideas have been held by those pre-modern hominids who were responsible for the pit of bones and the Middle Palaeolithic burials at a time when material symbols appear absent? To answer such questions we must consider the cognitive origins of religious thought.

It is the human propensity to bring together knowledge that 'naturally' resides in quite separate cognitive domains — cognitive domains about material objects, living things, the human social world — that underlies the ability to create ideas about supernatural beings. When I say 'naturally' resides, I am referring to the notion that for the majority of human evolution, diought was of a domain-specific character, with limited, if any, integration of knowledge and ideas from different cognitive domains. That notion suggests that Early Human ancestors and relatives had at least three specialized cognitive domains, which I have referred to as 'social', 'technical' and 'natural history' intelligence. These are characterized as bundles of interacting mental models resulting in complex activity in each of those behavioural domains. As such it was an excellent cognitive adaptation for living in complex natural and social environments which is testified by the success of Early Humans in colonizing such large parts of the Old World for almost two million years of the Pleistocene.

Modern humans acquired the ability which can be described as having cognitive fluidity. This evolutionary transition to a cognitively fluid mind is similar to a transition that occurs during cognitive development within an individual that has been described in various ways, including that of 'mapping across domains' and the emergence of 'representational redescription. With regard to cognitive evolution, cognitive fluidity was of immense adaptive value. It allowed, for instance, technical and natural history intelligence to be integrated so that specialized hunting weapons could be designed. By combining elements of social and technical intelligence items could be manufactured which conveyed social messages, such as beads and necklaces. This capacity for cognitive fluidity, the emergence of which is likely to be closely tied up with that of language and consciousness, gave Homo sapiens sapiens considerable adaptive advantage over other species of Homo who maintained a domain-specific mentality. Quite why H. sapiens sapiens alone evolved this cognitive fluidity remains unclear; it may be accounted for purely on the basis of historical contingency — a mutation that happened by chance in one member of H. sapiens sapiens rather than another hominid species.

If, as I described above, the essence of religious ideas is indeed the combination of elements from different natural categories, then this appears to be another product of cognitive fluidity - but one that does not need any adaptive value. In other words it is a spandrel of those ways of thinking that allow the development of more efficient foraging and social communication. If religious thinking and behaviour does involve costs, if it is maladaptive, these are more than compensated for by the benefits of cognitive fluidity gained from other types of thinking. It is important to note here that cognitive fluidity may not have evolved in one go; my own argument has been that it was a two-stage process with an initial integration of social and natural history intelligence, followed by one of technical intelligence. Only with this latter integration did the substantial cultural developments occur which are often referred to as the cultural explosion of the Upper Palaeolithic. Prior to that anatomically modern humans maintained a Middle Palaeolithic material culture, even though they appear to have been behaving in markedly different ways to Early Humans, as evident from their burial and hunting patterns in the Near East and the functional aspects of their anatomy. Those behavioural changes derived, I believe, from the first stage of the cognitive transition which saw the integration of social and natural history intelligence.

Although this left few direct archaeological traces, it may have been of far greater evolutionary significance than any cognitive changes around the start of the Upper Palaeolithic. If we follow the 'out of Africa model for modern human origins an integration of social and natural history intelligence may have provided the cognitive competitive edge which allowed modern humans to replace existing species of Early Humans in Africa, Asia and Europe. In some parts of the world this occurred many thousands of years before we see the widespread cultural developments which could only arise once technical intelligence was also integrated into a cognitively fluid mind. By being able to integrate ideas and knowledge from the two evolved domains of natural history and social intelligence people could, for the first time, attribute human-like thoughts to animals, and believe that they shared ancestors with specific animal species. Such anthropomorphic thinking lies at the heart of religious ideas. A mapping could be created of the social world onto the natural world, and vice versa. But if the domain of technical intelligence remained isolated, such ideas were unable to find material expression.

Religious ideas... are subject to immense diversity and there can be no assumptions that other individuals will be able to grasp the ideas that one possesses. As a consequence the cultural transmission of religious knowledge is fundamentally different — fundamentally more difficult — to that of technical and social knowledge. Rather than being informal, it is often undertaken in the context of ritual: ordered sequences of action, rigidly adhered to which serve to maintain the fidelity of the ideas during cultural transmission. Without this, religious ideas would too readily become corrupted and dissipated. But even with the bulwark of ritual, religious ideas are 'winnowed' by the process of cultural transmission; those which survive are those which can most easily find an 'anchor' in the human mind.

If everything about a supernatural being violated what we understood about the natural world, people would have immense problems in grasping religious concepts. Such concepts would be impossible to communicate and share. But there is a second, and perhaps a far more significant, way in which religious ideas are anchored: they are represented in material form. Religious ideas that are represented in material form gain survival value for the process of cultural transmission. When translated into material symbols they become easier to communicate and comprehend as their material form provides a second anchor into the human mind. Representation in physical form provides a means whereby those features of supernatural beings that violate intuitive knowledge may themselves be anchored into the mind.

Durkheim argued that the 'the principle forms of art seem to have been born out of religious ideas'. I think that this should be reversed: that religious ideas, or at least those which are shared, are born out of art, for without material symbols they cannot be sufficiently anchored into human minds. It is on this basis, therefore, that I feel we can be confident that ideas about supernatural beings did not exist within people's minds before material symbols were made.... Of course we must always be aware that material symbols may have been made from organic materials and simply did not survive in the archaeological record. But at present, we have no evidence that prior to 30,000 years ago material symbols did indeed exist. And without them, nor could shared ideas about supernatural beings.

The Evolution of Language and Languages, J. Hurford

Languages inhabit two distinct modes of existence, which have been called 'E-Language' and 'I-Language'. E-language is the external observable behaviour - utterances and inscriptions and manifestations of their meanings. E-Language is regarded by some as so chaotic and subject to the vicissitudes of everyday human life as to be a poor candidate for systematic study. (E-Language corresponds to what Chomsky, in earlier terminology, called 'performance'.) Out of this blooming buzzing confusion the individual child distils an order internal to the mind; the child constructs a coherent systematic set of rules mapping meanings onto forms. This set of rules is the child's I-Language (where T is for 'internal'). No two individuals' I-Languages have to be the same, although those of people living in the same community will overlap very significantly. But there will usually be at least some slight difference between the I-language features prevalent in one generation and those prevalent in the next.

The evolution of languages in the sense just sketched is patently not biological, but socio-cultural. This kind of language evolution is the stock in trade of historical linguistics. Historical linguistics is a relatively mature discipline. It has accumulated vast amounts of theory and fact concerning how languages have changed over the last few thousand years. It has reconstructed in detail many of the protolanguages from which modern languages are descended. Examples are Proto-Indo-European, presumed to have been spoken somewhere in Eastern Europe about five thousand years ago, and Proto-Iroquoian, the ancestor language from which the modern American languages of the Iroquoian family, such as Mohawk, are descended. Historical linguists have catalogued many types of change that can occur in the evolution of individual languages, changes such as weakening and strengthening of the meanings of words, change of basic word order, loss of inflections, grammaticalization of lexical words (nouns, verbs, adjectives) into grammatical function words (articles, pronouns, auxiliaries), merger of phonemes, the emergence of novel phonemic distinctions, lowering, raising, fronting, backing and rounding of vowels, palatalization, glottalization, and so on.

Protolanguage, for Bickerton, was not blessed with the syntactic intricacies of modern languages, but only had very simple devices for stringing words together. We presume that, to a first approximation, all modern humans have the same biologically given aptitude for language acquisition. All the developments discussed by historical linguists, therefore, have taken place within constraints imposed by the modern genome. To be a possible modern language (such as modern German, Classical Latin or ancient Egyptian), a system has to be acquirable by a biologically modern human. Modern humans were preceded by various (sub)species for whom different, more limited, classes of systems were acquirable as their 'languages'. Bickerton's term protolanguage is a useful attention-focusing device, postulating that the class of 'languages' biologically available to Homo erectus was the class of protolanguages, defined quite roughly as systems for concatenating vocabulary with none of the complex syntactic dependencies, constituencies, command and control relations characterizing modern languages. A Homo erectus individual, even if somehow presented with modern linguistic experience, could not make of it what a modern child makes of it due to innate limitations. Bickerton likens this type of 'language' to that which intensively trained chimpanzees are capable of.

Competence in a particular language is an acquired characteristic of an individual. Biological heredity, as of an innate language faculty, does not provide for the inheritance of acquired characteristics. In theory, a modern human language faculty could pass intact through thousands of years in a totally silent community (assuming the community itself could somehow survive); with the lifting of the vow of silence, the children of the new generation would be as ready as any others to acquire any language they were exposed to. This last point assumes, perhaps too strongly, that there would be no significant decay in the language faculty due to lack of any pressure of natural selection through linguistic behaviour.

A significant aspect of the environment into which a human child is born is the language of the community. The particular syntax, phonology and lexicon of the language is a historical creation of the child's cultural forebears. If the child is to prosper, he or she must be able to acquire this particular syntax, phonology and lexicon. But here we see an apparent paradox for the evolution of languages. Evolution means change, but it would seem that the requirement to acquire the language of one's community is a prescription for stasis rather than change. The paradox can be resolved by invoking the ideas of tolerance and intelligibility. A child does not need to learn to speak exactly like (one of) his or her parents; if the child acquires a syntax, phonology and lexicon permitting tolerable mutual intelligibility with the community he or she is born into, that child will prosper tolerably well. Fitting this picture, languages do indeed change very slowly, as we have seen, and stay well within the constraints of intergenerational intelligibility.

We assume that there were relevant mutations and recombinations in the evolution of the modern human language faculty. Accordingly, diere must have been children who were born capable of acquiring a class of languages different from the class of languages acquirable by their parents. These 'transitional' children would have been presented with data (spoken utterances) produced from grammars of the old type, and internalized grammars of a new type, while still maintaining tolerable mutual intelligibility with the previous generation.... At some point an individual must have arisen who was capable of internalizing a grammar of a type that none of his or her ancestors (no matter what data they were exposed to) could possibly have internalized.

Biological adaptationist accounts of the human language faculty face the difficulty that the initial conditions providing the platform for the adaptation must be presumed to contain some unique factor or combination of factors. Otherwise, why should we only find language in one species? The focus of explanation shifts away from the general pervasive tendency of species to adapt to their environments towards some specific one-off circumstance that has occurred only once in history. Adaptation is still part of the picture, however. Selective pressure for individuals (or groups) to be better adapted to their environments undoubtedly played a part in the evolution of the language faculty.

Any circumstances invoked as explaining the emergence of Language have to be argued to be true. Where special brain structures are proposed as the crucial explanans, for example, one has to be able to verify that humans, and no other species, have just such structures. Or where special social arrangements of humans are invoked as the crucial significant factor, one has to be able to argue that these social arrangements did apply to humans at the relevant time, and not to other species. And in general, more realistically and more eclectically, for any set of circumstances proposed as individually necessary and collectively sufficient to explain the emergence of Language, one has to show that this combination of circumstances applies (or applied) to humans and to no other species. We have a long way to go.

I give below a brief survey of some traits which have been suggested as preadaptations for language.... There is seldom, if ever, any serious consideration of the relative chronology of the various proposed preadaptations. Thus, each of the 'preadaptations' reviewed below might be seen as the last and crucial step that gave us Language, or it might be one of an accumulation of necessary characteristics preceding that final step.

A capacity to attribute to other individuals versions of one's own beliefs and desires is evident in much modern linguistic behaviour. There could conceivably be quite elaborate communication systems whose use does not require a theory of mind on the part of its users, but human languages, and especially the pragmatic systems of inference used with them, are not such systems. The acquisition and use of human languages requires substantial inferential machinery about the likely intentions of others.

Bickerton has proposed a single catastrophic event precipitating the emergence of the modern language capacity. This is the appearance of a connection in the brain between the (hypothetical) component that processes understanding of complex social relations between individuals (who-did-what-to-whom) and the symbol-processing machinery that can already handle isolated words but not syntax. This proposal is one of the more extreme 'Big Bang' style proposals for the emergence of the language faculty.

[Mimesis] is an idea first put forward by Merlin Donald, who sums it up as follows. "Mimesis is a non-verbal representational skill rooted in kinematic imagination, that is, in an ability to model the whole body, including all its voluntary action-systems, in three-dimensional space. This ability underlies a variety of distinctively human capabilities, including imitation, pantomime, iconic gesture, imaginative play, and the rehearsal of skills. My hypothesis is that mimesis led to the first fully intentional representations early in hominid evolution, and set the stage for the later evolution of language."

Communication may arise, as Dawkins and Krebs claim, from an arms-race between mind-reading and manipulation. A view (with versions which may be either complementary or opposed to this 'Machiavellian view) is that a certain degree of altruism and mutual cooperation is a prerequisite for the rise of complex communication systems, in particular where these can be used by one individual to convey factual information to another. It would seem that there is usually little immediate benefit to a speaker in 'giving' declarative information to another.

Everybody agrees that there is some connection between humans' abnormally large brains and their capacity for language, but nobody has been able to specify very precisely what this connection is. Deacon points out that in the two-million-year period in which brains have doubled in size, no clearly new structures have been added, although there has been warping of the proportions of the parts, with the frontal areas of cortex becoming more prominent. It is these parts which handle Verbal short-term memory, combinatorial analysis, and sequential behavioral ability'.

Human vocal tracts differ significantly in shape from those of chimpanzees, allowing us to produce a range of distinct sounds that chimpanzees are not capable of. Lieberman is the most prominent exponent of this topic. Lieberman's work also argues that the Neanderthal vocal tract was incapable of articulating the range of modern human speech sounds. This view has recently been challenged by Arensburg and co-workers and Duchlin. Aiello briefly surveys some evidence that the human vocal tract was an early preadap- tation, motivated by dietary changes in early hominids. Although the range of sounds available to modern humans is, by definition, characteristic of human language, it can be argued that this is a less crucial characteristic than some others (e.g. syntax). If we were capable of articulating fewer phonemes, we would have to use longer words. Perhaps there is some ideal trade-off between the capacity to make fine articulatory distinctions and the size of short-term memory buffers.

Preadaptations, such as those just discussed, are enabling rather than forcing. Having a particular preadaptive trait simply makes certain later steps possible; preadaptations for language are not in themselves selected for by any measure of fitness involving language. By contrast, (neo-)Darwinian accounts tend to stress adaptations, which, by definition, are selected for. One must, of course, avoid the 'strict adaptationist' fallacy of assuming that every trait is adaptive; there are spandrels, accidental, non-functional aspects of morphology or behaviour. Lightfoot's position is that the formally interesting features of the language faculty, which give human languages their characteristic features (e.g. the syntactic principle of subjacency — see exposition below), are not particularly fitness-enhancing; the human language capacity is more complex than it needs to be, and even in places dysfunctionally complex. Such features as subjacency may indeed be, Lightfoot argues, just accidents (spandrels); but scientific methodology abhors accidents, and a powerful theory predicting the occurrence of such features would be preferable, if one could be found. One cannot be happy with a general stance of classifying any interesting phenomenon as a spandrel.

Language was undoubtedly instrumental in conferring on humans fitness across an unprecedentedly wide range of environments. Many environments are still no-go areas for humans, but we can survive and reproduce in a range greater than that of any other species. Our ability to communicate precise and complex messages to each other must have helped. This much is a broad truism; we can explore the matter of fitness in relation to Language, and languages, in more subtle ways. If we assume that the innate human language faculty, in all its specific detail, arose by natural selection, the central puzzle is the relation between intricate universal principles of grammatical structure and fitness. Clearly, the space between fitness and principles of grammar had to be bridged by some intermediate theoretical construct, such as expressive power.

Bickerton is among those who emphasize the role of (internal) representation over that of communication in any adaptive account of human language. 'In any account of the functional motivation of language, the question of whether it was the communicative or the representational aspects that contributed most to the adaptedness of language surely bulks too large to be ignored'. Superior mental representational power has been listed as a necessary precondition to language. If communication is envisaged in Saussurean terms of a meaning in one head (speaker) being recreated in another head (hearer), the two heads involved clearly must have the power to represent these meanings. I cannot convey an idea to you that I am unable to grasp myself. Powerful mental representational capacity, without there necessarily being any means to externalize it in utterances, is very probably adaptive in itself.

A radical alternative to the focus on the phylogenetic adaptation of humans to be better communicators or better conceptualizers is a focus on the linguistic adaptation of systems of communication to be replicable by human acquirers. This idea has been well expressed by Christiansen: "What is often not appreciated is that the selective forces acting on language to fit humans is [sic] significantly stronger than the selective pressure on humans to be able to use language. In the case of the former, a language can only survive if it is learnable and processable by humans. On the other hand, adaptation towards language use is one out of many selective pressures working on humans - Thus, language is more likely to have adapted itself to its human hosts than the other way round. Languages that are hard for humans to learn simply die out, or, more likely, do not come into existence at all. Following Darwin, I propose to view natural language as a kind of beneficial parasite — i.e. a nonobligate symbiant— that confers some selective advantage onto its human hosts without whom it cannot survive."

'The timing of the origin of language is anyone's guess'. This assessment is near the mark, if not wholly right. The nature of the dating problem is to fit a series of vaguely and controversially hypothesized stages in the evolution of language around a handful of approximate (and also controversial) dates for key non-linguistic events in human evolution. The three key dates usually mentioned are of two phylogenetic transitions and one cultural transition in Homo sapiens. The phylogenetic transitions are habilis to erectus around 1.7 m years ago and archaic Homo sapiens to anatomically modern sapiens sapiens (between 200,000 and 100,000 years ago). The cultural transition is the Upper Palaeolithic revolution in toolmaking (45,000-40,000 years ago), which I collapse here for convenience with the emergence of 'modern' art forms around the same time. The erectus-to-sapiens date is contested by multi-regional evolution theorists, who claim that there was no relatively sudden speciation event, but rather a long (perhaps one million year) period of interbreeding between more modern and more conservative varieties in various parts of the Old World. The revolutionary character of the changes in tool making around 40,000 years ago is also disputed by some.

As far as 'stages' in linguistic evolution are concerned, the most specific suggestion is Bickerton's, of a simple two-stage progression from protolanguage to full human language. Protolanguage is described as concatenation of vocabulary items according to pragmatic pressures (e.g. put the word' for the most salient idea first), with no level of grammatical organization involving phrases or inflections or grammatical words such as determiners, auxiliaries or case-markers. It is like Tarzan-talk. Bickerton gives examples from pidgins, the efforts of trained apes, human children under two years of age and language-deprived adults. Bickerton suggests that Homo erectus spoke protolanguages. It is tempting to align Bickerton's step from protolanguage to full human language with the emergence of anatomically modern humans between 200,000 and 100,000 years ago. If there is a view which is held by more scholars than any other, on however flimsy grounds, it is probably that fully modern language came on the scene with the appearance of anatomically modern humans. But this currently conventional wisdom needs to be subjected to careful criticism as more evidence and arguments appear.

Another view associates the emergence of fully modern languages with the sudden marked improvement in stone tool technology around 40,000 BP. It is argued that what explains this technological explosion was the ability to describe to others, in language, the more complicated procedures needed for making the new improved tools. The theory relies on an impression of what might be learnable by mere observation and what tasks require linguistic instruction. If one accepts this view of the later emergence of modern languages, one has to ask what anatomically modern humans were doing for the preceding 60,000 years. A possible answer is that the socio-cultural transition from protolanguages to modern languages took 60,000 years; but this seems unlikely in the light of modern evidence from creolization.

The early story of the evolution of the human capacity for language involves the settling into place of a range of social, psychological and physiological preadaptations. Once all preconditions for language in humans were in place, it is likely that languages blossomed rapidly, starting before Homo sapiens sapiens' exodus from Africa, but also perhaps not achieving the full complexity of modern languages until after the expansion out of Africa.

The First True Humans , R. Klein, B. Edgar

We have suggested that human evolution was characterized by a series of short, abrupt steps or punctuations, separated by long periods with little or no change. So far, we have described a possible first punctuation, which occurred between 7 and 5 million years ago and produced bipedal apes, and a better-evidenced second event, which occurred between 3 million and 2 million years ago and produced the first stone tool makers. The abruptness of each step is debatable, but the stability that followed is patent. Thus, the anatomy of the bipedal apes changed little over intervals that lasted a million years or more. The anatomy of the earliest tool makers is poorly known, but they were probably equally conservative, judging by a remarkable lack of change in the tools they produced. They may have had larger brains than the bipedal apes, but they may also have retained an ape-like upper body form and a high degree of size difference between the sexes. If so, it's probable that they continued to rely heavily on trees for food and refuge and that they had an ape-like social organization that involved little or no cooperation between the sexes. When we know them better, we may decide that for all effects and purposes, they were "technological apes."

We turn now to a third step that occurred about 1.8 to 1.7 million years ago. It is more fully documented than its forerunners, and it was at least as momentous, for it produced a species that anticipated living people in anatomy, behavior, and ecology, save mainly for its smaller brain. With this caveat in mind, its members can reasonably be labeled the first "true humans," and this is how we will refer to them here. Early on, the first true humans authored a major advance in stone flaking technology, but thereafter, both their anatomy and their artifacts appear to have remained remarkably stable for a million years or more. In this respect, they were marching to the same drummer as their predecessors.

Fossils that date from after 500,000 years ago now indicate that sapiens evolved in Africa while erectus continued on largely unchanged in eastern Asia. In form and geologic age, ergaster is well positioned to be the ancestor not only of erectus but also of sapiens, and this is the view we adopt here. The ancestry of ergaster is murky, but it may have originated suddenly from habilis (or from one of the variants into which habilis may eventually be split) in adaptive response to a sharp increase in aridity and rainfall seasonality that occurred across eastern Africa about 1.7 million years ago.

On average, brain volume in ergaster was only about 900 cc, large enough to invent the new kinds of stone tools with which it is associated, but also small enough to explain why the tools then changed little over the next million years or so. Based mainly on dental development, the Turkana Boy was probably about 11 years old at time of death, but his stature compared more closely with that of a modern 15-year-old and his brain with that of a modern 1-year-old. The sum has led Walker to conclude that "While he may have been smart by ape standards, relative to [living] humans the Turkana Boy was tall, strong, and stupid." The same statement might apply equally well to everyone who lived between 1.8 million and 600,000 to 500,000 years ago, before a spurt in brain volume brought it much closer to the modern average.

Direct archeological evidence for new foods is lacking or ambiguous, but the choices are larger quantities of meat and marrow, greater numbers of nutritious tubers, bulbs, and other underground storage organs, or both. Cooking might also be implied, since it would render both meat and tubers much more digestible, but so far, persuasive fireplaces or hearths are unknown before 250,000 years ago, by which time ergaster had been replaced by more advanced species.

Given that ergaster was shaped for a hot, dry climate, we can speculate that it was also the first human species to possess a nearly hairless, naked skin. If it had an ape-like covering of body hair it could not have sweated efficiently, and sweating is the primary means by which humans prevent their bodies—and their brains—from overheating... The sexes [among the ergasters] differed no more in size than they do in living people. This stands in sharp contrast to the australopiths and perhaps hobilis, in which males were much larger than females. In ape species that exhibit a similar degree of sexual size difference, males compete intensely for sexually receptive females and male-female relationships tend to be transitory and non-cooperative. The reduced size difference in ergaster may signal the onset of a more typically human pattern in which male-male competition was reduced and male-female relationships were more lasting and mutually supportive.

The first tool makers, the Oldowan people, mastered the mechanics of stone flaking, and they were very good at producing sharp-edged flakes that could slice through hides or strip flesh from bone. At the same time, they made little or no effort to shape the core forms from which they struck flakes, and to the extent that they used core forms, it was perhaps mainly to crack bones for marrow. For this purpose, core shape didn't matter very much. Ergaster, however, initiated a tradition in which core forms were often deliberately, even meticulously, shaped, and shape obviously mattered a lot.

East Asian Homo erectus shows that a human species had left Africa by 1 million years ago, and we believe that this species was Homo ergaster. But aside from the issue of the kind of people involved, we may also ask why they left and what route(s) they took. Unlike many other questions in paleoanthropology, these are relatively easy to answer. Archeology shows that about 1.5 million years ago, shortly after ergaster emerged in Africa, people more intensively occupied the drier peripheries of lake basins on the floor of the Great Rift Valley, and they colonized the Ethiopian high plateau for the first time. By 1 million years ago, they had extended their range to the far northern and southern margins of Africa. The Sahara Desert might seem to provide an impenetrable barrier to movement northward, but during the long Acheulean time interval, there were numerous periods when it was somewhat moister and more hospitable, and Acheulean people penetrated it readily.

As to how and why people expanded through Africa and beyond, they almost certainly did so automatically, simply because their physiology and technology allowed them to inhabit territories that no one had occupied before. A group on the periphery of the human range would periodically outgrow its resource base, and a splinter party would break off and set up shop in empty territory next door. Such a party probably rarely moved far, but given time, the splintering process would inevitably have brought people to the northeastern corner of Africa. From there, members of a breakaway group would have colonized the southwestern corner of Asia without even knowing they had left Africa. From southwestern Asia, the same process of population budding would inevitably lead other groups eastwards towards China and Indonesia or northwards and westwards towards Europe.

The first people to leave Africa crossed the border between what is now Egypt and Israel. It is not surprising therefore that Israel contains the oldest firmly documented archeological site outside of Africa. This occurs at 'Ubeidiya in the Jordan Rift Valley, where ancient lake and river deposits have provided nearly eight thousand flaked stones. The tools include hand axes and other pieces that closely resemble early Acheulean artifacts from Olduvai Gorge and other African sites. They have been bracketed in the interval between 1.4 and 1 million years ago by associated mammal fossils, paleomagnetism, and potassium/ argon dating of an overlying lava flow.

Eastern Asia with its Homo erectus fossils shows that such a dispersal must have occurred by 1 million years ago. Europe may have been occupied equally early, but the oldest widely accepted evidence for human colonization is only about 800,000 years old. The evidence comes from the Gran Dolina, a cave at Atapuerca, near Burgos, Spain... Elsewhere in Europe, there is little or no indication that people were present before about 500,000 years ago, and it was perhaps only then that people gained a permanent foothold. Europeans at 500,000 to 400,000 years ago looked a lot like their African contemporaries, and they made similar Acheulean artifacts. They may thus signal a fresh wave of African immigrants.

By 600,000 to 500,000 years ago, people with larger, more modernlooking braincases had appeared in Africa, and for the moment, based in part on our reading of the artifactual record, we hypothesize that these people evolved abruptly from ergaster. They closely resembled Europeans of 500,000 to 400,000 years ago, and the Africans and Europeans together have sometimes been assigned to the species Homo heidelbergensis, named for a lower jaw found in 1907 in a sand quarry at Mauer near Heidelberg, Germany. It may have been heidelbergensis expanding from Africa about 500,000 years ago that brought the Acheulean tradition to Europe.

The difference between east and west in anatomy and artifacts might suggest that there was a telling difference in behavior or ecology, but so far there is no evidence for this. With regard to ecology, for example, we can say only that people everywhere subsisted partly on large mammals. Zhoukoudian Locality 1 is the most informative Chinese site, and it was literally filled with bones from a wide variety of species. Two extinct kinds of deer were particularly abundant, and this might mean that local erectus people were skilled deer stalkers. Against this, though, we note that the Locality 1 deposits also provided numerous fossilized hyena feces or coprolites and that many of the animal bones were damaged by hyena teeth. The conspicuous evidence for hyena activity means not only that hyenas could have introduced many of the animal bones, but it also suggests that hyenas successfully competed with erectus for living space. Based just on the Locality 1 evidence, we might conclude that as a predator or scavenger on other large mammals, erectus was less effective than hyenas.

Animal bones from broadly contemporaneous sites in Africa and Europe suggest that Homo heidelbergensis and its immediate successors were equally ineffective hunters. This is true even though heidelbergensis and erectus produced very different stone artifacts, and the ecological similarity serves to remind us that differences in stone artifacts between regions may say little about key aspects of underlying behavior. More important to this book, the apparent ecological similarity between heidelbergensis and erectus implies that they remained behaviorally alike even after they had diverged in anatomy. We will show now that Europe and Africa illustrate the same fundamental point-archeological (behavioral) residues remained strikingly similar on both continents, even as Europeans evolved into Neanderthals and Africans evolved towards modern humans. The pattern was broken only about 50,000 years ago, when the Africans developed the modern capacity for culture and then rapidly exported both their anatomy and their behavior to the rest of the world.

Humanity Branches Out, R. Klein, B. Edgar

The first permanent occupants of Europe were late Acheulean hand axe makers, who spread from Spain and Italy on the south to southern England on the north about 500,000 years ago. Occasional human fossils like those from Petralona, Greece, and Arago, France, suggest that the hand axe makers resembled their African contemporaries, and the Europeans probably descended from an expanding African population that brought the late Acheulean Tradition to Europe. For the sake of convenience, we assign this population and its first African and European descendants to the species Homo heidelbergensis.

Homo heidelbergensis shared many primitive features with Homo ergaster and Homo erectus, including a large, forwardly projecting face, a massive, chinless lower jaw with big teeth, large browridges, a low, flattened frontal bone (forehead), great breadth across the skull base, and thick skull walls. At the same time, it departed from both ergaster and erectus in its much enlarged brain, which averaged over 1200 cubic centimeters (cc) (compared to about 900 cc for ergaster and 1000 cc for classic erectus), in its more arched (versus more shelf-like) browridges, and in the shape of its braincase, which was broader across the front, more filled out at the sides, and less angular in the back. Like erectus, heidelbergensis probably evolved from ergaster, and in its anatomy and its geographic distribution, it is a plausible common ancestor for the Neanderthals (Homo neanderthalensis) who appeared subsequently in Europe and for modern humans (Homo sapiens) who evolved later in Africa.

The nature and timing of the shift from the early to the late Acheulean remain to be firmly established, but if the transition turns out to have occurred abruptly about 600,000 years ago, it could have coincided closely with a rapid expansion in brain size that biological anthropologists Chris Ruff, Erik Trinkaus, and Trent Holliday have detected. Their analysis suggests that between 1.8 million and 600,000 years ago, brain size remained remarkably stable at roughly sixty-five percent of the modern average, but not long afterwards it increased to about ninety percent of the modern value. If a spurt in brain size and associated changes in skull form sparked the appearance of hddelbergensis, its emergence 600,000 years ago would signal a punctuational event like the one that we previously proposed for ergastcr more than a million years earlier. The analogy would be especially apt if future research confirms a link between heiddbergensis and late Acheulean technology to parallel the one that we have postulated between ergaster and the origin of the Acheulean Tradition.

UCLA neuroscientist Harry Jerison notes that the human brain is roughly six times larger than we would predict from the relationship between brain size and body size in other mammals. Even if we restrict the survey to monkeys and apes and scale them to human body size, human brains are about three times larger than we would expect. The fossil record suggests that whenever encephalization has occurred, it occurred rapidly, and the human brain illustrates the point especially well. It may actually have been the most rapidly evolving organ in the history of the vertebrates.

Jerison also notes that a major function of the brain, and more particularly of the cerebral cortex, is to build a mental image or model of the "real world," which in his words is "the brain's way of handling an otherwise impossible load of information and is the biological basis for mind." Brain expansion after 600,000 years ago presumably increased the amount of data that the human brain could process, and this in turn allowed the development of more sophisticated mental models. "Brains are, after all, information-processing organs," notes Jerison, "and [natural] selection for brain size must have been selection for increased or improved information-processing capacity."

We suggest below that the development of fully modern behavior about 50,000 years ago—"the dawn of human culture" to which the title of this book refers—may mark the development of fully modern language and that this development may have been rooted in yet another neurological shift. We emphasize the "may," because the human brain reached its nearly modern size not long after 600,000 years ago, and if a neurological change occurred 50,000 years ago, it was confined to brain structure. Unfortunately, fossil skulls, even ones that are much differently shaped than our own, reveal little about brain structure, and arguments for neurologically driven behavioral change after 600,000 years ago cannot be tested independently of the behavioral (archeological) evidence that suggests them.

Logic alone suggests that human expansion throughout Africa and to Eurasia by 1 million years ago required fire for bodily warmth, predator protection, and food preparation. Nonetheless, to demonstrate fire use beyond a shadow of a doubt, most archeologists would require fossil fireplaces, that is, circular or oval lenses of ash and charcoal, surrounded by stone artifacts and broken-up animal bones. This requirement is unfortunate, because most early human sites formed on ancient land surfaces in relatively dry tropical or subtropical environments where charcoal and ash do not last long. Caves provide better preservation conditions, but most caves older than 150,000 to 200,000 years have either collapsed or been flushed of their original deposits, so we have no option but to concentrate on "open-air" sites. Patches of burned earth at two such sites in eastern Africa may indicate human mastery of fire by 1.4 million years ago, but in each case, the burning might simply mark a tree stump or patch of vegetation that smoldered after a brush fire. Occasional charred bones that accompany 1.5-million-year-old artifacts at Swartkrans Cave, South Africa, present the same dilemma. The charring is indisputable, but the bones originated outside the cave, where they might have been naturally burned.

Africans and Europeans surely belonged to different evolutionary lineages. Yet, there is nothing at the various sites to suggest a significant behavioral difference, and on each continent, behavior appears to have been equally primitive by modern standards. Africans and Europeans remained behaviorally similar-and still primitive-until about 50,000 years ago, when the Africans added modern behavior to modern anatomy. For a brief period, Africans and Europeans then differed sharply in behavior, but the modern behavioral mode gave the Africans a competitive advantage, and they soon spread it throughout Eurasia. By 30,000 years ago, people everywhere were modern in appearance and they were once again similar in behavior.

Nurture or Nature Before the Dawn?, R. Klein, B. Edgar

The preceding Mousterian [period] has provided nothing to compare to the Aurignacian paintings, engravings, figurines, and beads. Together with an increase in stone-tool diversity and standardization and the first routine manufacture of standardized (formal) artifacts in bone, ivory, and antler, the art and ornaments underscore the great gulf that separated even the earliest Upper Paleolithic people from the preceding Mousterians. The contrast becomes even starker when we consider the remarkable monotony of the Mousterian over thousands or even tens of thousands of years and compare this to the rapid diversification in both utilitarian and non-utilitarian artifact types that occurred from the Aurignacian onwards. In the rate at which material culture changed and diversified, only the Upper Paleolithic recalls later prehistory and recorded history.

We are not the first to emphasize the contrast between the Upper Paleolithic and everything that preceded it, and where we speak of the "dawn of human culture," others refer to a "human revolution," a "creative explosion," "a great leap forward," or a "sociocultural big bang." Most authorities highlight European findings, but we have stressed even older evidence for the "dawn" in Africa. The African data are less abundant and spectacular, at least in part because the vagaries of preservation have left fewer relevant African sites and there have been fewer archeologists to seek them out. Related to this, archeologists have been accumulating relevant evidence in Europe since the 1860s, while the key African observations all postdate 1965. Yet, the "dawn" is just as real in Africa, and, equally important, it occurred there first. Spectacular as it is, the European Upper Paleolithic, beginning around 40,000 years ago, was simply an outgrowth of behavioral change that occurred in Africa perhaps 5000 years earlier. That said, we must now proceed to the most difficult question of all: what accounts for the "dawn." The answer as we shall see is contentious, and it may always be that way.

Most archeologists who have tried to explain the "dawn" favor a strictly social, technological, or demographic cause. A small minority, of whom we may be the majority, favor a biological one. We'll outline two characteristic social or technological explanations first and then explain why we think our biological explanation is preferable. We stress at the outset that unlike the "dawn" itself, the explanation for it is more a matter of taste or philosophy than it is of evidence.

Randall White argues that "the rapid emergence of personal ornamentation [in the Upper Paleolithic] may have marked, not a difference in mental capacities between Cro-Magnons and Neanderthals, but rather the emergence of new forms of social organization that facilitated and demanded the communication and recording of complex ideas." In his view, either an increase in population density or a greater tendency for people to gather in large groups could have precipitated the underlying social transformation.

About 11,000 years ago, hunter-gatherers along the eastern margin of the Mediterranean Sea relied heavily on wild cereals (wheat, barley, and rye) and other plant foods, much as their forebears had in the preceding millennia. Their adaptation was stable, and it even allowed for a degree of sedentary life—permanent or semi-permanent hamlets from which the people could exploit abundant wild plants and an accompanying supply of gazelles and other wild animals. Then, starting about 11,000 years ago, climate turned suddenly and sharply colder and drier, and the downturn persisted for 1300 years, during what paleoclimatologists call the Younger Dryas period.

Wild cereals and other key food plants became much scarcer, and Bar-Yosef and other archeologists believe that the people responded by encouraging them to grow in nearby fields. To produce the next crop, they naturally selected seeds from those individual plants that grew best under their care, and in the process they transformed wild species that could grow on their own into domesticates that required human assistance. By 9500 years ago, they had added animals (sheep, goats, cattle, and pigs) to the repertoire of domesticates, and they were full-fledged farmers. The economic transformation encouraged human population growth, and for this reason alone, it also promoted changes in social and economic relations. As population density increased and world climate ameliorated after 9000 years ago, splinter groups broke off to seek new land and they eventually spread the new agricultural way of life westwards to Spain and eastwards to Pakistan.

Bar-Yosef suggests that like the agricultural revolution, the much earlier event we call the "dawn of human culture" involved the invention of new ways to obtain food and that this resulted in population growth and in new modes of social and economic organization. Splinter groups would again have carried the new adaptation from its core area, which in this case was probably eastern Africa. Like White and Bar-Yosef, other archeologists have proposed models in which the "dawn" followed naturally on a technological advance, a change in social relations, or both. Such explanations are attractive in part because they rely on the same kind of forces that historians and archeologists routinely use to explain much more recent social and cultural change. In regard to the "dawn," however, they share a common shortcoming: they fail to explain why technology or social organization changed so suddenly and fundamentally.

Population growth is an inadequate reason, first, because it too would have to be explained, and second, because there is no evidence that population was growing anywhere just prior to the "dawn." We have noted that the Africans who lived just before the "dawn" made MSA (Middle Stone Age) artifacts, while those who lived afterwards produced LSA (Later Stone Age) assemblages. In southern and northern Africa, the interval between 60,000 and 30,000 years ago that encompasses the MSA/LSA transition appears to have been mostly very arid, and human populations were so depressed that they are nearly invisible to archeology. Conditions for human occupation remained more favorable in eastern Africa, but so far, excavations and surveys here also fail to suggest a population increase in the late MSA. Neither the number of sites nor the density of occupation debris they contain increase conspicuously towards the LSA, which began between 50,000 and 40,000 years ago. And in Europe populations grew only after the "dawn" arrived, not in anticipation of it.

Finally, there is no evidence that the "dawn" was prompted by a technical innovation comparable to the invention of agriculture. Archeology not only fails to reveal such an innovation, it suggests that the "dawn" actually marks the beginning of the human ability to produce such remarkable innovations. From an archeological perspective then, the "dawn" is not simply the first in a series of ever more closely spaced "revolutions," starting with agriculture and running through urbanization, industry, computers, and genomics, it was the seminal revolution without which no other could have occurred. This brings us to what we think was the key change that explains it.

In our view, the simplest and most economic explanation for the "dawn" is that it stemmed from a fortuitous mutation that promoted the fully modern human brain. Our case relies primarily on three circumstantial observations extracted from our preceding survey of human evolution. The first is that natural selection for more effective brains largely drove the earlier phases of human evolution. The neural basis for modern human behavior was not always there; it evolved, and we are merely using the available behavioral evidence to suggest when. The second observation is that increases in brain size and probably also changes in brain organization accompanied much earlier behavioral/ecological shifts. These include especially the initial appearance of stone artifacts 2.6 to 2.5 million years ago, the first appearance of hand axes and the simultaneous human expansion into open, largely treeless environments 1.8 to 1.6 million years ago, and possibly also the advent of more sophisticated hand axes and the first permanent occupation of Europe about 600,000 to 500,000 years ago.

Our third and final observation is that the relationship between anatomical and behavioral change shifted abruptly about 50,000 years ago. Before this time, anatomy and behavior appear to have evolved more or less in tandem, very slowly, but after this time anatomy remained relatively stable while behavioral (cultural) change accelerated rapidly. What could explain this better than a neural change that promoted the extraordinary modern human ability to innovate? This is not to say that Neanderthals and their non-modern contemporaries possessed ape-like brains or that they were as biologically and behaviorally primitive as yet earlier humans. It is only to suggest that an acknowledged genetic link between anatomy and behavior in yet earlier people persisted until the emergence of fully modern ones and that the postulated genetic change 50,000 years ago fostered the uniquely modern ability to adapt to a remarkable range of natural and social circumstances with little or no physiological change.

In our view, it is above all a quantum advance in the human ability to innovate that marks the dawn of human culture. The strongest objection to the neural hypothesis is that it cannot be tested from fossils. The connection between behavioral and neural change earlier in human evolution is inferred from conspicuous increases in brain size, but humans virtually everywhere had achieved modern or near-modern brain size by 200,000 years ago. Any neural change that occurred 50,000 years ago would thus have been strictly organizational, and fossil skulls so far provide only speculative evidence for brain structure. Neanderthal skulls, for example, differ dramatically in shape from modern ones, but they were as large or larger, and on present evidence, it is not clear that the difference in form implies a significant difference in function. There is especially nothing in the skull to show that Neanderthals or their contemporaries lacked the fully modern capacity for language.

Our readers sitting as jurors must still reach a verdict, and if we have presented our case capably, they will agree that anatomically modern Africans became behaviorally modern about 50,000 years ago and that this allowed them to spread to Europe where they rapidly replaced the Neanderthals. They will probably also accept the likelihood that modern behavior allowed modern humans of recent African descent to replace non-modern people in the Far East, although in this instance, we as prosecutors would understand if they asked for more evidence. Their only serious reservation, roughly akin to reasonable doubt in the legal system, may concern our argument for what prompted the emergence of modern human behavior about 50,000 years ago. The crux here is logic and parsimony, not evidence, and with the full sweep of human evolution in mind, we would appreciate feedback on just how persuasive our logic is.

First Tool-Users and Makers, L. Barham, P. Mitchell

It is important to distinguish between the using and making of tools in human evolution as these are conceptually different behaviours, and arguably require different cognitive abilities. Tool-users apply unmodified natural objects to do work. Tool-makers deliberately modify materials, and may make tools for use in making other tools. We know more about stone tool technology simply because stone survives better in the archaeological record, but this dichotomy applies equally to organic materials, with both having the potential to provide information on the extent of learned as opposed to innate behaviours. Examples of the learned ability to use tools occur injust a few non-primate species such as the Californian sea otter, bottlenose dolphin, and Egyptian vulture.1 Tool-making is even rarer among non-primates, with only the New Caledonian crow observed to shape leaves into a variety of tools for extracting insects from crevices. The ability of young crows to make leaf tools in captive isolation without input from peers or parents indicates an innate behaviour transmitted genetically rather than socially, though some details of shaping tools may be learned by observation. The ability of primates, in particular chimpanzees, to use and make tools is now well documented, and involves a degree of social learning surpassed only by humans. The shared ability of apes and humans both to use and make tools suggests an inheritance from a common ancestor, but in the case of homiiiins, we developed an absolute dependence on tools for our survival. Why hominins diverged in this significant respect from the apes remains a fundamental question in palaeoanthropology, and one that the African archaeological record can address when placed in an interdisciplinary context.

As consummate tool-users and -makers, we take these capacities for granted, but they appeared relatively late in the y-million-year span of the hominin record. Africa provides the earliest archaeological evidence for systematic stone tool-making, with artefact assemblages recovered from the Gona region of Ethiopia in deposits radiometrically dated to 2.6—2.5 mYa. The Gona flakes and cores represent a Mode 1 tool-making strategy, and the first appearance of what is also known as the Oldowan Industry, which encompasses the oldest stone tools in Africa and adjacent regions of Eurasia. It features deliberate flaking of stone to produce sharp cutting, chopping, and scraping edges, as well as tool use in the form of unmodified hammerstones and stone anvils used for pounding and grinding.

The basic questions of when, where, who, how, and why stone toolmaking emerged are the subjects of this chapter. Three related issues are explored: the primate heritage of tool use, the anatomical and cognitive requirements for stone toolmaking, and the adaptive advantages of toolmaking in the context of ecological, dietary, and social changes between 3 .o and 1.6 mya. We conclude with a geographical and chronological overview of the distribution of Mode I technologies across Africa and farther afield in Eurasia, where they reflect the broadening range of toolmaking hominins beyond Africa.

The common chimpanzee (Pan troglodytes) and, to a lesser extent, the bonobo (Pan paniscus) habitually use a broad range of natural objects as tools, make tools, and have regionally and locally distinct toolkits. The range of tools used and made exceeds those reported for capuchins and includes different contexts of use, such as social displays, defence against predators, and maintenance of personal hygiene. Chimpanzees also do something not seen among capuchin groups; they use two or more tools in a sequence (a toolset) to accomplish a task. Toolsets have been used to penetrate beehives and termite mounds to extract food, and both activities involve considerable forethought and an understanding of cause and effect. Such sequential thinking is also seen in the earliest Mode i assemblages from eastern Africa, which suggests that this cognitive capacity for goal-oriented, tool-based behaviours existed among the last common ancestors of apes and hominins.

Tomasello draws a clear distinction between human and chimpanzee cultures that has implications for reconstructions of the co-evolution of tool-making and cognition. He places childhood learning at the heart of a distinctively human ability to retain and transmit innovations over many generations. This ability is based on the human child's facility for cultural learning based on imitation, emulation and teaching by others, all of which allow the individual to absorb the technological and social heritage into which he or she is born. Through social learning, the human child and adult share a collective cultural inheritance that becomes a repository of knowledge and a basis for innovation. This inheritance is the foundation for a potential 'ratchet effect' in cultural evolution by which cumulative knowledge is retained but also transformed over time. Chimpanzees are less capable of learning new behaviours because they lack an understanding of the intention or goals of other tool-users, and instead focus on the physical act or environmental context of a behaviour, only rarely engaging in imitative learning and teaching. This vision of the distinction between chimpanzee and human social learning is controversial, and may have limited utility for distinguishing between early hommin as opposed to Homo sapiens tool-use, but it reinforces the concept that the social context in which learning takes place is of prime evolutionary importance.

In summary, behaviour patterns among living primates, in particular the great apes, provide the foundations for reconstructing the conditions in which tool-use arose and the underlying toolusing abilities of a hypothetical last common ancestor. Tool-assisted feeding probably developed among primates that lived in environments offering nutrient-rich foods that were not accessible using hands or teeth alone, such as termites, hard-shelled nuts, tubers, and large carcasses. Also needed was some minimum level of manual dexterity and the cognitive capacity to innovate. Increased neocortex size among primates, which equates with enhanced information processing and memory, correlates with rates of both innovation and social learning. Innovations are most likely to have taken hold among the most sociable communities where cultural selection retained new skills, especially those that enhanced learning during childhood. Social and physical environments that gave offspring the time to learn from others without being in competition with adults or at risk from predators will have been more conducive to the transgenerational transmission of innovations. An extended childhood and adolescence based on communal provisioning of offspring no doubt provided a stable foundation for social learning and innovation.

We now know that stone tool-making is at least 2.6 million years old, but how much older both tool-use and tool-making might be remains a matter of informed speculation. The evidence for capuchin technology suggests an underlying anthropoid capacity that long preceded the first stone tool-making. There is, however, another route to examining the possible existence of a pre-Oldowan technology, and that is through comparative anatomy of the hands of apes and hominins, as well as experimental observations of tool-use among apes and humans.

Hominin evolution can be characterised by the following trends based on the fossil record and on comparisons with the our closest living relatives, the chimpanzees: a shift to habitual (obligate) bipedal locomotion, decreased tooth size, increase in brain size, decrease in skeletal robusticity and sexual dimorphism, an increase in life span, extended childhood growth and period of dependency on others, and an increased reliance on technology and culture. Since the early 19905, the fossil evidence for the evolution of bipedalism has extended the time span of the earliest hominins to the late Miocene, ~6—5 mya, in close agreement with the estimated date of the last common ancestor of humans and chimpanzees based on genetic data.

Hominin Evolution

The earliest Pliocene hominin currently known is Ardipithecus ramidus (4.4 Mya) (Ethiopia), a likely descendant of A. kadabba and also associated with woodland habitats. The phylogenetic relationship is unclear between this taxon and the slightly later and much better knoAvn hominins attributed to the genus Australopithecus. The number of species assigned to Australopithecus varies depending on taxonomic approach, with some researchers splitting the fossil record into a number of genera and species based on differing morphological traits (cladistic analysis). Others recognise considerable regional and chronological variation within species, preferring to encompass this diversity within fewer genera until the biological significance of specific traits is better understood and the fossil record expands.

The australopithecines are recognised here as having brains similar in size to chimpanzees, expanded molars and pre-molars (megadonty), small canines, and a wide distribution across Africa, and they are the likely root from which Homo evolved. Australopithecus anamensis (Kenya) is the earliest recognised representative (4.2—3.9 mya), and lived in a mosaic of woodland habitats and grassy floodplaiiis. This species may be the ancestor of A. afarensis (3.6—2.9 mya), which is well known from the fossil record as a habitual biped, though able to climb trees, highly variable in its morphology, and wide ranging across eastern Africa and possibly farther south.

The definition of fossil Homo and the number of species in this genus continues to be debated, with arguments made in favour of expanding or reducing membership. The two earliest representatives, H. habilis and H. rudolfensis (~2.3—2.0 mya), have been argued to be morphologically and behaviourally closer to australopithecines than to H. sapiens, and conversely both are considered as plausible long-legged ancestors of H. erectus. The recent discovery of an upper jaw attributed to H. habilis at Koobi Fora, Kenya (~1.4 mya), not only extends the time range of this species, but also throws some doubt on its position as the ancestor of H. erectus as both species seemed to have co-existed, at least in this part of East Africa. The ancestor of H. erectus may have yet to be found. Uncertainty about the significance of character traits as taxonomic indicators also afflicts the status of H. erectus (~1.8— 0.6 mya), which is viewed as either a single but regionally variable species that probably evolved in Africa then spread into Eurasia or an Asian species that developed distinctive robust features and that evolved from an African ancestor, Homo ergaster. We use H. erectus in preference to H. ergaster throughout this text and recognise the likelihood that regional subspecies co-existed.

Making stone tools is a deceptively simple act that involves a complex interplay between bone, muscle, and brain. A skilled knapper understands the flaking properties of stone, the relationships among angle, force, and placement of a blow, and the resulting length and thickness of a flake. These skills take time to learn, but also involve distinctive biomechanical features of human anatomy that are a likely consequence of the evolution of bipedalism. Apes use their hands for locomotion (e.g., climbing, brachiating, or walking as quadrupeds), as well as for manipulating objects. Humans, as terrestrial bipeds, have their arms and hands free from the constraints of locomotion and, compared with apes, have relatively short fingers and a long thumb. The increased relative length of the thumb gives humans greater dexterity, and a unique pad-to-pad form of the precision grip.... In particular, humans have the unique ability to cup the palm of the hand that plays a central role in power grips. Cupping the hand has the added effect of increasing the area of skin in contact with an object, which means more sensory information can be relayed to the brain.

Given the distinctive form of the human hand with its relatively long opposable thumb and ability to form a deeply cupped palm, it should be possible to assess the toolmaking abilities of early hominins based on fossil hands. Dated to 2.6—2.5 mya, the oldest stone tools currently known come from sites along the northern Awash River and its tributaries at Gona in the Afar region of Ethiopia. Stone flakes and cores have been found associated with faunal remains, some of which have cut-marks from processing animal carcasses, but as yet no hominins have been found with the Gona tools.... In southern Africa, the Oldowan appears later, at 2.0 mya, postdating Australopithecus africanus (3.3— 2.5 mya), but coeval with two possible contenders as tool-makers, Paranthropus robustus and early Homo sp.

In eastern Africa, the earliest dates associated with the genus Homo, at 2.4—2.3 mya postdate the appearance of the Oldowan by at least 100,000 years. Assuming this age for the emergence of early Homo remains unchanged by future research, then stone tool-making was already established by the time our genus appeared and was contemporaneous with Australopithecus and Paronthropus in eastern Africa.

The australopithecine pattern of development resembles that of early Homo, "whereas the morphology of the Paranthropus brain most closely resembles that of the great apes. The relative brain size of the australopithecines, including A. qfarensis, is as large as or larger than, that of chimpanzees and gorillas. These indirect indicators of cognitive ability, combined with the anatomical evidence for human-like grips, make the australopithecines prime candidates as early tool-makers and innovators of stone tool technology. Paranthropus, however, cannot be excluded as a possible stone tool-maker based on brain organisation and size alone. Brain size in this genus increased over time along with that of early Homo, and Paranthropus is found on Oldowan sites in both eastern and southern Africa.

Once the Oldowan industry makes its appearance, early Homo in the form of Homo habilis is found with stone tools in both eastern and southern Africa, supporting arguments for a behavioural and phylogenetic linkage between later australopithecines and early Homo. This lineal evolutionary sequence of tool-making from australopithecines to Homo is complicated by the .association of Paranthropus with the Oldowan industry. Tool-making patterns need not be species-specific, and in the case of the early Oldowan, from 2.6 to 2.0 mya, there is the likelihood that multiple species of tool-makers coexisted, perhaps developing differing cultural traditions of tool use comparable to those seen today among chimpanzees and bonobos. This potential complexity in the archaeological record raises significant methodological issues of how to recognise traditions within the Oldowan including those that may not be typical of the flake and core content of this industry. All these issues also apply to the later African archaeological record, but they first arise with the origins of the Oldowan.

The fundamental question of why stone tool-making developed when it did remains to be answered. If the capacity to use and make tools had evolved with the last common ape/hominin ancestor during the Miocene—Pliocene transition (8—5 mya) or perhaps much earlier with the common ancestors of Old and New World monkeys (35—30 mya), "why did stone tool technology develop relatively late in the broad sweep of hominin evolution? There is the possibility that an earlier manifestation of stone tool-use exists, but has not been recognised by archaeologists because it does not closely resemble the Oldowan as currently understood. Much of Kanzi's efforts at knapping produced broken flakes and other fragments that would be difficult to distinguish from naturally fractured stone.

An example of-what a pre-Oldowan assemblage might look like is provided by the chimpanzees of the Ta'i Forest, Ivory Coast, who are currently creating archaeological deposits by systematically cracking hard nuts using stone hammers and anvils. The deposits are characterised by an abundance of small fragments of shattered stone (<20 mm), a few flakes, no cores, and battered hammers and anvils. The rarity offtakes and lack of cores distinguishes these nut-processing areas from Oldowan sites. Such non-Oldowan sites would also be difficult to recognise in the context of naturally occurring accumulations of rock fragments and battered stones found in river and lakeshore deposits associated with early hominin sites. Actualistic studies are needed to develop criteria for distinguishing between deliberate and unintentional (natural) patterns of surface damage on stone. The Ta'i Forest excavations are a salutary reminder of the possible diversity of archaeological sites that may have existed before, during, and after the Oldowan, but remain methodologically invisible.

For the time being, the archaeological evidence points to stone tool technology having developed first in eastern Africa by 2.6 mya, then spreading to southern and northern Africa and into Eurasia by 1.5 mya among hominins adapted to savanna habitats (mixed woodland and grasslands). If the period 2.6—2.5 mya marks a genuine threshold of technological innovation, then we must consider what factors stimulated a shift towards greater use of stone technologies — in particular, cutting and pounding tools — to the extent they become archaeologically visible as distinctive Mode i accumulations. Our hypothetical last common ancestor used tools as an adjunct to its food procurement strategies, but was not dependent on technology for accessing essential foods. This may also have been the situation among the early australopithecines such as A. afarensis. With the appearance of the Oldowan, the relationships among technology, diet, physiology, and social traditions may have changed for some species, including early Homo, in ways that created new dependencies and adaptive opportunities.

Stone tools are assumed to have played a significant role in the feeding and social strategies of early hominins and explanations for the emergence of the Oldowan generally stress the adaptive value of flakes and hammerstones as tools for accessing meat and marrow. The systematic incorporation of animal protein into the diet of early hominins has long been regarded as the catalyst for the evolution of distinctive hominin traits, including large brain size, increased intelligence, increased body size, and delayed maturation with an extended childhood supported by food provisioning. Meat-eating has also been credited with the development of human forms of social organisation, including food sharing, nuclear families, base camps, and a sexual-based division of labour.

If the dietary range of these [Australopithecus, Paranthropus, and Homo] hominins can be broadly reconstructed, then the possible contribution of stone tools can be assessed, along with the likelihood that one or all species might have benefited from this technology.... Available animal sources would have included reptiles, birds, and rodents, as well as medium and large grazers. Scavenging or hunting large grazers without effective weapons would have exposed hominins to direct competition with dangerous predators, making this a risky source of food. A flexible pattern of omnivory may thus have existed among Australopithecus, Homo, and Paranthropus, with animal inputs involving a small fraction of scavenged or hunted meat and marrow.

The shift to greater dietary flexibility seen with Pamnthropus, the later australopithecines, and among early Homo coincides with the onset of global glacial cycles after 2.7 mya and the resulting selection pressures of increased aridity and environmental instability.... The stable carbon isotope evidence suggests these hominins were, well-adapted omnivores whose unique dental, skeletal, and muscular features enabled them to process a diverse range of foods. Young paranthropines could have had a weaning advantage over the young of Australopithecus and early Homo by being able to consume difficult-to-access foods (i.e., hard seeds and bone) at a critical time in their development. As a consequence, they would have been less reliant on feeding support from adults, with implications for the structure of Pamnthropus social groups and the life histories of individuals. The inherited ability to process tough foods would have lessened the need for the pounding, cracking, or grinding tools that form part of Oldowan assemblages. All three genera, based on the carbon isotope data, were probably omnivores able to consume plant and animal resources in a range of habitats. They may have coexisted in the same landscape at low population levels, each consuming similarly broad spectra of foods. The use of stone tools for processing carcasses as well as vegetable foods could have been part of their respective feeding strategies, though it seems Paranthropus would have had least need for heavy processing equipment such as stone hammers and anvils.

Given the obvious advantages of stone for cutting, scraping, and pounding (and for making other tools), we ask the question again - why did it take so long for hominins to develop this technology? The anatomical and cognitive capacity to make stone tools probably existed with A. afarensis, but this species became extinct by 2.9 mya, before the earliest currently known tools from Gona, Ethiopia.

[One] line of evidence comes from a combination of regional and local sources of environmental data that, when combined, show pronounced fluctuations in climate 3.4—1.8 mya. Looking more closely at the pattern of climate data reveals correlations among increased aridity, shifts in vegetation patterns, greater habitat variability, and the emergence of stone tool manufacture. A correlation is not necessarily causal, but climate change, by altering the distribution of plants, animals, and availability of surface water, can create adaptive stresses as well as new opportunities, depending on a species' ability to respond.... The australopithecines and paranthropines were dietary and landscape generalists, and we can infer from primate analogies and the fossil record that tool-using was part of their respective behavioural repertoires. These taxa were thus pre-adapted to the pronounced changes in climate and biogeography that preceded and coincided with the emergence of stone tool manufacture.

The periodicity and amplitude of the shifts between climate states (extremes of cool/dry and warm/humid) govern the distribution, stability, and variability in the distribution of the plant, animal, and water resources that were essential for early hominins. The northwest African coastal pollen record shows a long, warm, moist period before 3.5 mya with increased aridity between 3.5—3.2 mya that heralds a trend towards drier conditions across the Sahara and Sahel. This trend towards aridity in North Africa culminated ~2.6 mya in association with the growth of ice sheets in the northern hemisphere, marking the start of global glacial cycles. In North Africa, the onset of glaciation was associated with strengthened trade winds, reduced sea-surface temperatures, and the desertification of the Sahara.

The changing abundance of three key ecologically sensitive taxa indicates significant shifts in vegetation cover that can be linked to global and regional shifts in climate. Forest-loving taxa (e.g. suids) dominate assemblages 4.0—3.2 mya and taxa adapted to more open vegetation become more abundant after 2.5 mya. The rise of this bovid-dominated fauna marks a shift towards increasing seasonal aridity and corresponding loss of forest habitats. The transition from relatively closed forested environments to more open woodland and grasslands was marked by an increase in large-bodied terrestrial primates like Tlieropithecus, able to range widely to exploit more dispersed resources and defend themselves against the risk of predators in more open landscapes. These gradual trends underlie two episodic shifts in the fauna of the Omo ecosystem that took place ~2.8 and 2.5 mya and are linked to global climate. The onset of orbitally driven glacial cycles 2.5 mya corresponds with increased variability in the Omo fauna and the appearance of stone tools in the sequence.

Global climate change 2.8—2.5 rnya has been correlated with large-scale pulses of speciation among African fauna, including hominins, but regionally specific and high-resolution databases are necessary if the causal links are to be made among climatic instability, habitat variability, and hominin responses. The South African cave-derived data lack the temporal resolution of East Africa's rift sequences. However, faunal and isotopic data from Makapansgat and the Sterkfontein Valley show a trend of increasingly arid conditions and a gradual shift from woodland savanna to more mosaic habitats from the mid- to late-Pliocene, characterised by woodlands of varying density and more open grasslands.

The apparent correspondence in timing between the onset of global glaciation, increased aridity, and habitat variability and the appearance of stone tools could be coincidental, but there are other developments in hominin morphological evolution that, taken together, highlight the mid-Pliocene as a time of significant transition. The interval 2.9-2.5 mya saw the evolution of the paranthropines as a lineage of megadont hominins and the largetoothed, but less dentally robust, A. garhi.... An evolutionary trend emerges after 2.6 mya, with the extinction of 'gracile' australopithecines (A. afarensis, A. africanus) and the evolution of even more megadont forms (P. aethiopicus, P. robustus, P. boisei) linked to the spread of more seasonal, arid habitats in which USOs [underground storage organs] were potential fallback foods.

For the paranthropines at least, it seems that tool-use was unnecessary for processing USOs - their robust dental apparatus did the job - but digging sticks to access roots and hammerstones to split open marrow-rich bones, or flakes to remove scraps of meat from carcasses, might still have played a role in their feeding strategies. Shortly after the appearance of stone tools, the fossil record reveals another trend, that of reduced or gracile masticatory apparatus and increased relative brain size. This trend starts with the earliest representatives of the genus Homo (2.4-2.3 mya), Homo rudolfensis and Homo kahilis. Considerable uncertainty accompanies the morphological separation of these two taxa and even their attribution to the genus Homo rather than Australopithecus, but we retain the view that whether one or two species, they mark an adaptive shift towards the Homo lineage. Genetic evidence points to a mutation about 2.4 mya that inactivated the gene coding for large chewing muscles, and this may have had the knock-on effect of releasing the physical constraints on brain size created by the force of massive muscle attachments on the skull and its sutures. Stone pounding tools for processing USOs could have acted as surrogate molars and premolars, enabling those hominins with this mutation to coexist with the paranthropines as savanna omnivores. The release of constraints on encephalisation may be expressed in the increased brain size of early Homo notable after 2.0 mya, but had its roots in the complex interplay among environment, physiology, and behaviour. The evolutionary roots may also lie with the gracile australopithecines who, as potential ancestors of Homo spp., lacked sagittal crests to support large chewing muscles and had relatively reduced molars compared with the paranthropines. Whether a genetic release took place or a more subtle combination of factors selected for increased relative brain size, the fossil evidence points to a late Pliocene threshold.

The first peak in aridity ~3-5 mya neatly corresponds with the time span of A. afarensis, which, on anatomical grounds, had the ability to use and make tools and probably the dietary plasticity inferred for later australopithecines. We presume that its descendants, such as A. garhi, responded to increasingly heterogeneous and dry environments ~2.5 mya "with the innovation of stone tool manufacture, perhaps derived from a nut-cracking tradition.

Tool-making was certainly not the only behavioural response to environmental pressures available to early hominins, but it is the one that leaves a mark on the landscapes of Africa and beyond. The distribution of Mode i sites in eastern, south-central, southern, and northern Africa between 2.6—1.8 mya shows tool-making hominins living in regions with increasingly open and seasonal habitats that emerged following the onset of global glacial cycles.

Early hominins adapted to the seasonality of savannas both anatomically (Paranthropus dentition) and behaviourally through tool-assisted access to meat, marrow, insects, and underground plant foods. Other behavioural developments such as food-sharing, pair-bonding, extended parenting, and the use of central gathering places in the landscape may also have enhanced individual and group survival. This typically African biome also extended eastwards across Asia as far as northern China (40°N) by ~3 mya, raising the possibility that australopithecines and paranthropines had extended their range far beyond current known limits and entered Asia long before the appearance ofH. erectus sensu stricto 1.8 mya. The broad expanse of this transcontinental 'savannahstan' could have supported grassland-adapted taxa, and it is not surprising that the earliest Mode i technologies found outside Africa are in seasonal savanna habitats, such as Dmanisi, Georgia (1.7 mya and found with early Homo fossils), Erq el-Ahmar, Israel (~1.8 mya), possibly and controversially at Riwat, Pakistan (~2 mya), and in China's Nihewan Basin (1.66 mya). Though Homo is assumed to have been the first tool-maker to spread from Africa, the possibility remains that even earlier hominins, as resource and landscape generalists, were part of this savanna biome. Flint artefacts found at Yiron, Israel, in the Levantine extension of the Rift Valley, may be 2.4 mya and, if so, they reflect a very early movement of Mode i makers out of Africa. Further research at Yiron is needed, however, to demonstrate a clear association of the tools with the dated volcanic deposits.

This chapter asked why stone tool-making appeared relatively late in hominin evolution. The answer lies in a combination of evolutionary processes, with global climate change being the most recent stimulus acting on a common hominin heritage of bipedalism, omnivory, and tool-use. The shift to global aridity starting roughly 3.5 mya with the build-up of northern hemisphere ice, and culminating with the start of orbitally driven glacial cycles at 2.5 mya, created both intermittent and sustained ecological pressures. Peaks of aridity interrupted a gradual drying trend followed by the onset of the rhythmic waxing and waning of glacial cycles. Climatic variability and increased seasonality altered the distribution of plant, animal, and water resources, creating strong selective forces that favoured ecological generalists over specialists. The early hominins — australopithecines and paranthropines — developed a range of morphological and behavioural responses to increasingly harsh and unpredictable conditions, with some species becoming extinct as well. Differing contexts of social learning combined with varying traditions of innovation and variations in neocortex size could have given some species greater behavioural plasticity than others. Stone tool technologies contributed to the ecological flexibility of at least one taxon by facilitating its ability to extract and process fallback foods in the context of seasonally dry habitats. The interaction of social intelligence, innovation, and brain size with the environment establishes an evolutionary framework for modelling the emergence of the human dependency on technology 2.6 mya. This is the framework on which the subsequent development of the human lineage was built, enabling it to cope in this new and climatically unstable world and, in the case of Homo, to thrive as the premier landscape generalists.

The Problem of Modern Human Origins, M. Mirazon Lahr

If a gradual and continuous view of modern human origins is assumed, then the regional differences between populations are related to the first establishment of widely dispersed regional hominid groups, whether modern or archaic, for it is assumed that regional continuity of occupation and form occurred. This view has been formulated into an evolutionary hypothesis called the Multiregional Model of modem human origins. It argues that there was no event associated with the appearance of a modern human form, but rather the development of regional variants of a single species which followed similar evolutionary trends because of extensive inter-regional gene flow. The establishment of regional variants would have occurred approximately one million years ago, when Homo erectus expanded out of Africa towards Southeast Asia, Eastern Asia, Central Asia and Europe, while the transition from archaic to modern would have occurred in parallel in all regions because of genetic diffusion. According to this view, a modern human morphology is secondary to regional diversity, and thus, the origin of human diversity precedes the origin of modern humans.

An alternative view would see the record as discontinuous and anachronic, and interprets the appearance of modern humans as a distinct localised event followed by replacement of archaic hominids. This view has also been formalised into an evolutionary model of origins, called the Out of Africa hypothesis. It argues that modern humans appeared in Africa in the late Middle to early Upper Pleistocene, and subsequently expanded to form regional populations which diversified from an already modern ancestral form. According to this view, modern human regional diversity is secondary to a modern morphology, and thus, the origin of modern humans precedes the origin of modern diversity.

The key to this argument lies in the role of gene flow between regional populations of archaic and modern hominids. Besides temporal regional continuity, a multiregional model of Pleistocene hominid evolution requires that large amounts of spatial continuity between regions took place. This large amount of genetic exchange between widely separated areas like Java, East Africa, Europe and China, is necessary in order to maintain the temporal overall similarities and prevent local speciation occuring. Again, past gene flow cannot be directly measured, and traditionally two sources of inferential information have been used. The first of these is inferred from current genetic exchange between foraging groups like the Inuits, and transposed into the past. This method gives a good measure of possible gene flow in hunter-gathering groups, but does not take into account past demography and the opening and closing of routes in the past million years. Whether the world population of H. erectus had the necessary critical size to maintain gene flow patterns similar to any modern group is again debatable, and current interpretations of mtDNA data suggest not. The second source of evidence for inter-regional gene flow comes from the appearance of fossils and/or archaeological remains in one area which have strong affinities with another region of the world. These new forms are interpreted as resulting from gene flow from another regional hominid population, which acts to maintain the gradual multiregional change through time.

The issue of whether gradual and universal mechanisms characterise the evolution of modern populations also has a socio-political dimension. It has been argued that the view of punctuated and rapid events creating discontinuity of populations (i.e. population extinction) is a 'racist' view of prehistory, one that highlights the supremacy of one form over the other (be it modern humans over Neanderthals, or one modern population over a previously existing one). However, the question of evolutionary advantages of one group in relation to other closely related groups or even individuals is the whole basis of Darwinian thought. That the observed succession of forms in the fossil record represents the advantage of one form over another is the reality of evolution. This advantage may be biological, social, demographic, or even incidental, in the sense that a population may out-compete another only for the particular circumstances (environmental or demographic) of the moment in which they are competing for resources rather than for an absolute biological or social advantage. In such cases, the replacement mechanism may be reversed as circumstances change.

The evolutionary origin of modern humans has always been a controversial subject of study. Different theoretical assumptions and interpretations of the fossil record exist, and in recent years research has become polarised into two main opposing hypotheses. Both these hypotheses are concerned with explaining the origins of modern humans from an archaic hominid ancestor, either from a single or multiple geographical and biological source of ancestry. However, the origin of modern human diversity, an issue that is intimately linked with the process of origins of modern humans as a whole has received much less attention.

In evolutionary terms, the Multiregional Model offers an explanation for the evolution of modern human diversity. By proposing that a modern morphology developed from the regional archaic populations in several regions of the world, this model implies that the modern cranial form was superimposed on existing regional variations, and the differences between recent populations would be as old as the establishment of the archaic regional pattern. Therefore, this model sees regional differences as very ancient and very stable, and the process of modern diversification could be de coupled from that of modern origins. In contrast, the theoretical basis of the opposing view, i.e. a single and recent origin of all modern humans, implies that the origin of the species and that of the groups within the species were the result of different processes occurring at different times. Therefore, it is impossible to address the problem of the evolution of modern diversity without addressing the issue of modern human origins first, since the model of human origins accepted will determine the mode of diversification.

To explore the patterns and processes of the evolution of modern human diversity, either the multiregional or the single origin model has to be assumed with confidence. This book has titus begun with a review of the two opposing views on the origins of modern humans and the existing supporting evidence. The literature indicates that most lines of research, whether palaeontological, chronological, biogeographical, demographical or genetic, indicate that a single and recent origin of all modern humans took place. This evidence rests on the early dates of some African and Middle Eastern Homo sapiens fossils that antedate the appearance of modern people elsewhere, on the apparent contemporaneity of supposedly ancestral-descendant populations in some areas, on the patterns of faunal movements and associations with the hominid fossils suggesting the dispersal of early modern humans out of Africa, on reconstructions of past demographic parameters showing that gene flow could not have been maintained until a critical world population size was achieved late in the Pleistocene, and on extensive nuclear and mtDNA genetic studies which indicate the high probability of an African ancestry of all modern people in the recent past. In spite of this impressive body of evidence indicating a single source of ancestry of all modern people, the controversy is not resolved, for there are a number of cranial features that seem to link regional archaic populations with the recent modern inhabitants in those regions, and thus prove a multiregional pattern of descent.

Since the hypothesis of a single origin is supported by several lines of evidence, the solution to the problem lies in confirming or refuting the multiple archaic-modern regional lines of descent. The first section of this book examined these lines of descent in space and time, as well as the assumption that the particular features that reflect the multiregional evolutionary process (the 'regional continuity traits') can be taken to represent stable and independent phylogenetic markers. Therefore, the work presented in the first section of this book studied the problem of modern human origins from a different perspective from most other studies. It is argued that a re-evaluation of the character, temporo-spatial expression and phylogenetic validity of those traits proposed by the Multiregional Model as showing regional continuity through time would either substantiate or discard the morphological evidence for this model. A study of the regional distribution and anatomical relationships of these traits using a worldwide sample of modern human crania was carried out. Furthermore, using a worldwide cranial sample as representative of modern morphological variability, a standardised scoring technique was developed. This scoring system is based on the definition of the observed degrees of expression of a number of features into categorical grades, which in turn allowed quantification and statistical testing of their regional incidence.

The re-evaluation of the morphological evidence for multiregional evolution clearly shows that the particular features that have been proposed as characterising the evolution of Chinese and Javanese H. erectus into modern Chinese and Australian aboriginal populations fail to do so in several accounts. Firstly, in a number of cases the recent spatial distribution does not confirm certain features as typical of either Chinese or Australian Aborigines, either because their expression is not exclusive to those regions, or because they do not occur in these two populations at all. Furthermore, it was found that a number of supposedly Javanese-Australian features were not typical of the Ngandong sample, while in other cases these archaic hominids clearly show anatomical structures non-homologous with the modern condition. In terms of the temporal distribution of the 'regional continuity traits', it was not possible to identify a clear morphological lineage in the late Pleistocene modern humans of China, strongly rejecting a view of morphological stability over a period of a million years.

On the basis that the morphological evidence for regional lineages of descent from the early Pleistocene is partly incorrect and partly wrongly applied to evolutionary studies, the results obtained strongly refute a multiregional evolution of modern humans. In the absence of contradicting morphological evidence, a single and recent origin of all modern humans, as indicated by all other lines of evidence, can be assumed. Accepting a single and recent origin for the modern morphology implies that all the differentiation between modern populations occurred subsequent to this evolutionary event.

The time and rate at which different regional populations seem to have diversified during the late Upper Pleistocene suggest that this process [of variation in cranial form] was not synchronous throughout the world, showing varying levels of specialisation in relation to an ancestral morphology at any one time. Such differences are consistent with the view that different evolutionary mechanisms were acting on each population at any one time, reflecting the historical component in the determination of regional variation. In this context, the different levels of gene flow, selection and drift acting upon populations to create diversity would have been closely associated with the different geographical parameters of each group. Therefore, the specific circumstances affecting each population at different times had a historical component (in terms of the character and time of origin of the population) and a geographical component (in terms of the level of isolation in which that population is responding to the genetic mechanisms involved). These different components had a major role in determining the level of distinctiveness of populations.

The evidence of modern human diversification suggests that geographical differentiation was not only an integral part of the process, but that early geographical divisions account for the major divisions among humans today. However, the data further indicate that early geographical separation was not followed by gradual change at a regional level, but rather by a very dynamic process of population expansions and contractions, constantly altering the balance between stability and change. These patterns are best explained by a model of multiple dispersals from the ancestral source in Africa and from the secondary sources outside Africa, giving rise to more than one hierarchical chain of events. Depending on the level of ecological fragmentation determining the level of gene flow between groups, the process of multiple dispersals superimposed on the range of previous groups within each region will either create a mosaic of distinctive populations or relative homogeneous morphologies with some clinal variations.

In terms of behavioural changes, the appearance and expansion of modern humans is not associated to a significant behavioural event, and it is possible to hypothesise that whichever changes in behaviour that allowed a small population to survive the stringent conditions that drove it to reduce its population size dramatically, and later allowed it to expand over wide geographical areas, had already occurred when a modern morphology appears in the fossil record. This implies that such a behavioural change was not reflected in the type of stone tool technology, for late archaics, early modern humans and Neanderthals all share a technological complex. The distinctive Upper Palaeolithic of the late Pleistocene was a localised event associated with the expansion of modern humans into Eurasia, and not with modern human origins per se. This implies that particular tool technologies are not associated with biological levels of organisation, and therefore, their small stylistic variations most likely reflect the identity of the groups producing them. And lastly, it is clear that the origins of a modern morphology have to be disassociated from the process of skeletal gracilisation, which although characterises most modern populations at certain stages, never got to be a universal trend and was not part of the process of modern differentiation from an archaic source.

The data presented in this study support a single origin of modern humans in Africa in the late Middle to early Upper Pleistocene, followed by a process of multiple dispersals and early spatial differentiation throughout the Upper Pleistocene. This hypothesis rests on the integration of levels of morphological variation and distribution of archaeological remains, and is broadly consistent with statistical reconstructions of population relationships based on recent cranial and genetic data. However, the integration of different sources of prehistoric data is still very limited and faced with the problem of interpreting the patterns of cultural versus biological diversity, and those of recent versus past diversity. Discovering the specific history of each modern population, including all those whose existence we can only infer from the fossil record, remains the greatest challenge.

The Human Revolution, C. Gamble

When it comes to deciding what it is to be human and even more when we ask what constitutes humanity, our bodies are as important as our cultures, our minds as significant as our biology. But interpreting evolutionary clues, whether from bodies or culture, is never easy, nor are the results definitive. Answers to any question about ourselves will always be informed by contested issues such as race, intelligence, politics, gender and age. In such company, the concept of revolution is just another arena in which differences are defined, but an important one for establishing identity. At issue is the tempo of our appearance, slow or fast, and whether we — as humans — are a recent or ancient phenomenon. And the answer matters not only because it points to our capacity for change but also to what we think we share with others, past and present.

Identity, whether of self or society, is traditionally an exercise in drawing boundaries and the identification of revolutions has provided a useful hammer to drive in the historical posts. But there are different revolutions for different jobs. The Human Revolution draws the line between ourselves and the ultimate other, the outgroup which defines us and gives us meaning as a species and as humans rather than animals. But unlike the Neolithic Revolution, where social strangers live among or alongside us, with the Human Revolution we only glimpse those others in the rear-view mirror of prehistory. What I question is whether these revolutions are the correct conceptual tools for understanding change, because the outcome of our evolutionary journey has left humanity equally at home in an urban metropolis and a tropical rainforest, and in societies of millions or hundreds... In this chapter I will be looking in some detail at the notion of universal humanity and discovering how the concept of an anatomically modern human, the agent provocateur of the Human Revolution, arose as a scientifically meaningful adjunct to the study of Homo sapiens, ourselves, as a diasporic species with global coverage.

Although they disagree with Klein's recent Human Revolution, Sally McBrearty and Alison Brooks are nonetheless confident that fully modern human behaviour can be characterised in cognitive and cultural terms. Their scheme sets out what many archaeologists regard as a workable definition of humanity; workable in the sense that data do exist to back up the descriptions and so provide insights into the tempo as well as the geographical origin of change. But would anyone outside archaeology recognise those four cognitive skills — planning, symbolic behaviour, abstraction and innovation — as an adequate definition of what it is to be human? Even within the subject Francesco d'Errico regards their list as very limited and pre-judgemental of the outcome they want; namely that the Human Revolution has been overstated by judicious use of the European data and misrepresentation of the African evidence. Although they provide a check-list and a timetable it is McBrearty and Brooks' opinion that the unique behaviours of modern humans that would qualify them for the 'fully' prefix are to be discovered rather than prescribed. But here they become caught in the circularity common to all such archaeological exercises in origins research.

Is there an alternative? Elsewhere I have argued for a social explanation rather than either a cognitive re-organisation or genetic mutation to account for the differences in the archaeology. One striking feature of world prehistory after 60,000 years ago is that it is for the first time just that, a world prehistory. After this date Homo sapiens began their diaspora and in less than i per cent of the time since we last had a common ancestor with our closest primate relatives, the chimpanzees, we had migrated to the previously uninhabited islands and continents that before made up almost three-quarters of the Earth. From being a hominin long confined to a portion of the Old World we suddenly became global humans. Furthermore, for the first time in our evolutionary history we became a single species differentiated only by geographical variation. What we see in those ocean voyages, and the settlement of the seasonally cold interiors of continents, is a clear geographical signature that social life was fully released from the constraint of proximity that explains why most primates are not world travellers. What characterises social life in humans rather than hominids is our ability to extend social relations across space and through time, a theme I return to in Chapter 8. This is in marked contrast to all our primate cousins whose social life is based on local co-presence.

The point about the Human Revolution as outlined in the publication ['The Human Revolution' edited by Mellars and Stringer] was that it was recent. To emphasise its late appearance in human evolution it is also known as the symbolic revolution, when representational art appears for the first time; a revolution best summed up by the title of John Pfeiffer's review: The creative explosion. This revolution was associated by William Noble and Iain Davidson with the appearance of language, by Steven Mithen with a fully modern modular brain characterised by cognitive fluidity, by Lynn Wadley with evidence for external symbolic storage (art, ornamentation, style in lithics and the formal use of space) and by Robin Dunbar with the late appearance in human evolution of a theory of mind capable of the cognitive gymnastics implicated in religious beliefs.

The focus was very much on Europe but Klein in particular broadened the enquiry to include southern Africa so that twenty years later a full ten-point list, comparable to the one Childe proposed for the Urban Revolution could be published.

Behavioural Traits List

But not everyone is convinced by the need for a revolution and the dissent has been greatest among those working in Africa. The attack is twofold. First, they have called into question the European-ness of the Human Revolution. The Upper Palaeolithic revolution, and the replacement of Neanderthals by modern humans is, they argue, a provincial concern and should not be extended out to the rest of the world. For example, Hilary Deacon claims that European criteria for cultural modernity, such as Upper Palaeolithic art and ornament, are irrelevant in the context of the African origins of modern behaviour. The African evidence he cites as critical includes, among other aspects, spatial rules of cleanliness, colour symbolism and the reciprocal exchange of artefacts. Second, the critics point out that Africa has consistently produced earlier examples of items in Klein's check-list than Europe or anywhere else for that matter.

Francesco D'Errico in particular has strongly criticised the definitions of behavioural modernity that African archaeologists use. He shows that most of the defining traits are seen among Neanderthals in Europe and the Near East. Following this demonstration he then develops a model of convergent evolution. He argues that common ecological pressures, combined with permeable cultural barriers, led to the separate development of those distinctive ornaments and visual culture around which so much of the debate over behavioural modernity revolves. To account for the European creative outburst he suggests that: "the new situation involving contact between anatomically modern people and Neanderthals and the consequent problems of cultural and biological identity... stimulated an explosion in the production of symbolic objects on both sides." But while he scores a double hit, disposing of both the Human Revolution and primacy for Africa, or any other geographical area for that matter, this model still falls short as an explanation for change. What exactly are those common ecological pressures that stretched across the Old World? And if barriers were permeable to this degree then why was the response to crank out more symbolic objects to keep up with the thoroughly modern Joneses? Arguing from the European evidence, Mellars unequivocally shows that such convergence is an impossible coincidence.

With the focus on Africa rather than Europe it does indeed seem that this was, as McBrearty and Brooks argue, 'the revolution that wasn't'. What they prefer is an extended version of continuity with the slow accretion of significant traits over a very long time-scale. Rather than a revolution they favour a gradual unfolding of modern traits and Africa, as the current evidence shows, is the place to look. But exactly how we will recognise modern behaviours, if we have not already specified what they are, is unclear. It seems we are to know modern humans not only by what they did but by what they chose not to do, as long as this was in Africa. McBrearty and Brooks' conclusion is that behavioural modernity is African in origin if it can be said to originate anywhere. But they are less forthcoming with an explanation other than that the pattern points to cultural intensification. But why and how is not addressed. Theirs is very much an argument based on timetables and does not deal with issues of change. The timetables are not disputed: Africa came first with many of the innovations by which modern humans are recognised. But this does not rule out, as was the case in Europe and the Near East, an explosion between 60,000 and 30,000 years ago in the frequency and ubiquity of these items, in what has been called an Upper Palaeolithic, rather than Human, revolution.

None of the arguments about a Human Revolution have impressed those archaeologists studying the issue from a Neolithic rather than Palaeolithic perspective. What happens before agriculture is still, it seems, irrelevant to the transformation of the species. The other, the outgroup to humanity, continues to include both anatomically and fully modern humans. And why? Because humanity is only made up of farmers. This is the sapient paradox that Colin Renfrew has drawn attention to and that Alasdair Whittle attributes to the horror of the vacuum. It arises from the understanding of anatomically modern humans apparently equipped with modern minds and language, but showing none of the haste to change society, economy and material culture that is so evident after the appearance of farming. Renfrew's judgement on the Human Revolution of 40,000 years ago is revealing: "After this momentous conjuncture (if such it was), looking at the question broadly and at a distance, there were few decisive happenings in human existence for another 30,000 years. Hunter-gatherer communities peopled much of the earth--what the biologists term an adaptive radiation. But there were few other profound and long-lasting changes, at any rate when the picture is perceived in very general terms, until the end of the Pleistocene period."... His argument is that sedentism and the built environment that came with agriculture allowed a much more varied relationship with the material world to develop. For Renfrew, this was the 'true human revolution' when new concepts of value developed and a symbolic explosion occurred that touched all aspects of social life.

The difference between the two models can be summed up as biology versus history. For multi-regionalists human diversity is an expression of evolutionary principles such as convergent or parallel evolution where comparable adaptations have evolved to meet similar selection pressures from the environment. Among those who follow recent origins, biology is still important but is under selection from the many contingencies that attended the history of our diaspora from Africa. The issue in this lively and often acrimonious debate, as Proctor points out, is about which standpoint accords sufficient dignity to early hominins such as Herto or Neanderthal so that palaeoanthropologists can continue to escape the charge of fuelling racist agendas. To form a judgement we need to consider not just our own palaeoanthropological categories, such as anatomically and fully modern humans, but the political move to recognise humans as universal subjects because of their evolutionary endowment and growth.

The Birth of History, S. Mithen

Human history began in 50,000 BC. Or thereabouts. Perhaps 100,000 BC, but certainly not before. Human evolution has a far longer pedigree - at least three billion years have passed since the origin of life, and six million since our lineage split from that of the chimpanzee. History, the cumulative development of events and knowledge, is a recent and remarkably brief affair. Little of significance happened until 20,000 BC - people simply continued living as hunter-gatherers, just as their ancestors had been doing for millions of years. They lived in small communities and never remained within one settlement for very long. A few cave walls were painted and some rather fine hunting weapons were made; but there were no events that influenced the course of future history, that which created the modern world. Then came an astonishing 15,000 years that saw the origin of farming, towns and civilisation. By 5000 BC the foundations of the modern world had been laid and nothing that came after - classical Greece, the Industrial Revolution, the atomic age, the Internet - has ever matched the significance of those events. If 50,000 BC marked the birth of history, 20,000-5000 BC was its coming of age.

For history to begin, people required the modern mind - one quite different to that of any human ancestor or other species alive today. It is a mind with seemingly unlimited powers of imagination, curiosity and invention. The story of its origin is one that I have already told - or at least tried to tell - in my 1996 book, The Prehistory of the Mind. Whether the theory I proposed — of how multiple specialised intelligences merged to create a 'cognitively fluid' mind - is entirely right, wrong or somewhere in between is not an issue for the history that I will now recount. All the reader must accept is that by 50,000 years ago, a peculiarly creative mind had evolved. This book addresses a simple question: what happened next?

The peak of the last ice age occurred at around 20,000 BC and is known as the last glacial maximum, or LGM. Before this date, people were thin on the ground and struggling with a deteriorating climate. Subtle changes in the planet's orbit around the sun had caused massive ice sheets to expand across much of North America, northern Europe and Asia. The planet was inundated by drought; sea level had fallen to expose vast and often barren coastal plains. Human communities survived the harshest conditions by retreating to refugia where firewood and foodstuffs could still be found.

Soon after 20,000 BC global warming began. Initially this was rather slow and uneven — many small ups and downs of temperature and rainfall. By 15,000 BC the great ice sheets had begun to melt; by 12,000 BC the climate had started to fluctuate, with dramatic surges of warmth and rain followed by sudden returns to cold and drought. Soon after 10,000 BC there was an astonishing spurt of global warming that brought the ice age to its close and ushered in the Holocene world, that in which we live today. It was during these 10,000 years of global warming and its immediate aftermath that the course of human history changed.

By 5000 BC many people throughout the world lived by farming. New types of animals and plants — domesticated species — had appeared; the farmers inhabited permanent villages and towns, and supported specialist craftsmen, priests and chiefs. Indeed, they were little different to us today: the Rubicon of history had been crossed - from a lifestyle of hunting and gathering to that of farming. Those who remained as hunter-gatherers were also now living in a manner quite different to that of their ancestors at the LGM. The remit of this history is to explore how and why such developments occurred - whether they led to farming or new types of hunting and gathering. It is a global history, the story of all people living upon planet earth between 20,000 and 5000 BC.

This was not the first time that the planet had undergone global warming. Our ancestors and relatives - the Homo erectus, H. heidelbergensis and H. neanderthalensis of human evolution - had lived through equivalent periods of climate change as the planet see-sawed from ice age and back every 100,000 years. They had responded by doing much the same as they had always done: their populations expanded and contracted, they adapted to changed environments and adjusted the tools they made. Rather than creating history, they simply engaged in an endless round of adaptation and readaptation to their changing world.

Neither was it the last. In the early twentieth century AD, global warming began anew and continues apace today. Once again new types of plants and animals are being created, this time through intentional genetic engineering. Like these novel organisms, our modern-day global warming is a product of human activity alone - the burning of fossil fuels and mass deforestation. These have increased the extent of greenhouse gases in the atmosphere and may raise global temperatures far beyond that which nature alone can do. The future impacts of renewed global warming and genetically modified organisms on our environment and society are quite unknown. One day a history of our future times will be written to replace the multitude of speculations and forecasts with which we struggle today. But before that we must have a history of the past.

The people who lived between 20,000 and 5000 BC have left no letters or diaries that describe their lives and the events they both made and witnessed. The towns, trade and craftsmen had to be in place before the invention of writing could occur. So rather than drawing on written records, this history examines the rubbish that people left behind - people whose names and identities will never be known. It relies on their stone tools, pottery vessels, fireplaces, food debris, deserted dwellings and many other objects of archaeological study, such as monuments, burials and rock art. It draws on evidence about past environmental change, such as pollen grains and beetle wings trapped in ancient sediments. Occasionally it gains some help from the modern world because the genes we carry and the languages we speak can tell us about the past.

The World at 20,000 BC, S. Mithen

The world at 20,000 BC is inhospitable, a cold, dry and windy planet with frequent storms and a dust-laden atmosphere. The lower sea level has joined some land masses together and created extensive coastal plains. Tasmania, Australia and New Guinea are one; so are Borneo, Java and Thailand which form mountain chains within the largest extent of rainforest on planet earth. The Sahara, Gobi and other sandy deserts are greatly expanded in extent. Britain is no more than a peninsula of Europe, its north buried below the ice, its south a polar desert. Much of North America is smothered by a giant dome of ice. Human communities have been forced to abandon many regions they had inhabited before the last glacial maximum, or LGM; others are amenable for settlement but remain unoccupied because any routes for colonisation have been blocked by dry desert and walls of ice. People survive wherever they can, struggling with freezing temperatures and persistent drought.

Soon after 2 million years ago the first human-like species appeared, one that archaeologists call Homo ergaster. This was the first of our ancestors that spread out of Africa. It did so with extraordinary speed, reaching Southeast Asia perhaps as long as 1.6 million years ago. Homo ergaster had at least two evolutionary descendants, H, erectus in eastern Asia and H, heidelbergensis in Africa. The latter dispersed into Europe and gave rise to the Neanderthals - H. neanderthalensis - at about 250,000 years ago. The Neanderthals were an evolutionary dead-end, as was H. erectus in Asia. Nevertheless, both of these were extremely successful species that lived through great swings of climate. It was during one especially harsh glacial period at c. 130,000 years ago that H. sapiens evolved in Africa - the earliest specimen being found at Omo Kibish in Ethiopia. This new species behaved in quite a different way to those that had preceded them: the archaeological record begins to show traces of art, ritual and a new range of technology, reflecting a more creative mind. H. sapiens rapidly replaced all existing human species, pushing the Neanderthals and H. erectus into extinction.

Soon after 30,000 BC H. sapiens was the only type of human left on the planet; it was found throughout Africa, Europe and across much of Asia. A remarkable thirst for travel took some of its members to the southernmost reaches of Australasia, which would become the future island of Tasmania. By then, however, the climate was heading to the depths of the last ice age: temperatures were plummeting; droughts were persisting; glaciers, ice sheets and deserts were expanding; sea level was falling. Plants, animals and people either had to adjust where and how they lived, or become extinct. How many people were alive on planet earth at the LGM? Taking into account the large areas of uninhabitable regions, the harsh climatic conditions that induced early mortality, and the fact that modern genetics has suggested that only 10,000 modern humans were alive 130,000 years ago, we can guess at a figure of around one million. But this really is a guess; trying to estimate past population sizes is one of the most difficult tasks that archaeologists face.

Unlike the global warming we face today, that which came after 20,000 BC was entirely natural. It was just the most recent switch from a 'warm and wet' to a 'cold and dry' period in the earth's history - from a 'glacial' to an 'interglacial' state. The ultimate cause of such climatic change lies in regular alterations in the earth's orbit around the sun... While these changes in the shape, the tilt and the wobble of the earth's orbit will alter the earth's climate, scientists think that they are insufficient in themselves to account for the immense magnitude and speed of past climate change. Processes happening on the planet itself must have substantially amplified the slight changes they induced. Several of these are known: changes in ocean and atmospheric currents, the build-up of greenhouse gases (principally carbon dioxide) and the growth of the ice sheets themselves (which reflect increasing amounts of solar radiation as they increase in size). The combined impact of orbital change and amplifying mechanisms has been the see-sawing of climate from glacial to interglacial and back every 100,000. years, often with an extraordinarily rapid switch from one state to another.

'The Blessings of Civilisation', S. Mithen

20,000 BC was a time of global economic equality when everyone lived as hunter-gatherers in a world of extensive ice sheets, tundra and desert. By 5,000 BC, many were living as farmers. Some people grew wheat and barley, others rice, taro or squash. Some lived by herding animals, some by trade and others by making crafts. A world of temporary campsites had been replaced by one with villages and towns, a world with mammoths had been transformed into one with domesticated sheep and cattle. The path towards the huge global disparities of wealth with which we live today had been set. Many hunter-gatherers survived but their fate had been sealed when agriculture began. The new farmers, eager for land and trade, continued to disrupt hunter-gatherer lives. They were followed by warlords and then nation-states building empires in every corner of the world. Some huntergatherers survived until recent times by living in those places where farmers could not go: the Inuit, the Kalahari Bushmen and the Desert Aborigines. But even these communities are no more, effectively killed off by the twentieth century.

It is no coincidence that human history reached a turning point during a period of global warming. All communities were faced with the impact of environmental change - sudden catastrophic floods, the gradual loss of coastal lands, the failure of migratory herds, the spread of thick and often unproductive forest. And along with the problems, all communities faced new opportunities to develop, discover, explore and colonise. The consequences were different on each continent. Western Asia, for example, happened to have a suite of wild plants suited to cultivation. North America had wild animals that were liable to extinction once human hunting combined with climate change. Africa was so well endowed with edible wild plants that this cultivation had not even begun there by 5000 BC. Australia likewise. Europe lacked its own potential cultivars but it had the soils and climate in which the cereals and animals domesticated elsewhere would thrive. South America had its vicuna and North Africa its wild cattle; Mexico its squash and toesinte, the Yangtze valley its wild rice.

Continents, and regions within continents, also had their own particular environmental history, defined by their size, shape and place within the world. The people who lived in Europe and western Asia had the most challenging roller-coaster ride of environmental change. Those living in the central Australian desert and the Amazonian forest had the least. The type of woodland that spread in northern Europe favoured human settlement, while that in Tasmania caused the abandonment of its valleys. The melting of the northern ice sheets caused the loss of coastal plains throughout the world with the exception of the far north, where precisely the opposite occurred when the land, freed from its burden of ice, rose faster than the sea.

Although the history of any region was conditioned by the type of wild resources it possessed and the specific character of its environmental change, neither of these determined the historical events that occurred. People always had choices and made decisions from day to day, albeit with little thought or knowledge of what consequences might follow. No one planting wild seed in the vicinity of Jericho or Pengtoushan, tending squash close to Guild Naquitz or digging ditches at Kuk Swamp, anticipated the type of world that farming would create.

Human history arose from accident as much as by design, and the paths of historical change were many and varied. In western Asia, huntergatherers settled down to live in permanent villages before they began to farm, just as they did in Japan and on the Ganges plain. Conversely, plant cultivation in Mexico and New Guinea led to domesticated plants and farming long before permanent settlement appeared. In North Africa, cattle came before crops, just as vicuna came before quinua in the Andes. In Japan and the Sahara the invention of pottery preceded the start of farming, whereas it occurred simultaneously with the origin of rice farming in China; its invention in western Asia came about long after farming towns had begun to flourish.

Who could have predicted the course that history would take? At 20,000 BC, Southwest Europe set the cultural pace with its ice-age art, by 8000 BC it was an entirely undistinguished region. At 7500 BC, western Asia had towns housing more than a thousand people, but within a millennium itinerant pastoralists were making campsites within their ruins. Who would have imagined that the Americas, the last continent to be colonised, the last to begin a history of its own, would have become the most powerful nation on planet earth today, its culture pervading every comer of the world? Or that the very first civilisation would have arisen in Mesopotamia? Or that Australia would remain a land of hunter-gatherers while farming flourished in New Guinea?

While the history of each continent was unique, and has required its own specific mix of narrative and causal argument to explain, some forces of historical change were common to all. Global warming was one. Human population growth was another; this occurred throughout the world as people were freed from the high mortality imposed by ice-age droughts and cold and required new forms of society and economy irrespective of environmental change.

A third common factor was species identity. All people in all continents at 20,000 BC were members of Homo sapiens, a single and recently evolved species of humankind. As such, they shared the same biological drives and the means to achieve them - a mix of co-operation and competition, sharing and selfishness, virtue and violence. All possessed a peculiar type of mind, one with an insatiable curiosity and new-found creativity. This mind - one quite different to that of any human ancestor - enabled people to colonise, to invent, to solve problems, and to create new religious beliefs and styles of art. Without it, there would have been no human history but merely a continuous cycle of the adaptation and readaptation to environmental change that had begun several million years ago when our genus first evolved. Instead, all of these common factors combined, engaging with each continent's unique conditions and a succession of historical contingencies and events, to create a world that included farmers, towns, craftsmen and traders. Indeed, by 5000 BC there was very little left for later history to do; all the groundwork for the modern world had been completed. History had simply to unfold until it reached the present day.

What Are Human Beings?, R. Foley

This is the paradoxical position in which many find themselves: Humans are descended from something like an ape, and yet they are significantly, perhaps irrevocably different. Somehow most people manage to maintain these two, apparently mutually exclusive, views simultaneously. How the human mind manages this may be yet further evidence of the extent to which humans have developed a subtle and powerful brain. That brain is one of the symbols of the difference between humans and the rest of the animal kingdom. It is several times larger than it should be, and is clearly essential to human survival and evolutionary success. The modern human brain is approximately 1400 grams in weight. In general, brain weight is closely related to the overall body weight of an animal, and were humans to have the size of brain expected for an animal of our size, it would be about 500 grams.

That large brain undoubtedly facilitates a wide range of distinctively human abilities. Most obvious is linguistic skill. Humans can utter a vast array of sounds, but more importantly, they can order them using diverse and flexible rules, and they can understand the meanings, overt and implicit, of those sounds. Humans can create images in any number of media, and, again, give order to them. They may create order in the material world as well, with technologies that may be brilliant in either their sheer simplicity - the boomerang - or their complexity - spacecraft capable of collecting data from distant planets and sending it back to Earth.

None of this would have occurred if all human activity were based on individual prowess. A million monkeys typing for a million years might produce the works of Shakespeare, but they would be unlikely to be able to manufacture, market and distribute those typewriters in the first place. That would require considerable co-operative behaviour and an organized society. Human society is based on complex networks of social and economic interaction - co-operation, competition, dependence, altruism, friendship and enmity.

To [Alfred Wallace] natural selection was simply not powerful enough to bridge the gap in behaviour between animals and humans. His solution was to formalize this gap, and to propose that while natural selection produced the basic diversity of life, and led to adaptation among plants and animals, the brain of humans was the product of divine intervention. This solution is one that is echoed in much later work on the evolution of human behaviour, and lies at the heart of the recent debate on sociobiology, where the tenets and principles of behavioural ecology were accepted for animals, but their application to humans was both questioned and rejected. Humans were a unique and special case, beyond the power of normal evolutionary processes. Darwin himself was less certain of this, and explicitly extended his theory to incorporate humans. In general terms this was done in The Descent of Man,3 which is principally a catalogue of similarities between humans and other animals, particularly primates, running I the whole gamut of characteristics from the digestive tract to morality. However, Darwin recognized that the principal problem in applying evolutionary principles to humans lay in the realm of human behaviour.

It turns out that the co-founders of modern evolutionary theory had established the framework for much later debate about the nature and evolution of human behaviour. On the one hand, Darwin focused on the selective mechanisms by which behaviour could be treated within the same framework as morphological characteristics such as the shape of bones or the size of muscles, and was therefore a true evolutionary problem. Furthermore, behaviour in humans was on a continuum with that of other animals. Wallace, on the other hand, claimed that selection was too weak a process, and adaptation too much an agent of fine tuning, to be extended to all realms of organic diversity, and in particular, to complex and sophisticated behaviour. Much twentieth-century work on the evolution of behaviour, human and animal, played out in departments of zoology, psychology and anthropology, in the laboratory and in the field, has been a continuation of this debate. The status and standing of apes, angels and human uniqueness have risen and fallen as new results and ideas have emerged.

Along with the growth of knowledge of animal behaviour has come a greater understanding of the diversity of human life and, to some extent, a loss of confidence in the extent to which humans could be said to be on a pedestal above the swamp of animal brutishness. The camps of Dachau and Belsen, the millions killed in religious wars, the extent of poverty, famine and disease, and the almost boundless capacity of humans to do damage to each other at national and personal levels have, in the twentieth century, rather dented human self-esteem. Furthermore, as social anthropologists have revealed the richness and complexity of so-called primitive life, and to some extent implied that the simpler the society the greater the harmony and level of individual happiness, the more difficulty people have had holding on to the Victorian notion of a ladder of progress climbing closer and closer, hand in hand with technological and economic development, to the level of the angels.

One of the central problems is to decide how to characterize the human species and human features. Clearly, to explain why they should occur, we need to know what they are. The conceptual divide between apes and angels has acted as a barrier to coming to any such understanding. Apes and angels are really just ideals, and extremely nebulous ones at that. If evolution is a dynamic, shifting process, these static ideals are inadequate, and force both social and biological scientists into harder and more extreme positions, with a major obstacle to communication between them arising as a result. At one extreme, humans might be seen as nothing more than naked apes, and we can cheerfully and uncritically throw every evolutionary method at them in the hope of unravelling their mysteries. At the other, we may decide that in the process of becoming human a rubicon of evolution has been crossed that itself washes clean our evolutionary ancestry.... Instead humans should be defined in an all-embracing manner, and then the adequacy of Darwinian evolution as an explanation can be tested. Such a definition itself becomes both interesting and elusive once the relative security of the present species is abandoned for the unknown of the fossil past - the humans that lived before humanity.

Humans are bipedal, that is they walk in an upright manner on two rather than four limbs. Bipedalism is probably the most physically obvious human feature. It is unique among primates, and is an adaptation that has had a pronounced effect on the entire musculo-skeletal system.... Bipedalism may have made possible a number of other human features, in particular manual dexterity. Humans, along with other primates, have grasping, sensitive, five-fingered hands. Most primates are capable of high levels of manipulative dexterity, but this is found in its most extreme form among humans, who have thumbs that are capable of opposing virtually any of the other digits. Other anatomical features are also striking. Humans have very large brains for their body size, and faces that have become reduced and flat. The hair over most of our bodies is miniaturized (not absent, as is implied by the concept of the 'naked ape'), and the skin therefore exposed. Humans also have a potential for copious sweating.

The other distinctive characteristics of human structure lie in the reproductive organs and secondary sexual characteristics. Although modern humans are only moderately sexually dimorphic in terms of size - on average females are about 84 per cent the size of males - there are quite marked secondary characteristics. Females have, in the dry terminology of the comparative anatomists, pendulous breasts, and more rounded and fleshy buttocks. Physiologically they generally have large fat deposits that they can draw upon during periods of nutritional stress. Men are generally more hirsute, although this varies in extent from population to population. It is perhaps also reassuring to know that the male human penis is large compared with the other apes, although the testicles are, compared with a chimpanzee at least, not especially large.

Most would agree that it is not so much anatomy as human behaviour and mental ability that are the real marks of the species. A whole suite of behavioural characteristics can of course be found in humans and a claim made for their uniqueness. Not surprisingly many of these have been selected as the feature that made humans the way they are. Man the tool-maker, man the hunter, woman the gatherer, Homo economicus, Homo hiemrchicus, Homo politicus, and Homo loquans, these are all sobriquets that have been designed to epitomize human nature. They, and several others, are all traits that have been used by various people to identify the driving force that underlies human nature.

To many, tool-making has been the decisive factor. Even a cursory glance at the world shows that humans depend on technology to an extraordinary extent. This is not just the case for urban, industrial peoples, but for all societies. Houses, food, weapons, games, all involve technology to some extent even if they are relatively simple in construction. It is hardly surprising, especially when humans are compared with other animals, that it has been suggested that this is the key trigger to human success. The basis for tool-making stems partly from the manipulative skills of dexterous hands and partly from the ability of the brain to co-ordinate and create actions that have technological consequences. The practical applications of this are obvious, from the simplicity of the wheel to the power of the nuclear reactor. The significance, though, is broader than just the tools themselves. What technology does is to allow humans to modify and create the world they live in. Technology can make a species the active component in the construction of the environment, in contrast to the fate of most species, that are generally seen as the passive recipients of the world into which they are born. The best they can do is to react to their environment in ways that maximize their chances of survival. If humans want a predator-free environment then they build a house into which predators cannot come. If they want a warm environment, a fire can be lit. Technology is the means by which the human world is created.

However, that humans alone are tool-users and tool-makers is no longer acceptable. From the lowly termite to the hammer-using chimpanzee, other animals clearly use tools too." More important perhaps, other species cannot be viewed as simply accepting their environmental lot. They are not just passive receivers, but like humans are actively affecting their habitats, food resources and shelters. If this is the case, then technology alone cannot be the trigger that set humanity on its course. What other characteristics might be important? However ineptly, all humans use technology in their daily lives. The same cannot be said for another feature which has attracted the attention of scientists concerned with human origins. Hunting as a means of survival is not pursued extensively today. Rather, it is confined to a few groups of indigenous foragers such as the Kalahari San, the Eskimos, and the Australian aborigines. Ten thousand years ago it would have been far more useful. Today, although agriculturalists will supplement their diet with hunted food where possible, by and large most animal food comes from domestic sources. Hunting, though, whatever the particular context, may have been an important element in human evolution.

Eating meat is often thought to be the reserve of either very strong animals or very clever ones. For small, defenceless bipedal humans it was intelligence that was required. More than this, hunting seemed to require co-operation between individuals (and therefore social organization) and language to co-ordinate activities. Early humans were not just hunters, but social hunters, and so hunting implies more than just the eating of meat.

When Did We Become Human?, R. Foley

The whole point of this book is to show that being a human and being a hominid are by no means the same thing. To ask the question 'when did we become human?' virtually traps one into the answer 'it depends upon what you mean by human'. And it is all too clear that different people will use different criteria to establish whether or not a particular population is or is not over the line into humanity.

On the one hand, there is a tradition of seeing humans as having a unique ancestry stretching far back into the remote past. While the exact length of time has varied as our perception and ability to measure the geological past has changed, it has been believed that humans are sufficiently distinct from other animals that they must have evolved independently for a long period of time. This perspective has in turn been fuelled by the quite natural desire of palaeontologists to find earlier and earlier evidence of human fossils - after all, no one would win the Nobel prize for finding the second oldest fossil. On the other hand there are those who have been impressed by the similarity of humans and other animals, especially the primates and the anthropoid apes. Where the long perspective sees differences, the short perspective is overwhelmed by the similarity. The inference to be drawn is that humans diverged only relatively recently from other animals, and therefore have only a short independent evolutionary history.

Darwin himself was essentially a long-chronology person. Although he had no fossil record and only a limited grasp of the extent of geological time, none the less he believed that human antiquity was great. His reasons for believing this are interesting. To understand them it is necessary to remember that at that time the chronology recognized and accepted by most educated people in Europe was excessively short - that is, the chronology based on biblical history. This suggested that the world was only about 6000 years old.

For [Darwin] to be able to convince people that an amoeba could evolve into a fish and a fish into an amphibian and an amphibian into a mammal, enormous periods of time were necessary. This was especially the case when he dealt with humans. Humans were, to the Victorian mind, completely unlike any other species, and therefore the obvious implication was that they must have evolved over a very long period of time. This meant that to make his case convincing Darwin had to emphasize the slowness, the gradualness and the longevity of the evolutionary process that produced humans. In this way he was responsible for the idea that the search for humans and their origins should take one further and further back into the fossil record.

It must be stressed that it was not just the inherited beliefs of Darwin that produced such support and interest in the long perspective. Clearly the evidence of anthropology, psychology and neurobiology, all of which seemed to stress differences between humans and other animals, appeared to underpin the idea that humans were radically different. It was a combination of this radical difference in structure and behaviour, in association with the belief that evolution must be gradual, that essentially provided the key to a widely held view that humans had taken a long time to evolve and were distinct from 'ordinary' animals far back in the past. That such a view was also more acceptable philosophically and in terms of compatibility with religious beliefs may well also have been a significant factor in the predominance of the long perspective.

The earth is estimated to be about 4.6 billion years old. Life (usually defined as chemical systems capable of replicating themselves) did not appear until 3.5 billion years ago, and multicellular plants and animals evolved about 750 million years ago. Vertebrates made their appearance 450 million years ago, and from then they colonized both land and sea. The early dinosaurs emerged about 200 million years ago and dominated the earth until their rapid disappearance about 65 million years ago. Although the mammals have their origins at least 150 million years ago, they did not spread and diversify until after the extinction of the dinosaurs, at the same time as the radiation of the flowering plants or angiosperms. The primates, the biological order to which humans belong, only evolved during the last 60 million years or so. Mammals, primates and humans belong only to the most recent evolutionary events, and in the largest perspective are mere newcomers, arriving only in the last few minutes and seconds of the evolutionary clock.

Turning to humans, we belong to a species known as Homo sapiens. As it happens, there are no other living species within this genus, although there are several extinct ones. Homo sapiens is usually placed within a larger grouping (a superfamily) known as the Hominoidea or the hominoids. This includes all the apes as well as ourselves, and is distinct from the monkeys. This reflects our close evolutionary relationship with the apes, an affinity that was recognized by Darwin himself. To return to the question of determining the closest living relative, this amounts to finding out when and from whom human ancestors diverged. That it was an ape - one of the chimpanzee, gorilla, orang utan or gibbons - is widely accepted.

The major divide among the great apes and humans lies not between humans and the great apes, but between humans and African apes in relation to the Asian great apes, the orang utan. Chimpanzees, gorillas and humans are more closely related to each other than any of them are to the orang utan. The great apes are not what is termed a true or natural branch or clade in evolution. This means that rather than humans diverging from the apes before they themselves had split, the gibbons and orang at least had already undergone independent evolution. Humans are just another type of African ape.

This discovery has become well established. It suggests that in the context of hominoid evolution as a whole, humans are a relatively recent rather than a very ancient lineage. However, there is an even more startling possibility. When the same molecular techniques are applied to humans and to the African apes it is very difficult to determine the branching sequence among them. Intuitively what might be expected is that humans diverged first and then the chimps and gorillas from each other. Most of the evidence, however, indicates that it is not possible to discern the sequence precisely. Where it is possible to determine the order of divergence it seems that gorillas separated first from the common ancestor of chimpanzees and humans, and only subsequently did humans and chimpanzees diverge.

Humans, therefore, are specifically African apes and seem to be closely related to chimpanzees, far more closely in fact than anyone had previously thought possible. Only a relatively small number of genes separate these two species, despite the enormous number of morphological differences. The question of which is our closest relative has been answered. While the answer is the chimpanzee, which is not surprising, it is important to note that it is specifically the chimpanzee and not the great apes as a whole.

Evolutionary change can be thought of in two ways. One is, as Darwin himself thought, the change that results from animals adapting to their environments as a result of natural selection. The rates of change would therefore not be constant, but would vary according to the intensity of the competition or the amount of environmental change, or any number of other factors. However, evolutionary change, as it is actually measured, is nothing more than the rate of genetic change, whatever the cause of those changes might be. Certainly if the rate of genetic change is a haphazard response to environmental changes, with periods of very fast change and very slow change alternating, then genetic distance alone cannot give information about the date of evolutionary events. If, though, the rate of genetic change is constant then the quantity of genetic distance between any two or more species should indicate not just the sequence of events (their relative position in time) but also their exact timing (an absolute position in time).

The first living great ape to diverge was the orang utan, and the molecular estimates for this event are about 12 million years ago. It has already been established that it is virtually impossible to distinguish clearly the exact sequence of events amongst the African apes and humans, but it appears that these three lineages separated somewhere between 6 million and 8 million years ago, towards the end of the Miocene. If chimpanzees and humans diverged later than gorillas, then this is more likely to have occurred at the younger end of this time range. Humans, then, have been in existence for about 6 million years. This might provide one possible answer to the question, for that is the length of time that there has been an independent evolution separate from chimpanzees.

Human fossils were found first in Europe and then in South-east Asia. After a number of false trails, it was really only in the 1960s that Africa, the original favourite, began to come into its own. The story of these discoveries has been told many times, and the historical details are not particularly important here.20 What is more significant is that fifty years of intensive research by very large numbers of people have yielded more than 3000 fossils that can be assigned to the lineage leading to humans. Most significant of all, the vast majority of these are earlier than human fossils anywhere else in the world. The limestone caves of the Transvaal region of southern Africa and the Rift Valley in eastern Africa are the two regions from which the fossils come. While these localities may not be the sites of the actual origins of humankind, they provide an approximation, and a confirmation of an African origin.... Compared with some of the more dramatic finds from other sites it is not very exciting, but its significance comes from the fact that it is probably just over 5 million years old.

After 5 million years ago the fossil evidence for humans picks up considerably. Sites in Ethiopia dated to over 4 million years old, and a site in central Kenya, Tabarin, at just under 5 million years, are all supportive of the existence by then of a human lineage. From 3 million years ago the fossils are both more common and more complete. Where Lothagam and Tabarin provide only fragmentary evidence for human characteristics through the dentition, the material from Hadar in Ethiopia is far more convincing. In particular, a single specimen, known colloquially as Lucy and more formally as AL-288, found by Don Johanson in 1974, consists not just of dental and cranial fragments, but a nearly complete skeleton.23 Head, teeth, arms, back, hips and legs are all present. What this specimen shows is that by 3 million years ago there was in existence an animal that was for all intents and purposes bipedal. Upright walking is perhaps the most distinctive characteristic of humans, and of Lucy, with her elongated lower limbs, flared and rounded pelvis, and distinctive angle to the head of the femur. While there are clear differences when compared with a fully modern human, such as relatively long arms and curved phalanges, there is little doubt that this is part of the human lineage. By 3 million years ago, hominids were more bipedal than any other ape, clear evidence perhaps that by then something had become human, if bipedalism can be accepted as a key characteristic.

It can be argued, perhaps, that the answer to the question 'when did we become human?' is as follows: about 6 or 7 million years ago on the basis of the molecular evidence for when the human lineage departed company from the other African apes, or about 5 million years for sure, on the basis of the first palaeontological evidence, and certainly by about 4 million years ago, when clear evidence exists for that unique human feature, bipedalism. Does this add up to humanity?

The conclusion that must be drawn is that far from being a simple linear progression, the line leading to humans is itself made up of a number of branching events. Between 3 and 1 million years ago there were at least two distinct lines of human evolution - one leading to some megadontic specialists known as the robust australopithecines, and another leading to larger-brained forms, our own genus, Homo. Even later than 1 million years ago there may have been separate lines of development, one in Europe and Africa, one in Asia; and the neanderthals, a specifically European group, may be a separate line of development compared with anatomically modern humans. Given the probability that there have been a number of branching events, leading to distinct species in human evolution, not all of which could have led to modern humans, then the focus on the earliest appearance of the lineage as a whole may not be the most appropriate one. Perhaps it is the appearance of the genus Homo around 2 million years ago that is more critical, or even the appearance of our own species, Homo sapiens, that is the crucial event.

While more bipedal than any ape, Australopithecus afarensis was not a replica of the modern species. What is more, it is not until after about 1.7 million years ago, with the appearance of Homo erectus, that a more modern type of bipedalism can be securely established, and even then there are differences. Neanderthals, forms of ancient human that are chronologically and morphologically closest to modern humans, still exhibited some significant differences in their locomotor anatomy when compared with ourselves - their joints were robust, their limbs short, and their pelvis much wider and less deep from front to back, with the centre of gravity more posteriorly located.... Anatomy would seem to suggest, therefore, that while the basic upright body plan of humans was established as long ago as 3 or more million years, other anatomical systems which are strongly associated with being human do not occur until much later. Certainly on the basis of brain size, the term human should be confined to the genus Homo, if not to Homo sapiens itself.

The first technology, known as the Oldowan after the site of Olduvai in Tanzania where it was first described, existed for over a million years. Its successor, the Acheulean, characterized by large bifacially flaked axes, remained stable for about a million years as well. Even the later industries associated with the neanderthals, known as the Mousterian and consisting of systematically flaked cores which had been prepared prior to flaking, continued largely unchanged for well over 100,000 years. In contrast, the stone technology associated with modern humans was never in existence for more than 5 to 10,000 years, and generally was far more ephemeral. Furthermore, the rate of technological change among modern humans, continuing through to the present date, is one of accelerating rapidity.

There is a progressive trend towards modern humans, but none the less, even the most chronologically adjacent, the neanderthals, are significantly different.28 The inevitable conclusion is that the event of diverging from the African apes, the adoption of upright walking, even the establishment of technology, does not in itself provide evidence for humans in the sense understood today - flexible, slowly maturing, lightly built and highly intelligent creatures. Even the overall pattern of the fossil record, showing diverse trends, seems to undermine the notion of an ancient appearance of humans. The conclusion that has to be drawn is that becoming human and being a human being are two different things entirely.

From the chronological perspective that is paramount here, there are over 5 million years connecting apes and modern humans, and the connections are not straightforward. As far as can be told from the fossil evidence, everything along that path (with the possible exception of the oldest species, Australopithecus ramidus) is bipedal, but not in a form that compares exactly to the modern human's own form of locomotion. Brain size varies significantly, as do technology, behavioural flexibility, and growth patterns. In some cases the links lie more strongly with the apes, in others with modern humans. Given that evolution is an essentially continuous process of modification, some trends can be observed, but these tend to be relatively short-lived and confined to particular groups and lineages. There are few trends that flow smoothly from the first separation from the African apes to modern humans. Where among these trends, though, should the true origins of being human, of humans themselves, be placed?

Between the first humans at 5 million years ago and the first indisputably anatomically modern humans at about 100,000 years ago is a long period of time in which to bury a problem. More significantly, it is almost certain that most of the known types of fossil human are not directly on the path leading towards humanity in the strict evolutionary sense. Most of them may well be side branches and dead ends, leaving no trace in the modern world. To what extent, therefore, did they contribute to the process of becoming human?.

It is not a question of apes and angels, or even apes and humans, but of distinct levels of differentiation. Humans and apes share a level of evolutionary relationship that unites them as members of the Hominoidea - the hominoids. Humans are hominoids in the same way that chimpanzees and gorillas are. This linkage is at the superfamily level. The next level down is the family. It is generally accepted that all the species that lie on the path that separates humans from the apes belong to the same family - the Hominidae.

We became hominids - distinct from the other apes - when an ancestor diverged from the chimpanzees and gorillas, characterized most probably by an upright posture. This was some time before 5 million years ago. We became human, though, when we achieved the distinctive patterns of anatomical structure and behaviour that can still be found today. This was some time between 150,000 and 100,000 years ago, as it was only then that our own species - Homo sapiens — evolved. What lies between, though are a multitude of populations, groups, species that are distinctly hominid - neither apes nor angels. Some of them may be ancestors, most were probably not. However, their importance lies not in whether or not they were ancestors, nor in whether they were truly human, but in the fact that they were 'hominid' - that is, our closest evolutionary relatives. As such they provide the best clues to why and how there evolved a distinct type of hominid - humans. They were the 'humans' before humanity - although the conclusion of this chapter must be that they were not really human. It is these hominids, their distinctive characteristics, their patterns of evolution, their adaptations, behaviours and capacities, that provide the central evidence of the process of change from hominoid to human. The fact that they are all extinct is perhaps a disadvantage, but the task that lies ahead is to unravel the story of these hominids and how they indicate both what lies in the evolutionary space between ourselves and the other living species of animal, and also the reasons why one of them became human.

Footprints On The Sands of Time, C. Stringer, R. McKie

We triumphed in the end for a variety of reasons: social, cognitive, behavioral, and technological. In this last category, implements and creative techniques helped take our ancestors to nooks and crannies that would have been barred to them if they had begun their dispersal unaided. The remains of most of these tools—made of hides and wood—have long rotted away, of course. However, to judge from residual stone implements, and from the behavior of huntergatherers today, we can deduce they probably had warm clothing sewed together using carved bone and antler needles, water containers made from hides, boats and rafts made from fallen tree trunks or bamboo tied together, sophisticated foraging techniques, and the use of fire and smoke to burn clearings and trap prey. In these ways our ancestors were able to open up previously uninhabited lands, and conquer ones that were already peopled.

If there were primitive modern humans in Africa 100,000 years ago, why did they take so long to reach Europe, Asia, Australia, and the Americas? For instance, in Asia, there is little evidence of any Homo sapiens prevalence (apart from in the Levant) until about 40,000 years ago. We catch glimpses of their presence at contemporary sites like K'sar Akil in Lebanon and Darra-i-Kur in Afghanistan; and then in Sri Lanka about 30,000 years ago; in China, about 25,000 years ago; and in Japan about 17,000 years before present. So what happened in between? Where were our ancestors lurking and what were they doing? Puzzlingly, there are few good answers to these questions. Indeed, we have greater knowledge about more distant periods of our prehistory than this crucial, recent era: paleontologists studying this period would kill for a dated skeleton as well preserved as the 1.5 million-year-old remains of the Nariokotome boy. All we can say is that in Africa, the archeological record tells us that people were certainly living there between 40,000 and 80,000 years ago, although the associated fossils are disputed and scrappy. To find more information about this vital prehistoric time we therefore have to seek clues elsewhere, not from bones, but from genes which we know can be every bit as informative as fossils.

Homo sapiens went through a near fatal numerical crash or isolation between 50,000 and 150,000 years ago and then bounced back at different times, rates, and places. Our African recovery seems to have begun first, perhaps 60,000 years ago, followed by Asia 50,000 years ago, and finally by peripheral areas such as Europe and Australia, around 40,000 years ago.

Time and chance—and not predestination to greatness—played a pivotal role in our emergence as global conquerors. It seems obvious once again that that mighty equalizer—climatic change, encountered so many times before in this book—played a dramatic role. About 150,000 years ago, a 60,000-year-old cold "snap" was peaking. Ice caps sprawled across the poles, bringing cooler, drier conditions to the rest of the planet. The Sahara Desert had expanded, virtually cutting off North Africa from the rest of the continent, while the Kalahari Desert swelled across the south, forming a second, almost impenetrable band. At the same time, central Africa's dense tropical forests shrank into separate western and eastern refuges, surrounded by grasslands that would have provided homes for humans. It may have been south of the then impenetrable Sahara that our species was forged. Then about 130,000 years ago, the climate switched back briefly into a warmer, moister mode. The deserts began to retreat and the forests to expand again, a situation that probably led to prototype modern humans' first tentative steps out of Africa into the Middle East 120,000 years ago and further into Asia by 80,000 years ago.

These intercontinental intruders were the first unequivocal representatives of Homo sapiens and they must have evolved in Africa's hinterland between 130,000 to 200,000 years ago—during the long spell of global cooling—from archaic human predecessors. These hominid newcomers spread and by 100,000 years ago had established themselves in terrain stretching from southern Africa to present-day Ethiopia and the Levant. But where did they come from originally? The remains of these people, and evidence of their behavior, is tantalizing sparse, although we suspect that once again East Africa may turn out to be the key to our origins. However, we desperately need more evidence from across the continent to confirm this idea. What we do know is that this transition turned people with rather broad, long, and low braincases, with quite strong browridges, into individuals with higher, shorter, and narrower crania with smoother foreheads.

The reason for this physical metamorphosis is also a puzzle, though the reduction of skeletal strength gives an important clue— that our ancestors were developing a more energy efficient lifestyle, with brain predominating over brawn for the first time in human evolution.... All we can say is that isolation and stress in those cold and dry days, about 150,000 years ago, were probably the triggers for this fundamental change in humanity.

Ironically all this fragmentation and environmental pressure may have been the stimulus for the final crucial changes that transformed these hominid bit players into masters of the planet. Forged in this bleak crucible, evolutionary pressures triggered alterations to our brains and social behavior and we were sent "ticking like a fat, gold watch" towards zoological stardom. Some scientists say they can already detect signs of innovations associated with these changes at Klasies, Border Cave, and Katanda around 100,000 years ago.11 There they have found remains of the use of that Upper Paleolithic perennial, red ocher, along with the signs of complex composite tools of wood and stone. However, other researchers believe these innovations appeared later, about 50,000 years ago, nearer the time of our big rebound from near extinction.

Now in the past, it has been assumed that these first African migrants must have been black, like so many people on the continent today. Only later did some members evolve lighter, white and brown skins, it was thought. However, Jonathan Kingdon has attempted a detailed reconstruction of early human dispersals in his book Self- Made Man and His Undoing, and concludes that our original skin ~ o color was probably a medium brown. According to him it was only later, as early modern humans moved along Asia's southern seaboard, settling along the coast as they progressed, that their appearance changed. They became dependent for food on a life that was governed by the sea. Selection would therefore have favored those who could stay out in the sun, those of the darkest color, when the tides required it. Hence black skin evolved for the first time, and the genes responsible began to spread across southern Asia, with some ending back in mankind's African homeland. "On present evidence, modern humans are likely to have begun with all the built-in advantages of a versatile light brown skin and only developed later the extremes of densely shielded (black) or totally depigmented skins," says Kingdon.

But to what degree did these ancient members of Homo sapiens look like people alive today? It is, after all, a specific prediction of the Out of Africa theory that racial characteristics are new and relatively unimportant facets of our species' anatomy. So can we detect evidence in skeletons to support this idea? Intriguingly enough, when we examine some of the oldest Homo sapiens relics, like those 100,000-year-old fossils from Qafzeh and Skhul, we find they do not show the kinds of differentiation that distinguish races today. Their skeletons are modern, as is the overall shape of their braincases, but they have unusually short, broad faces, with short, wide noses. Nor does the picture get any clearer when we move on to the Cro- Magnons, the presumed ancestors of modern Europeans. Some were more like present-day Australians or Africans, judged by objective anatomical categorizations, as is the case with some early modern skulls from the Upper Cave at Zhoukoudian in China. It is a confused picture and suggests that racial differences were still developing even relatively recently, and should be viewed as a veiy new part of the human condition. It is an important point, for it shows that humanity's modern African origin does not imply derivation from people like current Africans, because these populations must have also changed through the impact of evolution over the past 100,000 years.

Firstly, we moved from Africa to Asia about 100,000 years ago, and spread eastward until we reached New Guinea and Australia by about 50,000 years ago. A little later, having conquered the East, mankind also dispersed westward from Asia and drifted into Europe, eventually extinguishing Neanderthals there. Finally, at some point, Asian people made their way over Beringia and speedily down through the Americas, their progress unencumbered by the presence of other hominid competitors. By 30,000 years ago, modern humans had achieved an estimated breeding population of at least 300,000 individuals. We were then the only human species left on earth, probably the first time the bush of human evolution had been pruned to a single branch for more than a million years. The others had withered in the face of repeated cold shocks between 75,000 and 30,000 years ago. These plunged the oceans and then continents into a series of mini-Ice Ages, each lasting one or two millennia. Humanity's ailing nonsapiens branches must have suffered a slow attrition of numbers in the face of such climatic instability and as a result of more adaptable Homo sapiens' faster growing populations. The descendants of Java's Ngandong and China's Dali people may have gone under first, while those remarkable survivors, the Neanderthals, clung on in shrinking pockets, such as Zafarraya, until 30,000 years ago.

The first clear evidence of human occupation in the Americas comes in the form of Clovis spear points, the earliest of which have been reliably dated as being about 12,000 years old. These stone tools have been found across the United States (but not in Canada, which was mostly smothered in glaciers at the time) and are named after the town of Clovis, New Mexico, near the Texas border, where they were first uncovered. The Clovis people were probably some of the finest human hunters thrown up during mankind's evolution and appear to have been in incessant movement. They camped along rivers, beside streams, close to waterholes, and hunted elephant-like mammoths and mastodons, bison, horses, and enormous giant ground sloths, in competition with lions, giant wolves, and sabertoothed cats. They butchered their prey where it fell, and used lightweight tools made of fine stone points that are described as being fluted because they have a groove running their lengths. This channel was carved either to help bind the implement to a spear, which would have been thrown by hand, or to a shaft, which would have been propelled by a throwing stick or a bow. Whichever was the case, it proved to be an extremely effective technology....The Clovis people were mighty hunters and highly effective colonizers, as can be judged from the fact that by 11,000 years ago, humans had spread to both coasts of America, and from the area we now call the Midwest to the tip of Patagonia in South America.

In fact, it is genetic evidence that provides the greatest challenge in our attempts to pinpoint the date of our first steps on to a continent that now so dominates life on earth. While the Clovis tools and other remains indicate dates around only 15,000 years ago for humanity's entry into the Americas, some archeologists suggest we may have arrived there as long ago as 35,000 years, an idea given support from genetic data gathered by Cavalli-Sforza and his team.27 His analyses of Native American blood and proteins indicate these are divergent enough to suggest the continent was settled thirty millennia ago, and in at least three different waves of immigration.

So was America first colonized by a few waves of wanderers who left Asia 15,000 years ago, or are their origins at least twice that age? It is still a baffling question. Nor do the anthropological headaches get any easier when we turn to Australia, one of the most mysterious of all human homelands. Australia is a vast stretch of land, still largely devoid of people. It has tropical forests in the north, an arid heartland, and cool woods. Until recently, it even had ice sheets in the south. In the past, it was filled with rich hunting territories that sported many species of large birds and strange marsupials: ten-foot-tall kangaroos; the diprotodon, a sort of browsing, rhinoceros-sized wombat; a lionlike marsupial carnivore; giant koala bears; deer-like marsupials; and a giant monitor lizard, the size of a horse. This unique fauna had started evolving in isolation from the rest of the world when Australia separated from South America and Antarctica more than forty-five million years ago. Then Homo sapiens appeared on the scene, our arrival coinciding more or less with the abrupt extinction of all this exotic fauna—though through which stage door, and at what moment, humankind chose to make its dramatic entrance, we can only guess.30 Certainly, their route could not have been an easy one. The islands of southeast Asia were mainly covered with thick jungles and even at the lowest sea levels of the Ice Age, with Tasmania, Australia, and New Guinea lumped together in a single continent, Asia was a considerable distance away. Men and women would have had to navigate many different journeys between islands separated— in some cases—by forty miles or more of open sea.

Why did Homo sapiens come to places like American and Australia in the first place? Can we envisage them peering across the snowy landscapes of Beringia or the open seas of southeast Asia, and wondering what lay beyond? Most probably they did not. The pressure to move was more likely to have been motivated by population growth. New generations needed new foraging territories because the land simply could not support high densities of hunter-gatherers. As Kingdon points out: "The movement or expansion of people over considerable distances is often imagined in individualistic terms, as if prehistoric groups were seized by the urge to explore or migrate. Such movements did not depend on individual wills; it was external events that imposed constant change and flux on human existence. A succession of bad years, incursions by aggressive neighbours, overpopulation, overhunting, the invention of a new and superior technique, fleeing from disease or fulfilling the prophecies of a shaman; all these and more could have triggered movement on and into the unknown."

The first people reaching the Americas were therefore unaware of the momentous voyage they had made. They would most likely have been following migrating herds of reindeer across Beringia. On the other hand, the first humans to reach New Guinea or Australia would have understood very rapidly that they were somewhere terrifyingly new, and that they probably could never return to their homeland. While the first Americans would have seen familiar plants and animals in Alaska and Canada, the first Australians really did arrive in a New World filled with strange creatures. Nor should we assume their journey to the continent was a simple one of island hopping at times of low sea levels. They may have taken place when seas were high. Rising waters would have shrunk habitats, increasing population pressures. To escape, groups probably set off for land they could see, but found themselves swept away by unforgiving shifts in tides and winds. Many of these ancient boat people perished. Nevertheless, some survived, to find themselves washed up on the shores of a strange land—on which they founded a whole new race of people.

The progeny of the people who found Australia 50,000 years ago, and the descendants of the tribes who poured down the Americas 12,000 years ago, as well as the heirs to all those other settlers of Europe, Africa, and Asia, share a common biological bond. They are all the children of those Africans who emerged from their homeland only a few ticks ago on our evolutionary clock. They may have diverged geographically since then, and developed superficial variations, but underneath our species has scarcely differentiated at all. We may look exotic or odd to our neighbors in other countries, but we are all startlingly similar when judged by our genes. Yet the issue of racial differences continues to dominate world affairs. Serbs fight Bosnians, Tutsi slaughter their Burundi neighbors, and blacks and whites keep an uneasy peace in downtown America. This divisive schism has been the source of untold misery for thousands of years. Yet our new evolutionary perspective offers us an opportunity to reexamine its roots and its implications.

The Sorcerer, C. Stringer, R. McKie

Certainly, something very special was happening to human society around this time [Paleolithic]. Before then, Homo sapiens was simply marking time culturally. For millennia upon millennia, we had been churning out the same forms of stone utensils, for example. But about 40,000 years ago, a perceptible shift in our handiwork took place. Throughout the Old World, tool kits leapt in sophistication with the appearance of Upper Paleolithic style implements. Signs of the use of ropes, bone spear points, fishhooks, and harpoons emerge, along with the sudden manifestations of sculptures, paintings, and musical instruments. As John Pfeiffer states in The Creative Explosion: "Art came, with a bang as far as the archaeological record is concerned." We also find evidence of the first long-distance exchange of stones and beads. Objects made of mammal bones and ivory, antlers, marine and freshwater shells, fossil coral, limestone, schist, steatite, jet, lignite, hematite, and pyrite were manufactured. Materials were chosen with extraordinary care: some originated hundreds of miles from their point of manufacture.

Did we bring the seeds of this mental revolution with us when we began our African Exodus, though its effects were so subtle they took another 50,000 years to accumulate before snowballing into a cultural and technological avalanche that now threatens to engulf Homo sapiens? Or did that final change occur later, and was it therefore more profound, and much speedier in its effects?

Many archeologists, linguists, anthropologists, and researchers in other fields, have little problem over their preferred response. Only the former makes sense to them, though this acceptance has implications bristling with intellectual difficulties. Their reasons for disavowing the latter scenario are simple. If we accept that neurological or behavioral changes were responsible for the abrupt flowering of human culture only about 40,000 years ago then we have to explain how this moment of transformation occurred more or less simultaneously across Africa, Asia, and Europe. As we have seen, the DNA studies by Harpending, Rogers, and others demonstrate that a whole variety of people—Turks, Sardinians, Australians, Japanese, Native Americans, and others—all went through sudden, and rapid eruptions in populations at the same time that our mystical Cro-Magnon creative explosion was occurring.

Clear evidence for rises in human numbers and in artistic and technological sophistication has only been found in Europe, but then—paleontologically speaking—this is the world's most assiduously studied continent. Elsewhere, research has been patchy and inconclusive until recent tantalizing evidence first emerged to show other parts of the Old World were also going through the rigors of artistic and cultural upheaval; a social tumult which matches those peaks of mitochondrial DNA mutations detected by Harpending and the rest. We now know that Homo sapiens sailed boats from southeast Asia to Australia at least 50,000 years ago and although it is unlikely these hapless settlers were actually aiming for the land in question, the fact that they were nautically mobile in the first place indicates they possessed considerable sophistication. Then there was the use of red ocher, the practice of cremation, and the creation of paintings, engravings, and necklaces—all discovered in Australian sites that are 30,000 years old or more. Intellectual change was in the air, and was not restricted to one small part of one continent. It was blossoming across the populated world.

But how can we rationalize the fact that artistic and symbolic ferment was bubbling from Mungo to Lascaux? If we assume the requisite cognitive transformations were triggered after Homo sapiens embarked on its African Exodus, we must explain how it erupted in tribes in different parts of the world. If the Great Change occurred as late as about 40,000 years ago, then it must have developed, virtually simultaneously, in peoples who were living many thousands of miles apart. Either that or those new genes or behaviors appeared in one place and then spread like social wildfire across half the planet, a notion that can only be sustained by assuming our forebears were indulging in a lather of gene or cultural exchange. This has led many scientists to see it as improbable that this decisive event occurred so late in our prehistory, when, they say, all sorts of convoluted, tortured explanations must be dreamed up to account for its transmission.

Instead, these scientists argue that mental mutations, which had evolved much earlier, more than 100,000 years ago, when Homo sapiens was still confined to one small part of its African homeland, were responsible for later artistic and technological development. This decisive modification was taken round the world with those humans who embarked on our great exodus. (It also remained with those who stayed in Africa, of course.) Such an explanation is unaffected by a need to have mysterious flows of DNA or cultural mutations striking most of the world's peoples simultaneously. On the other hand, if neuronal remodeling was part of our African heritage, why did it take so long to manifest itself? Why, when we look at those populations at Amud, Skhul, Qafzeh, Kebara, and the other Levant sites, can we see precious little difference between the stone tools made by Homo sapiens and our Neanderthal cousins? If we were already bequeathed with our full, final neurological endowment, why did it not manifest itself then? We had begun our exodus, and by inference were kitted out with our complete intellectual baggage. We were, to all intents, the same sorts of people that we now encounter every day at home or work. So why did our sophistication not show then? Why the delay before it became apparent?

Now our evolution, as we have seen, can be traced back to those swings in climate that swept Africa over the past few million years, and in these variations we can track the founding of human intelligence. "The evolution of anatomical adaptations in the hominids could not have kept pace with these abrupt climate changes, which would have occurred within the lifetime of single individuals," says neurophysiologist William Calvin, of the University of Washington School of Medicine. "Still, these environmental fluctuations could have promoted the incremental accumulation of mental abilities that conferred greater behavioural flexibility."

In other words, our bodies could not change speedily enough, so our brains took the strain instead. We developed a plastic, adaptive approach to the world. The result was a doubling in the expansion of our crania, a process which began around two million years ago, when Homo habilis and then Homo erectus people started to gather round the lakes of eastern Africa to make their tools and plan their scavenging and foraging (and possibly hunting). Their brains had, roughly, the capacity of a pint pot. Then, slowly, we began to gain gray matter, at a rate of about two tablespoons' worth every 100,000 years. By the time this cerebral topping-up had finished, the human cortex had more than doubled in volume. "The two-millimeter-thick cerebral cortex is the part of the brain most involved with making novel associations," adds Calvin. "Ours is extensively wrinkled, but were it flattened, it would occupy four sheets of typing paper." In fact, this outer layer of gray matter accounts for about 80 percent of our total brain volume. Compared with a human's, a chimpanzee's cortex would fit on only one sheet of paper, a monkey's on a postcard, a rat's on a stamp.

According to Cosmides and Tooby: "Knowing that the circuitry of the human mind was designed by the evolutionary process tells us something centrally illuminating: that, aside from those properties acquired by chance or imposed by engineering constraints, the mind consists of a set of information-processing adaptations, designed to solve those problems that our hunter-gatherer ancestors faced generation after generation. The better we understand the evolutionary process, adaptive problems and ancestral life, the more intelligently we can explore and map the intricacies of the human mind." Such a strategy forms the core of evolutionary psychology which tries to examine our conduct from the perspective of a huntergatherer with five million years of hard hominid evolution to its credit, and whose occasionally baffling actions and reactions can be best understood in this light. It is an approach that accepts a certain programed response from the human mind, but it does not maintain an individual is necessarily a prisoner of his or her genetic heritage.

What evolutionary psychology teaches us is that our ancestors must have had to evolve a whole series of mental mechanisms—those "sets of information-processing adaptations" mentioned by Cosmides and Tooby—that were used to solve the problems of everyday Stone Age life: food selection, mate selection, communications, toolmaking, dealing with wild animals, and many more. "Think of the mind as a great Swiss Army knife," says Cosmides. "We had different mental blades for solving all sorts of problems." More than any other species, including our reconstructions of other hominids, Homo sapiens possesses a wide array of different mental tools that we use for dealing with the outside world. And because we have such a variety, we can react more flexibly and deal with issues which we would never have encountered in our evolution.

For Darwin, language was "an instinctive tendency to acquire an art," and so the subject was treated in Victorian days—as an art. Today, however, its study is very much the province of the scientist, from the neurophysician to the computer expert. To them, or at least most of them, language is a hereditary endowment, as Steve Pinker stresses. "Language . . . has been found in every one of the thousands of societies that have been documented by explorers and anthropologists," he says. "Within a society, all neurologically normal people command complex language, regardless of schooling." We are, in fact, a startlingly talkative species, so much so that the British phonetician D. B. Fry has remarked, tongue in cheek (but then he is a phonetician) that Homo loquens would be a far more appropriate name for our species than Homo sapiens. Certainly, our urge to communicate is generally more in evidence than our wisdom. It is estimated that in a normal day, a person may speak as many as 40,000 words, the equivalent of four to six hours of continuous speech.

The crucial point is that language is ubiquitous among humans, a facility that is acquired just from exposure to the speech of the people with whom children interact. It is a facet of the mind quite separate from general intelligence, for language can be handicapped even though intelligence is left intact—and vice versa. More importantly, it is a means of communication bursting with extraordinary evolutionary implications. "In an intelligent social species such as ours, there is an obvious adaptive benefit in being able to convey an infinite number of precisely structured thoughts merely by modulating exhaled breath," says Pinker. "Anyone can benefit from the strokes of genius, lucky accidents and trial-and-error wisdom accumulated by anyone else, present or past." In other words, it was the genetic capacity to speak a complex language that raised modern humans from the millennia-long doldrums we were sharing with the Neanderthals until 40,000 years ago. It gave us the power to take over the world.

Jane Goodall: "Of all the characteristics that differentiate humans from their non-human cousins, the ability to communicate through the use of a sophisticated spoken language is, I believe, the most significant. Once our ancestors had acquired that powerful tool, they could discuss events that had happened in the past and make complex contingency plans for both the near and the distant future. They could teach their children by explaining things without the need to demonstrate. Words give substance to thoughts and ideas that, unexpressed, might have remained, for ever, vague and without practical value. The interaction of mind with mind broadened ideas and sharpened concepts. Sometimes, when watching the chimpanzees, I have felt that, because they have no human-like language, they are trapped within themselves. Their calls, postures and gestures, together, add up to a rich repertoire, a complex and sophisticated method of communication. But it is non-verbal. How much more they might accomplish if only they could talk to each other."

Dunbar points out that if you look at primate brain dimensions, you find that they correlate neatly with group size. Gibbons have fairly small crania and live in family pairs of four to six, for example. Their neocortexes (the most recently evolved parts of the cortex) contrast with that of'biggerbrained chimps who live in communities of fifty to eighty. The relationship is unusually clear-cut, he maintains. And when you plug the human brain size into this social thermometer, you produce a predicted group size of 148—a figure that is the optimal maximum for social assemblies of humans. Amazingly, says Dunbar, this magic number turns up in all sorts of human societies. Many huntergatherers alive today have an average core group of around 150, and so did Neolithic villages, such as those uncovered in Mesopotamia.... In short, the number looks like the fundamental unit of human social cohesiveness. Above this level, peer pressure can no longer control individuals and the group breaks apart. By using language to create this largest of all primate or hominid assemblies, Homo sapiens was able to generate a healthier, more effective culture.

It is an intriguing idea, though not everyone agrees that speech— either as a primary or secondary mental product—is necessarily the final deciding factor in humanity's success story. The gift of the gab may have taken us far down our present evolutionary track, but it was not necessarily the final means of our current "successful" status. For this reason, some scientists champion the cause of different "brain blades," such as memory. The storage of extra neural information would have been of immense benefit, they say. There would have been no point in having language if we did not have the power to retain the complicated knowledge that we wished to pass on, after all. With good memories we would have been able to sustain complex social relations. We could recall where we saw good hunting grounds the previous year and where we could find food supplies and vegetation. Tied to this notion is the issue of longevity. If humans lived, on average, to an older age, we would have been able to pass on more cumulative wisdom stored in our memory banks. There would have been more elders to transmit the benefit of their knowledge: what had been done in their youths during serious drought, for example. In other words, it was the rise of the human grandparent that gave our species its precious boost.

Alternatively, it may have been a deeper underlying mental structure that went through a final crucial change, Calvin suggests: "To account for the breadth of our abilities, we need to look at improvements in common-core facilities. Environments that give the musically gifted evolutionary advantage over the tone deaf are difficult to imagine, but there are multifunctional brain mechanisms whose improvement for one critical function might incidentally aid other functions. We humans certainly have a passion for stringing things together: words into sentences, notes into melodies, steps into dances, narratives into games with rules of procedure. Might stringing things together be a core facility of the brain, one commonly useful to languages, storytelling, planning, games and ethics? If so, natural selection for any of these talents might augment their shared neural machinery, so that an improved knack of syntactical sentences would automatically expand planning abilities too."

Chimps and humans diverged five million years ago and our genomes only differ by about 2 percent, according to DNA hybridization studies. However, the separation between Neanderthals and modern humans probably occurred only about 200,000 years ago which suggests that we may have differed by less than 0.1 percent of our genomes. And that slender gap may account for our success and their failure. Only a handful of genes must be involved in our jump to stardom, it would seem. In which case, says Steven Mithen, of Reading University, it may not be a question of which blade of our Swiss Army knife was honed to final perfection, but more the way we integrated their use most effectively. There was no big gap in individual mental aptitudes, just a different way of putting them together.

The roots of art lay with a need to create initiation ceremonies, to hold rituals, to settle territorial squabbles, and to demarcate roles in society, such as hunting. It was all part of the power we developed to free ourselves from the strictures of individual intellectual realms. If there was one thing that identified Homo sapiens at this time, it was mental osmosis in which ideas would creep throughout the mind to produce the fantastic figures of the Trois Freres Sorcerer; elaborate tools such as boomerangs, harpoons, and ropes; mineral quarries; ornate funeral rites; and many other wonders. As Randall White states, in Natural History magazine: "Cro-Magnons used two- and three-dimensional forms of representation systematically—to render concepts tangible, to communicate, and to explore social relations and technological possibilities. This powerfully enhanced their evolutionary fitness."24

Around this time we also see the rise of what must have been some form of organized religion, and belief in an afterlife, as can be gleaned from the 28,000-year-old site of Sungir, near the city of Vladimir, 100 miles east of Moscow. There archeologists have found three bodies (a man and two children) wrapped in painstakingly prepared ivory beads, arranged in dozens of strands. Each corpse was sheathed in thousands of such ornaments, and given that each must have taken about an hour to make, funeral preparations would have used up thousands of hours of work—per body. These rites of mortification betray an exquisite vault in imagination and motivation compared with the simple cave burials with a stag's head at Qafzeh 100,000 years ago, or with the deer's jaw that commemorated the death of the Neanderthal child at Amud 60,000 years ago.

This vision, liberated perhaps by language and softened by a breaking down of intellectual boundaries within the human mind, O was therefore the final flourish that took us from the cave of Qafzeh to the artwork of Lascaux to the space race, the atom smasher, and the gene probe. At least that is the theory, which sounds convincing, but does not explain why Homo sapiens, forged, most probably, in the ancient ancestral homeland of sub-Saharan Africa, developed the requisite loosening of neurones that took us to the moon. We may never find out the exact nature of these forces though we have much to thank, and to curse, them for.

The decisive point is that we are—to all extents—the same creatures who only embarked relatively recently on their African Exodus. And that has been a considerable influence on the way we act today. In The Stone Age Present, a study of how our hunter-gatherer past still influences our conduct today, William Allman writes: "The rich tapestiy of behaviours that make up our modern everyday lives—our choice of a mate, our ability to live together in large groups, our love of music and concept of beauty, our anger in reacting to infidelity, our occasional hostility toward people who look different from ourselves . . . all have deepseated evolutionary roots that stretch back to the times when our ancient ancestors were struggling to meet the challenges of the world around them."

The First True Humans, H.J. Deacon, J. Deacon

Opinions differ on the number of species that can be recognised in the genus Homo from the fossil evidence. Some authorities 'lump' all the available fossil finds into as few as three species while others 'split' the finds among as many as nine species. The divergent views on the number of valid species show just how continuous the process is of assessing the significance of new finds and re-evaluating older finds in the light of advances in ideas of how evolution works. However, all would accept that members of the genus Homo are distinct from australopithecines and paranthropines in their anatomy and in their ability to make stone artefacts. Large brain size and small teeth, set in a broad short tooth row, are among the obvious anatomical characters that distinguish early true humans from the australopithecines and paranthropines.

True humans became reliant on stone artefacts in obtaining and processing food to an extent not approached by the australopithecines. As stone artefacts are virtually indestructible, they are the most important trace fossils of true humans. Other examples of trace fossils are dinosaur footprints and the burrows of lower animals from some ancient time preserved in rock; they show where and, if dated, when the animals lived. Finds of early human fossils will always be rare because the chances of preservation are low. But stone artefacts, quickly blunted, had a high throw-away rate, and they are the most abundant 'spoor' indicating where people lived in the past.

It is generally assumed that flaked stone artefacts indicate the presence of true humans. Stone is flaked by percussion: this involves the use of a hammer, usually a pebble or small cobble, to strike a piece, called a flake, from a block of stone, called a core. Although the process is simple, it requires an appreciation of geometry and mechanics that is not inborn. Most would-be artefact-makers when they are first faced with the task of striking off a flake in a practical class find difficulty in directing the hammerstone at the correct angle and following through with the blow. When correctly struck, the flake is released with a cluck-like sound. Like many skills, percussion flaking is more easily learnt by imitation and becomes routine through practice. Being able to strike small flakes from a core with some degree of control puts one at the same level of technology as the earliest artefact-makers at about 2.5 million years ago.

Stone artefact technology may have developed out of a capacity for using and making tools from various natural materials, like plant stems, wood, bone and stone, at a level similar to or in advance of the capabilities shown by chimpanzees. Stone is not freely available everywhere. In areas of deep chemical weathering and lateritic soils, such as may be found under a rainforest, there are few stone outcrops. In younger eroded landscapes, such as those created by uplift and rifting of the eastern side of the African continent, rock outcrops abound. Here weathered mantles of rotten rock have been stripped away by erosion to expose fresh raw material and the landscapes, peppered with volcanoes and covered by extensive sheets of plateau lavas, provide plentiful raw materials. It is in these settings that the earliest traces of stone tools have been found; they are the kind of settings which encouraged the initial attempts at flaking stone.

Stone artefacts are relatively permanent markers of a human presence in the landscape. After extended periods, artefacts may have been moved by erosion, water and wind action from where they were originally made and used, to where they have become incorporated in the subsoil. However, the places where they are found are still a good approximation of the areas over which people once ranged. Artefacts occur in localised concentrations - what we refer to as archaeological sites - and they are assumed to mark camps (home bases) or tool-manufacturing sites. Isolated artefacts outside such concentrations - what the archaeologist Glynn Isaac has called the scatters between the patches - then become discards from activities or tasks carried out away from the camp sites.

The home base hypothesis is a simple but not simplistic explanation of the roots of human behaviour. It has been criticised for projecting what we know of the base-camp-living and food-sharing behaviour of modern hunter-gatherers back to the beginnings of emerging humanity. If people in remote times were not living in home bases to share food, how else can the scatters and patches of artefacts and bones in the landscape be explained?

The idea that the earliest humans were scavengers rather than hunters has gained general acceptance over the hypothesis, which was popular until the 1960s, that becoming hunters played the major role in human evolution. In arguing against the hunting hypothesis and the home base hypothesis, Lewis Binford has sought to explain the concentrations of artefacts and bones as marking places where people went to scavenge the minimal leftovers from carnivore meals. In his argument, these places do not indicate food-sharing behaviour, for the leftovers would have been in parcels too small to share. This thesis casts early humans in the role of daytime scavengers and foragers, following their route across the savanna, keeping an eye on the vultures and other indicators for available carcasses, and gathering plant foods on the way. The idea of routed foraging as the initial stage on the way to becoming human has its adherents but few would follow Binford in holding that humans remained locked into the scavenging niche for much of their history.

The downside of scavenging was the potential competition from dangerous carnivores. For those equipped with a basic Oldowan technology but lacking control of fire, predation by sabre-toothed and modern large cats and by hyenas would have posed a significant threat and encouraged living in larger groups. Some of the ways early hominids may have reduced the threat of carnivore predation are indicated by noisy chimpanzee groups crossing open places and, where necessary, wielding branches and throwing handfuls of earth. Pertinent to explaining why the artefacts may be concentrated in localised patches is the stone cache idea of Richard Potts. He argues that some places are revisited time and again because they are close to regular kill sites. In the process these locations would accumulate re-usable artefact raw materials and bone debris but would not be living places because they would attract predators.

If there were a progression from routed foraging in groups to living in and moving between defensible central places, this has still to be documented archaeologically As the resource-defence model suggests, group size would have been vital as humans left the shelter of the trees and became ground-dwellers. The need to cope with living in large social groups for defence would have propelled selection for increased intelligence. Social intelligence of this kind could have been applied in other domains and one of these was in the making of stone" artefacts.

It was the descendants of one habiline species that colonised the whole of the African continent and all other parts of the world. The initial dispersal, sometimes called 'Out of Africa 1', occurred between 1 and 2 million years ago. New dates for deposits containing fossils and artefacts in Java and China suggest the initial dispersal may have been almost 2 million years ago. At Dmanisi in Georgia, on the way out of Africa through western Asia, a human mandible is said to-date to 1.6 million years, a further hint of high antiquity for the dispersal. However, these new datings need confirmation. Even if the older dating for the Asian fossils are substantiated, the pithecanthropines of Java and China, the classic representatives of H. erecrus, are younger than the earliest fossils of Homo from Africa. The most economical hypothesis is that the early humans from Asia, the pithecanthropines, represent a human population whose founders moved out of Africa.

Although mental or cognitive abilities would be set at the time of speciation, their potential would only be realised sometime after the event. This means that the emergence of a new species would not be marked by a set of novel behaviours which archaeologists could document. This would explain why the basic Oldowan technology persisted until after humans more advanced than the habilines had emerged. Archaeologists have documented Acheulian technology associated with the descendants of H. ergaster and early archaic H. sapiens (H. heidelbergensis) in Africa. To archaeologists, Acheulian technology indicates a new level of complexity and, by inference, of thinking.

From as early as 1.4 million years ago in Africa, and later in western parts of Asia as far east as India and in southern Europe, artefact assemblages have been found that not only have cores and flakes, but include large, shaped stone tools made to a pattern. These mark the beginnings of style: making things according to rules about what to make and how to make them. Being bound by rules, determined by a collective appreciation of what is right and proper, is what Glynn Isaac considered to mark the difference between protohuman and human behaviour. Style goes beyond the primary level of function, as it involves choice in the way different kinds of artefacts are made. Choice involves the mind: for this reason the making of bifaces is significant as a window on the mind of the ancients....The habit of making bifaces persisted for more than a million years before they went out of fashion about 250 000 years ago in Africa. It was a habit shared by peoples on a continent-wide and even intercontinental scale.

There are a number of interesting questions raised by Acheulian artefacts to which partial answers can be given. Why were bifaces made? Why did the design persist for such a long time? Why did bifaces have such a wide geographic spread? Bifaces were not ornaments and had a function as useful pieces of equipment. They are basically large and weighty pieces of shaped stone. There is no indication from the shaping that they were other than hand-held. Experiments at throwing them, using them as some sort of projectile, are unconvincing. Bifaces were shaped so as to produce continuous sharp edges and a point (a splayed point, in the case of cleavers). This suggests that they represent the best design for a hand-held large sharp tool. Its use in the hand rather than mounted on a handle limited the scope for any innovation and design changes.

The function of bifaces has been widely discussed in the archaeological literature. Inferences about function can be made from the edge damage that tools show and from what is associated with them. From the emphasis on the tip or, in the case of cleaver forms, the transverse cutting edge, and the general absence of heavy damage or utilisation on either the tip or the laterals, one can deduce that bifaces were used on soft, non-abrasive materials. This would rule out digging in the ground for roots or chopping at a hard material like wood. Of the many suggestions put forward, the most probable use was as a butchering tool, a heavy tool to get through the hide and sinew of an animal carcass. While a light-duty flake may do good service in cutting up an antelope, something heavier like a biface would be more appropriate for dismembering an elephant or hippopotamus.

There is an important clue in the setting of Acheulian sites. In southern Africa, where Acheulian sites are numerous, they occur normally in valley bottoms or wetlands. This indicates that Acheulian people had a preferred habitat: they were terrain specialists or, in technical terms, stenotopic as opposed to eurytopic. The fact that the riverine habitat is distributed in long narrow zones explains the continent-wide distribution achieved by Acheulian groups at the low density of population inferred for that time. This habitat is a relatively constant environment, buffered against seasonal changes, and productive of animal and plant foods. It is the favoured habitat of large herbivores, notably hippopotami, which as walking stores of fat would have been of particular importance. For Acheulian groups, occupying a narrow niche, there was no premium on innovation, and biface manufacture persisted unchanged for many millennia. Modern peoples, on the other hand, were able to range more widely in the landscape.

People using Oldowan technology were living in South Africa as much as 2 million years ago. Oldowan artefacts may be associated with a habiline at Sterkfontein and H. ergoster at Swartkrans. It is the widespread occurrence of Acheulian bifaces after 1.5 million years ago that is striking: these are distinctive and easily recognised artefacts. In this chapter it has been argued that the Acheulian artefact-makers were terrain specialists (stenotopic), inhabiting valleys and wetlands. This narrow linear distribution explains why even with low population densities Acheulian groups are found from the Cape to Morocco.

Emergence of Modern People, H.J. Deacon, J. Deacon

With the wide geographical distribution of humankind, local populations have become relatively isolated and regional differences have developed through natural selection. Peoples living in regions of the hot savanna climates tend to have a tall, slender build, as this biotype aids dissipation of heat, while those living in high latitudes tend to have stout bodies and short arms and legs because their biotype conserves body heat. In the same way the melanin of darker skins in the tropics gives better protection against the sun's ultraviolet rays, a cause of melanoma or skin cancer that can be fatal. Selection for lighter, more transparent skins in the cooler regions may aid in the synthesis of vitamin D, but the reasons-are likely to be more complex and are not fully understood. While body build, skin colour, hair form and facial features are expressions of small biological differences, in general all living humans are remarkably alike. The reason is that modern people are members of a geologically recent species, H. sapiens, and their dispersal is even more recent. With the spread of agriculture, the development of trade, the growth of cities and the invention of means for long-distance travel within the last 10 000 years only, the world has become crowded with people and their numbers are doubling each generation.

Finds of the oldest fossils and artefacts show that Africa was the evolutionary centre for the emergence of the genus Homo. What is currently being debated is whether Africa continued to be the centre of human evolution and was the continent where modern people like ourselves. H. sapiens, evolved. But many scientists, including Colin Groves and his colleague Marta Lahr of the University of Sao Paulo in Brazil, make the point that it is only in Africa that a series of fossils that link H. ergaster / H. erectus to modern humans has been found. This is strong reason to associate Africa with the emergence of modern humans.

Among the marks that distinguish the Neanderthals from modern humans are features of the back of the skull, a ridge (occipital bun) and hollow (suprainic depression), features of the ear and nose regions (they had particularly large noses) as well as characteristic brow ridges, and lack of a chin. They were very strong-muscled, stocky people whose body form indicates a strenuous life in the cold and dry environments of Ice Age Europe. H. neanderthalensis, with a brain capacity as large as or larger than that of modern humans, appears to have suffered a wave-like extinction initiated about 40 000 years ago; the last survivors persisted in southern Spain until 27 000 years ago.

Most living populations have lost the robust architecture of the skull and limbs of early modern humans. Changes in diet and lifeways have reduced the need for heavy chewing, as reflected in the skull, and for strenuous activities, which required stout limb bones. However, early modern humans can be linked to living humans in the high vault to the skull, the retraction of the face under the brain case, and the presence of a chin.

Ofer Bar-Yosef, a Harvard-based archaeologist working in Israel, has argued that early modern people in the Near East were replaced by Neanderthals some 75 000 years ago, with the Neanderthals in turn being replaced by later modern people 45 000 years ago. This picture of one population replacing another is understandable if the geographical range of the cold-adapted Neanderthals, centred on Europe, expanded and contracted with the shifts in climatic zones during the Last Glacial period. Under the warmer climatic conditions of the early Late Pleistocene, African faunas expanded into the Near East and, with them, as the sites of Skhul and Qafzeh show, so did early modern humans.

The debate on the emergence of modern humans has polarised around two competing hypotheses. The hypothesis that enjoys the majority support, and the one preferred here, is that of the single centre of origin. In variant forms it is also known as the 'African Eve', the 'Garden of Eden', the 'Out of Africa 2' and the 'Recent African Origins' hypothesis, each with slightly different predictions. The last hypothesis, for example, which has amongst its prominent advocates Gunter Brauer and Chris Stringer of the Natural History Museum in London, holds that all living people are descended from a population that had its centre of origin in Africa and adjacent areas. This implies that early modern people everywhere share an African skeletal morphology. Dispersal of the early modern ancestors out of the centre resulted in the replacement or assimilation of archaic peoples elsewhere. As a consequence of the movement out of Africa, all parts,of the Old World and the New Worlds of Australia and the Americas became populated. The recent African origin of modern peoples carries the further implication that geographical or racial differences cannot be traced back to archaic kinds of humans.

The second hypothesis, the multi-regional hypothesis, appeals to an evolutionary process known as anagenesis, a term that means that evolutionary change took place within a single lineage without branching into separate species. This hypothesis stresses continuity in evolution in the many regions to which humans dispersed after leaving their African homeland a million or more years ago. What has allowed the regional populations to evolve together is gene flow, not through large-scale migrations, but through people meeting and interbreeding. This implies that early humans like H. erectus were ancient forms of H. sapiens and that as a consequence of gene flow, modern humans have had many ancestors but no single centre of origin. In direct contrast, the single centre of origin hypothesis appeals to the evolutionary process known as cladogenesis or branching and holds that H. sapiens is a recently evolved and the only surviving branch of humankind.

All mtDNA in living populations can be traced back to a single ancestor, a woman, because mtDNA is inherited in the female line. This woman, dubbed the lucky mother, was not the only person alive but one in a population whose mtDNA did not become extinct. She would have been the mtDNA ancestor and other genes would have had other ancestors. As diversity or variety develops through time as a product of mutation, the mtDNA ancestor is the starting point, or point of coalescence, from which the diversity in contemporary populations derives.

The mtDNA ancestor lived in Africa. The source of new variation in mtDNA is mutation, and the number of differences in two mtDNA types due to mutation reflects how recently the individuals shared a common ancestor....An African mtDNA ancestor remains the probable explanation because African populations show more mtDNA diversity and more divergent mtDNA lineages than populations elsewhere.

The mtDNA ancestor lived about 200 000 years ago. This figure is calculated from measures of the amount by which human mtDNA sequences have diverged, and from estimates of the rate of divergence through mutation. Statistical errors associated with these calculations give a 95 per cent confidence that the date lies between 500 000 and 50 000 years ago.

Harpending and colleagues point out that projecting the modern mtDNA diversity back in time to the point of coalescence, at about 200 000 years ago, does not mark the time when a new population or species arose. Rather, it tells us that at that time the effective ancestral population - the number of reproductive females - was between 1000 and 10 000: a very small population. This is a point used to argue against the multi-regional hypothesis on the ground that such a small population distributed between Cape Town and Beijing would have been spread too thinly to maintain the significant gene flow demanded by the multi-regional hypothesis.

In the latitudes of southern Africa, plant rather than animal food would have contributed the bulk of the diet... While plants were the dietary staples in the Middle Stone Age, hunting was also important. Hunting provided an essential high-protein supplement to a plant diet rich in carbohydrate....However, as bone preserves better than plant tissue, there is more direct evidence for the use of animal foods in coastal and inland areas.

In the Middle Stone Age there occurs the first evidence for the systematic use of marine foods. The Cape coastal sites have been well studied; they show seals, penguins and shellfish were eaten. Fishing, however, was not practised - the necessary technology may have been lacking. There is evidence for the active hunting of all sizes of bovids (Bovidae is the family that includes antelope and buffalo). Signs of active hunting, as opposed to scavenging, are to be found in the butchering of whole or near-whole animals on site; this is evident in the body parts represented and in cut marks and other damage on the bones. Hunting does not rule out opportunistic scavenging. It is very probable that seals were scavenged rather than hunted. With control of fire there was the possibility of driving even large carnivores off their kills.

In Klein's view, Middle Stone Age people did not use the animal resources in their environment as effectively as Later Stone Age people because in the Middle Stone Age the people were neurologically (mentally) different. There are, however, other possible explanations - for example, demographic, ecological and technological reasons - why in the Middle Stone Age there was a lower intensity in the use of resources - reasons that do not assume differences in intelligence.

The usual archaeological criteria taken to indicate the appearance of modern cognition are the behaviours that distinguish the anatomically modern Upper Palaeolithic Cro-Magnon people from the Mousterian Neanderthalers. The Upper Palaeolithic is characterised by the presence of art and personal ornaments, blade and burin stone technology, and the making of bone artefacts; these indicators or markers are not found at Mousterian sites. The Upper Palaeolithic, dating from about 40 000 years ago, is a regional phenomenon restricted to Europe and the Near East. It represents a period of intensification in the use of food resources under particular favourable glacial climates, and the innovative use of specialised equipment for hunting and fishing, The large number of Archaeological sites of this period indicates signifjcant population densities. As population densities rise there is more need to identify individuals and groups: this explains the occurrence of bone and shell ornaments, art and elaborate artefacts used as symbols in social signalling. All people in the world use symbols - a flag, a style of dress or objects like a motor car - to communicate their social position. The Upper Palaeolithic Cro-Magnons made use of symbols, but their Neanderthal cousins did not.

Once human populations left the safety of the valleys and began to occupy entire landscapes, regional differences emerged. One should expect that in regions as different in climate and resources as Eurasia and Africa, different cultural and population histories would be recorded. Indeed, the Upper Palaeolithic was not a universal stage in human history: in sub-Saharan Africa there was no phenomenon like the Upper Palaeolithic. There were not the same conditions of predictable, seasonally abundant food resources needed to encourage particular technological innovations and permit population increase. Indeed, at the time when populations were expanding in Eurasia, they appear to have been contracting after an even earlier expansion in Africa. Thus the usual Upper Palaeolithic indicators for the appearance of modern cognition have no relevance for the archaeological record in Africa. Another important difference is that anatomically modern people were present in Africa in the Middle Stone Age, tens of thousands of years before their appearance in the Upper Palaeolithic in Europe.

If the Middle Stone Age is associated with anatomically modern people, then the question arises, Were these people modern in their way of thinking or, as Klein argues on economic grounds, modern in body but not in mind? It is a logical assumption that mental or cognitive abilities would have been set at the time of speciation, and early anatomically modern Middle Stone Age people 100 000 and more years ago should have had the capacity for modern behaviour. If we take into account appropriate criteria, and not those that separate Neanderthals and Cro-Magnons, the indications are that Middle Stone Age people did think like us. On this reckoning the criteria for recognising modern cognition in a southern African context would be those behaviours that link Middle and Later Stone Age and contemporary hunter-gatherers. These criteria would include the following:

• Family foraging groups. The evidence is in the occurrence of small circular hearths that are associated with food waste. These are domestic hearths and they are universally 'owned' by a woman with a family. • Strong kinship ties allowing foraging groups to split up and disperse at some times, and at other times to come together and form larger aggregations. The evidence is indirect but is indicated in the dispersal of sites in the landscape. The earlier peoples who made the Acheulian artefacts, by contrast, were habitat specialists and lacked this ability to disperse. • Active hunting, as evidenced in the faunal remains. • Ability to manage plant food resources with fire, as indicated in the collection of geophytes. • Capacity to communicate by the use of symbols. The prime evidence is in the use of red ochre. Throughout Africa, a triad of colours, red, white and black, serve as colour symbols. Red symbolises blood and life and is featured in rites of passage like birth, initiation into adulthood, marriage, and death and burial. • Reciprocal exchange of artefacts. The primary evidence is in the selection of exotic raw materials for making artefacts, like the Howiesons Poort backed tools in the Middle Stone Age. Gift exchanges are very important in maintaining interpersonal relations among contemporary huntergatherers, and Lyn Wadley has argued that they were equally important in the Later Stone Age. Craft specialisation in the manufacturing of Stillbay points in the Middle Stone Age also suggests a high antiquity for reciprocal agreements.

Our Distant Past, P. Ehrlich, A. Ehrlich

When most dinosaurs were eradicated in the wake of a collision of an extraterrestrial body with Earth about 65 million years ago, many ecological niches, such as eggeating predator or large plant-eater, were left for mammals to evolve into. One group of mammals that arose just after the decimation of the dinosaurs was characterized by grasping hands, binocular vision with depth perception, a complex brain to help them deal with complicated social arrangements, and predominantly single births followed by the infant's prolonged dependence on the mother. That group, the primates, began an adaptive radiation some 50 million years ago, and human beings are on one branch of that radiation.

On that branch, the discovery of "missing links" galore has shown that the human family tree, once thought to be a rather simple evolutionary sequence, was actually a complex evolutionary "bush" with much speciation in the past. And in recent years, traditional fossil evidence of these events has been magnificently supported by the use of new molecular techniques that get at the very basis of genetic change and are even allowing us to determine the genetic differences between living people and their close fossil relatives.

If the line from early primates to humans is the most discussed period of our evolution, it is not the earliest: our very first ancestors— the first living organisms—arose more than 3 billion years earlier. At its most basic, the entire process of genetic evolution is relatively straightforward. It involves a combination of a genetic system (a set of interacting elements that replicates itself, with some variation) enclosed by a membrane that separates it from its environment (thus making it an organism) with a mechanism, natural selection, that leads to some variants replicating more than others. Such an enclosed genetic system subject to differing environments has made possible the now-vast diversity of life (biodiversity) that evolved once life had begun, and the eventual appearance of our planet-dominating species.

Yet one part of the story is not simple. A fundamental question remains unresolved—that of where and how life itself began. There are few problems in understanding the creation, in a lifeless world, of the chemical building blocks of life; scientists have made many of them in laboratories under conditions that simulated those hypothesized for the early Earth. But how those building blocks became concentrated and assembled into metabolizing, self-replicating entities with capsules separating them from their environments is still unknown. Biochemists, geologists, and other scientists have been working on pieces of the puzzle, but they can't even rule out the possibility that life reached Earth on a meteorite originating on Mars. There are still two basic schools of thought. One is that life would naturally evolve on any Earth-like planet in the universe. The other is that it was a rare event that, about 3.8 billion years ago, simply happened on our planet. Life may have begun as a primitive chemical cycle on the surfaces of rocks, with the development of mechanisms for replication and of cells with membranes that could provide isolation from the physical environment occurring later as a result of selection. Or it might have started with the appearance of a selfreplicating molecule such as RNA, a single-stranded close relative of DNA, which still functions in modern cells and which can serve both as a carrier of information and as a catalyst (the latter role being played by proteins in today's organisms). Or it might have been some combination of the two.

What most scientists consider the main evidence for evolution's occurrence, the fossil record, is so rich now that it would take many volumes to describe it (indeed, many have been written). The validity of scientific interpretation of that record is supported in many ways, but perhaps the most interesting is the repeated finding of animals and plants that had been described on the basis of fossil specimens but that later turned up alive and just as described from ancient fossils.... In the early days of evolutionary science, much attention was paid to the question of "missing links." If people were really descended from monkeys, where were the fossils of intermediate forms? Why hadn't people found traces of the links between birds and reptiles if the former really were derived from the latter? In the century and a half since Charles Darwin, scientists have learned some answers. Certain kinds of organisms don't easily become converted to fossils, and birds are among them. Some habitats don't allow good preservation, and lots of birds live in those places. And in some cases scientists just hadn't looked hard enough....In general, the other principal gaps in the fossil record, such as the ones between fishes and amphibians, reptiles and mammals, and chimpanzees and human beings, are now occupied by fossils of organisms with intermediate characteristics.

Chimpanzees, the most humanlike apes, have brain volumes of about 400 cubic centimeters (cc). Modern people average about 1,350 cc. So where are the roughly 900-0: "missing links"? In 1856, just before Darwin published On the Origin of Species, a fossil of a different kind of human being was discovered in the Neander Valley of Germany. "Neanderthal man" was clearly different, but not that different. It had a brain size roughly like that of modern people (actually, when more specimens were discovered, it turned out the Neanderthals' average brain size was slightly larger). Interestingly, noting this large brain size in an ancient human being seems to have been the only written notice Darwin took of Neanderthals. They looked at best like a recent ancestor, and they were too much like us and not chimplike enough to seem like a real missing link. But Neanderthals were the first clear evidence that Homo sapiens wasn't the only "human being" ever to populate Earth.

Half a century ago, the human family tree seemed pretty simple and linear—more a "family pole" than a tree: chimpanzee-like ape -» Australopithecus -» Homo erectus -» Homo neanderthalensis -» Homo sapiens. In the view of anti-evolutionists, of course, those discoveries simply created more spaces that needed missing links. Where, for example, they asked, is the intermediate form between Australopithecus and Homo erectus? The answer is that at least two links subsequently have been discovered in that gap: H. haUlis and H. ergaster, with brain volumes in the 600- to 900- cc range. The human family pole—the hominins (the technical term for humans and their ancestors after they diverged from the chimpanzee line)—has in the past fifty years become the "family bush." During much of the past, several human (hominin) species were living at the same time.

Just why the chimp-human lines diverged, and especially what role was played by habitat alterations generated by climate changes, a leading environmental contender for the cause, is not settled. Scientists, using a variety of methods, including chemically analyzing the shells of single-celled organisms in ocean sediments (foraminifera), have concluded that about 10 million years ago the climate of Earth cooled, and forests in Africa thinned and shrankin area. Our ancestors may have increasingly taken advantage of enlarging savannas or mosaics of woodlands and open grasslands, and, in connection with that, may have been selected for upright bipedal posture for its putative advantages when spending time in the open—although recently some scientists have suggested that bipedalism evolved in the trees as an adaptation to walking on springy branches. In any case, recent evidence suggests that the upright Ardipithecus line gave rise to the other australopithecines, starting with Australopithecus anamensis and later A. afarensis (Lucy).

Fossil and genetic evidence have demonstrated that, rather than being a recent ancestor of Homo sapiens, H. neanderthalensis split off from our ancestral line more than half a million years ago and evolved alongside it. Many questions remain about the Neanderthals, including how similar their speech was to that of modern H. sapiens, why they died out, and especially the degree to which our ancestors outcompeted or interbred with them. Recent evidence suggests that, at least in terms of quality of tools, the Neanderthals were in a league with our ancestors, though few paleoanthropologists think there was much, if any, interbreeding. Scientists have recently managed to extract ancient DNA fragments from Neanderthal bones, perhaps making it possible to sequence the entire genome of 35,000-year-old Neanderthals, and an attempt is already under way. That could help answer some of the questions.

The ancestral line leading to the apes diverged from the line leading to African and Eurasian monkeys about 30 million years ago; the line leading to gorillas, chimpanzees, and people separated from the line leading to orangutans about 18 million years ago. That started a trend toward spending less time in the trees, a good thing from the human viewpoint, since a dominant animal would not have been likely to evolve in an arboreal habitat—too tough to develop agriculture, among other things. The chimp line split from the human line roughly 6 to 7 million years ago. Some scientists have interpreted genetic evidence to suggest a fairly substantial hybridization between the chimp and human lines at first, with the chimps and bonobos (sometimes called "pygmy chimpanzees") diverging some 2.5 million years ago. Some of those evolutionary changes are thought to be related to the climate shifts that changed the distribution of habitats, with first gorilla ancestors and then chimp ancestors remaining behind in forests while hominin precursors became ever more erect as they moved into drier, more exposed savannas.

Numerous explanations have been proposed for the evolution of upright posture, from allowing more efficient travel, freeing the hands to hold tools or weapons, and exposing less body area to the hot savanna sun to permitting upright displays, which would suppress competition in the new, relatively open and resource-poor habitats. Upright posture would also raise the eyes to a level at which they would be more able to spot approaching predators. Whether we'll ever know the definitive answer to the "why upright" question is doubtful.

The spurt of brain growth from about 400 to 1,350 cc probably was a response to selection pressures originating in the increasing complexity of human social behavior. Such changes in the rate of evolutionary change are common, so hominin brain size remaining more or less constant for a few million years and then tripling in a few million more is not an unusual sort of evolutionary phenomenon. In the absence of powerful selection pressures to change, a species' characteristics may remain constant for long periods of time but then, in the face of an environmental stressor, quite rapidly adjust genetically. This common pattern of variation in evolutionary rates has long been known. In the early 19705 a situation of relative stasis followed by rapid evolutionary change was given a special name, "punctuated equilibrium," which some people still misinterpret as some kind of novel evolutionary process.

It is generally assumed that there were two human "out of Africa" episodes — departures from the cradle of humanity to distant lands. The first exodus was more than a million years ago, almost cer- tainly as much as 1.7 to 1.8 million years ago, and the emigrants were Homo ergaster or their immediate descendants, H. erectus. The second departure occurred more than 5,000 generations after modern Homo sapiens evolved in Africa, about 200,000 years ago. Our own kind left their African birthplace, spreading first to the Near East (the Levant) perhaps 120,000 years ago, and then to much of the Old World some 65,000 to 40,000 years ago. They subsequently occupied the rest of Earth's land areas and eventually replaced all other hominins that had persisted until then. The role of climatic events in their dispersal and the part that invading modern Homo sapiens may have played in the extinctions of our hominin cousins are still debated. For instance, there have been extreme oscillations in Europe's climate over the past 100,000 years, and the severity of climate change may have reduced supplies of game and pushed less adaptable Neanderthals to extinction, with or without competition from modern Homo sapiens who lived in the same areas.

Interestingly, there is a connection between the ability to make stone tools and the ability to speak. Small-brained chimps fully understand the uses of tools and can smash rocks on a concrete floor and select sharp shards with which to cut a cord and gain access to food in a box. But they do not have the fine hand-eye coordination that would let them manufacture sharp tools by striking them from a rock core (a task difficult for even modern people to master). It turns out that the same kind of neuromuscular coordination required for stone tool manufacture is also essential to the ability of our tongues to undergo the incredible gymnastics that are required to produce speech. But when this ability arose in the past is unclear. Some scientists speculate that true language with syntax emerged suddenly with Homo sapiens, while others think that language started early with gestures and grunts, and that H. habilis would have been able to communicate verbally to a significant degree. A puzzle related to these advances, as we'll see, is the reality and cause of what has become known as the "great leap forward," a cultural "revolution" that seems to have occurred about 50,000 years ago (or somewhat earlier) and greatly accelerated our rise to dominance.

Of Genes and Culture, P. Ehrlich, A. Ehrlich

Just as human beings do, chimps learn from others, and teach others, how to deal with their biophysical and social environments. In some wild chimp societies the young find out in this manner how to open hard nuts; in others they learn techniques to "fish" for termites (which can be a very complex task) and even to make "spears" (possibly just probes) that they insert into tree holes to flush or kill the small nocturnal primates called bush babies that spend the day there. Some socially transmitted traditions are also foundin other non-human animals. For instance, among birds called oystercatchers, the young of different local groups learn, by imitating their parents, quite different traditional ways of opening oysters.

Chimp culture (and that of oystercatchers) is very different from the complex cultures of modern human beings. The transmission and alteration of humanity's vast stores of cultural information have been the key to Homo sapiens' rise to dominance—the reason why the world isn't run by chimpanzees. First of all, non-genetic information that can be transmitted between generations of chimps takes the form of behaviors—not, so far as we know, ideas. And their behaviors often involve tools such as sticks and rocks, things that are common in the environment, rather than artifacts that they have created for special purposes. Should the cultural transmission (by imitation) be disrupted, one can easily imagine any given behavior, because of its relative simplicity, being reinvented by an individual chimp through trial and error and then imitated by others. But if even the human culture of Plato's day (let alone today's) were entirely lost, one would not expect an individual to be able to reestablish many of the behaviors by experimentation— be they building chariots, leading great armies in battle, or writing sophisticated dramas. It would instead take the contributions of large numbers of people and hundreds or thousands of generations—if it could be accomplished at all under whatever environmental conditions prevailed.

Those big brains that our ancestors evolved over the past couple of million years have allowed human beings to enter an entirely new realm of evolution, one of large-scale cultural evolution: change in that unprecedentedly vast pool of non-genetic information stored in human brains and in the artifacts those brains have devised. Speech, which allows ideas to be transmitted, and writing, which furthered the conceptual and geographic range of that transmission, created an unbridgeable gap that separates Homo sapiens from all our living relatives. They allowed cultural evolution on a scale unimaginable in speechless, illiterate apes whose only tools are sticks and rocks and whose evolution has thus remained largely a matter of modification of the relatively small amount of genetic information that they (and we) possess.

Ralph Linton defined [culture] as "an organized group of learned responses characteristic of a given society," and he defined a society simply as "an organized group of individuals." We'll basically go with Linton and define "a culture" (as opposed to culture itself) as a social group's collective behaviors that persist for generations and that are not, in essence, independent of social context (e.g., sneezing is a behavior not normally considered part of culture).

The first records of human culture go back some 2.5 million years, very likely to Homo habilis and those simple Oldowan stone tools. Interestingly, and probably not coincidentally that was also roughly the start of the period of great brain expansion in our line. One of the most interesting aspects of the human prehistoric cultural record, preserved primarily in stone tools until some 50,000 years ago, is how slow the rate of early human cultural evolution appears to have been. In the Paleolithic period, or Old Stone Age (2.5 million to 12,000 years ago), the first cultural tool kit—flaked stones and sharpedged flakes struck from them (Oldowan)—lasted 700,000 to 800,000 years. The second (Acheulean), which added more complex stone tools such as hand axes, cleavers, and picks, emerged with Homo ergaster and persisted for 1.5 million years. Culture as preserved in stone showed real refinement beginning only about 700,000 to 600,000 years ago, when Africans began to produce hand axes that were finely trimmed and often remarkably symmetrical when viewed from the side or the end. They also began to produce a wider range of flake tools, many of which were secondarily modified into scrapers, knives, and the like.

About 50,000 years ago, technology exploded in the African Later Stone Age and European Upper Paleolithic period, and new dimensions of human activity began to appear in the archaeological record. This transformation was one of the most dramatic and abrupt in the prehistoric record and may have been the most important in our history. Called the "cultural revolution" and popularized by Jared Diamond as the "great leap forward," it involved the development of new, sophisticated technologies with more diverse and standardized stone tools, and the first appearance of bone, ivory, and shell objects that were deliberately shaped into projectile points, needles, awls, and so forth. Added to that was the appearance of a diversity of very fine stone tools, a flowering of cave painting, sculpture, and body ornamentation, and signs of ritual (including burials), accompanying a spurt of population growth and, some scientists believe, a great advance in language skills. Perhaps more important, Homo sapiens after the leap rapidly replaced other hominin types in Eurasia, with little or no evidence of interbreeding or cultural exchange with them.

It seems quite reasonable that the great leap can be traced to a reorganization of the brain roughly along present-day lines that opened the door not just to art but to a cultural evolutionary sequence that started with replacing all other hominins and ended up producing books, computers, and (through sanitation and medical advances) population explosions. In short, the evidence doesn't demonstrate that human brain wetware underwent a rapid selectional shift before the great leap, but it certainly makes it seem likely.

Intimately connected with the questions surrounding the great leap forward are those of the development of language with complex syntax (that is, meaning embodied in relationships between the words in a sentence). Possession of such a language is the foremost characteristic distinguishing today's Homo sapiens from all other animals. It is also at the root of that other preeminent feature of humanity we've just been discussing: the possession and manipulation of vast amounts of nongenetic information—that is, a rich culture. The physical infrastructure necessary for language, including a big, complex brain and structures in the mouth and throat (tongue, larynx, vocal cords), had to evolve genetically before modern verbal communication and the cultural evolution of language could occur. Especially important must have been the development of the neuromuscular machinery that enabled the tongue to perform the impressive movements required for speech. A disproportionately large part of the brain's motor control area is devoted to control of lips and tongue.

Linguists must depend on speculation about the evolutionary transitions from grunts, purrs, andbarks that indicate mood or the presence of danger to the development of word symbols—purely arbitrary sounds to represent things in the real world or to express relations among things, or even among other words and concepts. The system of vervet monkey alarm calls shows that arbitrary symbolic verbal expression is not limited to people; to indicate the threat of a hawk, the monkey makes one noise, easily distinguished from another noise it makes to indicate a snake. While both noises are distinct, neither has any obvious connection with the predator (e.g., the hawk sound doesn't resemble a hawk's scream). The monkeys have developed a culture (and, by some definitions, elements of a primitive language) that associates these noises with information.

Evolution of large and complex human brains made possible language with syntax and the storage of vast amounts of information, indeed made possible both "humanity" and "culture" as people commonly understand them. Modern human beings' big brains (compared with those of the great apes and our hominin ancestors) are our most important physical characteristic. Our brains evolved their impressive dimensions relatively rapidly over the past 2.5 million years or so, well after development of our other physical hallmark, upright posture, and over about the same period that our ancestors left stone clues to their culture.

We think the best current explanation of why human beings have evolved big brains is related to the growing ability of a highly social primate to discern, imagine, and project the thoughts residing in the brains of other members of their group. In other words, people have come to have a highly developed "theory of mind." Why did humanity's capacity for a theory of mind evolve? What was its selective advantage? The short answer is that it was needed to deal with the complexities of increasingly smart animals living in tight social groups. In the human evolutionary line, brains evolved not only to adjust, plot, plan, and maneuver their owners' lives but also to adjust their behaviors to those of other plotting, planning, maneuvering, and signaling individuals with whom they could empathize and cooperate (or compete) and whose minds they would continually try to read. And in evolving empathy, our ancestors' brains gave us the capacity to develop ethics.

Something is now known of a possible physical basis for a theory of mind. Homo sapiens and other primates have elements in our brains (originally discovered in monkeys) called "mirror neurons" that respond both to our own actions and to those of another individual acting similarly. When we reach for food or when we see another person reaching for food, exactly the same neurons in our brain discharge. Indeed, some scientists have suggested that the mirror neuron system, which is involved in gesture imitation, may have been the evolutionary development tens of millions of years ago that eventually led to language. Hand and mouth gestures are related, and understanding the abstract gestures of their companions (e.g., shrugs) may have been the precursor to understanding abstract noises made by others. Indeed, some parents communicate with their children with symbolic gestures before the children have much grasp of spoken language. But so far no scientist has been able to connect mirror neurons directly with a theory of mind or with empathy—for which the theory clearly is a prerequisite.

One way to view cultural evolution is as the changing pool of stories being narrated in the brains of human populations, just as their changing pool of genes constitutes genetic evolution. The most mystifying thing about our brains is how the many trillions of connections among neurons bathed in hormones generate those narratives that we think of as "consciousness." British psychologist Nicholas Humphrey has developed an intriguing speculation about the evolution of consciousness. He, like many neurobiologists, ties consciousness to bodily sensations—the presence of mental representations rich with "feeling," of "something happening here and now to me." Sensations contrast with perceptions, which are bits of news without emotional content about "what's happening out there." The distinction is a fine one. The chemical exuded by a rose, for example, is perceived as having a sweet scent, but it also gives a person the sensation of being sweetly stimulated and may elicit further sensations of, say, being with a lover. Humphrey's hypothesis is that sensations occur at the boundary between an organism and its environment and in human beings are generally registered as sight, touch, sound, smell, and taste. When direct sensations are absent, mental representations are accompanied by reminders of sensation; for instance, some thoughts are "heard" as quiet voices within the head and would fade without that component of sensation.

In Humphrey's view, there has been a gradual evolution of consciousness through a shortening of the sensory feedback loops. Originally, among our very distant ancestors— say, a simple worm—the input of sensation from the organism-environment interface triggered a nervous response to the stimulated part of the body surface. Then, gradually through evolutionary time, that circuit shortened, and the target of the responses moved inward to a surrogate area on the surface of a slowly evolving nerve center, which eventually became a brain. Nerve impulses arriving in the brain created conscious experiences. It seems a plausible evolutionary story, but it's at most a starting point for thinking about how consciousness evolved.

We suspect that today only human beings have what Paul has described as "intense consciousness": "we analyze our physical and social surroundings, remember past events, and 'talk' to ourselves about those analyses and their meaning for our future. We have a continuous sense of 'self— of a little individual sitting between our ears—and perhaps equally important, a sense of the threat of death, of the potential for that individual—our self—to cease to exist."

Our minds are a function of our brains, which have been formed by an interaction of inherited genetic "instructions" with environmental information both in the womb and throughout postnatal life. That is an ongoing process, which in a social context means a constant dialogue with others and continual observation of events that lead to a constant updating of our views of the past, probabilities of future events, our opinions of others, and our ethical views. We learn the evolving norms of our social groups and adjust our views of what we will and (perhaps more important) what we will not do in various situations.

The world interacting with our brains is not totally determined (e.g., one cannot predict when any given radioactive atom will decay), and it seems reasonable to assume that our social behavior is not all preordained either. The question of how much of our activity is in some sense automatically programmed (evidence from new imaging techniques indicates that our brains can decide on many actions before we become aware of the decisions), or "constrained," and how much is "free" remains a philosophical one. But in many if not most cases we apparently can override automatic decisions, sometimes to act "responsibly." Neuroscientist Michael Gazzaniga said that "the brain is determined, but the person is free." Intriguing, but, like many statements about free will, not totally helpful. Our brains are a product of genetic evolution, key to the evolution of our complex culture, and the organ responsible for our global dominance. But scientists, and perhaps philosophers, have a long way to go even to approach a full understanding of them.

Do the great leap and the explosion of human culture mean that cultural evolution overrides genetic evolution in human beings? The issue is not dissimilar to that of whether the cause of an individual behavior is "genetic" or "environmental" (with much of the critical environment being cultural). The latter common question is a misleading one. Genes cannot function without an environment. Equally, there cannot be an environment without genes. Environments exist only relative to organisms, being all the external factors that interact with those organisms. Genes and environments work together to shape the phenotype, the actual observed structural and behavioral characteristics of an individual, and that individual in turn affects the surrounding environment.

Inseparable though they are, a dramatic difference between genetic and cultural evolution is the amount of information each has to operate on. Human genetic evolution is a story of changes in a mere 20,000 to 25,000 genes, involving roughly 3 billion nucleotide base pairs (the "rungs" of the DNA "ladder") which are involved in encoding the structure of proteins....The way cultural information dwarfs genetic information is inherent in the contrast between some 25,000 genes, each averaging 120,000 base pairs, and the information storage capacity of the human brain. The brain has hundreds of trillions of connections between nerve cells, so even if we very conservatively equate the information storage capacity of a connection to that of a trio of base pairs, the 1,000,000,000 base pair trios are up against hundreds of thousands of times as many brain connections, and each gene would need to program more than a billion synapses.

An Evolutionary Approach to the Origin of Mind, W. Noble, I. Davidson

Our book attempts to show how 'mental capacity' might best be conceived in the light of recent developments in psychology, social science (including philosophy), and animal ethology. Following that conceptual clarification we turn to consider the evidence of the archaeological record to evaluate how, in a context of natural selection, such a capacity did emerge. We begin with a foretaste of the arguments we will expand and defend throughout, to the effect that mindedness in human terms is inseparable from language use, and that gestures and fixed visual images have a key function in the origin of behaviour that today may be described as linguistic.

In our view attention should be directed to what people do in their communicative interactions with each other; we should not be sidetracked by tendencies to abstraction. Here we suggest a parallel; indeed, one which is pertinent to our evolutionary story in several respects. Most people with two legs engage in a range of behaviours using them. Amongst other things, they use their legs to walk. Usually they do this effortlessly. Nonetheless, humans had an ancestor which walked on all fours—a gait we discuss in chapters 2 and 6. Walking bipedally emerged in our ancestors during their evolution. More closely tied to everyone's experience, infants and babies do not walk as adults do. They learn to walk that way. The way people walk varies within and between communities; it varies from time to time, and on different occasions, for individuals. There are many different ways of walking, yet communities, and persons within communities, have a characteristic way of doing it. We recognise people by the way they walk, even from the sounds of their footsteps. How walking is done is quite complex; it involves the coordinated use of many sets of muscles. And by walking, many different things can be accomplished. We can appreciate all these things about walking without having to invoke a mysterious abstract entity called, say, 'ambulation', of which the more everyday activity of walking is somehow a material expression.

The ancestors of modern human beings, for some period close to the branch point with other apes, did not habitually walk on their legs alone. In just the same way, those ancestors, for some time subsequent to the branching with other apes, did not talk. How either behaviour came to be expressed, and why it was selected, are intelligible questions within an evolutionary framework, and we aim to answer the second of them in this book. Of continual concern in the argument is to explain how human speaking (or its equivalent in Sign language), as a behaviour, is distinct from all other forms of communication. If we explain this properly we believe the nature of the human mind is made comprehensible.

For the sake of forecasting the argument about mind and language it must be stipulated that we are, ultimately, telling a story about how communication became such that its intentional character is unquestionable. That state of affairs must stand in contrast to one in which communication does not have an intentional character. But this is where language actually gets in the way, for we are considering a circumstance we cannot possibly imagine, namely, a condition of being aware without the use of language as a feature of that awareness.

Human communication of the form we call linguistic is communication using symbols. In our usage, a symbol is anything that, by custom or convention, stands for something else. A symbol is a representative of another thing; it is like an ambassador for that other thing. There are two important elements in this definition: (1) that the thing stands for something other than itself; (2) that it does so by convention, that is to say, by social custom. Anything can be a symbol and, in human life, almost anything is. Pictures are things that stand for other things.

There are several interesting features in the evolution of the ancestors of modern humans. (1) The brain gradually expanded over a period of about 2.5 million years; (2) stone was being flaked through most of the same period; (3) flaked stone was continually being used in association with scavenging food from dead animals; (4) over a period of at least 1.5 million years, little seemed to change in the behaviour of these creatures whose brain size was increasing. Many hypotheses have been put forward to account for the increase in brain size. Besides an hypothesis that accounts for some portion of the increase as a change in overall body scale, the further one we find plausible is that the hominids were increasing the refinement of their motor control, particularly their forelimb control. The archaeological record establishes the expression of bipedal locomotion by about 3.5 million years ago. The forelimbs were thus increasingly useable for the transport and breaking of stones. In the motor control hypothesis—proposed by Darlington and developed by Calvin—the particular capacity that was continuously improving was stone throwing. The capacity to throw further, faster, and with increasing accuracy, and thus to throw stones at rivals, at predators and at prey, would confer endless advantage over fellows who could not do this. Such an improvement can explain the expansion of the brain and the absence of any other change in the archaeological record.

We have argued that the aiming of missiles for successful throwing can in turn support aiming without throwing: what is called pointing in the modern world. Pointing is an intentional behaviour—one original meaning of the word 'intend' is 'point'—and it seems to be unique to humans. In fact infant bonobos (pygmy chimpanzees) make a pointing gesture, but do so in contexts such that their natural caregivers cannot witness the behaviour, as when an infant is being carried on an adult's back. Of course, even if this gesture were expressed in ways more visible to adults, it might not call forth a response from them. Uniquely human is that human infants witness older others pointing long before they themselves do it, and are keenly attended to and imitated by those adults whenever they start to.

To complete the synopsis of our evolutionary scenario: pointing can reveal the whereabouts of predators or prey to fellow creatures, and in silence, thus without warning of one's own presence. The refinement of control of the forelimbs allows for the possibility of their controlled movement in following the path of a prey or predator animal; it also allows for the possibility of making gestures that distinguish prey from predator. We round off this story with the proposal that the leaving of a trace of such a gesture in a persistent form creates a meaningful object for perception. The trace of the gesture is meaningful because of the salient links among the gesture, the object that provoked it and the communicators. It is in this complex of behaviours and their products that we see the prospects for the sign itself (the trace of the gesture, hence the gesture itself) to become noticed, as against being simply the means for drawing attention to something else. The vervet calls draw the attention of the others to something significant in the landscape. Inscriptions of analogous utterances, i.e., forelimb gestures, following the sort of evolution just outlined, may be the mechanism whereby such signs are discovered as objects that represent things other than themselves. Thus are symbols born.

Many of the features which we regard as unique are more or less direct consequences of the emergence of language, of communication using symbols. In order to make this argument we will cover territory unfamiliar to some parts of the audience which might be interested in our story. There will be archaeologists and prehistorians unfamiliar with the philosophical and psychological aspects of the account, and there will be philosophers and psychologists unfamiliar with the archaeological aspects. Much of the work will address these aspects and show how they are tied together.

Not all the possible definitions of what it means to be human can be reconciled. One approach is to look to the anatomy of modern humans to define human uniqueness, but this runs into problems arising from the variation existing in different populations around the world. Moreover, for humans and our ancestors, there is an unparalleled record of fossil specimens which tends to emphasise the continuity of anatomical variation across geographical space and also through time—bearing in mind that variability in space and time must be considered together. Such definitions may, therefore, be practically unsuitable if we wish to identify the point in time, or the place at which, our ancestors became human.

Our criterion for symbol-based communication is 'all-or-none'; we have difficulty with notions of 'protolanguage', 'rudimentary language', or 'language as we know it'. Such ideas have been voiced to try to gradualise the development of symbol use. We do not think this can work. As with the notion of something having, or not having, 'meaning', symbols are either present or absent, they cannot be halfway there. Such an argument is consistent with the idea expressed earlier of discovery of the property of one thing as able to stand for another. 'Discovery' is an achievement, a binary condition—something is discovered or it is not. Nothing can be 'half-discovered', though something may be discovered that turns out later to be but half of something larger, hence 'half-discovered' can make sense in that case.

We will argue that there is good evolutionary evidence, but it is not in the fossilised remains of human ancestors; rather it is in the products of their behaviour. The interpretation of this evidence of ancestral products is made more difficult because, for much of the older evidence and for the standard story that has built up because of it, there has been little thought about the issues of the mental capacities of the makers. Interpretations reflect unrecognised assumptions that the mental capacities of ancestors were like those of the archaeologists and others who make the interpretations, as well as interpreters' ideological commitments to uncritically accepted views of 'mind'.

The view that crucial human abilities are innate has a long history which, in western thought, may go back to at least the Greek historian and mythologiser, Herodotus. Belief in the innateness of language lives on in the work of Chomsky and his followers, though now the definition is extended beyond vocabulary to grammar and syntax. Extended reification results in the belief that there is such a 'thing' as language, which has a 'life' of its own, somewhere inside the equally metaphorical mind.

We asserted that gradualism cannot work when it comes to matters of meaning or symbol use. Pinker and Bloom would seek to counter that. They point out that barely comprehensible utterances—the analogues of 'fractional' meanings—can nevertheless be made some sense of. They cite such things as the 'agrammatical' telegraphic style of newspaper headlines, malapropisms, and stories in a foreign language newspaper (for people with only a passing knowledge of the language in question). Bickerton uses the analogy of 'pidgins', contact languages among speakers of different native languages. Pidgins may have little in the way of grammatical structure, but serve basic communicative functions by adaptation of elements of the speakers' native languages for mutual contact purposes. Bickerton paints a picture of hominid ancestors struggling to make others understand their protolinguistic yet nonetheless referential mouthings, and both he and Pinker and Bloom invoke notions of selective pressure for hominids to keep applying effort toward extracting meaning from unpromising material. Not noticed in the examples of fractional meaning is that the comprehensibility of not very comprehensible material is enabled by the fact that those exposed to the utterances are already language users, who can thus postulate likely meanings, by bringing the apparatus of language and related pragmatic and contextual knowledge to bear on the task. The scenarios of hominids struggling along similar-looking lines are illicit projections of that apparatus to creatures with no such resources.

Donald also relies on the idea of mental representation, and offers a threestage story. The first stage saw the evolution in our hominid ancestors of a capacity for mimesis—the recollection and controlled reproduction of action patterns based on motor memories. The second stage takes the form of evolution of a capacity for lexical invention, which in turn provided the selective context for phonation. Finally, with the emergence of externalised representations, in the form of graphic images and other symbolic artefacts, the modern mind evolved, with its supposedly wide-ranging cognitive capacities. As one commentator on a synopsis of Donald's text has noted, the archaeological record relied on for much of the story does not provide evidence of the inferred behaviours as clearly as Donald takes it to. Add to this the observation that 'Donald ... presents no convincing evidence (and in some cases no evidence at all) showing that particular brain structures or particular types of neurological functioning were correlated with his supposed stages.' Chase closed his commentary on Donald's text by remarking that, 'the meaning of archaeological data in psychological terms is either unclear or controversial or both. One reason is a lack of communication between archaeology and psychology.' Our enterprise is proof that such communication has begun to occur.

We identify absence of language as the feature that limits the consciousness of non-human animals to that of sensory sensitivity (conscious sensibility). Human mindedness is essentially marked by the conscious attentiveness and articulateness that language enables. The core issue for this book is how that sort of mentality emerged in the course of human evolution, always given that what is being referred to here are particular forms of behaviour. These forms must have arisen at some time during the course of the evolutionary emergence of our species. But they are aspects of behaviour we share, in part, with other species as is to be expected for characteristics which can only be argued to arise by evolutionary processes. We assume that a properly formulated evolutionary account outclasses any rival version of how humans and other living entities get to have the characteristics they do.

There is agreement that the consciousness, the mental character, of our ancestors was probably transformed by the acquirement of language. The nature of that transformation is differently understood by different commentators; as is the problem of how the transformation occurred. A typical solution to the problem of 'how', takes the form of alterations in central nervous system circuitry, enabling connections previously unmade.... Subtle structural changes may be feasible which somehow bring about awareness of what is going on. But what is it that theorists take to occur? For Jaynes, 'consciousness' originates when a 'mental' voice, hitherto commanding the behaviour of some human being, is discovered to be one's own. For Dennett, a fancied hominid practice of asking questions of and receiving answers from others, one day provokes a self-provided answer, and a new 'virtual' connection arises in the brain.... 'Consciousness' is then 'awakened' when the rather 'matter-of-fact' activity of self-generated speaking gets connected to the rest of the mind or brain.

The process of evolutionary emergence of the phenomena of mind, language, and 'higher consciousness', can be considered in the light of the evidence of fossilised fragments of the skeletons of human ancestors, and in terms of the behavioural products of those ancestors. An increasing number of fossils document change of physical form which are subject to varying interpretation. At one level it would be convenient to assume that innateness rules, for then there is some hope that the material evidence of fossilised skeletons will indicate a cerebral or other skeletal rubicon which proves the emergence of language. This might be a change in the shape of the skull (inside or out) showing the production of language or thought in the brain; or it might be a change in the anatomy of the throat, showing the production of speech. But the argument we have been considering suggests that these are not likely to be useful lines of approach, since any animal will recruit whatever neuroanatomical stuff there is to the purposes it can achieve as an animal with that development and experience. In the Savage-Rumbaugh model of language emergence the only problem that must be dealt with is how communications first became transformed into language.

We cannot suppose that language-using offspring were born to non-language using parents under a model that language is reinvented from the guidance of the caregivers. There must have been an original invention, which has since gone on being guidedly reinvented. Moreover, the first inventors of language were primates and, as such, may be supposed uncontroversially to have been using vocal, gestural and other communications in their habitual interactions with their conspecifics. Thus, the focus is on what happened when hominids first expressed themselves in ways of which they were aware. When we look at the assembly of elements introduced here to provide an understanding of this evolution, we see that the physical form of the ancestral skeleton may not have been so very important. Cranial capacity increased among our ancestors, hence presumably so did brain size, both absolutely and relative to body size, over the period from the time of the common ancestor between humans and chimpanzees and bonobos.

We also know something about some of the changes in the shape of the inside of the cranium, from the time of the common ancestor, and there is good reason to suppose that this reflects some aspects of the shape of the brain. One of the changes that occurred was the appearance of a shape which distinguishes humans from other primates in the region of Broca's area. Because of the association of that region with speech production, this is taken by some as strong evidence that our ancestors 2 million years ago had language. We will argue that there is no other sign from this period of behaviour that would be expected to be associated with such a capacity. What seems to be needed to argue for the timing of the earliest emergence of language is some sign in the archaeological record that our ancestors behaved in ways that allowed language reinvention among their infants or that could be expected to result from language use. Language reinvention, we suggest, requires a social context of learning; artefacts imbued with 'symbolic meaning', we suggest, result from language.

The Origin of Symbol-Making, W. Noble, I. Davidson

How is it that humans are selfconsciously aware of circumstances that confront them? It is this fact of human life that, among other things, explains why, sometimes, we anxiously agonise about events and, sometimes, joyfully reminisce over, appreciate and anticipate them.... The mark of modern human behaviour is its self-awareness, its 'mindedness'. It behoves students of psychology or archaeology to try to account for this, and our explanation is that language provides the key. It is by means of language that people can describe what they perceive, reflect on that, recognise themselves as part of and as the perceivers of their perceived world, agonise, reminisce, and make plans. Language has this power because its constituent signs are symbols. Symbols are entities that stand, by convention, for things other than themselves. The words and phrases of a language are those sorts of things. Language users know that their communicative utterances have meanings, that the terms they use are things that refer to other things, to events, states of affairs, including their own states of feeling and awareness.

The theme which emerges consistently, from both theoretical and empirical inquiry, is that behaviour is at stake in the story of the evolution of the human mind. The practices of creatures in relation to the physical and social resources available to them are the means by which they come to 'make up their minds'. While brain cells are intrinsic to behaviour, as phonemes to language, selection on behaviour, including linguistic behaviour, and hence the brain cells that permit it, is where the action is. In this final chapter we offer proposals about the behavioural steps involved in discovery, by our ancestors, of the symbolic potential of communicative signs.

For a relatively brief time in evolutionary terms, the archaeological record shows evidence of behaviour involving stone and other natural materials (ochre, bone) that is critically different from what we have been reviewing. Actions are being taken upon materials that show evidence of forward planning to achieve a goal. Undertaking such behaviour entails being able to articulate that goal and the steps needed to reach it. Even if the initial achievement of a sequence-based outcome is accidental, the capacity to note the outcome and to recreate the steps leading to its reachievement depends on the same ability. The evidence in the archaeological record of human evolution of the last 60 000 years is of behaviour undertaken according to plan. The arrival in the Australian region may have been an accident, but it is not one that could have happened in the absence of the use of sea-going vessels constructed according to plan. Archaeologically, this is the earliest evidence of modern human behaviour.

There is no knowing exactly how long prior to the first colonisation of Australia such plan-based behaviour had been expressed. On the basis of linguistic analysis it has been argued that the people who first went there originated from a common (modern human) ancestor whose other descendants radiated across Europe (and that a second radiation characterises the divergence of Australian from North American human groups). DNA analysis may not provide unequivocal evidence for an origin of modern human genes in Africa, or it may, but all interpretations allow agreement that colonisation of Australia and Papua New Guinea occurred early in the global radiation of modern humans. We may speculate that a period of 10 000 years, from the time of common origin, is sufficient to allow for the appearance of modern humans on the southernmost coastlines of Asia. Finally, it can be said that there is no archaeological evidence for the expression of modern human behaviour anywhere else before about 100 000 years ago. Hence, we conclude that sometime between about 100 000 and 70 000 years before the present the behaviour emerged which has become identified as linguistic.

Language may seem indispensable to the conduct of modern human life, yet all other living creatures, including the closest primate relatives of modern humans, survive without such capacity. The mere emergence of this behaviour is, thus, no guarantee of its fitness. Selection for its maintenance relied, in our view, on particular consequences of the change in awareness its emergence induced. Since speakers know that their utterances convey meanings, and know they are making them, they can exercise control over what they say. The capacity to control information contrasts with the expression of it involuntarily, behaviour more typical of chimpanzees. Control of information, for example, about resources has advantage for those 'in the know'.

On the basis of advantages in the form of improvements in throwing and knapping, we can plausibly argue for selection for greater manual control in hominid ancestors. What might drive selection for improved vocal control, which is what seems to have happened? Our position is that increased vocal control is not readily explained in the absence of an account of how communication itself becomes elaborated. We can do nothing with mysterious stories, as that by Burling mentioned in Chapter 1, about talking as a form of self-address which just shows up. The contexts for communication to increase in salience can only be speculated about. What follows is a chain of speculation which is not offered as the story. Rather we are satisfied that each element in the speculative chain does not violate interpretations of either the archaeological or modern-day behavioural evidence.

We have said previously that by the very fact of being a common feature of human communication, yet witnessed only rarely among other primates, iconic calls and gestures are just the sort of behaviour on which natural selection could have operated. On this same score, we note that a rare occurrence among monkeys and chimpanzees is to respond to a vocal or gestural utterance with another one; these too are behaviours forming the potential for the natural selection of imitation as a general feature. The essence of imitation in the first place is replication of the behaviour of a co-present model. Imitation can only be recognised, at its inception, by the feature of copresence of the model being imitated.... The evolutionary emergence we have documented, since the time of the common ancestor, of bipedalism, hairlessness, cranial thermoregulation, increased brain size, carnivory, and secondary altriciality in a tool-making primate, created the context for joint attention and imitative learning. Gesture figures strongly in this process among modern humans, and a foundational gesture among infants is that which adult observers gloss as 'pointing'. We gather these various elements together not to retail a recapitulationist story but as preparation for one that takes targeted throwing, pointing, and gesturing, when they occur in the presence of conspecifics, as plausible evolutionary steps enabling communicative signs to become used symbolically.

We have argued throughout that other modern primates, in their own ecologies, may be observed to communicate using vocal and gestural signs, but that there is no evidence to support a proposition of their appreciating that such signs have the functions they achieve. The signs themselves go unnoticed; they are seen or heard 'through' to what they signify.... Our argument overall is that appreciating the reality of signifiers has to arise in order that they can be made detachable from immediate contexts of association, carried off as it were (displaced) to other contexts, yet appreciated as continuing to refer to now absent signifieds. The trace, in 'freezing' the gesture, makes the signifier a new environmental entity so that its existence is created as something in itself, yet as existing in simultaneous relationship to both sign-maker and signified. Some such reorientation of attention away from the signified is needed in order to realize the existence of the sign as signifier. Howsoever that is accomplished, once done, the sign becomes usable symbolically, as a name. Its form, initially iconic, could be reduced or otherwise modified and have increasingly arbitrary formal relation with its referent, such reductions being enabled so long as all common users were party to new encodings—remained in the 'language game'. From the occasion of discovery/invention of signs as signifiers, any vocal utterances associated with manual gestures would be similarly transformed in their use and also become symbols.

Of more particular concern is the context for selection of the behaviour whose form of emergence we have been speculating upon. We may 'rationally' suppose that a capacity for reference, to use and understand utterances at the symbolic level, brings advantages in terms of information exchange, planning and the like. The general expectation is that the emergence of language made communication somehow 'better'. Another force at work, though, is the rapidity with which symbols can be modified in their form, yet go on being meaningful for so long as the modifications are jointly appreciated, i.e., within a given community.

With the emergence of such 'idiosyncratic' meanings ('codes'), as groups become isolated from each other, comes the emergence of unreliable reference, the breakdown of unambiguous communication. Failures and misunderstandings between groups due to this would increase the importance of personally worn or expressed emblems to allow speakers to know that meanings were common among those similarly marked. Such distinctive emblems, arising within human groups, would also constrain the flow of communication between bearers of dissimilar emblems. In-group' and 'out-group' membership may be signalled, indeed, by passwords (shibboleths) and other markers, so that the boundaries of reliable meanings may be distinguished.

A consequence of experiencing the occurrence of unreliable communication is that strategies come into play, in the context of language use, which make the withholding of information, or the production of deliberately misleading information, advantageous. These possible forms of exploitation, entirely recognisable in contemporary human dealings, are the rapidly appreciated fruits of the consciousness that is delivered by seeing that symbols are one's own expressive products, and may be used deceptively for purposes of social control. This function of language—for lying—stands among its other functions in the everyday political, religious and aesthetic activities of human life.

We have asserted that 'the mental' equates to language in human life, hence makes for the uniqueness of human mentality. But we have also argued that language must be conceptualised as a form of interactive behaviour.... It has been possible to constrain speculation about human evolution by attending to the archaeological evidence that records it. But an archaeological record does not speak for itself; it must be interpreted—a theoretical framework needs to be articulated so that the record may be understood coherently. We have argued that the record of artefacts, the products of behaviour, provides the real evolutionary answers. We argue this because the mindedness we seek to explain is also argued as manifest in behaviour.

The Naked Ape, J. Relethford

"What are humans?" This question forms a major focus of much human effort, ranging from the sciences to religion to philosophy. This question is put to God in Psalms 8:4-6: "What is man, that thou art mindful of him? and the son of man, that thou visitest him? For thou hast made him a little lower than the angels, and hast crowned him with glory and honor. Thou madest him to have dominion over the works of thy hands; thou hast put all things under his feet."

This passage reflects a common view of humanity as having been created to be special among living creatures but still "lower than the angels." A view of humans as special and superior is found in much of Western philosophy and is perhaps best summarized by Aristotle's "Chain of Being," which ranked humans at the top of a "scale of nature." The question, "What are humans?" concerns the extent to which we are part of, rather than separate from, the rest of the animal kingdom. What is the best way to classify humans? Should we focus on our spiritual natures, placing us lower than the angels, or should we be considered as simply another species? Are we better described as "featherless bipeds" or "naked apes"? Are the differences between us and other organisms qualitative or quantitative? Our culture holds both views. On the one hand, we all realize that humans are animals in a zoological sense, yet we frequently tell our children not to act like animals, showing how we use the term "animal" in different ways. Judeo-Christian tradition suggests that humans have special status in the mind of a creator and hold dominion over all other creatures because we were created in God's image, not the apes. Yet, studies of genetics show us that we are more than 98 percent genetically similar to African apes.

We belong to the animal kingdom and specifically to the subphylum of vertebrates, defined by the presence of a segmented spinal column. There are five classes of vertebrates: fish, amphibians, reptiles, birds, and mammals. We are mammals, defined by the presence of mammary glands for breastfeeding and the ability to maintain a constant body temperature, among other characteristics. There are three different subclasses of mammals in the world today: egg-laying mammals, such as the duck-billed platypus; marsupial mammals, such as the kangaroo and opossum; and the placental mammals, the group to which we belong. Within the subclass of placental mammals, humans belong to the order primates, defined as having forward-facing eyes, grasping ability in the hands, nails rather than claws, and a number of other anatomical characteristics. Within the primates we belong to die suborder anthropoids, a group composed of monkeys, apes, and humans. There are two major groups of anthropoids: the platyrrhines (New World monkeys) and the catarrhines (Old World monkeys, apes, and humans), distinguished primarily by nostril shape and orientation. Among the Old World anthropoids, we are hominoids, a group made up of the living apes and ourselves. Hominoids share a number of traits. They all lack a tail, a trait found in all other primates, and all have similar cusp patterns on their molar teeth. Hominoids also share aspects of shoulder anatomy, allowing them to raise their arms above their heads easily. Although there were many different species of hominoids between 10 and 20 million years ago, there are only a handful of species alive today. These have usually been placed in one of three different zoological families [hylobaids (lesser apes), pongids (great apes), and hominids (humans)], although we shall see that this traditional classification has been increasingly challenged in recent years.

The family known as hylobatids, or lesser apes, consists of the gibbon, a small Asian ape capable of fantastic aerial acrobatics. Pongids, also known as the great apes, are the second zoological family. There arc four different species of great ape alive today: one species of Asian great ape and three species of African great apes. The Asian great ape—the orangutan—is a large-bodied ape with reddish-brown fur. Orangutans tend to live in small social groups consisting of a mother and her dependent offspring. In the trees, they are agile climbers, but they walk on all fours when on the ground, balling their front hands into fists for support when moving. As with all apes, orangutan arms are longer then their legs.

There are three species of African apes: gorillas, chimpanzees, and bonobos. All three have black fur and rest on their knuckles when walking on the ground. Gorillas are the largest African apes; they live in small social groups of a single adult male, several adult females, and their offspring. Gorillas rely extensively on a vegetarian diet. Chimpanzees are smaller and live in larger social groups made up of a number of adult males and females and their offspring. The lesser-known bonobo is physically very similar to the chimpanzee but has a more slender body build. Bonobos are different behaviorally from chimps in several ways; females are likely to be as dominant (or more dominant) than males, and they engage in a great deal of sex play for resolving tension in the group. Finally, there is the third family: the hominids, to which humans belong. This zoological family contains only one living species—ourselves—but the term "hominid" is also used to describe fossil ancestors that share our bipedal stance. Although we share many anatomical features with the great apes, it is obvious that we are also different in many ways.

Using the genetic relationships as a guide, anthropologists suggest that the first branch in the family tree split between an Asian line, leading to modern day orangutans, and an African line, leading to the African apes and humans (which makes sense geographically, since the first hominid fossils are found in Africa). The African line then later split again—first between the line leading to die gorilla and that leading to chimps, bonobos, and humans, and then again between the human line and that leading to chimps and bonobos, and tiien yet again between the chimp line and the bonobo line. This reconstruction is based on the principle that the more similar two living species are genetically, the more recently they both split from a common ancestor. Since the orangutan is the least similar, its evolutionary line must have split off first.

The family tree doesn't tell us anything about what the various common ancestors looked like or where or when they lived. We can make logical inferences based on comparisons among the living species, but ultimately we need to go to the fossil record to see what species existed and where they might fit in this family tree. This is more complicated than it sounds, because a fossil ape might not lie directly on one of the branches of our tree; it might represent one of the extinct dead ends that we can't identify from genetic analysis.

What actually matters when comparing different traits is the extent to which variation in a trait reflects evolutionary kinship as opposed to the extent to which it reflects unique adaptations. One way of dealing with this is to determine whether a given trait seems to be primitive (of ancient origin) or seems to be derived (of recent origin). Primitive traits don't tell us much about evolutionary relationships because they are shared by descendant species....Derived traits provide more information if they are shared by two or more species, because the simplest explanation for shared derived traits is an evolutionary connection between species. For example, the African apes are all closely related because they all share a knuckle walking anatomy, a derived trait. Knuckle walking is most likely shared by these three species through inheritance from a common ancestor rather than each species having evolved this trait independently.

Genetic and anatomic analyses show African apes and humans to be more closely related, in terms of evolutionary history, than either is to the orangutan. However, it is also obvious that humans are quite different in several adaptations. We are relatively hairless, have large brains, have small canines, and walk upright. These are all examples of derived traits that are unique to the human line. In terms of reconstructing evolutionary history, we would not focus on such traits because they don't provide any information about shared traits. That is, they can't tell us whether we are more similar to chimpanzees than to gorillas; they simply tell us that we are different. Putting these ideas together allows us to reconcile our close relationship with the African apes with the obvious unique features that we have. Although we share close kinship with the African apes, some of our features have changed dramatically since the time of our common ancestor. When we look at humans and great apes in terms of traits such as bipedalism or brain size, we are seeing a demonstration that we have changed more and they have changed less in some features. On the one hand, we are very similar to the African apes, and on the other, we are different. It depends on what features we are looking at.

All of the discussion so far can be summarized with two basic points. First, our closest living relatives are the African apes. Humans and African apes are more closely related to each other than either is to the Asian great ape, the orangutan. Indeed, it now appears that humans and some African apes (chimps and bonobos) are more closely related to each other than either is to the third African ape, the gorilla. Second, examination of unique derived traits in humans shows that in some ways we have changed dramatically from our common ancestry with African apes.

In terms of classification, what should we call ourselves? Should we focus on our similarity with the African apes or on our differences? The traditional classification scheme given in Figure 2.1 emphasizes our differences from the other living hominoids. In terms of our bipedal stance, our large brain, our small canines, and other traits, this is a reasonable classification. The problem here is that it doesn't fit the actual pattern of overall genetic and evolutionary relationship, which indicates that humans should be grouped more closely with the African apes. In other words, the traditional classification does a good job of showing differences in adaptation but fails to reflect existing genetic relationships. So, should we classify ourselves according to our overall genetic and evolutionary relationship to other species, or should we focus on those traits that have changed uniquely in the course of human evolution and make us different? Should classification reflect common evolutionary history or unique adaptations? There continues to be a great deal of debate on this issue.

Personally, I am sympathetic to organizing classification systems around the principle of evolutionary relationships, but I am not convinced that these relationships alone should determine our taxonomic status. Yes, we are very closely related to the chimpanzee and bonobo, and we could be considered "naked apes," as Desmond Morris called us, or "the third chimpanzee," as Jared Diamond called us. Nevertheless, it is also clear that our species has changed both anatomically and behaviorally. Whether we acknowledge this in our formal classification depends on how much we wish to stress evolutionary relationship and how much we want to stress differential adaptations. Either way, the genetic evidence has clearly shown that we are very similar to our closest living relatives.

Do You Know Where Your Ancestors Are?, J. Relethford

Where did our ancestors live 150,000 years ago? According to one view, Mof oar ancestors alive at that time lived in Africa. It doesn't matter where your more recent ancestors lived in the past few thousand years, be it Europe, Africa, or elsewhere; if this model is correct, and you could trace your ancestry back over thousands of generations, you would find that each and every ancestral line goes back to Africa no more than 150,000 years ago. Although this view of a single recent African origin of modern humans has attracted many supporters over the past few years, it is by no means universally accepted. Others argue a different interpretation, where some of our ancestors 150,000 years ago lived in Africa, but others lived elsewhere in the Old World. What makes this debate particularly interesting to me are the implications for our understanding of the fossil record. We know that by 150,000 years ago, large-brained archaic humans lived throughout the Old World. These archaic humans had brains roughly equal in size to our own today, but with a differently shaped skull. Today, we have modern humans living across the entire planet.

What is the relationship of the archaic humans to modern humans? Again, it boils down to a question of which archaic human populations are ancestral to us—just those in Africa, or those from more than one region? If all living humans had only African ancestors 150,000 years ago, then what happened to the closely related archaic humans that were living outside of Africa, such as the enigmatic Neandertals of Europe and the Middle East? Were they a different species? If they left no descendants, then why did they die out? If, on the other hand, the transition from archaic to modern humans took place in more than one continent, then how did these different populations interact over time, and can we determine how much of an ancestral contribution each group made? The focus here is on the evolutionary history of the human species over the past few hundred thousand years. Although this debate has relied on information from the fossil and archaeological records, genetic data have also come into play.

The major difference between the African replacement model and the multiregional evolution model lies in the question of where our ancestors lived some 150,000 years ago, the period just preceding the earliest fossil evidence of anatomically modern humans. According to the African replacement model, all of our ancestors came from Africa. Humans living outside of Africa at this time (such as the Neandertals) were not our direct ancestors but were cousins of a side branch of our family tree who eventually became extinct. In contrast, the multiregional evolution model holds that while some of our ancestors lived in Africa, others lived outside of Africa, so that our ancestry today includes some genetic contributions from populations in more than one continent.

According to the model of complete replacement, the transition from archaic to modern took place in Africa and only in Africa- roughly 150,000 to 200,000 years ago. A new species, Homo sapiens, split off from an earlier archaic human species. By 100,000 years ago, populations of this new species began to disperse out of Africa, first into the Middle East and then later into Australia, Asia, and Europe. Meanwhile, there were still populations of "archaics" living outside of Africa. According to the replacement model, the non-African archaic populations were eventually replaced by newly arriving modern populations from Africa, and consequently, all living humans trace all of their ancestry back 150,000 years or so to the initial appearance of Homo sapiens in Africa.

A key point of multiregional evolution, with its emphasis on gene flow, is that our genetic diversity in the world today has resulted from a mixture of genes from different parts of the world over the past several hundred thousand years or more. According to this model, some of our ancestors 150,000 years ago did live in Africa, but others lived elsewhere. Most advocates of multiregional evolution suggest that the same process marks the entire time span of the genus Homo, going back close to 2 million years ago. Following the dispersal of some early humans (Homo erectus) from Africa, human populations in different parts of the Old World have remained connected via gene flow in a single species that evolved over time. Some changes occurring in one part of the world were ultimately shared elsewhere. This does not mean that all populations were identical. Some evolutionary forces act to increase geographic differences, but gene flow acts to counter these sufficiently to prevent a new species from splitting off.

Modern humans appear first in Africa and then later in other parts of the world. Moderns outside of Africa are found first in the Middle East, which is geographically closest to the African continent. More geographically distant places, such as Australia and Europe, are populated by moderns later in time. This is exactly the pattern we would expect if the African replacement model were correct. This geographic pattern of dates is also consistent with the time it would take for early hunting and gathering peoples to move outward from Africa when movement was limited to walking.

Although the dates for the appearance of modern humans are compatible with African replacement, they are also compatible with the primary African origin version of the multiregional model, which postulates an initial change in Africa followed by gene flow outside of Africa. Any movement of genes, either through the steady flow from population to population through interbreeding or by the physical movement of groups of people, is going to take time. Moreover, the amount of time needed for gene flow would be related to the geographic distance from Africa—first to the Middle East and later to other parts of the Old World. Both the African replacement model and some versions of multiregional evolution predict the same pattern, and therefore we cannot distinguish between them based on the dates.

There is evidence for some continuity in some parts of the world outside of Africa. If we combine this observation with the known distribution of archaics and moderns over time and space, the most logical model (to me) would embrace an African origin of modern humans combined with gene flow outside of Africa. Recent analyses have compared a number of archaic and modern fossils and found strong evidence that anatomically modern humans generally show a pattern consistent with the view that all modern humans have some ancestors in Africa and others from outside Africa.

At some point in our past, all living humans shared a common female ancestor. When did this woman live? Cann and her colleagues concluded that this ancestral female lived about 200,000 years ago. All living people can trace their ancestry of mitochondrial DNA (although not necessarily other genes) back to this person. Furthermore, their analysis revealed an interesting pattern of relationship among the mtDNA sequences, which were arrayed in two clusters. The first cluster consisted only of sequences from subjects with African ancestry, while the second cluster consisted of sequences from those in all five ancestral groups, both African and non-African. Because both clusters contained sequences of African origin, they inferred that the most parsimonious explanation was that die common ancestor was also African.

The major impact of the analysis conducted by Cann and her colleagues was the implication for the origin of modern human beings. They reasoned that if our common ancestor lived in Africa some 200,000 years ago, then multiregional evolution could not have occurred, because it predicted a much older common ancestor. The observed data were felt to be compatible with the African replacement model and incompatible with die multiregional evolution model. Initial debate over their research focused on a number of technical issues and questions regarding sampling of humans. However, a number of other studies have since appeared that confirm their basic findings. This does not mean that their interpretations went unchallenged.

Certainly, mitochondrial Eve is compatible with African replacement, but is it really incompatible with multiregional evolution? Let's start by considering the location of this ancestor—Africa. We know that there must have been a common mitochondrial ancestor living somewhere at some point in the past. If this ancestor had lived in Asia, then that would obviously weaken the case for an African origin. The reverse is not necessarily true; although an African location of the common ancestor is compatible with replacement, it is also compatible with multiregional evolution. As noted by geneticist Alan Templeton, "Eve" had to live somewhere. Under multiregional evolution, this ancestor could have lived in Africa, or in Europe, or in Asia. Knowing the location does not resolve the debate.

Does the date and location of "Eve" necessarily rule out a multiregional interpretation? Alan Templeton analyzed the geographic distribution of human mitochondrial DNA sequences and concluded that the same results could also be expected under multiregional evolution and that these data did not support the idea of complete replacement. His analytic method is very complex, but it essentially involves looking at the geographic distribution of different mitochondrial DNA types and comparing it to the different expectations he has found for replacement models and gene flow models. His results suggest that our common mitochondrial DNA could have existed in an African ancestor and then spread throughout the Old World by gene flow, mixing with other populations outside of Africa without replacing them.

Templeton found that taken together with fossil evidence, the picture obtained from gene trees is one of multiple dispersals out of Africa. The first such dispersal took place about 1.7 million years ago with the origin of Homo erectus. Genetic data suggest a second dispersal of genes out of Africa between 400,000 and 800,000 years ago, and a third dispersal about 150,000 years ago. Templeton's analysis is significant because it demonstrates that there has been recurrent gene flow among human populations over the past 2 million years. Although his results do suggest that Africa was often the source of new genetic variations, they also demonstrate that replacement was unlikely. The origin of modern humans appears to be out of Africa, but not exclusively so.

The genetic data combined with observations from the fossil record do give us a picture of a likely model of modern human origins. The fossil record, although not complete, indicates that the first changes to modern human anatomy took place in sub-Saharan Africa, at least 130,000 years ago. These changes spread outward from Africa over time, appearing next in the Middle East, and then later in Asia and Australia, and finally in Europe. The genetic data are consistent with this. What is less clear, however, is what happened outside of Africa, where we know that more archaic humans were living. Were these archaics replaced? I tend to doubt it. I suspect that what actually occurred was a mixture of populations and genes over time and that this mixture was strongly affected by differences in ancient population size.

My own views do agree with the African replacement model to the extent that I suggest the initial changes leading from archaic to modern human did take place first in Africa and then spread outward over time. However, I think these changes took place within a single evolving species and did not involve complete, wholesale replacement. One could argue that this is a semantic difference, because what I have described could easily be taken as a form of "genetic replacement" over time... But the distinction is critical in understanding our own origins. The difference between the birth of a new species (African replacement model) and change within a species (multiregional evolution model) is fundamentally crucial in evolutionary terms. When we look at the fossils of non-African archaic humans, we want to know whether they are an inherent part of our ancestry or a side branch having no direct kinship with us. Even a small genetic contribution would be significant. Coming back to the basic question of this chapter, I suggest that 150,000 years ago most of our ancestors lived in Africa, but not all of them. I think that the evidence points to some ancient non-African ancestry, although it is not clear what was contributed by specific populations from geographic regions outside of Africa.

Introduction and Conclusion, P. Chase

Because culture provides motivations for the behavior of the individual, it gives the group a means of controlling the individual that is absent among other primates. Among all living humans, culture provides a (uniquely human) mental or intellectual context for almost everything the individual thinks or does. If culture as an emergent phenomenon is both unique to humans and of major importance to the human way of life, then its origins should be investigated by paleoanthropologists.

What I am trying to do is to investigate a particular phenomenon, a particular aspect of the way in which humans govern their behavior, that is different from that of other species. In order to do so, I must have a term by which to refer to the concept I am trying to investigate, and "culture" seems appropriate to me. For other scholars, in other contexts and for other purposes, different concepts will be more meaningful, more useful, or more valid, and the word "culture" will refer to something very different. To begin with, what I call culture is something that exists in the mind. Several theorists have conceived of culture in this way, but my concept of culture is probably closest to that of Ward Goodenough, although it differs from his in other respects. For him, culture consists of categories (forms), propositions, beliefs, values, rules, recipes, customs, and meanings. In a similar vein, when I use the word "culture," I mean something in the mind of the culture bearer that informs and guides his or her behavior.

I use the term "coding" to mean motivations, concepts, beliefs, rules, values, etc., that exist in the mind and that govern behavior. "Culture" is then a subset of coding. The first thing that distinguishes culture from other kinds of coding is that cultural codes are emergent. My concept of emergence is essentially that of complexity theory. That is, emergent phenomena are those that arise from the interactions of multiple agents and that cannot be understood without reference to those interactions.

An important aspect of human culture as it is found among living humans is that its socially created codes provide motivation for behavior. This is not inherent in the nature of socially created coding. Imagine, for example, a population of early humans with simple language (socially created codes for communication) and simple, agreedupon procedures for cooperative hunts. In this imaginary group, socially created codes would inform and guide the behaviors of the individuals involved, but it would not motivate them. Individuals would hunt cooperatively for the same reasons that other species cooperate: because each individual decided independently that doing so was in his or her own best interest. However, among modern humans, it appears that culture, in the form of socially created moral beliefs, religious prescriptions, and so forth, motivates behaviors that would be difficult to understand in the absence of culture

If it is in fact the case that culture motivates behavior as well as informs and guides it, then the implications are very significant. It means that the society or social group (however defined) has a way of influencing the behavior of the individual that does not exist in other species. This raises the possibility that an individual might be led to behave in ways that are beneficial to the group yet detrimental to him or her. This in turn raises a theoretical question: how can this happen, given that natural selection should eliminate behavior that decreases the evolutionary fitness of the individual?

Nonhuman primates, particularly the great apes (chimpanzees, bonobos, gorillas, and orangutans) appear to have almost all the cognitive abilities needed to construct and make use of such codes. Yet there is no good evidence that they actually do so. It is not a part of their adaptation in the wild, and even in the laboratory, for all their apparent symbolic abilities, they seem to stop just short of doing so. This conclusion is subject to change, of course, as further evidence is collected. But for the time being it appears that, whatever the cognitive abilities of our nearest relatives, all three aspects of human culture evolved after our lineage separated from theirs.

In the end, it seems to me, judging from currently available data, that the ubiquitous and all-encompassing nature of human culture is probably a mechanism by which socially created coding can be used to motivate and influence the behavior of individuals for the benefit of the larger social group. This implies that genetic evolution has not produced fully altruistic humans. However, it also implies that genetic evolution has in one way or another produced humans who are, to an extent, willing to let socially created codes, codes that are external to us as individuals, motivate our behavior. Culture and genetics work together to produce the human way of life.

All these conclusions are to some extent tentative. This is in part because no one has ever explicitly set out to investigate the evolution of human culture as I conceive of it. As a result, the empirical research on which conclusions must rest was designed with other ends in mind. Yet it is also true that science is continuously working at the edges of what is known, and that scientists must base their work on imperfectly understood or imperfectly known foundations. My primary purpose has been to focus the attention of Paleolithic archaeologists on the origins and evolution of human culture. Most North American Paleolithic archaeologists were trained in departments of anthropology, yet we have concentrated our research almost entirely either on intelligence and cognition or on language and symbolism (the latter two being only a subset of culture). Culture, however, is of enormous importance to the human way of life, and any account of hominin evolution that ignores it is woefully incomplete.

My main thesis is that human culture is a manner of governing behavior, one that coexists with ways of doing so that we share with other species but that is unique by virtue of its emergent nature. Our behavior, like that of all mammals, is guided by noncultural coding that is in part genetically determined, in part individually learned, and in part socially learned. Such noncultural coding, even the part of it that is learned socially, can be understood at the level of the individual. Cultural coding, because it is created socially, cannot be understood at the level of the individual. This emergent property defines what I call culture and is its most essential aspect. The elaboration of culture into all-encompassing systems and our willingness to let cultural coding motivate our behavior are secondary, because they depend on the existence of socially created coding, and because it is possible that socially created coding existed without them for a significant period of time. Nevertheless, they are ubiquitous among extant human societies, and they are of central importance to the way all Homo sapiens live today. Our willingness to let coding that is socially created motivate our behavior changes the relationship between the individual and the social group. Because culture has been elaborated into allencompassing systems that include almost everything we perceive, think, or do, this new relationship colors all of human life.

I propose three very general hypotheses to explain the elaboration of cultural coding into all-encompassing systems: 1. That this was simply a by-product of socially created coding and of the cognitive and social capacities underlying it (the byproduct hypothesis) 2. That as soon as the ability to create codes socially became available, it was elaborated into cultural beliefs and practices that could allay the emotional stresses inherent in the lives of all mammals (the anxiety hypothesis) 3. That the elaboration of cultural coding provided social groups with a means of influencing the behavior of individuals for the benefit of the group, even at the expense of the individual's evolutionary fitness (the group benefit hypothesis)

I reached three conclusions based on the available primatological, paleontological, and archaeological data. 1. Living primate species other than humans do not create coding through social interaction. This means that the origin of what I call culture lies somewhere on the hominin line. 2. The data provided by fossil endocrania, by the reconstruction of fossil vocal tracts, and by zooarchaeological evidence for cooperative hunting all suggest (albeit tentatively) that socially constructed coding (including referential language) was a part of hominin adaptation by the end of the Middle Pleistocene, and perhaps long before. 3. The archaeological record does not provide evidence of widespread elaboration of culture for at least 50,000 years after the Middle Pleistocene.

The by-product and anxiety hypotheses must be therefore be rejected in favor of the group benefit hypothesis. However, it is worth examining the basis for these conclusions. Doing so paints a rather stark picture of the state of our knowledge today. There are several weaknesses in the chains of argument and data on which the conclusions are based. It is possible to challenge many of the bridging arguments involved. For example, many arguments linking language with endocrania or with vocal tract reconstructions are open to question. Likewise, many of my arguments in chapter 3 are based on research in disciplines outside archaeology, research that is still in progress. Understanding individual versus group benefits of culturally motivated behavior will necessarily depend on developments in evolutionary theory as it applies to individual versus group versus multilevel selection.

All too often researchers have not tested alternative hypotheses. This is especially true of arguments linking endocast morphology or tool making to language. There may be data to support particular hypotheses regarding the origins of language, but until they are tested against alternative hypotheses, we cannot accept them with a high degree of confidence. Testing a single hypothesis tells us only whether that explanation is possible. Testing multiple hypotheses tells us which of them best fits the data. Failing to test one plausible hypothesis against another plausible hypothesis leaves us with no way of choosing between them.

A great deal of research is being done on whether or how humans evolved a tendency to altruism - a predisposition to help others at the expense of one's own individual fitness. I suggested in chapter 3 that humans might instead have evolved a susceptibility to culture, a propensity not to help others but to obey the dictates of socially created coding, even at the expense of their individual fitness. In many cases, this would lead to the same behavior - helping others - but the underlying psychological mechanism would be different.

First, I hope that my discussion of culture will stimulate interest in the way humans create the cultural coding that governs our behavior. I hope it will stimulate examination of the way our willingness to let socially created coding motivate our actions affects the relationship between the individual and the group, in terms of both behavior and natural selection. Second, I hope that this effort will focus the attention of Paleolithic archaeologists on the evolution of culture as a phenomenon. It is not necessary that archaeologists accept my definition of human culture. However, it is necessary that Paleolithic archaeology address the issue of culture. If culture is more important to human adaptation than it is to the adaptation of any other species, then we cannot pretend to have understood human evolution until we have also understood the origins of human culture.

How is Human Culture Different?, P. Chase

The fact of human uniqueness is not in itself remarkable. Yet our species has chosen a rather peculiar way to be unique. In the course of our evolution, we have done more than change our anatomy, physiology, and behavior. We have also changed, in part, the manner in which our behavior is governed. Humans are primates, and for the most part we do essentially what other primates do. In many cases where we differ, the difference is one of degree rather than of kind. For example, it has been suggested that at least some apes have all the abilities needed to use symbolic language, albeit in less developed form than humans. In spite of this, it is easy to find things done by humans that other living species simply never do.

Chimpanzees do not practice celibacy for doctrinal reasons, do not play games like chess, and do not invent and discuss fictional worlds. These differences are not differences of degree. In spite of all the other continuities between us, in ways such as these other primates simply do not behave or think as we do. Why this is so is the crux of the issue. I will argue that it is not because we are more intelligent, although intelligence is important, but because our way of life is shaped by culture.

Recall that I use the term "culture" to refer to the totality of three related phenomena: 1. Codes that we create through social interaction inform and govern our behavior. These codes are emergent in character because they cannot be understood without reference to this interaction. The codes do not replace other, private, forms of coding, but are added to them. 2. Such socially created codes not only inform and govern our behavior but also frequently motivate it. Because this potentially leaves individuals open to exploitation by the social group that creates the coding, our willingness to be motivated by socially created coding can be seen as a susceptibility to cultural manipulation. 3. Cultural codes form all-encompassing webs of meanings, values, and dicta that incorporate into themselves almost everything that humans perceive, think, or do. Thus culture forms an inescapable intellectual framework for human life and human action. The heart of this chapter is a detailed explanation of what I

By coding, I mean something that exists in the mind (or brain) that governs and informs behavior. We can think of coding in terms of four categories or levels: 1. Coding that is essentially determined genetically. Note that, like all coding, this is something in the brain, not the behavior it produces. 2. Learned coding. Because of the plasticity of their brains, mammals are able to create new codes in response to their interactions with their environments. 3. Socially learned codes. These codes are initially created by one individual through individual learning, but others then learn them from conspecifics, either by observation or through teaching. 4. Codes created through social interaction.

Learning is the modification of neural structures in order to create new codes or to modify existing ones. This involves an interaction between the environment and existing codes. New codes will be created that, in general, fit with existing ones. In other words, an animal will learn to do something that satisfies existing codes (e.g., hunger) and to avoid behaviors that do the opposite (e.g., eating foods that cause nausea). Both genetically determined and learned neural coding are involved. If an interaction with humans causes an animal pain (genetically based coding), that animal will learn to fear humans (both genetic and learned coding) and will therefore be reluctant to eat food that is too near a human, even when the animal is hungry. Extreme hunger may outweigh this fear, so that the animal may feed near humans. If no one bothers it and it can satisfy its hunger often enough, it will eventually unlearn its fear of humans.

Animals, then, learn by interacting with their environment. Other individuals of the same species constitute an integral part of an animal's environment, and members of at least some species are capable of learning by observing the behavior of conspecifics. As a result, something learned independently by one individual may spread through a population when others observe the first individual. To many scholars, this is the essence and the definition of culture. In my opinion, something more is going on among humans. Learning from conspecifics is an important part of human culture, but it is not the whole picture.

Essentially, memetic or socially learned coding resembles individually learned coding in that each individual animal creates its own codes. In the individual case, it creates codes in response to its own direct interactions with its environment. In the memetic case, it does so after observing the behavior of other individuals - and this means that memes are replicated. How often a meme is replicated depends on how many individuals have preexisting neural coding that leads them to adopt that meme. Therefore the successful meme is one that adapts not to the physical environment but to the existing pool of neural coding in a population.... However, I believe there is an element to human culture - and to most if not all of the examples just listed - that goes beyond and sets it apart from the memes found in other species. Social codes are not just transmitted from one individual to another; they are created by interactions among individuals. This makes cultural codes, unlike memes, emergent phenomena.

Among nonhuman species, memes are not emergent phenomena. They do have a certain public character in that they are "shared," but this is analogous to "sharing" the gene for blue eyes with other members of a population. The coding represented by memes is understandable at the level of the individual. An individual interacts with its environment, and on the basis of those interactions either constructs or modifies neural codes that will govern its behavior in the future. It matters little if the relevant part of the environment is the behavior of running water, the behavior of a predator, or the behavior of a conspecific. Each individual constructs coding that it perceives (in terms of its already existing coding) as being beneficial. The codes created in response to this interaction can thus be understood in terms of the individual creating them, and they are therefore not emergent phenomena.

Humans, on the other hand, are governed (in part) by coding that cannot be understood at the individual level alone. It is easiest to grasp this fact by considering codes that are both based on arbitrary convention and serve to coordinate the behaviors of multiple individuals. Take, for example, the red, yellow, and green lights at a highway intersection. These represent an arbitrary convention that facilitates the safe flow of traffic by coordinating the behavior of all the drivers who approach the intersection. While a driver may understand the benefit of traffic lights for himself or herself, this benefit exists only if the convention is "agreed" to by all drivers. In the absence of such agreement, the individual's best strategy at an intersection is not adherence to a convention but a combination of caution and bluff.

Examples of indubitably emergent socially constructed coding abound in human life. A chess game, for example, can exist only if the concept of the game, the definitions of the pieces, and the rules of play are agreed on by at least two individuals. One player alone is insufficient. Exogamous clans can organize a society only if everyone agrees on the definition of a clan, the definition of marriage, and the rule of exogamy. If only one person adheres to the concept of exogamous clans, society will be organized along other lines in spite of him or her.

Among the most important of emergent codes are the semantic and syntactic conventions that make up languages. Unless everyone in a conversation uses the same conventions, linguistic communication will not exist. If one wants to talk to another English speaker, one has no choice but to use English words and English conventions for indicating tense, number, and so forth. It is possible for one individual to make up his or her own language, but no communication will take place unless at least one other person adheres to the same linguistic coding.

One of the things any primate must learn is how other individuals are likely to react under given circumstances. The reason, of course, is that the behavior of other individuals will have an effect on one's own life. In all primates, not just humans, this ability to observe, predict, and adjust one's behavior to social facts is highly developed, with the result that primate social systems tend to be both complex and flexible. A second kind of consequence faced by an individual who fails to accept an emergent code or system of codes is simply that he or she is left out of the social system or social activity that the code produces. This may be of little consequence. For example, I personally do not feel handicapped because I never learned the rules of bridge. However, because I have not done so, I cannot join in a game. In other cases, the consequences may be more severe. For example, in the unlikely event that someone in a hunter-gatherer band refused to learn the conventions controlling communal hunts, he would be unable to participate in those hunts and might be denied a share of the prey.

This kind of exclusion is not the same thing as not learning how to deal with others socially. All social mammals, whatever their individual social skills, are nevertheless involved in social interactions. Being socially inept means failing to accomplish one's goals in a social setting, whether these have to do with rank, access to food, access to mates, or something else. If one does not learn to play bridge, the consequence is not that one fails but that one cannot even play the game. In many cases there may be a third kind of consequence. An emergent coding system may include the requirement that all individuals ac- cept and adhere to that system, and that those who fail to do so be punished. This is typical of some religious systems, many moral codes, and of virtually all legal systems. In such cases, if one fails to accept and adhere to a code, one will be punished by other members of society. The punishment will be prompted by the same set of codes that one has rejected. Sometimes the punishment is harsh, including torture or death. Sometimes it is limited to mild ostracism or simply the withholding of social approval, as when someone wears a necktie that is unfashionably narrow or eats his salad with the wrong fork.

From the laws that govern a nation to the rules of a children's game, emergent coding is usually the result of a more or less complex process of coercion, negotiation, persuasion, and compromise, a process that involves at least a portion of those affected by the outcome. This does not mean that everyone is equally influential in the process, but simply that the process involves more than one person. One person may invent a new game, but the game will not exist as a game unless at least one other person is persuaded to learn its rules....What seems to be unique to humans is the emergent nature of a significant portion of the codes that exist in our minds or brains and that influence our behavior. Socially constructed, emergent coding makes the human way of life different from that of all other animals. It lies at the very core of human culture. There is, however, much more to human culture than just socially constructed coding per se.

Because emergent codes are created through the interactions of multiple individuals, there is no a priori guarantee that they will produce behavior that will benefit any given individual. If natural selection acts on the individual, it follows that it should quickly destroy any tendency to obey codes that might reduce the individual's evolutionary fitness. When codes are generated externally - by multiple individuals - no one person can be assured that the results will be beneficial to him or herself, or even that they will not be downright deleterious. In addition, it is a characteristic of complex (i.e., emergent) systems that their evolution is unpredictable. As a result, whenever multiple individuals interact to create coding, it is always possible that the system will produce unintended consequences, trapping individuals in a system of coding that benefits no one.

The question therefore arises, how could natural selection have failed to prevent the evolution of a willingness to let socially created (and therefore external) codes motivate one's behavior? The same question arises with regard to the apparent propensity of our species to act altruistically, helping others at one's own expense. This is a complex question that is the subject of a large body of literature. As will be seen in the next chapter, nothing about the question is simple -not even the definition of evolutionary fitness. But until we understand when, how, and why cultural coding came to provide the motivation for individual behavior, we will not understand the evolution of human culture and of the human way of life.

It is not that socially constructed coding displaces or replaces either the natural environment or individual or memetic coding. Rather, it assigns them cultural meanings and values and uses them as cultural symbols. The result is that while the behavior of an enculturated individual is still guided in part by individual and memetic coding, everything he or she does, feels, or thinks is now enmeshed in a cultural system. Although it is possible and even desirable to distinguish analytically between the natural and the cultural environments, or between individual and emergent coding, in practice the enculturated individual can never ignore the cultural meanings of natural phenomena or the cultural meanings and consequences of behavior guided by individual coding.

Human culture, then, is based on a form of coding that at some point in the course of our evolution was added to already existing forms of coding, whether genetically determined, learned, or memetic. Thus, in principle, present-day human culture includes a memetic element. However, the emergent and all-encompassing aspects of human culture affect memetics in three ways: First, in the presence of language, most memes will be codes, not behavior. Second (and this is much more important), when socially created codes guide and motivate individual behavior, and when all things (including memes) come to have cultural meaning and positive or negative cultural value, memetic selection can no longer be assumed to take place at the level of the individual. Third, a new entity, the culture trait, comes into being. As I use the term, a culture trait is in some ways analogous to a meme, but it consists of coding created socially and transmitted from group to group rather than from individual to individual.

Thus, the processes by which culture traits spread (or fail to spread) among groups are equivalent to the processes by which memes spread among individuals. Culture traits are adopted when they are perceived as potentially beneficial and rejected when they clash with existing codes. However, they belong to the emergent level of society and culture, not to the individual level where memes reside. Like all emergent codes, they are created by interactions among multiple individuals, and they can be adopted only by groups of individuals.

Culture is far from being a static, immutable force that rigidly determines the behavior of all members of a society. Rather, it is a dynamic phenomenon, of which individuals are at the same time both the creators and the captives, and which is also only one of the factors determining their behavior. It is true that we as individuals either voluntarily or involuntarily accept the dictates of cultural coding even when they conflict with our individual, internal coding. However, this trait also gives the individual a new weapon in the competition with others. Culture becomes a way of manipulating the behavior of other individuals. Cultural codes (rules, values, etc.) can variously be invoked, manipulated, or altered in order to influence their behavior. The result is that culture is at one and the same time an arena in which the struggle for individual success is, in part, played out, an object of competition, and a means of competing. Marx's famous characterization of religion as the "opium of the people" reflects this dual propensity to submit to culture and to use culture to manipulate the behavior of others.

From Social Interaction to Social Institutions, M. Tomasello

I do not believe altruism is the process primarily responsible for human cooperation in the larger sense of humans' tendency and ability to live and operate together in institution-based cultural groups. In this story, altruism is only a bit player. The star is mutualism, in which we all benefit from our cooperation but only if we work together, what we may call collaboration. Free-riding persists here, but in the most concrete cases—where you and I must work together to move a heavy log, for instance—free-riding is not really possible because each of our efforts is required for success, and shirking is immediately apparent. As a side benefit, in the context of a mutualistic effort, my altruism toward you—for example, pointing out a tool that will help you do your job—actually helps me as well, as you doing your job helps us toward our common goal. So mutualism might also be the birthplace of human altruism: a protected environment, as it were, to get people started in that direction.

To get from ape group activities to human collaboration, we need three basic sets of processes. First and most importantly, early humans had to evolve some serious social-cognitive skills and motivations for coordinating and communicating with others in complex ways involving joint goals and coordinated division of labor among the various roles—what I will call skills and motivations for shared intentionality. Second, to even begin these complex collaborative activities, early humans had first to become more tolerant and trusting of one another than are modern apes, perhaps especially in the context of food. And third, these more tolerant and collaborative humans had to develop some group-level, institutional practices involving public social norms and the assignment of deontic status to institutional roles.

Human beings live not only in the physical and social worlds of other apes, but also in an institutional or cultural world of their own making, a world that is populated with all kinds of deontically empowered entities. The specifics of this world vary greatly among different groups of people, but all groups of people live in some such world.

All social animals are, by definition, cooperative in the sense of living together relatively peacefully in groups. Most social species forage as a group in one way or another, mainly as a defense against predation. In many mammalian species, individuals also form specific relationships with other individuals, leading to coalitions and alliances in their intra-group competition for food and mates. Inter-group defense and defense against predators is also a group activity among many mammalian species. Chimpanzees and other great apes do more or less all of these group things, so our question is how their collective activities are similar to and different from human forms of collaboration.

In collaborative activities, participants not only jointly pay attention to matters relevant to the common goal, but they each have their own perspective as well. Indeed, the whole notion of perspective depends on first having a joint attentional focus that we may then view differently (otherwise we just see completely different things). This dual-level attentional structure—shared focus of attention at a higher level, differentiated into perspectives at a lower level—is directly parallel to the dual-level intentional structure of the collaborative activity itself (shared goal with individual roles) and ultimately derives from it.

By all indications—including several experiments that looked quite carefully for it—great apes do not engage in joint attention. Various data show that a chimpanzee knows that his group-mate sees the monkey, but there is no evidence that the chimpanzee knows that his group-mate sees him seeing the monkey. That is, there is no evidence that great apes can do even one step of recursive mind reading (if you will allow me this term), which is the cognitive underpinning of all forms of common conceptual ground. If, as we hypothesize, the first step on the way to what has been called mutual knowledge, common knowledge, joint attention, mutual cognitive environment, intersubjectivity, and so forth, was taken in collaborative activities with joint goals, the reason that great apes do not establish joint attention with others is that they do not participate in activities with joint goals in the first place.

In our several collaboration studies with great apes, they have never made any attempt at overt communication to establish joint goals and attention, whereas human children engage in all kinds of verbal and nonverbal communication for forming joint goals and attention and for coordinating their various roles in the activity. Human cooperative communication thus evolved first within the bounds of collaborative activities because these activities provided the needed common ground for establishing joint topics, and because they generated the cooperative motives that Grice17 established as essential if the inferential machinery is to work appropriately.

To sum up, the species-unique structure of human collaborative activities is that of a joint goal with individual roles, coordinated by joint attention and individual perspectives. It was by way of Skyrms's stag hunt19 that human beings evolved skills and motivations for engaging in these kinds of activities for concrete mutualistic gains. Skills and motivations for cooperative communication coevolved with these collaborative activities because such communication both depended on these activities and contributed to them by facilitating the coordination needed to coconstruct a joint goal and differentiated roles. My hypothesis is that concrete collaborative activities of the type we see today in young children are mostly representative of the earliest collaborative activities in human evolution. They have the same basic structure as the collaborative hunting of large game or the collaborative gathering of fruit in which one individual helps the other climb the tree and procure the food they will later share. Indeed, I believe that the ecological context within which these skills and motivations developed was a sort of cooperative foraging. Humans were put under some kind of selective pressure to collaborate in their gathering of food—they became obligate collaborators—in a way that their closest primate relatives were not.

I am focusing here on collaborative activities as the key to many qualities uniquely human. But in an evolutionary story, collaborative activities actually constitute a kind of middle step; there is an earlier development that paved the way for the evolution of complex collaborative activities. None of the advancements in cooperation we have been talking about could get moving evolutionarily in animals that were always competing: there had to be some initial emergence of tolerance and trust—in our current story, around food—to put a population of our ancestors in a position where selection for sophisticated collaborative skills was viable.

In the standard evolutionary explanation of sociality, animal species become social in order to protect against predation. Typically, defense is best achieved in groups. When protection is not needed, individuals are better off foraging for food on their own because then they do not have to compete with others for food constantly. When food is dis- persed, there are generally no problems: antelope graze peacefully across the fertile plains, staying together for protection. But when food is found in clumps, dominance raises its ugly head. When a primate group finds a tree full of fruit, there is typically both scramble and competition, and individuals separate themselves from others by at least a few meters as they eat. The paradigmatic clumped source of food is the prey animal. For solitary hunters, of course, prey animals present no competition-related problems. But for social carnivores such as lions and wolves, a group kill raises the issue of how to share the spoils. The solution is that the carcass is large enough that even while some individuals may get more, each individual still gets plenty. In the case where one individual actually makes the final kill, as the others approach the carcass the killer must allow them to have some because attempting to fend off one competitor would mean losing the carcass to others (this is the so-called tolerated-theft model of food sharing).

Chimpanzees make their living mainly off of fruits and other vegetation. Fruits tend to be a loosely clumped, highly valued resource, so they spur competition. But some chimpanzees also engage in the aforementioned group hunting for red colobus monkeys. As noted, this group hunting appears truly collaborative, with shared goals and a division of labor. When the monkey is captured, the hunters get more of the meat than do bystanders who did not hunt. This supports the idea of a shared goal with a fair division of spoils. But recent research demonstrates otherwise. First of all, the chimpanzee who actually makes the kill immediately attempts to avoid others by stealing away from the kill site, if possible, or by climbing to the end of a branch to restrict the access of other chimpanzees. But in most cases, meat possessors are unsuccessful in attempts to hoard, and are surrounded by beggars, who begin pulling on the meat. The possessor typically allows the beggars to take some meat.

Studies suggest that humans and chimpanzees compete for food with starkly different levels of intensity. For humans to have evolved complex skills and motivations for collaborative activities in which everyone benefits, there had to have been an initial step that broke us out of the great-ape pattern of strong competition for food, low tolerance for food sharing, and no food offering at all. It is relatively easy for chimpanzees to collaborate in the "large carcass" scenario in which each individual has a reasonable probability of capturing the monkey, and even unsuccessful participants can still harass the capturer and get some meat. But how can there be a joint goal—in the human sense—of capturing a monkey when the hunters know that success will invariably provoke a contest for the booty?

There are a number of evolutionary hypotheses about the context in which humans became more socially tolerant and less competitive over food. We could tell a story totally within the context of foraging, such that as collaboration became obligatory, those individuals who already were less competitive with food and more tolerant of others naturally had an adaptive advantage (assuming they could find one another, as Skyrms has shown). We could also speculate that since hunter-gatherer societies tend to be egalitarian, with bullies often ostracized or killed, humans underwent a kind of self-domestication process in which very aggressive and acquisitive individuals were weeded out by the group.

Finally, we could argue for the importance of so-called cooperative breeding (cooperative childcare). It is a startling fact that among all of the greatape species except humans, the mother provides basically 100 percent of childcare. Among humans, across traditional and modern societies, the average figure is closer to 50 percent. In a cooperativebreeding scenario, helpers—all those who are not the mother—often engage in a variety of pro-social behaviors such as active food provisioning and basic childcare. In Mothers and Others, Sarah Hrdy argues that this changed social context, which may have arisen due to differences in the way humans needed to forage and the monogamous relationships between females and males, created humans' unique pro-social motivations.

It is of course possible that all of the above scenarios played a role. The important point is simply that there was some initial step in human evolution away from great apes, involving the emotional and motivational side of experience, that propelled humans into a new adaptive space in which complex skills and motivations for collaborative activities and shared intentionality could be selected.

Researchers have shown that if one chimpanzee steals food from another, the victim will retaliate by preventing the thief from keeping and eating the food. But so far in ongoing research we have not witnessed any comparable behavior from observers. Individuals do not try to prevent a thief from enjoying his bounty (or to inflict any other kind of negative sanction) if he stole it from someone else. Despite ongoing efforts, we have observed no third-party punishment. While these two great-ape behaviors—excluding and retaliating— serve to discourage antisocial behavior among group-mates, in neither case is any kind of social norm being applied, certainly not in any agentneutral sense from a third-party stance.

In contrast, humans operate with two basic types of social norms, though many norms are hybrids: norms of cooperation (including moral norms) and norms of conformity (including constitutive rules). Norms of cooperation presumably emanate historically from situations in which individuals going about their daily business, in either individualistic or mutualistic situations, bump into one another in some way. Through processes that we do not understand very well, mutual expectations arise, and perhaps individuals try to induce others to behave differently, or they agree in an egalitarian manner to behave in certain ways, such that some kind of equilibrium results. To the extent that this equilibrium is governed by mutually recognized expectations of behavior that all individuals cooperate in enforcing, we may begin to speak of social norms or rules.

Normative judgments, by definition, require some generalized standard to which an individual's specific activities are compared. Some collaborative activities in a community are performed over and over by various members of a social group, with different individuals in different roles on different occasions, such that the collaborative activities become cultural practices whose structures—in terms of the joint goals and the various roles involved—everyone knows mutually. To gather honey from beehives in trees, for instance, one person stands next to the tree, another climbs on her shoulders and gathers the honey from the hive and hands it down, and a third pours the honey into a vessel. As novices tag along and socially learn what to do in the different roles in this activity, the roles become defined in a general way, such that there are mutual expectations in the group that anyone playing role X must do certain things in order to achieve group success. Any praise or blame for an individual in a particular role is offered in the context of the standard that everyone mutually knows must be met. Thus, social practices in which "we" act together interdependently in interchangeable roles toward a joint goal generate, over time, mutual expectations leading to generalized, agent-neutral normative judgments.

In addition to norms of cooperation, human behavior is guided through norms of conformity or conventionality. At some point in human evolution, it became important for individuals in a group to all behave alike; there arose pressure to conform. The proximate motivation here is to be like others, to be accepted in the group, to be one of the "we" that constitutes the group and that competes with other groups. If we are to function as a group, we must do things in ways that have proven effective in the past, and we must distinguish ourselves from others who do not know our ways. It may be that imitation and conformity were in many ways the central processes that led humans in new directions evolutionarily.

The reason is that imitation and conformity can create high degrees of intra-group homogeneity and inter-group heterogeneity, and on a faster time scale than that of biological evolution. Because of this peculiar fact—presumably characteristic of no other species—a new process of cultural group selection became possible. Human social groups became maximally distinctive from one another in language, dress, and customs, and they competed with one another. Those with the most effective social practices thrived relative to others. This is presumably the source of humans' in-group, out-group mentality, which researchers have shown is operative even in very young infants (who, for example, prefer to interact with people who speak their own language even before they themselves speak).

Norms provide the background of trust in which agent-neutral roles and shared cooperative activities with joint goals and joint attention enable social institutions. But the kind of conventionally created realities characteristic of social institutions depend on one more ingredient: a special kind of imagination and symbolic communication. The origin of symbolic communication is a long story. It depended most fundamentally on cooperative ways of performing tasks and began with the pointing gesture inside joint attentional activities. But there arose a need to communicate about things not in the here and now, which gave birth to iconic gestures (not yet conventionalized) in which I pantomime some scene for you in a kind of pretense display.

In order to have created the ways of life that they have, Homo saptens must have begun with collaborative activities of a kind that other primates simply are not equipped for either emotionally or cognitively. Specifically, humans came to engage in collaborative activities with a joint goal and distinct and generalized roles, with participants mutually aware that they were dependent on one another for success. These activities hold the seeds of generalized, agent-neutral normative judgments of rights and responsibilities, as well as various kinds of division of labor and status assignments as seen in social institutions. They also are the birthplace of human altruistic acts, and humans' uniquely cooperative forms of communication. Humans putting their heads together in shared cooperative activities are thus the originators of human culture. How and why all of this arose in human evolution is unknown, but one speculation is that in the context of foraging for food (both hunting and gathering), humans were forced to become cooperators in a way that other primates were not.

Of course, humans are not cooperating angels; they also put their heads together to do all kinds of heinous deeds. But such deeds are not usually done to those inside "the group." Indeed, recent evolutionary models have demonstrated what politicians have long known: the best way to motivate people to collaborate and to think like a group is to identify an enemy and charge that "they" threaten "us," The remarkable human capacity for cooperation therefore seems to have evolved mainly for interactions within the local group. Such group-mindedness in cooperation is, perhaps ironically a major cause of strife and suffering in the world today. The solution—more easily described than attained—is to find new ways to define the group.

Culture Is Essential, P. Richerson, R. Boyd

Culture is crucial for understanding human behavior. People acquire beliefs and values from the people around them, and you can't explain human behavior without taking this reality into account. Murder is more common in the South than in the North. If Nisbett and Cohen are right, this difference can't be explained in terms of contemporary economics, climate, or any other external factor. Their explanation is that people in the South have acquired a complex set of beliefs and attitudes about personal honor that make them more polite, but also more quick to take offense than people in the North. This complex persists because the beliefs of one generation are learned by the next.... Culturally acquired ideas are crucially important for explaining a wide range of human behavior—opinions, beliefs, and attitudes, habits of thought, language, artistic styles, tools and technology, and social rules and political institutions.

Much evidence suggests that we have an evolved psychology that shapes what we learn and how we think, and that this in turn influences the kind of beliefs and attitudes that spread and persist. Theories that ignore these connections cannot adequately account for much of human behavior. At the same time, culture and cultural change cannot be understood solely in terms of innate psychology. Culture affects the success and survival of individuals and groups; as a result, some cultural variants spread and others diminish, leading to evolutionary processes that are every bit as real and important as those that shape genetic variation. These culturally evolved environments then affect which genes are favored by natural selection. Over the evolutionary long haul, culture has shaped our innate psychology as much as the other way around.... Howevef, the most fundamental questions of how humans came to be the kind of animal we are can only be answered by a theory in which culture has its proper role and in which it is intimately intertwined with other aspects of human biology. In this book we outline such a theory.

Eminent biologist Ernst Mayr has argued that "population thinking" was Charles Darwin's key contribution to biology.3 Before Darwin, people thought of species as essential, unchanging types, like geometric figures and chemical elements. Darwin saw that species were populations of organisms that carried a variable pool of inherited information through time. To explain the properties of a species, biologists had to understand how the day-to-day events in the lives of individuals shape this pool of information, causing some variant members of the species to persist and spread, and others to diminish. Darwin famously argued that when individuals carrying some variants were more likely to survive or have more offspring, these would spread through a process of natural selection.

Population thinking is the core of the theory of culture we defend in this book. First of all, let's be clear about what we mean by culture: Culture is information capable of affecting individuals' behavior that they acquire from other members of their species through teaching, imitation, and other forms of social transmission. By information we mean any kind of mental state, conscious or not, that is acquired or modified by social learning and affects behavior. We will use everyday words like idea, knowledge, belief, value, skill, and attitude to describe this information, but we do not mean that such socially acquired information is always consciously available, or that it necessarily corresponds to folk-psychological categories. Our definition is rooted in the conviction that most cultural variation is caused by information stored in human brains—information that got into those brains by learning from others.

Population thinking is the key to building a causal account of cultural evolution. We are largely what our genes and our culture make us. In the same way that evolutionary theory explains why some genes persist and spread, a sensible theory of cultural evolution will have to explain why some beliefs and attitudes spread and persist while others disappear. The processes that cause such cultural change arise in the everyday lives of individuals as people acquire and use cultural information. Some moral values are more appealing and thus more likely to spread from one individual to another. These will tend to persist, while less attractive alternatives tend to disappear. Some skills are easy to learn accurately, while others are more difficult and are likely to be altered as we learn them. Some beliefs make people more likely to be imitated, because the people who hold those beliefs are more likely to survive or more likely to achieve social prominence. Such beliefs will tend to spread, while beliefs that lead to early death or social stigma will disappear. In the short run, a population-level theory of culture has to explain the net effect of such processes on the distribution of beliefs and values in a population during the previous generation. Over the longer run, the theory explains how these processes, repeated generation after generation, account for observed patterns of cultural variation. The heart of this book is an account of how the population-level consequences of imitation and teaching work.

Culture is as much a part of human biology as walking upright. Culture causes people to do many weird and wonderful things. Nonetheless, the equipment in human brains, the hormone-producing glands, and the nature of our bodies play a fundamental role in how we learn and why we prefer some ideas to others. Culture is taught by motivated human teachers, acquired by motivated learners, and stored and manipulated in human brains. Culture is an evolving product of populations of human brains, brains that have been shaped by natural selection to learn and manage culture. Culture-making brains are the product of more than two million years of more or less gradual increases in brain size and cultural complexity. During this period, culture must have increased the reproductive success of our ancestors; otherwise, the features of our brain that make culture possible would not have evolved. The operational products of this evolution are innate predispositions and organic constraints that influence the ideas that we find attractive, the skills that we can learn, the emotions that we can experience, and the very way we see the world.

Differences in the environment may cause genetically identical individuals to behave differently, and in this sense environmental differences are immediate causes of behavior. However, if we want to know why the organism develops one way in one environment and a different way in a different environment, we have to find out how natural selection has shaped the developmental process of the organism so that it responds to the environment as it does. Or, as biologists put it, the ultimate determinant of behavior is natural selection on genes. Learning and other developmental processes that cause individuals to respond differently to different environments implement structures built into the genes.

The last 800,000 years or so have seen especially large, rapid fluctuations in world climate; the world average temperature sometimes changed more than 10 degrees Celsius in a century, leading to massive shifts in ecosystem structure. A group of hominids living in a habitat something like contemporary Madrid could find themselves in a habitat like Scandinavia one hundred years later. You might think that such rapid and extreme environmental changes would put a premium on individual learning over imitation. Odd as it may seem, in many kinds of variable environments, the best strategy is to rely mostly on imitation, not your own individual learning. Some individuals may discover ways to cope with the new situation, and if the not-so-smart and not-so-lucky can imitate them, then the lucky or clever of the next generation can add other tricks. In this way the ability to imitate can generate the cumulative cultural evolution of new adaptations at blinding speed compared with organic evolution. A population of purely individual learners would be stuck with what little they can learn by themselves; they can't bootstrap a whole new adaptation based on cumulatively improving cultural traditions. This design for human behavior depends on people adopting beliefs and technologies largely because other people in their group share those beliefs or use these technologies. When lots of imitation is mixed with a little bit of individual learning, populations can adapt in ways that outreach the abilities of any individual genius.

If the only processes shaping culture arose from our innate evolved psychology, then culture would be a strictly proximate cause of human behavior. Understanding how natural selection gave rise to our psychology would be more complicated than for other forms of behavioral plasticity, but in the end we could, at least in principle, reduce human culture to the actions of evolution by natural selection to increase genetic fitness. However, not all of the processes shaping culture do arise from our innate psychology—culture itself is subject to natural selection. Much as a child resembles her parents, people resemble those from whom they have acquired ideas, values, and skills. Culturally acquired ideas, values, and skills affect what happens to people during their lives—whether they are successful, how many children they have, and how long they live. These events in turn affect whether their behavior will be culturally transmitted to the next generation. If successful people are more likely to be imitated, then those traits that lead to becoming successful will be favored.

Our ape cousins still live in the same tropical forests in the same small social groups, and eat the same fruits, nuts, and bits of meat as our common ancestors did. By the late Pleistocene (say, 20,000 years ago), human foragers already occupied a much wider geographical and ecological range than any other species, using a remarkable range of subsistence systems and social arrangements. Over the last ten millennia we have exploded to become the earth's dominant organism by dint of deploying ever moresophisticated technology and ever more-sophisticated social systems. The human species is a spectacular evolutionary anomaly, so we ought to expect that the evolutionary system behind it is pretty anomalous as well. Our quest is for the evolutionary motors that drove our divergence from our ancestors, and we believe that the best place to hunt is among the anomalies of cultural evolution. This does not mean that gene-based evolutionary reasoning is worthless. To the contrary, human sociobiologists and their successors have explained a lot about human behavior even though most work ignores the novelties introduced by cultural adaptation. However, there is still much to explain, and we think that the population properties of culture are an essential ingredient of a satisfactory theory of human behavior.

Darwin thought "inherited habits," by which he meant something very close to human culture, were important in a wide variety of species. In a sense he was correct—simple forms of social learning are widespread in the animal kingdom.20 However, Darwin imagined that even honeybees had humanlike imitative capacities, whereas the best modern evidence, as we shall see, suggests that all other animals, including our closest ape relatives, have rudimentary capacities for culture compared with ourselves. Darwin's intuitions about "inherited habits" no doubt came from his observation that humans had such things, combined with his desire to minimize the gap between humans and other animals.

We, and a few compatriots, have sought to give cultural evolution its due weight without divorcing culture from biology. We hope to convince you that this approach to cultural evolution delivers new and powerful tools to dissect some of the enduring problems of the human sciences: How do genes and culture interact to influence our behavior? Why are humans so extraordinarily successful a species? How do individual processes and the institutional structures and functions of groups articulate? What are the sources of cultural diversity? Why, despite our success as a species, do our actions often seem mildly (or sometimes wildly) dysfunctional? Why does our behavior sometimes lead to colossal catastrophes? Why are we sometimes downright heroic in our concern for others' welfare while in other circumstances indifferent, callous, exploitative, or vicious? As far as we can see, the benefits of such a theory are large compared with the cost of abandoning certain cherished commitments to disciplines, methods, and hypotheses that it casts into doubt. We hope that by the time you finish this book you will agree.

Nothing About Culture Makes Sense Except in the Light of Evolution, P. Richerson, R. Boyd

Because evolution provides the ultimate explanation for why organisms are the way they are, it is the center of a web of biological explanation that links the work of all the other areas of biology into a single, satisfying, explanatory framework. As Dobzhanzky put it, without the light of evolution, biology "becomes a pile of sundry facts, some of them interesting or curious but making no meaningful picture as a whole." We believe that evolution can play the same role in explaining human culture. The ultimate explanation for cultural phenomena lies in understanding the genetic and cultural evolutionary processes that generate them. Genetic evolution is important because culture is deeply intertwined with other parts of human biology. The ways we think, the ways we learn, and the ways we feel shape culture, affecting which cultural variants are learned, remembered, and taught, and thus which variants persist and spread. Parents love their own children more than those of siblings or friends, and this must be part of the explanation for why marriage systems persist. But why do people value their own children more than others? Obviously, an important part of the answer is that such feelings were favored by natural selection in our evolutionary past.

Cultural evolution is also important for understanding the nature of culture. Because culture is transmitted, it is subject to natural selection. Some cultural variants persist and spread because they cause their bearers to be more likely to survive and be imitated. The answer to why mothers and fathers send their sons off to war is probably that social groups having norms that encourage such behavior outcompete groups that do not have such norms.

While many animals have rudimentary capacities for social learning, these are uniquely hypertrophied in humans. In late infancy, a suite of behaviors emerges in humans that make us very efficient imitators compared to any other animal. These capacities might underlie language, though the dominant school of linguists insists that language learning is a special-purpose capacity. Regardless of these controversial details, humans are clearly capable of transmitting vast quantities of information by imitation, instruction, and verbal communication. Humans have the capacity to form a large cultural repertoire, and the evidence surveyed in chapter 2 shows that much of our extraordinary behavioral variation stems from differences in cultural traditions. Human populations are characterized by durable traditions that result in different behaviors even in the same environments.

Two other plausible mechanisms explain variation in human behavior among groups: genetic differences and individual adaptation to environmental differences. Genetic differences cannot be very important, as borne out in the most direct data bearing on this issue, the results of cross-cultural adoptions. The evidence indicates that children raised by parents of another culture behave like the members of their adoptive culture, not their natal culture, in all important respects. Until a few thousand years ago, all humans lived in quite simple societies. Since then, most of us have come to live in much more complex ones, albeit some of us much more recently than others. Human behavior, under the influence of evolving cultural traditions, can change enormously without any appreciable genetic evolution. Whatever average innate differences might exist between human populations, they must be small compared to cultural differences.

The importance of individual behavioral versus cultural adaptation to local environments is a more difficult issue. Humans are adaptable and inventive creatures, no doubt. However, if individual behavioral adaptation to local conditions is the primary force generating behavioral differences between groups, then people living in the same environment should all behave in more or less the same way, but we know they often don't. Farmers with Lutheran German, Anabaptist German, and Yankee roots living side by side in the American Midwest behave quite differently, confirming that cultural tradition often has a powerful impact on behavior.

Most cultural change is relatively gradual, and is apparently the result of modest innovations spreading by diffusion from their point of origin to other places. Such patterns were well documented by anthropologists in the nineteenth century. In the twentieth century, "diffusionism" fell into disrepute for being atheoretical and merely descriptive. A Darwinian theory provides the tools needed to analyze the process of invention and diffusion in a rigorous way. Cultural evolution is a population phenomenon. Individuals invent, and they observe the behavior of others. Imitation by discriminating observers selectively retains and spreads innovations which in turn accumulate and eventually yield com plex technology and social organization. Darwin described such patterns of change as "descent with modification."

Humans adapt quickly and efficiently to variable environments using technology, and they evolve variable, often complex, social institutions producing unusual amounts of cooperation, coordination, and division of labor. Much of the diversity of human behavior in time and space results from adaptive microevolutionary processes shaping complexes of technology and social organization that suit us to live in most of the terrestrial and littoral habitats on earth. Other organisms must speciate in order to occupy novel environments, whereas humans rely mostly upon culture. Modern humans apparently have spread out of Africa to the rest of the world in the last one hundred thousand years, relying on their ability to generate complex cultural adaptations suited to virtually every habitat on earth.

The debate over whether culture is adaptive, maladaptive, or just neutral has gone on for a century. The theory outlined here predicts what the empirical evidence tells us—culture is sometimes adaptive, sometimes maladaptive, and sometimes neutral. It adds the nuance that what is maladaptive from the gene's-eye point of view may result from selection acting on cultural variation. Then, genes adapt secondarily to a world with culturally evolved institutions, so that genes come to support cultural adaptations. In a broader sense, human genes have also on average benefited from cultural adaptations even though natural selection directly on genes never favored large-scale cooperation! The soap opera messiness of human life accords well with the idea that multilevel selection has built conflict into our instincts and our institutions.

Much of human psychology is concerned with acquiring and managing culturally acquired information, and the variation in psychology among different groups of people is mainly a cultural phenomenon. The rational-choice disciplines of economics and game theory need theories of constraints and preferences, many of which are cultural in origin. Anthropology, sociology, political science, linguistics, and history have long relied on cultural explanations to account for changes in human behavior and to explain diversity. In this book, we have drawn upon empirical work from all of these disciplines to understand the nature of cultural evolution. We have advanced cultural evolutionary hypotheses to explain interesting phenomena that social scientists have documented, such as the surprising reversal of the correlation between wealth and reproductive success that has gradually spread from society to society over the last two centuries. We don't expect all of these hypotheses to stand the test of time; perhaps none will. Our use of this immense and valuable body of data, we hope, illustrates the relevance of the social sciences to evolutionary questions. We also hope we have demonstrated to your satisfaction how culturalevolutionary analyses integrate data from disparate disciplines and schools within the human sciences. Several questions that have excited enormous controversy in the social sciences seem to us to have natural resolutions in the evolutionary framework.

Darwinian concepts provide a neat account of the relations between individual and collective phenomena. Darwinian tools were invented to integrate levels. The basic biological theory includes genes, individuals, and populations. In these models, what happens to individuals (for example, natural selection) affects the population's properties (for example, the frequencies of genes), even as individuals are the prisoners of the gene pool they draw upon. Many other links between individuals and the populations they live in are possible, and the addition of culture creates still more. We have considered examples such as conformist transmission, where the frequency of a cultural variant, a population property, affects its probability of being imitated by individuals. Darwinian tools help us build linkages between phenomena at different levels as given problems require. Individuals seem to be hapless prisoners of their institutions because, in the short run, individual decisions don't have much effect on institutions. But, in the long run, accumulated over many decisions, individual decisions have a profound effect on institutions. Evolutionary theory gets right the basic structure of the relationship between individuals and the collective properties of their societies.

The sources of human happiness and human misery are evolutionary. Take social institutions as an example. Some simple societies lack effective systems of dispute resolution, whereas others have quite effective ones.18 Levels of trust, happiness, and satisfaction with life differ greatly within western European countries, quite independently of per-capita wealth.19 People evidently find some sets of social institutions more congenial than others. Since individual decision-making and collective decision-making institutions act as forces in cultural evolution, we may be said to affect our own evolution. However, we are also the prisoners of the culture and genes we inherit.

Evolutionary processes are thus at the crux of the most interesting questions about our species. How do we find ourselves in the early twenty-first century in the particular state we are in? The cultural evolutionary events of the centuries that came before have everything to do with that. Why do we have the social predispositions that we do? The coevolution of genes and culture over a million or more years has much to do with that. Can we influence the current evolution of human societies in desirable directions? As humans, we are unusually active agents in our own evolution, because we each choose which cultural variants to adopt and which to neglect.30 Moreover, we organize institutions ranging from a simple tribal council to highly complex modern ones, such as the research university and the political party, that are designed to direct the course of cultural evolution.31 Yet, cultural evolution is a very big dog on the end of our leash. Even cultural heroes leading great political movements typically have modest effects. Gandhi could not prevent the Muslims from leaving India, nor could he persuade Hindus to reform the caste system. Only by attending properly to the population-level processes can we arrive at a proper picture of cultural evolution. With a reasonable picture of cultural evolution in hand, we could begin to understand how we might humanize processes that often exact savage costs in the currency of human misery.

Much of the objection to applying Darwinian tools to the human case seems to come from a visceral dislike of picturing us as just "another unique species." From the evolutionist's point of view, human exceptionalism is a major problem. As long as humans stand outside the Darwinian synthesis, as long as human culture is said to be superorganic, the whole Darwinian project has a potentially fatal gap. Darwin feared that attacks on the Descent of Man would be used as a platform for attacks on the whole edifice of his theory. In this he was not disappointed. As the Quarterly Review's commentator, probably the long hostile and devoutly Catholic St. George Mivart, gloated, the Descent "offers a good opportunity for reviewing his whole position" (and rejecting it).

Darwinians generally feel more bemused than beleaguered by their critics. Scientists very commonly have humanistic interests. They paint, read novels, write history. So many older scientists try their hand at philosophy that it can practically be regarded as a normal sign of aging. Many are politically active. On the religious side, most scientists will admit to a belief in a god if a sufficiently broad definition is used. Far from feeling a conflict between their science, their religion, and their humanistic impulses, most scientists find their science suffused with the beautiful and the sublime.

The Neanderthal Within, G. Cochran, H. Harpending

Moderns showed up in Europe about 40,000 years ago, arriving first in areas to the east and north of Neanderthal territory, the mammoth steppe that Neanderthals had failed to settle permanently. A superior toolkit—in particular, needles for sewing clothes— may have made this possible. Later, modern humans moved south and west, displacing the Neanderthals. This is more or less what one would expect to happen, since the two sister species were competing for the same kinds of resources—ecological theory says that one will win out over the other. It took just 10,000 years for modern humans to completely replace Neanderthals, with the last Neanderthals probably living in what is now southern Spain.

Judging by outcomes, modern humans were competitively superior to Neanderthals, but we don't know what their key advantage was, any more than we know what drove the expansion of modern humans out of Africa. Several explanations have been suggested, and some or all of them may be correct. One idea is that modern humans had projectile weapons, in contrast to the thrusting spears used previously. If lightly built modern humans could hunt just as well as Neanderthals while requiring fewer calories, strongly built Neanderthals would have become obsolete. Even if Neanderthals had managed to copy that technology, they would have expended more energy in hunts because of their heavier bodies.

Another idea is that modern humans were smarter—which might have been the case, but it is hard to prove. Probably the most popular and attractive hypothesis is that modern humans had developed advanced language capabilities and therefore were able to talk the Neanderthals to death. This idea has a lot going for it. It's easy to imagine ways in which superior language abilities could have conferred advantages, particularly at the level of the band or tribe. For example, hunter-gatherers today are well known for having a deep knowledge of the local landscape and of the appearance and properties of many local plants and animals. This includes knowledge of rare but important events that happened more than a human lifetime ago, which may have been particularly important in the unstable climate of the Ice Age. It is hard to see how that kind of information transmission across generations would be possible in the absence of sophisticated language. Without it, there may have been distinct limits on cultural complexity, which, among other things, would have meant limits on the sophistication of tools and weapons.

Beginning in Africa, and continuing in the European archaeological record, we see signs of long-distance trade and exchange among modern and almost-modern humans in the form of stone tools made out of materials that originated far away. The Neanderthals never did this: To the extent that such trade was advantageous, it would have favored moderns over Neanderthals, and it is easy to imagine how enhanced language abilities would have favored trade. Those trade contacts (and the underlying language ability) might have allowed the formation of large-scale alliances (alliances of tribes), and societies with trade and alliances would have prevailed over opponents that couldn't organize in the same way.

The actual advance of modern humans in Europe may have taken the form of occasional skirmishes in which moderns won more often than not. Perhaps modern humans were better hunters and made big game scarce, so that neighboring Neanderthal bands suffered. Perhaps moderns, with their less bulky bodies and more varied diet (including fish), were better at surviving hard times. Quite possibly, the actual advance was made up of a mix of all these patterns.

One realistic and embarrassing possibility is that modern humans expanding out of Africa carried with them some disease or parasite that was fairly harmless to them but deadly to Neanderthals and the hominid populations of East Asia—the "cootie" theory. There is no direct evidence for this, but then it's hard to see how there would be: Germs seldom leave fossils.

The natural question is, "Why?" It doesn't really look as if being a modern human, in the sense of having ancestors who were anatomically modern and who had originated in Africa, was enough, by itself, to trigger this change. We don't see this storm of innovation in Australia. Obviously, something important, some genetic change, occurred in Africa that allowed moderns to expand out of Africa and supplant archaic sapiens. Equally obviously, judging from the patchy transition to full behavioral modernity, there was more to the story than that. So probably being an "anatomically modern" human was a necessary but not sufficient condition for full behavioral modernity.

Genetic changes allowed important human developments in 40,000 BC that hadn't been possible in 100,000 BC. Moreover, other genetic changes may have been necessary precursors to later cultural changes. Here we shall argue that the dramatic cultural changes that took place in the Upper Paleolithic, which have been referred to as the "human revolution," the "cultural explosion," or (our favorite), the "big bang," occurred largely because of underlying biological change. We are not the first to suggest this. Richard Klein has said that some mutation must have been responsible for this dramatic increase in cultural complexity. We wholly agree with the spirit of his suggestion, but we believe that such dramatic change probably involved a number of genes, and thus some mechanism that could cause unusually rapid genetic change. As it turns out, we know of such a mechanism, and the necessary circumstances for that mechanism turn out to have arrived just in time for the human revolution.

So what exactly were the innovations of the Upper Paleolithic that have drawn attention to this period as a time of revolutionary change? For one thing, we see new tools, made from new materials—tools made using careful, multistep preparation. Modern humans still used stone (although their methods of preparation had grown more elaborate and efficient), but they often used bone and ivory as well, in sharp contrast with Neanderthals. They also used particular types of high-quality stone from distant sources, sometimes from as far as hundreds of miles away, a pattern that suggests trade.

The most striking change of the Upper Paleolithic, to modern eyes, is the birth of art. The most spectacular examples are the cave paintings, found primarily in France and Spain. Typical subjects are large animals such as bison, deer, and aurochs, but sometimes predators such as lions, bears, and hyenas are depicted. Made with carbon black or ochre, these paintings usually depict animals naturalistically. Humans, which show up rarely, often look quite strange. The first real sculptures also appeared during this time. The most famous, the Venus figurines, such as the famous Venus of Willendorf, may have been portable pornography. At Dolni Vestonice, researchers found ceramic figures made about 29,000 years ago, long before the invention of pottery in other parts of the world.

The tremendous changes in tools, in weaponry and hunting methods, and in art, along with the social and cultural changes they imply, could not have simply come out of the blue. The Upper Paleolithic advances point to some underlying mechanism that generated rapid genetic changes that conferred new capabilities. That mechanism, we believe, was introgression—that is, the transfer of alleles from another species, in this case Neanderthals. There is no faster way of acquiring new and useful genes.... The issue of whether or not there was mating between modern humans and Neanderthals is central to the debate that has raged for several decades about multiregional evolution versus a single African origin of our species. The strong multiregional position held that Neanderthals were directly ancestral to humans,6 while the strong single-Africa-origin model held that modern humans simply replaced the Neanderthals. It quickly became apparent in the face of genetic data that a dramatic out-of-Africa dispersal of modern humans did occur, but the extent of genetic exchange between the old and new humans was not resolved. Much debate occurred about whether there were anatomical continuities between Neanderthals and contemporary Europeans, the underlying assumption being that some sort of anatomical blending would have occurred. Our perspective on the issue, elaborated below, is quite different.

The first point made by critics is that modern humans and Neanderthals could not have been interfertile. However, we believe that they almost certainly were, since the two species had separated fairly recently, roughly half a million years earlier. No primates are known to have established reproductive isolation in so short a time. Bonobos, for example, branched off from common chimpanzees some 800,000 years ago, but the two species can have fertile offspring. Most mammalian sister species retain the ability to interbreed for far longer periods. Sometimes zookeepers are surprised by this, as when a dolphin and a false killer whale produce viable offspring. There are rumors about successful matings between primate lineages that separated as long as 5 million or 6 million years ago, but those are currently unsubstantiated. Nevertheless, there is no reason to think that during the Upper Paleolithic Neanderthals and anatomically modern humans could not have mated and had children who lived to also reproduce.

Imagine that humans occasionally mated with Neanderthals, and that at least some of their offspring were incorporated into human populations. That process would have introduced new gene variants, new alleles, into the human population.... A tiny bit of Neanderthal ancestry thrown into the mix tens of thousands of years ago could have resulted in many people today, possibly even all modern humans, carrying the advantageous Neanderthal version of some genes.

Neanderthal alleles that were advantageous outside of Africa may not have been so in Africa and thus might not have spread to anatomically modern humans there. We have reason to think that the modern humans who expanded out of Africa some 50,000 years ago had changed in important ways—had, for example, probably acquired sophisticated language abilities. A Neanderthal allele that had not been particularly useful in the genetic context of near-modern humans 100,000 years ago might have been useful to the more advanced people who were expanding out of Africa. Logically, if admixture occurred at all, it had to happen somewhere in Neanderthal-occupied territory, which means Europe and western Asia. As modern humans expanded their territory, they must have encountered Neanderthal bands again and again. The two kinds of humans coexisted for a few thousand years before the Neanderthals disappeared, at least in some regions. This looks to be the case for the Chatelperronian culture of France and northern Spain, and there are traces of a similar culture in Italy. If there was trade, or if there was enough contact to transmit toolmaking techniques, there was sexual contact as well—depend on it. If in the future we look at very large genetic datasets from huge numbers of individuals, we might find a few traces of neutral Neanderthal genes.

There were deep differences between Homo sapiens and Homo neanderthalensis in way of life, with Neanderthals being high-risk, highly cooperative hunters, rather like wolves, while anatomically modern humans in Africa probably had a mixed diet and were more like modern hunter-gatherers. Those differences could mean that big Neanderthal brains were solving different sorts of problems than big African brains....And yet, European Neanderthals probably faced many of the same life problems that African humans did. To some degree, big brains may have been solving the same problems in both populations. Even if that is the case, though, we can be certain that those problems were not solved in exactly the same way.

If the new FOXP2 allele is really that recent in modern humans, it is likely that the migrating humans picked it up from Neanderthals, since that's about the time they encountered them in their expansion out of Africa. The idea that we might have acquired some of our speech capabilities from Neanderthals may be surprising, but it is not impossible. The timing of the acquisition is certainly consistent with the creative explosion. If it is true that we gained the gene by means of introgression, then the version of FOXP2 in the Neanderthals is likely to be older and have more variation than it does in modern humans. Further sequencing efforts on the skeletal remains of Neanderthals should eventually confirm or refute this possibility. If FOXP2 is indeed a "language gene" and responsible, perhaps, for some of the creative explosion of modern humans in Europe and northern Asia, it would explain a major puzzle about modern human origins.

A burst of innovation followed the expansion of modern humans out of Africa. Signs of that change existed in Africa before the expansion, but the pattern became much stronger in Europe some 20,000 years later, after anatomically modern humans had encountered and displaced the Neanderthals. That transition to full behavioral modernity—as seen in the archaeological record—occurred patchily and finished later in other parts of Eurasia. We argue that even limited gene flow from Neanderthals (and perhaps other archaic humans) would have allowed anatomically modern humans to acquire most of their favorable alleles. We believe that this sudden influx of adaptive alleles contributed to the growth of the capabilities that made up the "human revolution," and we believe that this introgression from archaic human populations will prove central to the story of modern human origins. So by 40,000 years ago, humans had become both anatomically and behaviorally modern (which is not to say they were exactly like people today). They had vastly greater powers of innovation than their ancestors, likely owing in part to genes stolen from their Neanderthal cousins. The speed of cultural change increased by tens of times, and when the glaciers retreated and new opportunities arose, it accelerated further.

The Evolution of Certain Novel Human Capacities, P. Bloom

Many would insist that evolutionary biology has little to say about the more interesting aspects of human thought. Derek Bickerton, for instance, says of evolutionary psychology that "it is perhaps not unfair to say that this approach can tell us all we need to know about the least interesting aspects of human behavior. Surely what is most interesting about human behavior ... is precisely the part of it that we do not share with other creatures". And Alan Wolfe insists that "biology per se has little to tell us about those aspects of human behavior, such as how we make our way in the world, which are most interesting to social scientists". There is something odd about these authoritative assertions of what is and is not interesting. Even if it were true that evolutionary psychology could tell us only about what we share with other creatures, this would still leave us with insights about perception, memory, communication, taste in food, love for our offspring, fear, play, rage, and lust. And, anyway, it isn't true that evolutionary theory cannot account for traits that are unique to a given species. Pinker notes that the elephant's trunk is just as singular as, say, human language, but nonetheless poses no special mystery for biologists.

This chapter will address what I see as the most serious attack on evolutionary psychology, one that has been raised by many critics. This attack concerns the proper explanation of certain complex human capacities. I will suggest that, contrary to what is often claimed by these critics, theories that reject the tenets of evolutionary psychology fail to explain how these capacities evolved. Instead, the most likely explanation emerges from a modular adaptationist perspective on the mind, though one that must be elaborated in certain surprising ways.

One of the central issues in evolutionary biology [concerns] the proper application of selectionist explanation. All sorts of traits can be the result of natural selection, from a moth's colour to the number of vertebrae in a monkey's spine to echolocation in bats. But which ones have to be adaptations, as opposed to being the products of mutation, genetic drift, and so on? When must we involve what Williams has called the 'quite onerous' notion of adaptation? The answer is that natural selection is the only known explanation for adaptive complexity. Adaptively complex entities are ubiquitous in nature; they include such things as eyes and hands and camouflage and cats. Before Darwin, the only explanation for how these came to exist was an intelligent designer, God. Darwin's fundamental accomplishment was to show how a non-intentional process can give rise to good design. The eye, for example, looks like something that has been cleverly constructed for the purpose of seeing. But it hasn't—it arose through the accumulation of random variations, each of which led to some improvement over the preceding form, and, over the course of time, led organisms along the path in the astronomically vast space of possible bodies leading from a body with no eye to a body with a functioning eye.

There are many considerations that suggest that a certain aspect of the mind is a biological adaptation, but perhaps the strongest consideration is the one above—adaptive complexity. Abilities such as object recognition, syntactic processing, and spatial navigation arguably require cognitive mechanisms that are, on computational grounds, sufficiently complex so as to require an explanation in terms of natural selection. From this you get the argument at the core of evolutionary psychology—the complex design of the mind cries out for an adaptive explanation. But here's the problem. The cognitive structures required to develop theories of quantum physics and create moon rockets are intuitively at least as complicated as tasks such as object recognition. There are thus two sources of good design. The first is natural selection, and we have some understanding of how this works. The second is human creative powers, as exemplified in science and technology. Humans somehow have the ability to understand and manipulate aspects of the world in unique and novel ways. By any normal criteria, quantum physics and space flight reflect good design, but they are neither the direct result of natural selection nor, in any direct sense, the by-products of evolved abilities. How can we explain their origin?

When it comes to finding the source of uniquely human creative powers, a sensible strategy is to look for some other capacity that only humans have. Capacities such as habituation and the principles of association do not meet this criterion, but language certainly does. An average adult knows tens of thousands of words; these refer to categories and to individuals, to entities as diverse as objects, actions, social institutions, distances, emotions, and numbers. And even young children unconsciously command a rich set of syntactic and morphological principles that enable them to combine these words to produce and understand a potential infinity of sentences that nobody has ever produced or understood before. There is nothing comparable in other primates, and attempts to teach non-humans the same sort of lexical and generative systems used by humans have been abysmal failures.

Some relationship between the richness of human thought and the evolution of language clearly exists. This is true even if the most modular view of the origin of language (as a distinct neural system evolved for the purpose of communication) is correct. For one thing, it is likely that our language is richer than the communication systems of other creatures just because our thoughts are richer. We can say more because we have more to say. For another, language and non-linguistic cognition might well have co-evolved in the course of human evolution in a 'cognitive arms race', in which the increased communicative abilities of members of our species gave rise to selective pressure for enhanced cognitive skills (most notably, social cognition), which in turn made communication more important, and so on. Finally, there is the obvious fact that language is an excellent tool for the transmission and storage of information, and hence plays a central role in the development of culture.

But can the evolution of language entirely explain the unique human capacities that we are interested in? Darwin thought so: "If it could be proved that certain high mental powers, such as the formation of general concepts, self-consciousness, [etc.], were absolutely peculiar to man ... it is not improbable that these qualities are merely the incidental results of other highly-advanced intellectual faculties; and these again mainly the result of the continued use of a perfect language" [p. 126]. More recently, Daniel Dennett entertains an even more radical proposal: "Perhaps the kind of mind you get when you add language to it is so different from the kind of mind you can have without language that calling them both minds is a mistake".

It is sometimes the case that being deprived of language does have a terrible effect. Often such a person is severely deprived of personal contact, and, as Sacks discusses, there are many instances in which a languageless adult becomes cognitively lost. But the people described by James and Schaller suggest that this is not inevitable and hence should lead us to question the view that language is essential for human thought. In the end, it might well be that there is a substantial amount of truth to the claim that language facilitates cognition—perhaps it can bring unconscious concepts to conscious light, lead to the shaping of new concepts, cause the unfolding of modular processes, and facilitate long chains of reasoning. This is nothing to sneer at. And there is no doubt that language is a great tool for the accumulation and dissemination of culture. But language does not, in itself, spark creative powers in individuals, and humans without language are, nonetheless, conscious and complete individuals.

So what is the solution to the question of the origin of genuinely new capacities? The proposal that I will endorse here is an old one, perhaps obvious to many, and it starts with the observation that there is a sense in which the claim that humans can do number theory and fly to the moon is false. After all, individual humans cannot do such things. Such creative powers exist only as the result of the accumulation of the efforts of many humans over many generations. Hence a psychological theory need not and cannot entirely explain the genesis of human creative powers, as such powers emerge under certain limited circumstances, over many generations and through the extended interactions of many people.

This leads to the following suggestion. There is no extra-modular capacity in humans, no general intelligence. Humans are nothing more than souped-up primates, chimpanzees with certain enhanced abilities, and a naked human, without history and without society, is no more capable of creating science or technology than is a naked chimp. But our special abilities allow for the accumulation and storage of information, and this makes it possible, over the course of many generations, for science and technology to emerge. This type of good design emerges not through natural selection, but out of the interaction of humans with other humans and with the external world.

Each individual human has certain limited abilities to manipulate the world and learn from the environment. But humans can work in a coordinated fashion and can build on the accomplishments of other humans, across both time and space. We can build tools and then use them to make other tools; we can learn about different domains and then bootstrap from this knowledge to new knowledge. This is admittedly not a solution to the puzzle of where human creative powers come from; at best it is an idea as to what a solution might look like. A solution would have two parts—one, a description of the cognitive prerequisites for accumulation, and, two, an explanation as to how this accumulation leads to the emergence of abilities that show signs of good design, or adaptive complexity. The prerequisites probably include adaptations such as language, which is essential to the dramatic sort of accumulation and transfer of cultural knowledge we find in our species. There is also a 'theory of mind'—an understanding of the thoughts, emotions, desires, and goals of others. While other primates possess some abilities in this domain, much of the innate human capacity appears to be special to our species. Other relevant proposals include the capacity to deal with meta-representation, and some ability to appreciate generativity and recursion.

Even once the prerequisites for this accumulation process were present in our species, the emergence of these creative powers probably still reflects, to a large extent, historical accident. It might be that good design is never inevitable. But we can hopefully gain some insight as to how it is even possible. I have suggested that the answer in the case of human creative powers will not involve rejecting a modular view of the mind, but instead will require a careful analysis of the interaction between evolved modules, human interaction, and the external world.

Primate Social Insticts, Human Morality, and the Rise and Fall of 'Veneer Theory', F. de Waal

Social contract theory, and Western civilization with it, seems saturated with the assumption that we are asocial, even nasty creatures rather than the zoon politikon that Aristotle saw in us. Hobbes explicitly rejected the Aristotelian view by proposing that our ancestors started out autonomous and combative, establishing community life only when the cost of strife became unbearable. According to Hobbes, social life never came naturally to us. He saw it as a step we took reluctantly and "by covenant only, which is artificial". More recently, Rawls proposed a milder version of the same view, adding that humanity's move toward sociality hinged on conditions of fairness, that is, the prospect of mutually advantageous cooperation among equals.

These ideas about the origin of the well-ordered society remain popular even though the underlying assumption of a rational decision by inherently asocial creatures is untenable in light of what we know about the evolution of our species. Hobbes and Rawls create the illusion of human society as a voluntary arrangement with self-imposed rules assented to by free and equal agents. Yet, there never was a point at which we became social: descended from highly social ancestors a long line of monkeys and apes—we have been group-living forever. Free and equal people never existed. Humans started out—if a starting point is discernible at all—as interdependent, bonded, and unequal. We come from a long lineage of hierarchical animals for which life in groups is not an option but a survival strategy. Any zoologist would classify our species as obligatorily gregarious. Having companions offers immense advantages in locating food and avoiding predators. Inasmuch as group-oriented individuals leave more offspring than those less socially inclined, sociality has become ever more deeply ingrained in primate biology and psychology. If any decision to establish societies was made, therefore, credit should go to Mother Nature rather than to ourselves.

One school views morality as a cultural innovation achieved by our species alone. This school does not see moral tendencies as part and parcel of human nature. Our ancestors, it claims, became moral by choice. The second school, in contrast, views morality as a direct outgrowth of the social instincts that we share with other animals. In the latter view, morality is neither unique to us nor a conscious decision taken at a specific point in time: it is the product of social evolution. The first standpoint assumes that deep down we are not truly moral. It views morality as a cultural overlay, a thin veneer hiding an otherwise selfish and brutish nature. Until recently, this was the dominant approach to morality within evolutionary biology as well as among science writers popularizing this field. I will use the term "Veneer Theory" to denote these ideas, tracing their origin to Thomas Henry Huxley (although they obviously go back much further in Western philosophy and religion, all the way to the concept of original sin). After treating these ideas, I review Charles Darwin's quite different standpoint of an evolved morality, which was inspired by the Scottish Enlightenment. I further discuss the views of Mencius and Westermarck, which agree with those of Darwin. Given these contrasting opinions about continuity versus discontinuity with other animals, I then build upon an earlier treatise in paying special attention to the behavior of nonhuman primates in order to explain why I think the building blocks of morality are evolutionarily ancient.

Since many consider morality the essence of humanity, Huxley was in effect saying that what makes us human could not be handled by evolutionary theory. We can become moral only by opposing our own nature. This was an inexplicable retreat by someone who had gained a reputation as "Darwin's Bulldog" owing to his fierce advocacy of evolution. Second, Huxley gave no hint whatsoever where humanity might have unearthed the will and strength to defeat the forces of its own nature. If we are indeed born competitors, who don't care about the feelings of others, how did we decide to transform ourselves into model citizens? Can people for generations maintain behavior that is out of character, like a shoal of piranhas that decides to turn vegetarian? How deep does such a change go? Would not this make us wolves in sheep's clothing: nice on the outside, nasty on the inside?

Evolution favors animals that assist each other if by doing so they achieve long-term benefits of greater value than the benefits derived from going it alone and competing with others. Unlike cooperation resting on simultaneous benefits to all parties involved (known as mutualism), reciprocity involves exchanged acts that, while beneficial to the recipient, are costly to the performer. This cost, which is generated because there is a time lag between giving and receiving, is eliminated as soon as a favor of equal value is returned to the performer. It is in these theories that we find the germ of an evolutionary explanation of morality that escaped Huxley.

It is fine to describe animals (and humans) as the product of evolutionary forces that promote self-interests so long as one realizes that this by no means precludes the evolution of altruistic and sympathetic tendencies. Darwin fully recognized this, explaining the evolution of these tendencies by group selection instead of the individual and kin selection favored by modern theoreticians. Darwin firmly believed his theory capable of accommodating the origins of morality and did not see any conflict between the harshness of the evolutionary process and the gentleness of some of its products. Rather than presenting the human species as falling outside of the laws of biology, Darwin emphasized continuity with animals even in the moral domain: "Any animal whatever, endowed with well-marked social instincts, the parental and filial affections being here included, would inevitably acquire a moral sense or conscience, as soon as its intellectual powers had become as well developed, or nearly as well developed, as in man."

All species that rely on cooperation—from elephants to wolves and people—show group loyalty and helping tendencies. These tendencies evolved in the context of a closeknit social life in which they benefited relatives and companions able to repay the favor. The impulse to help was therefore never totally without survival value to the ones showing the impulse. But, as so often, the impulse became divorced from the consequences that shaped its evolution. This permitted its expression even when payoffs were unlikely, such as when strangers were beneficiaries. This brings animal altruism much closer to that of humans than usually thought, and explains the call for the temporary removal of ethics from the hands of philosophers

In discussing what constitutes morality, the actual behavior is less important than the underlying capacities. For example, instead of arguing that food-sharing is a building block of morality, it is rather the capacities thought to underlie food-sharing (e.g., high levels of tolerance, sensitivity to others' needs, reciprocal exchange) that seem relevant. Ants, too, share food, but likely based on quite different urges than those that make chimpanzees or people share. This distinction was understood by Darwin, who looked beyond the actual behavior at the underlying emotions, intentions, and capacities. In other words, whether animals are nice to each other is not the issue, nor does it matter much whether their behavior fits our moral preferences or not. The relevant question rather is whether they possess capacities for reciprocity and revenge, for the enforcement of social rules, for the settlement of disputes, and for sympathy and empathy.

Emotions occupy a central role; it is well known that, rather than being the antithesis of rationality, emotions aid human reasoning. People can reason and deliberate as much as they want, but, as neuroscientists have found, if there are no emotions attached to the various options in front of them, they will never reach a decision or conviction. This is critical for moral choice, because if anything morality involves strong convictions. These convictions don't—or rather can't—come about through a cool rationality: they require caring about others and powerful "gut feelings" about right and wrong.

No human moral society could be imagined without reciprocal exchange and an emotional interest in others. This offers a concrete starting point to investigate the continuity that Darwin envisioned. The debate about Veneer Theory is fundamental to this investigation since some evolutionary biologists have sharply deviated from the idea of continuity by presenting morality as a sham so convoluted that only one species—ours—is capable of it. This view has no basis in fact, and as such stands in the way of a full understanding of how we became moral. My intention here is to set the record straight by reviewing actual empirical data.

Both developmentally and evolutionarily, advanced forms of empathy are preceded by and grow out of more elementary ones. Instead of language and culture appearing with a Big Bang in our species and then transforming the way we relate to each other, Greenspan and Shanker propose that it is from early emotional connections and "proto conversations" between mother child that language and culture sprang. Instead of empathy being an endpoint, it may have been the starting point.

Bottom-up accounts are the opposite of Big Bang theories. They assume continuity between past and present, child and adult, human and animal, even between humans and the most primitive mammals. We may assume that empathy first evolved in the context of parental care, which is obligatory in mammals. Signaling their state through smiling and crying, human infants urge their caregiver to pay attention and move into action. The same applies to other primates. The survival value of these interactions is obvious. For example, a female chimpanzee lost a succession of infants despite intense positive interest because she was deaf and did not correct positional problems (such as sitting on the infant, or holding it upside-down) in response to its distress calls.

For a human characteristic, such as empathy, that is so pervasive, develops so early in life, and shows such important neural and physiological correlates as well as a genetic substrate, it would be strange indeed if no evolutionary continuity existed with other mammals. The possibility of empathy and sympathy in other animals has been largely ignored, however. This is partly due to an excessive fear of anthropomorphism, which has stifled research into animal emotions, and partly to the one-sided portrayal by biologists of the natural world as a place of combat rather than social connectedness.

There exists ample evidence of one primate coming to another's aid in a fight, putting an arm around a previous victim of attack, or other emotional responses to the distress of others. In fact, almost all communication among nonhuman primates is thought to be emotionally mediated. We are familiar with the prominent role of emotions in human facial expressions, but when it comes to monkeys and apes—which have a homologous array of expressions—emotions seem equally important. When the emotional state of one individual induces a matching or closely related state in another, we speak of "emotional contagion". Even if such contagion is undoubtedly a basic phenomenon, there is more to it than simply one individual being affected by the state of another: the two individuals often engage in direct interaction. Thus, a rejected youngster may throw a screaming tantrum at its mother's feet, or a preferred associate may approach a food possessor to beg by means of sympathyinducing facial expressions, vocalizations, and hand gestures. In other words, emotional and motivational states often manifest themselves in behavior specifically directed at a partner. The emotional effect on the other is not a by-product, therefore, but actively sought.

With increasing differentiation between self and other, and an increasing appreciation of the precise circumstances underlying the emotional states of others, emotional contagion develops into empathy. Empathy encompasses—and could not possibly have arisen without—emotional contagion, but it goes beyond it in that it places filters between the other's and one's own state. In humans, it is around the age of two that we begin to add these cognitive layers. Two mechanisms related to empathy are sympathy and personal distress, which in their social consequences are each other's opposites. Sympathy is defined as "an affective response that consists of feelings of sorrow or concern for a distressed or needy other (rather than the same emotion as the other person). Sympathy is believed to involve an otheroriented, altruistic motivation". Personal distress, on the other hand, makes the affected party selfishly seek to alleviate its own distress, which is similar to what it has perceived in the object. Personal distress is therefore not concerned with the situation of the empathyinducing other. A striking primate example is given by de Waal: the screams of a severely punished or rejected infant rhesus monkey will often cause other infants to approach, embrace, mount, or even pile on top of the victim. Thus, the distress of one infant seems to spread to its peers, which then seek contact to soothe their own arousal. Inasmuch as personal distress lacks cognitive evaluation and behavioral complementarity, it does not reach beyond the level of emotional contagion.

The most compelling evidence for the strength of empathy in monkeys came from Wechkin et al. and Masserman et al. They found that rhesus monkeys refuse to pull a chain that delivers food to themselves if doing so shocks a companion. One monkey stopped pulling for five days, and another one for twelve days after witnessing shock delivery to a companion. These monkeys were literally starving themselves to avoid inflicting pain upon another. Such sacrifice relates to the tight social system and emotional linkage among these macaques, as supported by the finding that the inhibition to hurt another was more pronounced between familiar than unfamiliar individuals.

Consolation is defined as reassurance by an uninvolved bystander to one of the combatants in a preceding aggressive incident. For example, a third party goes over to the loser of a fight and gently puts an arm around his or her shoulders. Consolation is not to be confused with reconciliation between former opponents, which seems mostly motivated by self-interest, such as the imperative to restore a disturbed social relationship. The advantage of consolation for the actor remains wholly unclear. The actor could probably walk away from the scene without any negative consequences.

Why would consolation be restricted to apes? Possibly, one cannot achieve cognitive empathy without a high degree of self-awareness. Targeted help in response to specific, sometimes novel, situations may require a distinction between self and other that allows the other's situation to be divorced from one's own while maintaining the emotional link that motivates behavior. In other words, in order to understand that the source of vicarious arousal is not oneself but the other and to understand the causes of the other's state, one needs a clear distinction between self and other.

I have argued before that, apart from consolation behavior, targeted helping reflects cognitive empathy. Targeted helping is defined as altruistic behavior tailored to the specific needs of the other in novel situations....These responses require an understanding of the specific predicament of the individual needing help. Given the evidence for targeted helping by dolphins, the recent discovery of MSR in these mammals supports the proposed connection between increased self-awareness, on the one hand, and cognitive empathy, on the other.

Preston and de Waal propose that at the core of the empathic capacity is a relatively simple mechanism that provides an observer (the "subject") with access to the emotional state of another (the "object") through the subject's own neural and bodily representations. When the subject attends to the object's state, the subject's neural representations of similar states are automatically activated. The closer and more similar subject and object are, the easier it will be for the subject's perception to activate motor and autonomic responses that match the object's (e.g., changes in heart rate, skin conductance, facial expression, body posture). This activation allows the subject to get "under the skin" of the object, sharing its feelings and needs, which embodiment in turn fosters sympathy, compassion, and helping.

In conclusion, empathy is not an all-or-nothing phenomenon: it covers a wide range of emotional linkage patterns, from the very simple and automatic to the highly sophisticated. It seems logical to first try to understand the basic forms of empathy, which are widespread indeed, before addressing the variations that cognitive evolution has constructed on top of this foundation.

During the evolution of cooperation it may have become critical for actors to compare their own efforts and payoffs with those of others. Negative reactions may ensue in case of violated expectations. A recent theory proposes that aversion to inequity can explain human cooperation within the bounds of the rational choice model. Similarly, cooperative nonhuman species seem guided by a set of expectations about the outcome of cooperation and access to resources. De Waal proposed a sense of social regularity, denned as "A set of expectations about the way in which oneself (or others) should be treated and how resources should be divided. Whenever reality deviates from these expectations to one's (or the other's) disadvantage, a negative reaction ensues, most commonly protest by subordinate individuals and punishment by dominant individuals."

[In response to] the question whether there is any single word that may serve as prescription for all of one's life.... Confucius proposed "reciprocity" as such a word. Reciprocity is of course also at the heart of the Golden Rule, which remains unsurpassed as a summary of human morality. To know that some of the psychology behind this rule may exist in other species, along with the required empathy, bolsters the idea that morality, rather than a recent invention, is part of human nature.

I don't know if people are, deep down, good or evil, but to believe that each and every move is selfishly calculated, while being hidden from others (and often from ourselves), seems to grossly overestimate human intellectual powers, let alone those of other animals. Apart from the already discussed animal examples of consolation of distressed individuals and protection against aggression, there exists a rich literature on human empathy and sympathy that, generally, agrees with the assessment of Mencius that impulses in this regard come first and rationalizations later.

In this essay, I have drawn a stark contrast between two schools of thought on human goodness. One school sees people as essentially evil and selfish, and hence morality as a mere cultural overlay. This school, personified by T. H. Huxley, is still very much with us even though I have noticed that no one (not even those explicitly endorsing this position) likes to be called a "veneer theorist." This may be due to wording, or because once the assumptions behind Veneer Theory are laid bare, it becomes obvious that—unless one is willing to go the purely rationalist route of modern Hobbesians, such as Gauthier—the theory lacks any sort of explanation of how we moved from being amoral animals to moral beings. The theory is at odds with the evidence for emotional processing as driving force behind moral judgment. If human morality could truly be reduced to calculations and reasoning, we would come close to being psychopaths, who indeed do not mean to be kind when they act kindly. Most of us hope to be slightly better than that, hence the possible aversion to my black-and-white contrast between Veneer Theory and the alternative school, which seeks to ground morality in human nature.

This school sees morality arise naturally in our species and believes that there are sound evolutionary reasons for the capacities involved. Nevertheless, the theoretical framework to explain the transition from social animal to moral human consists only of bits and pieces. Its foundations are the theories of kin selection and reciprocal altruism, but it is obvious that other elements will need to be added. If one reads up on reputation building, fairness principles, empathy, and conflict resolution, there seems a promising movement toward a more integrated theory of how morality may have come about. It should further be noted that the evolutionary pressures responsible for our moral tendencies may not all have been nice and positive. After all, morality is very much an ingroup phenomenon. Universally, humans treat outsiders far worse than members of their own community: in fact, moral rules hardly seem to apply to the outside. True, in modern times there is a movement to expand the circle of morality, and to include even enemy combatants—e.g., the Geneva Convention, adopted in 1949—but we all know how fragile an effort this is. Morality likely evolved as a within-group phenomenon in conjunction with other typical within-group capacities, such as conflict resolution, cooperation, and sharing.

The first loyalty of every individual is not to the group, however, but to itself and its kin. With increasing social integration and reliance on cooperation, shared interests must have risen to the surface so that the community as a whole became an issue. The biggest step in the evolution of human morality was the move from interpersonal relations to a focus on the greater good. In apes, we can see the beginnings of this when they smooth relations between others. Females may bring males together after a fight between them, thus brokering a reconciliation, and high-ranking males often stop fights among others in an evenhanded manner, thus promoting peace in the group. I see such behavior as a reflection of community concern, which in turn reflects the stake each group member has in a cooperative atmosphere. Most individuals have much to lose if the community were to fall apart, hence the interest in its integrity and harmony. Discussing similar issues, Boehm added the role of social pressure, at least in humans: the entire community works at rewarding group-promoting behavior and punishing group-undermining behavior. Obviously, the most potent force to bring out a sense of community is enmity toward outsiders. It forces unity among elements that are normally at odds. This may not be visible at the zoo, but it is definitely a factor for chimpanzees in the wild, which show lethal intercommunity violence. In our own species, nothing is more obvious than that we band together against adversaries. In the course of human evolution, out-group hostility enhanced in-group solidarity to the point that morality emerged. Instead of merely ameliorating relations around us, as apes do, we have explicit teachings about the value of the community and the precedence it takes, or ought to take, over individual interests. Humans go much further in all of this than the apes, which is why we have moral systems and apes do not.

And so, the profound irony is that our noblest achievement—morality—has evolutionary ties to our basest behavior—warfare. The sense of community required by the former was provided by the latter. When we passed the tipping point between conflicting individual interests and shared interests, we ratcheted up the social pressure to make sure everyone contributed to the common good. If we accept this view of an evolved morality, of morality as a logical outgrowth of cooperative tendencies, we are not going against our own nature by developing a caring, moral attitude, any more than civil society is an out-of-control garden subdued by a sweating gardener, as Huxley thought. Moral attitudes have been with us from the start, and the gardener rather is, as Dewey aptly put it, an organic grower. The successful gardener creates conditions and introduces plant species that may not be normal for this particular plot of land "but fall within the wont and use of nature as a whole". In other words, we are not hypocritically fooling everyone when we act morally: we are making decisions that flow from social instincts older than our species, even though we add to these the uniquely human complexity of a disinterested concern for others and for society as a whole.

Following Hume, who saw reason as the slave of the passions, Haidt has called for a thorough reevaluation of the role played by rationality in moral judgment, arguing that most human justification seems to occur post hoc, that is, after moral judgments have been reached on the basis of quick, automated intuitions. Whereas Veneer Theory, with its emphasis on human uniqueness, would predict that moral problem solving is assigned to evolutionarily recent additions to our brain, such as the prefrontal cortex, neuroimaging shows that moral judgment in fact involves a wide variety of brain areas, some extremely ancient. In short, neuroscience seems to be lending support to human morality as evolutionarily anchored in mammalian sociality.

Why did evolutionary biology stray from this path during the final quarter of the twentieth century? Why was morality considered unnatural, why were altruists depicted as hypocrites, and why were emotions left out of the debate? Why the calls to go against our own nature and to distrust a "Darwinian world"? The answer lies in what I have called the Beethoven error. In the same way that Ludwig van Beethoven is said to have produced his beautiful, intricate compositions in one of the most disorderly and dirty apartments of Vienna, there is not much of a connection between the process of natural selection and its many products. The Beethoven error is to think that, since natural selection is a cruel, pitiless process of elimination, it can only have produced cruel and pitiless creatures. But nature's pressure cooker does not work that way. It favors organisms that survive and reproduce, pure and simple. How they accomplish this is left open. Any organism that can do better by becoming either more or less aggressive than the rest, more or less cooperative, or more or less caring, will spread its genes.

The process does not specify the road to success. Natural selection has the capacity of producing an incredible range of organisms, from the most asocial and competitive to the kindest and gentlest. The same process may not have specified our moral rules and values, but it has provided us with the psychological makeup, tendencies, and abilities to develop a compass for life's choices that takes the interests of the entire community into account, which is the essence of human morality.

Ethics and Evolution: How To Get Here From There, P. Kitcher

Human morality [Frans de Waal] suggests, stems from dispositions we share with other primates, particularly with those closest to us on the phylogenetic tree. Yet my formulation of his position, like his own, is vague in crucial respects: what exactly is meant by claiming that morality "stems from" traits present in chimpanzees, or that morality is "a direct outgrowth of the social instincts we share with other animals," or that "deep down" we are truly moral, or that "the building blocks of morality are evolution - arily ancient"? I want to focus the position more precisely by articulating a particular version of what de Waal might have in mind. If this version is not what he intends, I hope it will prompt him to develop his preferred alternative with more specificity than he has done so far. In fact, I think de Waal's own presentation is hampered by his desire to take a sledgehammer to something he conceives of as the rival to his own view. That rival, "Veneer Theory," is to be demolished. The fact that the demolition is so easy should alert us to the possibility that the real issues have not been exposed and addressed.

Veneer Theory, as I understand it, divides the animal kingdom into two. There are nonhuman animals who lack any capacity for sympathy and kindness, and whose actions, to the extent that they can be understood as intentional at all, are the expression of selfish desires. There are also human beings, often driven by selfish impulses to be sure, but capable of rising above egoism to sympathize with others, to curb their baser tendencies, and to sacrifice their own interests for higher ideals. Members of our species have the selfish dispositions that pervade the psychologically more complex parts of the rest of the animal world, but they have something else, an ability to subdue these dispositions. Our psyches are not just full of weeds; we also have a capacity for gardening. De Waal associates this position with T. H. Huxley, whose famous lecture of 1893 introduced the gardening metaphor. He accuses Huxley of deviating from Darwinism on this point, but it is not clear to me that, even if this is an adequate statement of Huxley's view (which I doubt), the accusation is justified. A fully Darwinian Huxley might claim that human evolution involved the emergence of a psychological trait that has a tendency to inhibit another part of our psychological nature; it is not that something mysterious outside us opposes our nature, but that we come to experience internal conflicts of a kind that had not previously figured in our lives. It would be quite reasonable, of course, to ask this Darwinian Huxley to offer an account of how this new mechanism might have evolved, but, even if any answer proved to be speculative, Huxley would be innocent of assuming that morality is some sort of nonnaturalistic addition.

The version of Veneer Theory I have sketched, and the one that occupies de Waal, takes a specific view of the starting point and the end point. Back in our evolutionary past, we had ancestors, as recent as the common ancestors of human beings and chimpanzees, who lacked any capacities for sympathy and altruism. Present human beings have ways of disciplining their selfish urges, and the theory thinks of morality as this collection of disciplinary strategies. The real objection to Veneer Theory in this form is that it has the starting point wrong. It is falsified by all the evidence de Waal has acquired about the other-directed tendencies of chimpanzees, bonobos, and, to a lesser extent, other primates. Appreciating this point ought to be the first stage in an inquiry about the evolutionary history that links the psychological dispositions of our ancestors to the capacities that underlie our contemporary moral behavior. De Waal demolishes his favored version of Veneer Theory by being very clear about the starting point—that, after all, is a project to which he has devoted much of his life—but he is considerably less clear as to the nature of the terminus. The vague talk about "building blocks" and "direct outgrowth" comes in because de Waal hasn't thought as hard about the human phenomenon he takes to be anticipated or foreshadowed in chimpanzee social life.

If we think of a four-dimensional space, we can map "altruism profiles" that capture the distinct intensities and different skills with which individuals respond across a range of contexts and potential beneficiaries. Some possible profiles show low-intensity responses to a lot of others in a lot of situations; other possible profiles show high-intensity responses to a few select individuals across almost all situations; yet others are responses to the neediest individual in any given situation, with the intensity of the response proportioned to the level of need. Which, if any, of these profiles are found in human beings and in nonhuman animals? Which would be found in morally exemplary individuals? Is there a single ideal type to which we'd want everyone to conform, or is a morally ideal world one in which there's diversity? I pose these questions not as a prelude to answering them, but as a way of exposing how complex the notion of psychological altruism is and how untenable is the idea that, once we know that nonhuman animals have capacities for psychological altruism, we can infer that they have the "building blocks" of morality, too. The demise of Veneer Theory, as de Waal understands it, tells us that our evolutionary relatives belong somewhere in altruism space away from the point of complete selfish indifference. Until we have a clearer view of the specific kinds of psychological altruism chimpanzees (and other nonhuman primates) display, and until we know what kinds are relevant to morality, it's premature to claim that human morality is a "direct outgrowth" of tendencies these animals share.

Some ability to adjust our desires and intentions to the perceived wishes or needs of others appears to be a necessary condition for moral behavior.7 But, as my remarks about the varieties of psychological altruism should have suggested, it's not sufficient. Hume and Smith both believed that the capacity for psychological altruism, for benevolence (Hume) or sympathy (Smith), was quite limited; Smith begins the Theory of Moral Sentiments with a discussion of the ways in which our responses to the emotions of others are pallid copies. Both would probably recognize the full range of de Waal's studies, from Chimpanzee Politics through Peacemaking among Primates to Good Natured, as vindicating their central points, showing (in my terms) that psychological altruism exists, but that it is limited in intensity, range, extent, and skill.

Somewhere in hominid evolution came a step that provided us with a psychological device for overcoming wantonness. I am inclined to think of it as part of what made us fully human. Perhaps it began with an awareness that certain forms of projected behavior might have troublesome results and a consequent ability to inhibit the desires that would otherwise have been dominant. I suspect that it was linked to the evolution of our linguistic capacity, and even that one facet of the selective advantage of linguistic ability lay in helping us to know when to restrain our impulses. As I envisage it, our ancestors became able to formulate patterns for action, to discuss them with one another, and to arrive at ways of regulating the conduct of group members.11 At this stage, I conjecture, there began a process of cultural evolution. Different small bands of human beings tried out various sets of normative resources—rules, stories, myths, images, and more—to define the way in which "we" live. Some of these were more popular with neighbors and with descendant groups, perhaps because they offered greater reproductive success, more likely because they made for smoother societies, greater harmony, and increased cooperation. The most successful ones were transmitted across the generations, appearing in fragmentary ways in the first documents we have, the addenda to law codes of societies in Mesopotamia.

Most of this process is invisible because of the long period between the full acquisition of linguistic ability (50,000 years ago at the very latest) and the invention of writing (5,000 years ago). There are fascinating hints of important developments: the cave art and the figurines, for example. Most significant are the indications of greater ability to cooperate with individuals who don't belong to the local band. From about 20,000 years ago on, the remains of some sites show an increase in the number of individuals present at a particular time, as if several smaller bands had come together there. Even more intriguing are finds of tools made of particular materials at considerable distances from the nearest natural source; perhaps these should be understood in terms of the development of "trading networks," as some archeologists have proposed; or perhaps they should be viewed as indicators of the ability of strangers to negotiate their way through the territories of many different bands. Whichever alternative one selects, these phenomena reveal an increased capacity for cooperation and social interaction, one that becomes fully manifest in the large Neolithic settlements at Jericho and Qatal Huyuk.

Whether or not we can ever do more than guess at the actual course of events, there is, I think, a possible evolutionary account of how we got here from there, one which sees the development of a capacity for normative guidance—perhaps understood in that enlargement and refinement of sympathy that gives rise to Smith's impartial spectator—as a crucial step. Once that was in place, and once we had languages in which to engage in discussions with one another, the explicit moral practices, the compendia of rules, parables, and stories, could be developed in cultural lineages, some of which extend into the present. To revert to Huxley's famous image, we became gardeners, having, as part of our nature, an impulse to root out the weeds that are parts of our psyche, and to foster other plants by adding a stake here or a trellis there. Moreover, with us, as with any garden, the project is never finished but continues indefinitely, as new circumstances and new varieties arise.

There are important continuities between human moral agents and chimpanzees: we share dispositions to psychological altruism without which any genuinely moral action would be impossible. But I suspect that between us and our most recent common ancestor with the chimps there have been some very important evolutionary steps: the emergence of a capacity for normative guidance and self-control, the ability to speak and to discuss potential moral resources with one another, and about fifty thousand years (at least) of important cultural evolution. As Steve Gould saw so clearly, in any evaluation of our evolutionary history you can emphasize the continuities or the discontinuities. I think little is gained by either emphasis. You do better simply to recognize what has endured and what has altered.

Of course, de Waal might reject my speculations about how we got from there to here. Despite the fact that I think my story integrates insights he has developed at different stages of his career, he might prefer some alternative. The important point is that some account of this kind is needed. For central to my argument is the thesis that mere demonstration of some type of psychological altruism in chimpanzees (or other higher primates) shows very little about the origins or evolution of ethics. I am happy to consign Veneer Theory (though not Huxley's insights!) to the flames. That, however, is only the start of making the many primatological insights de Waal has given us relevant to our understanding of human morality.

On The Origin of the Human Mind, R. Dunbar

Humans seem to lie on a different cognitive plane to the other primates. My aim in this chapter is to try to provide an explanation as to why and when this might have come about. First, however, I want to identify what seem to me to be crucial cognitive differences between humans and other primates: these I characterise in terms of theory-of-mind ability, though I do not want to suggest that this is all there is to the human-ape difference. I shall then go on to offer an explanation as to why humans should need these additional cognitive abilities and conclude by offering a suggestion as to when they might have evolved.

When children acquire ToM (theory of mind), they acquire the ability to see the world from someone else's point of view. Second-order intentionality (or ToM) is important because it paves the way for at least two forms of behaviour that are of particular interest, namely lying and fictional (or pretend) play. Without the ability to see the world from someone else's point of view, it is impossible to feed another individual with information that you know to be false in order to get them to behave in a way that suits your purposes. By extension, the ability to believe that the world can be other than it really is necessarily underpins the ability to engage in pretend play (and ultimately, of course, to write fiction).

It is clear that ToM is crucial for many of those phenomena that are most characteristic of our humanity - conversation, literature, religion. The question we now have to address is: are we alone in aspiring to these levels of cognitive complexity?... The most obvious place to look for human-like cognitive abilities is our nearest relatives, the great apes (chimpanzees, gorillas and orang-utans). If these species do not possess ToM and other related cognitive traits, then it may be considered unlikely that other species will do so. So far, then, all attempts to evaluate the intentional status of non-human animals has concentrated on apes, and specifically chimpanzees (our sister species).

Humans are likely to be able to engage in deeper mind-reading than other great apes precisely because their largerbrain allows them to set aside more computing power for these purposes. In contrast, monkeys, with their smaller brain volumes, are constrained into devoting more computing power to visual processing because there are still important gains to be made from doing so.

The growth in neocortex size within the primates as a whole appears to be driven by the need to increase social group size. (There is some evidence to suggest that this relationship may extend beyond the primates to other mammals, including insectivores, carnivores and the cetaceans.) Extension of this argument to humans suggests that group sizes of about 150 would be predicted for a primate with the size of neocortex that modern humans possess. This prediction appears to be surprisingly well confirmed by the available data.

Second, there is evidence to suggest that neocortex size correlates with the frequency with which subtle social strategies are used: Byrne has shown that the relative frequency of tactical deception correlates with neocortex size in primates, while Pawlowski et al. have shown that the extent to which low-ranking males can use social skills to undermine the power-based strategies of dominant males so as to gain access to fertile females is likewise correlated with neocortex size.

Whatever the proximate mechanisms involved may be, the selection pressures favouring increases in brain size within primates as a whole seems to be the need to increase group size. This pressure, in turn, presumably reflects the need to solve some kind of ecological problem (in most cases, predation risk). In other words, species invade new habitats or find their existing habitat undergoing rapid change (usually as a consequence of climate change) and, as a result, are faced with, among other problems, increased predation risk. At this point, they have a choice: they can either (i) move to a different area where the habitat remains similar to their original one; (ii) increase group size to compensate for the demands of the new conditions; or (iii) go extinct. Increasing group size, however, comes at a price (brains are costly) and it is likely to impose some additional foraging demands.

On balance, my sense is that the need to evolve alliances to provide access to limited ecological resources (almost certainly permanent water) was most likely to have been the key pressure selecting for increased brain size in ancestral hominids. This would have started to become significant once a more nomadic or migratory lifestyle had been adopted. An alternative interpretation (suggested by Andrew Chamberlain) might be that increasing competition between human groups led to both enlarged groups and greater cognitive skills (e.g. 'advanced' ToM) as mechanisms for out-competing rival groups. An important point here is that fighting a mind-reader is a very different proposition to fighting a predator (which will typically only have first-order intentionality). The selection pressure imposed by mind-reading opponents may itself have enough impact to spur the evolution of higher orders of intentionality. This suggestion sees group size and mind-reading as two separate consequences of rivalry, whereas the preceding explanation assumes, in effect, that mind-reading is a by-product of group size (in the sense that the extra brain power generated by large groups allows the development of mind-reading abilities, which in turn are selected for because they are advantageous in ensuring the temporal cohesion of large dispersed groups). The important distinction between these two hypotheses is that the first sees mind-reading as being related to the internal dynamics of groups, whereas the alternative sees it as being related to the external dynamics of groups.

The final question to ask is when the human mind in its modern form appeared. There have been a number of attempts to consider this problem over the years, the most recent (and perhaps most successful) being those by Merlin Donald and Steven Mithen. Donald argued for a two-stage evolutionary process from the ancestral primates, each associated with a more sophisticated way of storing and manipulating knowledge. The first stage involved mimetic skills that allowed individuals to transmit information or skills by imitation learning in ways that appear to be beyond the scope of living non-human primates. The second stage was associated with the evolution of language which enabled an even more sophisticated and rapid transmission of information. Donald associated the first stage with Homo erectus, the second with the appearance of Homo sapiens. In contrast, Mithen sees the evolution of the human mind in terms of the interlinking of a set of cognitive elements, each dealing with different aspects of the world in which we live (natural history, technical, social): the final integration of all these previously independent units (and, more importantly, the ability to move easily and rapidly from one to another - what he terms 'cognitive fluidity') is again associated with language and the emergence of modern humans.

Approaches to Modelling Early Human Minds, P. Mellars, K. Gibson

We took it as self-evident in organizing this conference that fully modern intelligence, with its capacity for inventing — and operating — computers, space satellites, electron microscopes and the like — cannot have emerged instantaneously, and that there must inevitably have been a long and essentially gradual progression of mental capabilities over the course of hominid evolution from those of our closest primate ancestors. Even the harshest critic of anthropocentric models would not question that there are indeed very profound contrasts between the mental capacities of apes and those of modern humans. The crucial question is exactly what form these pre-modern patterns of mental development may have taken. Above all perhaps — if we adopt an explicitly scientific stance — where do we look to find plausible working models (or working hypotheses) of exactly what the nature of these earlier patterns of cognition might have been? Our view, as reflected in the papers in the present volume, is that we should adopt the widest possible approach to this question, and look into the potential contributions of the broadest possible range of cognitive disciplines to provide clues to these early stages of human mental development.

Effective models of human evolution must include descriptions of the behavioural and cognitive capacities of the last common ancestor of humans and African apes, describe successive stages in the evolutionary development of human mental capacities, and determine the geological time-frames of these evolutionary events. Such models must also delineate the ecological or other selective agents that resulted in the emergence of human capacities and, ultimately, they must explain the genetic basis of human and ape mental differences. The proliferation of models of the origin of such phenomena as language and consciousness suggest that it is very easy to devise evolutionary scenarios which incorporate many of these elements. What is difficult is to determine which methods of evolutionary modelling are most appropriate and which models are the most scientifically credible.

Nearly all palaeoanthropologists and archaeologists agree that, by at least 35--40,000 years ago, anatomically modern humans possessed fully modern mental capacities. However, vast differences in technological, scientific, and literary accomplishments distinguish Upper Palaeolithic from later human cultures. Similarly, the material remains of medieval Europeans are quite different from those of their twentieth-century descendants. Yet all evolutionary biologists would agree that both populations were fully modern in their cognitive capacities. How, then, can we judge when our ancestors first attained modern cognitive, linguistic, and other mental capacities?

How do we set about investigating the evolution of mental capacities for behaviour, as opposed to the changing patterns of behaviour themselves? Renfrew has reminded us of the ultimate intransigence of this paradox in behavioural and archaeological terms. Yet we cannot escape the fact that the genetic capacities for intelligent behaviour must have expanded fairly dramatically over the course of the past two million years, and that this expansion almost certainly occurred at many different stages during the course of human evolution. Mellars has emphasized that even in populations as chronologically close to ourselves as the European Neanderthals, we could be dealing with people who had pursued a genetically separate trajectory of evolutionary development from that which led to behaviourally and biologically modern populations over a time span of at least 300,000 years. The issue of potential, innate biological factors in the development of even relatively recent, archaeologically defined behavioural patterns therefore cannot be avoided.

In this volume we have argued that a productive approach to this question demands the cooperation of the widest possible range of cognitive disciplines — of psychologists, neurologists, ethologists, primatologists, biological and social anthropologists, and linguists. Ultimately, it may also require the co-operation of other fields such as artificial intelligence, molecular biology and philosophy. We have also argued that the eventual testing and evaluation of alternative, hypothetical models of cognitive evolution must rely heavily on the 'hard' behavioural evidence which only the archaeological record can provide.

The Sapient Behaviour Paradox, C. Renfrew

My aim in this paper is to comment on the apparent paradox involving the difficulty, perhaps the impossibility, of observing potential or capacity or ability until it is revealed in performance/actuality/ achievement. Until a way can be found around this difficulty there are grave difficulties in understanding adequately the process of the emergence of sapient cognitive abilities and capacities.

The central problem is that we are interested in examining the threshold between pre-sapient and fully sapient existence. There is the underlying assumption that the pre-sapiens hominids, such as Homo erectus, would not have been capable, even after millennia of cultural evolution, of the technical and cultural advances which are familiar to us as part of the human story, such as the Neolithic revolution, the development of pyrotechnology, the urban revolution, the development of writing, the industrial revolution and so forth. It is also widely assumed that the fundamental genetic changes which produced modern humankind were already largely accomplished with the appearance of Homo sapiens sapiens in different parts of the world some 40,000 years ago. From then on, it is held, we are all one species. But it is manifestly evident that many of the elements in the human story since 40,000 years ago of developing technology and social organization have only been documented in restricted areas and periods since. In one sense that is obvious enough. But, if we are speaking in causal terms, how satisfactory an explanation would it be (if we had one) that succeeded in explaining the changes which took place over the preceding millennia that led to the emergence and widespread dispersal of our species? If it is the case that genetically we today differ rather little from our sapiens ancestors of forty millennia ago, how does that genetic composition which emerged then explain the cultural differences between then and now? The usual answer is that from that time the human animal had the skill, the intelligence, the potential to achieve its later accomplishments. But what kind of explanation is it that lays such weight upon so apparently teleological a concept as potential?

In genetics the distinction between genotype (as determined by the molecular genetic structures, by the genes) and phenotype (as actually apparent in the organism) is a very familiar one. But the distinction being drawn here is a rather different one: not so much in appearance of form (i.e. the phenotype) as in behaviour, action (which we should perhaps designate 'praktotype')- For it is generally held that the main genetic contributions to our species were present in Homo sapiens sapiens from early times. But many aspects of behaviour emerge only very much later. They are based upon abilities which must be genetically determined, but they are called into effect, or triggered, by culturally determined antecedent circumstances.

It is widely accepted that with the appearance of fully modern Homo sapiens over much of the globe between 60,000 and 20,000 years ago, the genetic make-up, the hardware, of contemporary Homo sapiens sapiens was effectively established, and that there have been few subsequent changes of major significance. The Great Transition from Homo erectus to Homo sapiens had taken place. This hypothesis can certainly be questioned, and among sociobiologists to have done so are E.O. Wilson and C.H. Darlington, who claim that the differences between the cultures and civilizations in documented different places over the past few millennia are in large measure due to significant differences in genetic composition between them. But such has not yet been shown to be the case, and the time intervals involved seem rather short for very radical genetic changes to have occurred.

The paradox here then is that the big changes in behaviour seem to have taken place many millennia after the alleged genetic changes which are said to have 'caused' them. The genetic changes apparently had little immediate effect. What then is their explanatory power?... The paradox here is that recent hunter-gatherers (genetically modern) have often manifested behaviour not greatly more complex than that of their genetically pre-modern predecessors. So just what is the genetic change conceived of as explaining?

There would seem to be a need to define very much more precisely what, in the Upper Palaeolithic in general, is so impressive that it can be seen to reflect what is generally regarded as the most decisive step in human evolution. (Localized phenomena, such as French cave art, cannot be allowed to obscure the broad picture.) If the emergence of our species was of such decisive significance, in a causal sense, for the subsequent unfolding of human history, why are the consequences of that emergence evident only in such a patchy way in space and time?

If we are considering human evolution and the emergence of Homo sapiens sapiens, it is necessary to call into evidence also the trajectory of cultural evolution after the crossing of the 40 ky threshold. If the hypotheses above are correct (and they are very "widely accepted) the achievements of the subsequent evolutionary trajectory were already there in potential soon after the threshold was crossed. But only \vith the development of the 'software' — that is to say the cultural component as contrasted with the genetic 'hardware' component — was that potential actualized.

All of this amounts to a striking paradox. We are studying events which led up to a very significant threshold some 40,000 years ago. But the material record of that time or of the immediate aftermath does not bring out at all clearly that significance. The momentous nature of the transition can only be comprehended in the light of 40 millennia of subsequent cultural evolution. As implied earlier, it is a feature of teleological or goal-direct explanations that they explain changes at one time in terms of their consequences at a later time. But they are generally felt to be poor explanations, opening the way to circular arguments.

One is left here with the impression that there is something inappropriate with the "way we are formulating these problems. Of course, in the future, developments in molecular genetics may well allow us to focus much more closely upon the relatively few genetic changes which were central to the great transition from Homo erectus to Homo sapiens sapiens. But even here, if we could define more closely the genotype, in terms of molecular composition, it is difficult to predict from that the nature of the phenotype. And it is just as difficult, having established the physical form of the phenotype, to proceed to predict its behaviour, the praktotype, and then further again, to