Zen Physics The Science of Death the Logic of Reincarnation Read online

Page 3


  We also need to be cautious before jumping to conclusions about the soul when there is such a clear and powerful motive for us to want to believe in it. (The same argument applies to other marginal phenomena, such as ghosts, telepathy, and UFOs, all of which appeal to our need for a "higher" truth.) Potentially, the soul is a lifeline, a way of avoiding the terrifying finality of death. Imagine what a difference it would make to us psychologically if we knew, as certainly as we know we have a brain, that there is part of us that cannot die. We have a vested interest in the soul hypothesis being correct. And this fact alone is sufficient (whatever other elements may be involved) to account for the global, intercultural, long-standing belief in souls and an afterlife – a belief that has flourished in spite of a conspicuous lack of evidence.

  Clearly, there is something very different between a lifeless corpse and a living, breathing, sentient person. But what is different? During life, is there an aspect of us that is above and beyond the mere workings of a biological machine? Or are we, after all, nothing more than a temporary aggregation of chemicals and cells?

  We have a strong tendency to feel as if we are something extra beyond our bodies and brains – that we are, in effect, an intelligent life force dwelling within an organic shell. This makes it easy to go along with the suggestion of dualists such as Descartes, that the mind is not just an upshot of the functioning brain but, on the contrary, is a deeper and further fact. In the dualist's scheme, each of us has – or is – a "Cartesian ego" that inhabits the material brain. And from this position, in which the mind is held to be distinct from the living brain, it is a short (though not inevitable) step to the assertion that the mind is capable of an entirely independent existence, as a disembodied soul.

  Dualism is simple and desirable to believe in. But then, from a child's point of view, so is the Easter Bunny. In time, we come to appreciate (often with regret) that an extremely large, beneficent rabbit is not essential to explaining the origin of a surfeit of concealed eggs at Easter. Similarly, most neurologists have now reached the conclusion that a Cartesian ego or self is not needed to account for the existence of the self.

  It is a consensus fast approaching unanimity in scientific circles that "we" (our selves) are no more than the consequences of our brains at work. In the modern view, we are mere epiphenomena or, more charitably perhaps, culminations, of the greatest concentration of orchestrated molecular activity in the known cosmos. And although it is true we don't yet know exactly how the trick is done – these are still frontier days in the brain sciences – it is widely held to be only a matter of time before those who are teasing apart the circuitry of the human cortex lay bare the hidden props of the illusion. The situation is as brutally materialistic as that. There is not the slightest bit of credible evidence to suggest there is more to your self, to the feeling of being you, than a stunningly complex pattern of chemical and electrical activity among your neurons. No soul, no astral spirit, no ghost in the machine, no disembodied intelligence that can conveniently bail out when the brain finally crashes to its doom. If science is right, then you and I are just transitory mental states of our brains.

  * * *

  We think of ourselves as being definite people, unique individuals. But, at birth, within the constraints of our genetic makeup, we are capable of becoming anyone. For the first year or two of life outside the womb, our brains are in the most pliable, impressionable, and receptive state they will ever be in. At the neural level this is apparent in the fact that we are all born with massively overwired brains that contain many more embryonic intercellular links than any one individual ever needs. Such was the surprising finding of the first extensive electron microscope study of human neural synapses (brain cell connections) by pediatric neurologist Peter Huttenlocher of Chicago's Pritzker Medical School in 1979. By staining and examining tissues from the frontal cortex, Huttenlocher found that the infant brain has, on average, about 50 percent more synaptic connections than has an adult brain, though the immature synapses are different in shape and much less well defined. It is as if a wide selection of the potentialities of the human race, acquired over millions of years, are made available to each of us at birth.

  During the first twelve months of life, a remarkable 60 percent of a baby's energy intake goes toward fueling the development of its brain. In this critical period, huge numbers of embryonic connections between neurons are lost (through lack of use) while others are reinforced and developed (through repeated use). From being an incredibly sensitive, information absorbent, but otherwise useless lump of flesh, the brain rapidly acquires a highly patterned infrastructure that encodes a particular set of memories and beliefs about the world. Each brain loses the potential to become anyone, but gains, instead, the much more useful ability to conceive of itself as being a certain someone.

  This transformation might seem almost magical if it weren't for the fact that we know, at least in general terms, how and why it comes about. A brain that was simply passive, naively experiencing its environment, reflecting everything but interpreting nothing, like a grinning Buddha, would quickly end up as a juicy morsel inside someone else's stomach. And so it would die, in blissful ignorance, before it could pass on its genes. And so there would be less grinning Buddhas in the future, but plenty more non-Buddha Buddha-eaters.

  A real human brain starts out like a Buddha, all-receptive. But four billion years of ultrapragmatic live-and-let-die evolution have ensured that it immediately, under preprogrammed genetic control, gets down to the business of metamorphosing into a tough, practical survival machine. Its onboard genetic commands swiftly guide it in the process of condensing from a sort of gaseous state of total, nondiscriminating naivety to a sharp, crystalline state of effective self-centeredness with the wits and street savvy needed to stay alive.

  Unfortunately, we are absolutely, pathetically helpless throughout the period that this momentous development takes place, which is why a lengthy, protective, nurturing environment is so essential to humans (and other brainy animals). Simpleminded creatures, like amoebae, ants, and even alligators, come into the world "knowing" as much about their self-boundaries as they will ever know, albeit this knowledge is based purely on dumb reflexes and instinct. But our self-knowledge is a much more elaborate affair. Survival in the Homo niche demands being able to experience the self as an agent in the world, as an individual with the power to plan and predict and decide among alternative courses of action. Such knowledge can only be garnered through individual experience, by watching and learning from others who are already proficient at being the most ruthlessly effective survival machines in the known universe – men and women.

  A crucial part of the development of our self-image involves the brain latching onto the game rules by which the individuals around it play. During infancy, and continuing into childhood and adolescence, the brain organizes itself around the prevalent attitudes and beliefs to which it is exposed. But it goes beyond building a general sociocultural belief system; otherwise everyone within a given race or clan would turn out pretty much the same. The brain personalizes its belief system by consolidating numerous, often highly subtle impressions it picks up from others about its particular character, intelligence, and status; its bodily appearance, gender role, and capabilities. Whether these impressions, received from parents, siblings, friends, and other people who are most influential during childhood are, in any absolute sense, "right" or "wrong" is not the issue. The brain will take them onboard whatever their merits, because they have come from the only authorities it recognizes and has access to. As these specific, private details are absorbed and assimilated, they begin to form the personal dimension of the brain's emerging worldview. Consequently, the brain starts to think of itself not just as being in a particular world, but as being a particular someone in that world – a person, an agent with powers of its own, with qualities, both good and bad, desirable and undesirable, by which it is uniquely distinguished from all others.

  With the rudiments of a belief system in place, the brain starts to interpret and evaluate everything that comes to its attention in terms of this resident catechism of received wisdom. Every sensation and perception, every incident and event, every word, gesture, and action of other people, is construed within the context of how the brain understands the world and itself to be like. Thus the brain steadily becomes more and more dogmatic, opinionated, and biased in its thinking. It tends to hold on to – that is, to remember – experiences that comply with and support its acquired worldview, while at the same time it tends to reject or deny anything that seems incongruous with its system of beliefs. So, the emerging belief system is further strengthened and validated. And in this way the brain builds for itself an island of stability, a rock of predictability, in the midst of a vast ocean of potentially fatal chaos and inexplicable change.

  We are inventions of our genes, our culture, our society, our particular upbringing, but oddly enough we're not aware of being so utterly contrived. We recognize that other people in other places and times may hold views different from our own. But we tend greatly to underestimate the extent to which we ourselves are caught up, constrained, and molded by the paradigms imposed upon us. Our indoctrination begins at such an early age and is so all-pervasive that the rules and theories we acquire become hard-wired into our brains. In particular, the power of our closest caretakers to shape us is awesome. Our parents or guardians reflect back at us, with approval, those sounds and actions we make as infants which are considered most desirable and appropriate in progressing toward the people they want us to become (just as they, too, were once similarly shaped). Subsequently, we fail to recognize that the beliefs about the world and about ourselves which we carry around with us like sacred relics are tentative, and pos
sibly completely wrong. Instead we go through life fully convinced that they are true. We come to share and accept with unquestioning obedience the concepts of normality held by those around us, because these concepts are literally part of ourselves: we are their embodiment.

  Our early environment and interpersonal relationships determine the precise neural circuitry of our brains, and this circuitry in turn determines who we are. Having encoded a particular model of reality, the brain, without "us" even realizing it, gives a spin to every sight, sound, smell, taste, and touch that enters through the senses. In fact, the conditioning begins even before the conscious brain goes into action. Evolution has furnished us with a range of sensory repression systems that save us from having to be aware of and thereby hopelessly overloaded and distracted by every minutia of our surroundings. So, just as the president has a team of minions to deal with all but the most crucial, relevant paperwork, the brain is able to deploy its attention, its executive power, where most needed by having the bulk of sensory input weeded out at a lower level.

  Human vision, for instance, is an active process in which signals and perceptions are highly filtered, screened, and manipulated before they ever reach the higher centers of the cortex. We may feel as if we are directly and immediately aware of whatever images fall upon our retinas, but we are mistaken. Most of the handling of data from our eyes takes place at a subconscious level through a variety of largely independent specialized subsystems. And, strange though it may seem, some of the visual subsystems in our brains produce an output that "we" cannot see. They contribute to brain function and even to our awareness of the world, but no amount of introspection can make us aware of the subsystems themselves. One of the ways this is made most strikingly clear is by the strange neurological condition known as blind sight. Following some kinds of injury to the visual cortex, people may become blind in one half of their visual field. But although they claim an inability to see anything in their blind half, they sometimes seem capable of absorbing information from that half. For example, if asked to point to a spot of light on a screen on their blind side they will say they cannot see it at all and that they are just guessing its position. Yet they are able to point to it correctly far more often than would be expected by chance alone. Many other investigations, too, over the years have shown that much of what is actually registered by our eyes and brain escapes our conscious attention.

  Survival for our ancestors would have been impossible if every datum of sensory input had been allowed to gain access to the inner sanctum of consciousness. By various means, then, we are shielded from the endless flux, the seething, ceaseless commotion both outside and among our neurons, the fact that neither we nor the world are ever the same from one moment to the next. Only when the integration is complete, and the flux has been smoothed and transformed into stability, does a final, coherent picture appear in our awareness.

  All human beings are subject to similar biological and genetic conditioning. A Pygmy's eye works in the same way as a Parisian's; a neurologist would be at a loss to distinguish between the brain of a Japanese and that of a Scot. But the impact of different societies and cultures upon the developing self is much more diverse. We tend to underestimate this impact and so assume that people have always held their individuality and mortality clearly in mind, as we Westerners do today. However, looking at the history of death, and of how death was dealt with by people in the past, gives some clues to a possible evolution of self-awareness even over the past few hundred years. This is not to say that our relatively recent ancestors had no concept at all of themselves as unique individuals; to believe that humans have not always been self-aware to some degree is radical in the extreme. (Just such a view is expressed by Julian Jaynes in his book The Origin of Consciousness in the Bicameral Mind. Jaynes, an American psychologist, has suggested that human self-awareness originated within the last two thousand years.) But it does seem as if there was a trend toward a more intensely focused awareness of self, especially during the early modern period.

  In medieval Europe, society was rigidly structured. Everyone knew their place in the scheme of things – a scheme based on lineage, gender, and social class. There was virtually no chance of escaping one's birthright, whether as a peasant or a feudal lord, no scope for social mobility. To appreciate more readily the mentality of this time we have to recognize that our modern emphasis on the fundamental, overriding importance of the individual is not universal. Medieval attitudes lacked this emphasis, in large measure because of the overarching influence of the Church of Rome. The medieval faith in Catholicism was absolute. But what mattered in this faith was not the individual's role but the broad cosmic sweep of holy law and salvation. Personalities, individual differences and opinions, were considered irrelevant and undesirable in the face of such totalitarian religious belief. And this downplaying of the personal is reflected in the fact that medieval times produced virtually no autobiographies and very few biographies – and then only inaccurate, stereotypical lives of saints. In these writings, the psychology of the person makes no appearance; all that comes across is a cardboard cutout of a man or woman, an anodyne approximation to the Christian ideal, unashamedly embellished with archetypal miracle tales.

  By the end of the Middle Ages, however, a change was evident. Instrumental in this was the rise of Protestantism, particularly in its most extreme form – Puritanism. John Calvin preached that some, "the Elect," were predestined to enter heaven, while most were doomed to spend eternity in hell. Absurd and intellectually offensive though this idea may appear now, it had the effect at the time of casting the individual into sharp relief, of differentiating between one person and another. And, in general, Protestantism of every kind argued for the private nature of religion. Catholics did not need, and were not expected, to face God alone. Priests, nuns, saints, the Virgin Mary, and all manner of rituals were on hand to intercede for the masses, so that the masses didn't have to think too hard or deeply for themselves, didn't have to become too involved as individuals or worry too much about the implications to themselves of the great issues of life, death, and redemption. Protestantism, by contrast, sought to diminish the gap between layperson and God, while Puritanism sought to close it completely. The Puritan faced God alone – in the privacy of the individual mind.

  And there were soon to be other factors at work in the West, helping to turn the spotlight even more fully on each man and woman, forcing the self out of hiding. Not the least of these was the Industrial Revolution and, at its heart, that great engine, literally and figuratively, for change. Suddenly, the old agricultural lifestyle in which son did like father, and daughter like mother, generation after generation, and in which it was frowned upon and futile for the individual to act any differently from the rest, was swept away. And in its place was development (often for the worse for those who lived in the new slums) and technological progress, the rise of personal ambition, of the entrepreneur, the winner and loser, and a new emphasis on individuality and concern for one's own welfare. Suddenly, it was good and potentially profitable to be an individual, to go one's own way, to be different from the crowd. And that attitude has not altered to this day.

  In the modern West, we revere the self, we set it up on a pedestal. There has never before been a culture, a time, in which people focused so obsessively on the well-being and elevation of their egos. And what do these egos turn out to be? Nothing, says science, but artifacts of the brain. We – our feelings of being someone in the world – survive as long as the brain lives. And when the brain dies ...

  Our prospects look bleak. The very mode of inquiry that has helped shape the modern world and that we have come to rely upon so much informs us that, in effect, we are the dreams of carbon machines. There is no real substance to us, no deeper, further fact to being a person than just one feeling after another after another. Impressions, sensations, thoughts, emotions, continually well up into awareness and the sequence of these experiences, bound together by that fragile thing called memory, is projected by the brain as you and me.