The cats' timidity blossoms at about one month of age, which is the point when their amygdala matures enough to take control of the brain circuitry to approach or avoid. One month in kitten brain maturation is akin to eight months in a human infant; it is at eight or nine months, Kagan notes, that "stranger" fear appears in babies—if the baby's mother leaves a room and there is a stranger present, the result is tears. Timid children, Kagan postulates, may have inherited chronically high levels of norepinephrine or other brain chemicals that activate the amygdala and so create a low threshold of excitability, making the amygdala more easily triggered.
One sign of this heightened sensitivity is that, for example, when young men and women who were quite shy in childhood are measured in a laboratory while exposed to stresses such as harsh smells, their heart rate stays elevated much longer than for their more outgoing peers—a sign that surging norepinephrine is keeping their amygdala excited and, through connected neural circuits, their sympathetic nervous system aroused.4 Kagan finds that timid children have higher levels of reactivity across the range of sympathetic nervous system indices, from higher resting blood pressure and greater dilation of the pupils, to higher levels of norepinephrine markers in their urine.
Silence is another barometer of timidity. Whenever Kagan's team observed shy and bold children in a natural setting—in their kindergarten classes, with other children they did not know, or talking with an interviewer—the timid children talked less. One timid kindergartener would say nothing when other children spoke to her, and spent most of her day just watching the others play. Kagan speculates that a timid silence in the face of novelty or a perceived threat is a sign of the activity of a neural circuit running between the forebrain, the amygdala, and nearby limbic structures that control the ability to vocalize (these same circuits make us "choke up" under stress).
These sensitive children are at high risk for developing an anxiety disorder such as panic attacks, starting as early as sixth or seventh grade. In one study of 754 boys and girls in those grades, 44 were found to have already suffered at least one episode of panic, or to have had several preliminary symptoms. These anxiety episodes were usually triggered by the ordinary alarms of early adolescence, such as a first date or a big exam—alarms that most children handle without developing more serious problems. But teenagers who were timid by temperament and who had been unusually frightened by new situations got panic symptoms such as heart palpitations, shortness of breath, or a choking feeling, along with the feeling that something horrible was going to happen to them, like going crazy or dying. The researchers believe that while the episodes were not significant enough to rate the psychiatric diagnosis "panic disorder," they signal that these teenagers would be at greater risk for developing the disorder as the years went on; many adults who suffer panic attacks say the attacks began during their teen years.5
The onset of the anxiety attacks was closely tied to puberty. Girls with few signs of puberty reported no such attacks, but of those who had gone through puberty about 8 percent said they had experienced panic. Once they have had such an attack, they are prone to developing the dread of a recurrence that leads people with panic disorder to shrink from life.
NOTHING BOTHERS ME: THE CHEERFUL TEMPERAMENT
In the 1920s, as a young woman, my aunt June left her home in Kansas City and ventured on her own to Shanghai—a dangerous journey for a solitary woman in those years. There June met and married a British detective in the colonial police force of that international center of commerce and intrigue. When the Japanese captured Shanghai at the outset of World War II, my aunt and her husband were interned in the prison camp depicted in the book and movie Empire of the Sun. After surviving five horrific years in the prison camp, she and her husband had, literally, lost everything. Penniless, they were repatriated to British Columbia.
I remember as a child first meeting June, an ebullient elderly woman whose life had followed a remarkable course. In her later years she suffered a stroke that left her partly paralyzed; after a slow and arduous recovery she was able to walk again, but with a limp. In those years I remember going for an outing with June, then in her seventies. Somehow she wandered off, and after several minutes I heard a feeble yell—June crying for help. She had fallen and could not get up on her own. I rushed to help her up, and as I did so, instead of complaining or lamenting she laughed at her predicament. Her only comment was a lighthearted "Well, at least I can walk again."
By nature, some people's emotions seem, like my aunt's, to gravitate toward the positive pole; these people are naturally upbeat and easygoing, while others are dour and melancholy. This dimension of temperament—ebullience at one end, melancholy at the other—seems linked to the relative activity of the right and left prefrontal areas, the upper poles of the emotional brain. That insight has emerged largely from the work of Richard Davidson, a University of Wisconsin psychologist. He discovered that people who have greater activity in the left frontal lobe, compared to the right, are by temperament cheerful; they typically take delight in people and in what life presents them with, bouncing back from setbacks as my aunt June did. But those with relatively greater activity on the right side are given to negativity and sour moods, and are easily fazed by life's difficulties; in a sense, they seem to suffer because they cannot turn off their worries and depressions.
In one of Davidson's experiments volunteers with the most pronounced activity in the left frontal areas were compared with the fifteen who showed most activity on the right. Those with marked right frontal activity showed a distinctive pattern of negativity on a personality test: they fit the caricature portrayed by Woody Allen's comedy roles, the alarmist who sees catastrophe in the smallest thing—prone to funks and moodiness, and suspicious of a world they saw as fraught with overwhelming difficulties and lurking dangers. By contrast to their melancholy counterparts, those with stronger left frontal activity saw the world very differently. Sociable and cheerful, they typically felt a sense of enjoyment, were frequently in good moods, had a strong sense of self-confidence, and felt rewardingly engaged in life. Their scores on psychological tests suggested a lower lifetime risk for depression and other emotional disorders.6
People who have a history of clinical depression, Davidson found, had lower levels of brain activity in the left frontal lobe, and more on the right, than did people who had never been depressed. He found the same pattern in patients newly diagnosed with depression. Davidson speculates that people who overcome depression have learned to increase the level of activity in their left prefrontal lobe—a speculation awaiting experimental testing.
Though his research is on the 30 percent or so of people at the extremes, just about anyone can be classified by their brain wave patterns as tending toward one or the other type, says Davidson. The contrast in temperament between the morose and the cheerful shows up in many ways, large and small. For example, in one experiment volunteers watched short film clips. Some were amusing—a gorilla taking a bath, a puppy at play. Others, like an instructional film for nurses featuring grisly details of surgery, were quite distressing. The right-hemisphere, somber folks found the happy movies only mildly amusing, but they felt extreme fear and disgust in reaction to the surgical blood and gore. The cheerful group had minimal reactions to the surgery; their strongest reactions were of delight when they saw the upbeat films.
Thus we seem by temperament primed to respond to life in either a negative or a positive emotional register. The tendency toward a melancholy or upbeat temperament—like that toward timidity or boldness—emerges within the first year of life, a fact that strongly suggests it too is genetically determined. Like most of the brain, the frontal lobes are still maturing in the first few months of life, and so their activity cannot be reliably measured until the age of ten months or so. But in infants that young, Davidson found that the activity level of the frontal lobes predicted whether they would cry when their mothers left the room. The correlation was virtually 100 percent: of dozens of infants tested this way, every infant who cried had more brain activity on the right side, while those who did not had more activity on the left.