1491: New Revelations of the Americas Before Columbus Read online

Page 14


  From today’s perspective, it is difficult to imagine the ethical system that could justify De Soto’s subsequent actions. For four years his force wandered through what are now Florida, Georgia, North and South Carolina, Tennessee, Alabama, Mississippi, Arkansas, Texas, and Louisiana, looking for gold and wrecking most everything it touched. The inhabitants often fought back vigorously, but they were baffled by the Spaniards’ motives and astounded by the sight and sound of horses and guns. De Soto died of fever with his expedition in ruins. Along the way, though, he managed to rape, torture, enslave, and kill countless Indians. But the worst thing he did, some researchers say, was entirely without malice—he brought pigs.

  According to Charles Hudson, an anthropologist at the University of Georgia who spent fifteen years reconstructing De Soto’s path, the expedition built barges and crossed the Mississippi a few miles downstream from the present site of Memphis. It was a nervous time: every afternoon, one of his force later recalled, several thousand Indian soldiers approached in canoes to within “a stone’s throw” of the Spanish and mocked them as they labored. The Indians, “painted with ochre,” wore “plumes of many colors, having feathered shields in their hands, with which they sheltered the oarsmen on either side, the warriors standing erect from bow to stern, holding bows and arrows.” Utterly without fear, De Soto ignored the taunts and occasional volleys of arrows and poled over the river into what is now eastern Arkansas, a land “thickly set with great towns,” according to the soldier’s account, “two or three of them to be seen from one.” Each city protected itself with earthen walls, sizable moats, and dead-eye archers. In his brazen fashion, De Soto marched right in, demanded food, and marched out.

  After De Soto left, no Europeans visited this part of the Mississippi Valley for more than a century. Early in 1682 foreigners appeared again, this time Frenchmen in canoes. In one seat was René-Robert Cavelier, Sieur de la Salle. La Salle passed through the area where De Soto had found cities cheek by jowl. It was deserted—the French didn’t see an Indian village for two hundred miles. About fifty settlements existed in this strip of the Mississippi when De Soto showed up, according to Anne Ramenofsky, an archaeologist at the University of New Mexico. By La Salle’s time the number had shrunk to perhaps ten, some probably inhabited by recent immigrants. De Soto “had a privileged glimpse” of an Indian world, Hudson told me. “The window opened and slammed shut. When the French came in and the record opened up again, it was a transformed reality. A civilization crumbled. The question is, how did this happen?”

  Today most historians and anthropologists believe the culprit was disease. In the view of Ramenofsky and Patricia Galloway, an anthropologist at the University of Texas, the source of contagion was very likely not De Soto’s army but its ambulatory meat locker: his three hundred pigs. De Soto’s company was too small to be an effective biological weapon. Sicknesses like measles and smallpox would have burned through his six hundred men long before they reached the Mississippi. But that would not have been true for his pigs.

  Pigs were as essential to the conquistadors as horses. Spanish armies traveled in a porcine cloud; drawn by the supper trough, the lean, hungry animals circled the troops like darting dogs. Neither species regarded the arrangement as novel; they had lived together in Europe for millennia. When humans and domesticated animals share quarters, they are constantly exposed to each other’s microbes. Over time mutation lets animal diseases jump to people: avian influenza becomes human influenza, bovine rinderpest becomes human measles, horsepox becomes human smallpox. Unlike Europeans, Indians did not live in constant contact with many animals. They domesticated only the dog; the turkey (in Mesoamerica); and the llama, the alpaca, the Muscovy duck, and the guinea pig (in the Andes). In some ways this is not surprising: the New World had fewer animal candidates for taming than the Old. Moreover, few Indians carry the gene that permits adults to digest lactose, a form of sugar abundant in milk. Non-milk drinkers, one imagines, would be less likely to work at domesticating milk-giving animals. But this is guesswork. The fact is that what scientists call zoonotic disease was little known in the Americas. By contrast, swine, mainstays of European agriculture, transmit anthrax, brucellosis, leptospirosis, trichinosis, and tuberculosis. Pigs breed exuberantly and can pass diseases to deer and turkeys, which then can infect people. Only a few of De Soto’s pigs would have had to wander off to contaminate the forest.

  The calamity wreaked by the De Soto expedition, Ramenofsky and Galloway argued, extended across the whole Southeast. The societies of the Caddo, on the Texas-Arkansas border, and the Coosa, in western Georgia, both disintegrated soon after. The Caddo had a taste for monumental architecture: public plazas, ceremonial platforms, mausoleums. After De Soto’s army left the Caddo stopped erecting community centers and began digging community cemeteries. Between the visits of De Soto and La Salle, according to Timothy K. Perttula, an archaeological consultant in Austin, Texas, the Caddoan population fell from about 200,000 to about 8,500—a drop of nearly 96 percent. In the eighteenth century, the tally shrank further, to 1,400. An equivalent loss today would reduce the population of New York City to 56,000, not enough to fill Yankee Stadium. “That’s one reason whites think of Indians as nomadic hunters,” Russell Thornton, an anthropologist at the University of California at Los Angeles, said to me. “Everything else—all the heavily populated urbanized societies—was wiped out.”

  Could a few pigs truly wreak this much destruction? Such apocalyptic scenarios have invited skepticism since Henry Dobyns first drew them to wide attention. After all, no eyewitness accounts exist of the devastation—none of the peoples in the Southeast had any form of writing known today. Spanish and French narratives cannot be taken at face value, and in any case say nothing substantial about disease. (The belief that epidemics swept through the Southeast comes less from European accounts of the region than from the disparities among those accounts.) Although the archaeological record is suggestive, it is also frustratingly incomplete; soon after the Spaniards visited, mass graves became more common in the Southeast, but there is yet no solid proof that a single Indian in them died of a pig-transmitted disease. Asserting that De Soto’s visit caused the subsequent collapse of the Caddo and Coosa may be only the old logical fallacy of post hoc ergo propter hoc.

  Not only do archaeologists like Dobyns, Perttula, and Ramenofsky argue that unrecorded pandemics swept through the Americas, they claim that the diseases themselves were of unprecedented deadliness. As a rule, viruses, microbes, and parasites do not kill the majority of their victims—the pest that wipes out its host species has a bleak evolutionary future. The influenza epidemic of 1918, until AIDS the greatest epidemic of modern times, infected tens of millions around the world but killed fewer than 5 percent of its victims. Even the Black Death, a symbol of virulence, was not as deadly as these epidemics are claimed to be. The first European incursion of the Black Death, in 1347–51, was a classic virgin-soil epidemic; mutation had just created the pulmonary version of the bacillus Yersinia pestis. But even then the disease killed perhaps a third of its victims. The Indians in De Soto’s path, if researchers are correct, endured losses that were anomalously greater. How could this be true? the skeptics ask.

  Consider, too, the Dobynsesque procedure for recovering original population numbers: applying an assumed death rate, usually 95 percent, to the observed population nadir. According to Douglas H. Ubelaker, an anthropologist at the Smithsonian’s National Museum of Natural History, the population nadir for Indians north of the Río Grande was around 1900, when their numbers fell to about half a million. Assuming a 95 percent death rate (which Ubelaker, a skeptic, does not), the precontact population of North America would have been 10 million. Go up 1 percent to a 96 percent death rate and the figure jumps to 12.5 million—creating more than two million people arithmetically from a tiny increase in mortality rates. At 98 percent, the number bounds to 25 million. Minute changes in baseline assumptions produce wildly different results.


  Worse, the figures have enormous margins of error. Rudolph Zambardino, a statistician at North Staffordshire Polytechnic, in England, has pointed out that the lack of direct data forces researchers into salvos of extrapolation. To approximate the population of sixteenth-century Mexico, for example, historians have only the official counts of casados (householders) in certain areas. To calculate the total population, they must adjust that number by the estimated average number of people in each home, the estimated number of homes not headed by a casado (and thus not counted), the estimated number of casados missed by the census takers, and so on. Each one of these factors has a margin of error. Unfortunately, as Zambardino noted, “the errors multiply each other and can escalate rapidly to an unacceptable magnitude.” If researchers presented their estimates with the proper error bounds, he said, they would see that the spread is far too large to constitute “a meaningful quantitative estimate.”

  Extraordinary claims require extraordinary evidence, scientists say. Other episodes of mass fatality are abundantly documented: the Black Death in Europe, the post-collectivization famine in the Soviet Union, even the traffic in African slaves. Much less data support the notion that Old World bacteria and viruses turned the New World into an abattoir.* Such evidence as can be found lies scribbled in the margins of European accounts—it is, as Crosby admitted, “no better than impressionistic.”

  “Most of the arguments for the very large numbers have been theoretical,” Ubelaker told me. “But when you try to marry the theoretical arguments to the data that are available on individual groups in different regions, it’s hard to find support for those numbers.” Archaeologists, he said, keep searching for the settlements in which those millions of people supposedly lived. “As more and more excavation is done, one would expect to see more evidence for [dense populations] than has thus far emerged.” Dean R. Snow, of Pennsylvania State, repeatedly examined precontact sites in eastern New York and found “no support for the notion that ubiquitous pandemics swept the region.” In the skeptics’ view, Dobyns, and other High Counters (as proponents of large pre-Columbian numbers have been called) are like people who discover an empty bank account and claim from its very emptiness that it once contained millions of dollars. Historians who project large Indian populations, Low Counter critics say, are committing the intellectual sin of arguing from silence.

  Given these convincing rebuttals, why have the majority of researchers nonetheless become High Counters? In arguing that Indians died at anomalously high rates from European diseases, are researchers claiming that they were somehow uniquely vulnerable? Why hypothesize the existence of vast, super-deadly pandemics that seem unlike anything else in the historical record? The speed and scale of the projected losses “boggle the mind,” observed Colin G. Calloway, a historian at Dartmouth—one reason, he suggested, that researchers were so long reluctant to accept them. Indeed, how can one understand losses of such unparalleled scope? And if the European entrance into the Americas five centuries ago was responsible for them, what moral reverberations does this have today?

  THE GENETICS OF VULNERABILITY

  In August 1967 a missionary’s two-year-old daughter came down with measles in a village on the Toototobi River in Brazil, near the border with Venezuela. She and her family had just returned from the Amazonian city of Manaus and had been checked and cleared by Brazilian doctors before departure. Nonetheless the distinctive spots of measles emerged a few days after the family’s arrival on the Toototobi. The village, like many others in the region, was populated mainly by Yanomami Indians, a forest society on the Brazil-Venezuela border that is among the least Westernized on earth. They had never before encountered the measles virus. More than 150 Yanomami were in the village at the time. Most or all caught the disease. Seventeen died despite the horrified missionaries’ best efforts. And the virus escaped and spread throughout the Yanomami heartland, carried by people who did not know they had been exposed.

  Partly by happenstance, the U.S. geneticist James Neel and the U.S. anthropologist Napoleon Chagnon flew into Yanomami country in the midst of the epidemic. Neel, who had long been worried about measles, was carrying several thousand doses of vaccine. Alas, the disease had preceded them. They frantically tried to create an epidemiological “firebreak” by vaccinating ahead of the disease. Despite their efforts, the affected villages had a mean death rate of 8.8 percent. Almost one out of ten people died from a sickness that in Western societies was just a childhood annoyance.

  Later Neel concluded that the high death rate was in part due to grief and despair, rather than the virus itself. Still, the huge toll was historically unprecedented. The implication, implausible at first glance, was that Indians in their virgin-soil state were more vulnerable to European diseases than virgin-soil Europeans would have been. Perhaps surprisingly, there is some scientific evidence that Native Americans were for genetic reasons unusually susceptible to foreign microbes and viruses—one reason that researchers believe that pandemics of Dobynsian scale and lethality could have occurred.

  Here I must make a distinction between two types of susceptibility. The first is the lack of acquired immunity—immunity gained from a previous exposure to a pathogen. People who have never had chicken pox are readily infected by the virus. After they come down with the disease, their immune system trains itself, so to speak, to fight off the virus, and they never catch it again, no matter how often they are exposed. Most Europeans of the day had been exposed to smallpox as children, and those who didn’t die were immune. Smallpox and other European diseases didn’t exist in the Americas, and so every Indian was susceptible to them in this way.

  In addition to having no acquired immunity (the first kind of vulnerability), the inhabitants of the Americas had immune systems that some researchers believe were much more restricted than European immune systems. If these scientists are correct, Indians as a group had less innate ability to defend themselves against epidemic disease (the second kind of vulnerability). The combination was devastating.

  The second type of vulnerability stems from a quirk of history. Archaeologists dispute the timing and manner of Indians’ arrival in the Americas, but almost all researchers believe that the initial number of newcomers must have been small. Their gene pool was correspondingly restricted, which meant that Indian biochemistry was and is unusually homogeneous. More than nine out of ten Native Americans—and almost all South American Indians—have type O blood, for example, whereas Europeans are more evenly split between types O and A.

  Evolutionarily speaking, genetic homogeneity by itself is neither good nor bad. It can be beneficial if it means that a population lacks deleterious genes. In 1491, the Americas were apparently free or almost free of cystic fibrosis, Huntington’s chorea, newborn anemia, schizophrenia, asthma, and (possibly) juvenile diabetes, all of which have some genetic component. Here a limited gene pool may have spared Indians great suffering.

  Genetic homogeneity can be problematic, too. In the 1960s and 1970s Francis L. Black, a virologist at Yale, conducted safety and efficacy tests among South American Indians of a new, improved measles vaccine. During the tests he drew blood samples from the people he vaccinated, which he later examined in the laboratory. When I telephoned Black, he told me that the results were “thought-provoking.” Every individual person’s immune system responded robustly to the vaccine. But the native population as a whole had a “very limited spectrum of responses.” And that, he said, “could be a real problem in the right circumstances.” For Indians, those circumstances arrived with Columbus.

  Black was speaking of human leukocyte antigens (HLAs), molecules inside most human cells that are key to one of the body’s two main means of defense. Cells of all sorts are commonly likened to biochemical factories, busy ferments in which dozens of mechanisms are working away in complex sequences that are half Rube Goldberg, half ballet. Like well-run factories, cells are thrifty; part of the cellular machinery chops up and reuses anything that is floating arou
nd inside, including bits of the cell and foreign invaders such as viruses. Not all of the cut-up pieces are recycled. Some are passed on to HLAs, special molecules that transport the snippets to the surface of the cell.

  Outside, prowling, are white blood cells—leukocytes, to researchers. Like minute scouts inspecting potential battle zones, leukocytes constantly scan cell walls for the little bits of stuff that HLAs have carried there, trying to spot anything that doesn’t belong. When a leukocyte spots an anomaly—a bit of virus, say—it destroys the infected or contaminated cell immediately. Which means that unless an HLA lugs an invading virus to where the leukocyte can notice it, that part of the immune system cannot know it exists, let alone attack it.

  HLAs carry their burdens to the surface by fitting them into a kind of slot. If the snippet doesn’t fit into the slot, the HLA can’t transport it, and the rest of the immune system won’t be able to “see” it. All people have multiple types of HLA, which means that they can bring almost every potential problem to the attention of their leukocytes. Not every problem, though. No matter what his or her genetic endowment, no one person’s immune system has enough different HLAs to identify every strain of every virus. Some things will always escape notice. Imagine someone sneezing in a crowded elevator, releasing into the air ten variants of a rhinovirus, the kind of virus that causes the common cold. (Viruses mutate quickly and are commonly present in the body in multiple forms, each slightly different from the others.) For simplicity’s sake, suppose that the other elevator passengers inhale all ten versions of the virus. One man is lucky: he happens to have HLAs that can lock onto and carry pieces of all ten variants to the cell surface. Because his white blood cells can identify and destroy the infected cells, this man doesn’t get sick. Not so lucky is the woman next to him: she has a different set of HLAs, which are able to pick up and transport only eight of the ten varieties. The other two varieties escape the notice of her leukocytes and go on to give her a howling cold (eventually other immune mechanisms kick in and she recovers). These disparate outcomes illustrate the importance to a population of having multiple HLA profiles; one person’s HLAs may miss a particular bug, but another person may be equipped to combat it, and the population as a whole survives.