The Once and Future World Read online

Page 3


  When McClenachan goes diving in those waters today, she no longer sees with the same eyes that she used to. “It’s like there’s all these ghosts lurking around,” she says. She has become, she tells me, “a big buzzkill for family vacations.”

  Memory conspires against nature. The forgetting can begin in the instant that a change takes place: the human mind did not evolve to see its surroundings—what we now so clinically refer to as “the environment”—as the focus of our attention, but rather as the backdrop against which more interesting things take place. We generally don’t notice small or gradual changes. Our minds would otherwise be crowded with turning leaves and the paths of clouds across the sky—a beguiling madness, but a madness all the same.

  Even a dramatic event can be overlooked in the moment, through a phenomenon known as “change blindness.” In the most famous study of its kind, test subjects who were asked to follow the path of a ball being passed among a group of basketball players consistently failed to notice when a person in a gorilla suit danced through the scene. It’s a question of where you direct your attention: keep your eyes on the ball and you’re likely to miss the dancing gorilla. While being guided through a similar demonstration by Daniel Simons, a University of Illinois psychologist, I failed to notice a slow but steady change to the background of a scene—despite the fact I was aware this was the purpose of the video. In fact, I missed the change three times straight. Then Simons patiently suggested that I run the video in fast-forward, which made the change abrupt enough that I finally saw that about one-fifth of the wheat field I was looking at was being slowly reduced to stubble.

  Most people do not believe that such experiments would fool them. In reality, change blindness affects even the expert eye. A study of football fans found they were 110 percent more likely than non-fans to spot changes to a football scene—but only if those changes were relevant to the game. Make a change in the background, and both fans and non-fans were likely to miss it. Yet the belief that your own eyes will not fool you is persistent enough that psychologists have given that condition a name too: “change blindness blindness.” If you don’t believe that you are capable of missing significant changes to a scene, then you won’t heighten your awareness in order not to miss them—which means that you probably will. Change blindness blindness is the failure to see that we so often fail to see.

  We are an incredibly adaptable species. Whether or not we notice a change in our circumstances, the change itself is real, and we quickly adapt to the new conditions. Once we’ve done so, there is little point in holding on to memories of how things used to be. The shifting baseline syndrome applies as much to the way we forget what houses cost ten years ago or fail to notice that fast-food portion sizes have tripled since the 1970s as it does to the natural world. Out of sight, out of mind: ordinary amnesia.

  The historian Clive Ponting sees the whole story of civilization as a series of adaptations forced on us as we rendered the world around us less and less livable. What he calls “the first great transition”—the era in which our ancestors moved over several millennia from being hunter-gatherers to being agriculturalists—is an example:

  Human societies did not set out to invent agriculture and produce permanent settlements. Instead a series of marginal changes were made gradually in existing ways of obtaining food as a result of particular local circumstances. The cumulative effect of the various alterations was important because they acted like a ratchet. Changes in subsistence methods often allowed a larger population to be supported but this made it difficult and eventually impossible to return to a gathering and hunting way of life because the extra people could not then be fed.

  Ponting argues that this “ratchet effect” has continued, with societies needing to advance their technologies and degree of organization in order to respond to environmental challenges that are often of their own making. Success has the paradoxical effect that even greater human populations with larger environmental impacts can support themselves from a more degraded natural world. The author Ronald Wright describes this as a “progress trap” that contributed to the collapse of societies as significant as the Roman and Mayan empires.

  The adapt-and-forget pattern is amplified by modern life. If you, like me, are a city dweller, then you’re unlikely to suffer change blindness to shifts in the natural world, because you’re not there to witness those shifts, and you don’t suffer much environmental amnesia, either, because you don’t have many memories of nature in the first place. For you, the baselines that shift will be mainly urban and technological ones; your generation will accept as normal that which your parents struggle to adapt to, and your children will carry forward little memory of the city as you knew it.

  Memory is depressing country, and never more so than when it comes to what is passed down from one generation to the next. Much of the research into such memories has to do with the Holocaust. There are urgent reasons to remember the Nazi genocide against Jews and other selected minorities in Europe during World War II, most obviously in order to prevent anything similar from happening again. Yet Holocaust researchers have had to confront difficult truths about remembrance, including the fact that survivors’ stories will be largely forgotten by the time they’ve been handed down through just three generations, or about ninety years—the great-grandchildren of the men and women who have personal experience of the Holocaust will, like the rest of us, know the event mainly from books and films. The Holocaust is drifting toward “remote history,” or history that was never directly experienced by any living person who can remember it. Much of what happened, how, and why, will be known only to a tiny cadre of specialists, while many more details—each a part of the historical caution that the Holocaust represents—will be irretrievably lost. The term for this is knowledge extinction.

  Oddly enough, the extinction or near-extinction of certain animals has proved to be the standout exception to our forgetfulness when it comes to the natural world. The extinct dodo, for example, has become an unlikely cultural heavyweight. A fat, flightless bird that disappeared ten generations ago, the dodo ranks alongside the penguin, elephant and tiger among animals that even small children are likely to recognize. Popular culture has also held on to the bison—our greatest and most enduring symbol of the natural abundance of the past. The image of that great shaggy head immediately calls to mind the thundering herds that ruled North America’s plains two hundred years ago, and also the destruction of those herds, the buffalo hunters firing until their gun barrels seized from the heat of it, the skulls piled high as houses.

  The dodo and the bison have passed into the ranks of what are known as transgenerational memories: stripped-down versions of the original that can be recalled whenever cultural shorthand is needed to represent some era or moral or way of being in the world—Marilyn Monroe and the Berlin Wall and the Temptation in the Wilderness. A transgenerational memory is better known as a myth, a fable, a testament, an icon.

  We hang on to the dodo, then, but lose sight of the long list of other species that disappeared from Mauritius, the Indian Ocean island the dodo called home, among them the Mauritius scops-owl, the Mauritius giant skink, the Mauritius blue-pigeon, the lesser Mascarene flying-fox, two kinds of giant tortoise and a parrot that might have been the largest ever known. We remember the bison herds too, but not the way the spadefoot toad and western chorus frog sang from the animals’ wallows—an estimated 100 million small ponds across the plains—in spring, or that birds such as the McCown’s longspur and the mountain plover nested on those same wallows as the sun dried them into dust bowls. The bison of memory is forever an animal of the grasslands, and never of the Rocky Mountain passes or Mexican deserts where they also once lived. Who remembers the menacing, coal-black bulls hunted in the hardwood forests of Pennsylvania? What about the bison herds of California?

  Certain kinds of memorial encourage forgetting. At the time of the great buffalo hunts, almost every other animal in the New World was also under
assault, from oysters to fur seals, from prairie-chickens to basking sharks. We remembered the destruction of the bison as a way of remembering that entire era of extermination; today, only the buffalo’s story is universally known. The bison hunt has passed through the irony machine of history, in much the same way that the enduring presence of the Holocaust has ended up overshadowing the remembrance of every act of ethnic cleansing before or after.

  To more fully appreciate the lost memory that our focus on the buffalo represents, consider the fact that a similar slaughter had already taken place in North America by the time the bison hunts began: deer were once hunted to the brink of extinction. Try raising this around a kitchen table in the U.S. or Canada, and you will meet with flat disbelief. Deer? Deer eat tulips in suburban gardens. Deer show up in online videos goring people’s dogs within city limits. The drivers of New York state alone run down seventy-five thousand deer a year.

  The slaughter of the deer is vanishingly obscure. Fur-trade scholar Charles Hanson declares the buckskin trade of the American southeast “sadly neglected in literature”—and the buckskin trade of the American southeast is by far the best known of the deer hunts across North America. Early colonial immigrants to the Carolinas and Florida reported plains and forests “crowded with deer.” An observer named Thomas Ashe, in the 1680s, tells of “deer of which there is such infinite herds, that the whole country seems but one continuous park.” To the Muscogee Indians, whose territories covered much of what is now Georgia, Alabama and northern Florida, the herds were the currency of survival. Three quarters of the Muscogee meat supply came from white-tailed deer, not to mention most of their clothing, housewares, tools and such distinctive paraphernalia as flutes hollowed from the leg bones.

  Not surprisingly, skilled Muscogee hunters quickly became the supply side of the deerskin trade. On the demand side was all of Europe, where deer had already been so badly overhunted that gloves in Paris were reportedly being made with rat skins. Before the era of denim, there were deer-leather breeches, and just as with blue jeans, these buckskins were worn first by labourers and then came into fashion among the aristocracy. Imagine the scale of killing today if even a single city the size of Los Angeles, London or Toronto were to replace its jeans with animal hide. When the southeastern deerskin trade peaked in the years ahead of the American Revolution, the total number of hides brought in by Indian traders was at least one million each year. So went the pattern as the hide trade spread across North America, eventually overlapping with the bison kill. By 1886, a pioneer in rural New York recalled for the New York Times the hunts that routinely killed forty or fifty deer at a time in that state. Those days were done. “If a man were offered a million dollars for a deer killed in this county today he could never earn the money,” the woodsman said.

  “They were so scarce,” writes Leonard Lee Rue in The Deer of North America, “that their same numbers today would make them candidates for the endangered species list.”

  It’s easy to assume that the deer hunt is forgotten because the story ends so differently than the elegies to the bison. Unlike the buffalo, deer made a phenomenal comeback. With hunting restrictions in place, their predators largely wiped out, and forests opened by roads, farms and logging, the deer population recovered; the white-tailed deer in particular is the only large animal in North America that now ranges over more territory than ever before. But the deer trade left its own deep scars. At the beginning of the buckskin era in 1685, a Muscogee hunter would undergo ritual purification before the long winter hunt, and might make four hundred kills in a season, feeding entire communities. By the end of the American Revolution, not quite a century later, the natural economy of the Muscogee was in collapse and they were no longer able to find enough “bucks”—the origin of the slang term for money—to pay off their debts to colonial traders offering easy credit, especially for tafia rum. In 1802, federal Indian agent Benjamin Hawkins spelled out the new reality in his advice to a Muscogee leader. “Sell some of your waste lands,” he said, referring to territories that were seen a century earlier as infinitely rich. “I see no other resource that is very abundant.” Most Muscogee were ultimately displaced west to Oklahoma; the only land the tribe still holds in its historical territory is a 230-acre reserve in Alabama. A world emptied of deer marked the beginning of a long period of dispossession for the Muscogee, a lifetime before the scorched-earth buffalo hunts became an official instrument of war against the indigenous nations of the Great Plains.

  But we forget.

  When the bison were all but gone, buffalo hunters still hung on in the buffalo-hunting towns, waiting for the herds to return. The animals had migrated, the men told themselves, and would come back soon. Many of the hunters saw themselves as blameless, left unemployed by the whims of natural forces. They waited a while for the buffalo, and then they became cowboys.*

  Denial is the last line of defence against memory. It helps us to forget what we’d rather not remember, and then to forget that we’ve forgotten it, and then to resist the temptation to remember. “The ability to deny is an amazing human phenomenon, largely unexplained and often inexplicable,” writes the sociologist Stanley Cohen, author of States of Denial. Yet we find denial useful. It fulfills, to quote the definition preferred by Cohen, “our need to be innocent of a troubling recognition.”

  Once upon a time, there were dodo deniers. For more than a century after the last dodo died in the late seventeenth century, the bird’s former existence was doubted and rejected. The general public forgot the bird entirely, and even naturalists dismissed reports and paintings of the bird as fanciful works of imagination. It was only the publication, nearly two centuries later, of Lewis Carroll’s Alice’s Adventures in Wonderland that brought the dodo back to life, albeit as an extinct species. The illustration of the dodo in the book, drawn by John Tenniel, captured people’s hearts. Even today no one can say exactly what a living dodo looked like, but the basic image has not changed since Tenniel’s depiction: the madhouse eyes, the ungainly beak, the wallflower plumage, the pointless wings. The bird is a portrait of the perfect victim.

  Until the early 1800s, many leading thinkers denied the idea of extinction entirely; it was considered contradictory to the notion of godly creation. “That no real species of living creatures is so utterly extinct, as to be lost entirely out of the world, since it was first created, is the opinion of many naturalists; and ’tis grounded on so good a principle of Providence taking care in general of all its animal productions, that it deserves our assent,” wrote Thomas Molyneux, a seventeenth-century scholar who argued that fossilized skeletons of Irish elk—an enormous deer that once roamed much of Eurasia—were in fact only the remains of a misplaced breed of American moose. Similarly, Thomas Jefferson, in his role as third president of the United States, hoped that the overland expedition of Meriwether Lewis and William Clark, which ran from 1804 to 1806, would find living mammoths in the American West as proof that the Christian god would not allow any of his flock to disappear from the earth.

  Name a vanished animal, and in its story you are bound to find denial. Sometimes, the species is said never to have lived. More often, it’s said never to have died. The great auk was among the first animals to be driven extinct after the European discovery of the Americas, alongside even more deeply forgotten species, such as the sea mink. Like the dodo, the auk was a flightless bird, though one that resembled a small penguin and spent most of its time at sea. It took a thousand years for European hunters to eradicate the auk from their home continent and, with better technology, three hundred years to erase it from North America. Auks were hunted for their meat, eggs and feathers, as well as for “trane-oil”—the animal oil that lit and lubricated the world ahead of the petroleum age. Auks were so fatty that there are reports of them being thrown, sometimes alive, onto fires as fuel to boil the oil out of other auks.

  As the great auks vanished, the nature of the birds changed in the eyes of human observers. In the beginning t
hey were seen as too thick-headed to flee from hunters, and so common that an auk was a ship captain’s first sign that he was nearing the North American coast. Once the birds had become rare, their absence was blamed on natural timidity or, in one example, an alleged migration to the Arctic “by choice and instinct.” At last, only a few years after the last auks were killed in 1844,* one commentator wrote that “in all probability, the so-called great auk of history was a mythical creature invented by unlettered sailors and fisherfolk.” Even in the 1960s, when the prior existence of the great auk was no longer disputed, a Canadian fisheries bureaucrat told news reporters that the bird had been a relict species with no place in the modern world. They “had to go,” the scientist said.

  But to witness the lengths to which the hand-washing rituals of denial can go, consider Thylacinus cynocephalus. To begin with, it was denied the uniqueness of its being. The animal is sometimes remembered as the Tasmanian tiger, because it had stripes, and sometimes as the Tasmanian wolf, because it had the pointed snout and long-trotting look of a canine (charming detail: its ears remained erect even when the animal was asleep), and sometimes even as the Tasmanian hyena, because it didn’t really look like either a cat or a dog. What it was, was itself: a marsupial that carried its young in a pouch, like a kangaroo,* but was otherwise a formidable hunter and meat-eater with no close relatives among the living creatures of the earth. Most biologists now refer to the animal as the thylacine.