Below you can read excerpts from a magazine essay I was asked to write reflecting on my “anthropocene” journey — the decades I’ve spent examining what’s now called humanity’s
great acceleration, the explosive growth in our numbers, resource appetites and environmental footprint since around 1950.
I wrote the piece for the inaugural issue of, yes, Anthropocene Magazine — the new incarnation of Conservation Magazine,
published for 15 years by the University of Washington and now reimagined with help from the MacArthur Foundation, Wilburforce Foundation and Future Earth, an international sustainability
science consortium. Here’s a video in which I and other contributors and partners explain the magazine’s focal point.
I put it this way:
“What an amazing juncture to be alive. Humanity was a dribble for most of its existence, and then there’s been zoom, and within the lives of almost everyone who’s alive right now something
different is coming.”
In the essay, I explore the host of meanings and debates that have emerged around the word, saying, “After 16 years of percolation and debate, anthropocene has become the closest thing there is to common shorthand
for this turbulent, momentous, unpredictable, hopeless, hopeful time—duration and scope still unknown.”
What has your Anthropocene journey been like, and where is it going? Read on for mine, and weigh in. For a soundtrack while you read, I recommend “Anthrocene,”
the new song by the Australian musician Nick Cave (and bandmate Warren Ellis), using a spelling I proposed way back in 1992.
Here are some excerpts and a link to the full piece:
An Anthropocene Journey
By Andrew Revkin
My reporting career has taken me from smoldering, fresh-cut roadsides in the Amazon rain forest to the thinning sea ice around the North Pole, from the White House and Vatican to Nairobi’s vast, still-unlit slums.
Throughout most of it, I thought I was writing about environmental and social problems and solutions.
Lately I’ve come to realize that my lifelong beat, in essence, has been one species’ growing pains. After tens of thousands of years of scrabbling by, spreading around the planet, and developing tools
of increasing sophistication, humans are in surge mode and have only just started to become aware that something profound is going on. The upside has
been astounding. Child and maternal mortality rates have plunged. Access to education has soared. Deep poverty is in sharp retreat. Despite the 24/7 distilled drama online and on TV, violence on scales from war
to homicide has been in a long decline.
It’s been only a few decades since science began building a picture of the back story to this spectacular ascent. It’s
a story about how humans became such a potent environmental influence that a signature of our doings, for good or ill, will be measurable in layered rock for millions of years to come. By altering climate, landscapes,
and seascapes as well as flows of species, genes, energy, and materials, we are sealing the fates of myriad other species. And, without a big shift from business-as-usual, we will undermine our own long-term welfare
In 2000, after a century of earlier efforts by scholars, scientists, and at least one journalist (me) to give a name to humanity’s emerging role as a planet-scale force, one word emerged in a heated moment at
a global change conference in Cuernavaca, Mexico—anthropocene.
It appears to be here for the long haul. After 16 years of percolation and debate, anthropocene has become the closest thing there is to common shorthand for this turbulent, momentous, unpredictable, hopeless, hopeful
time—duration and scope still unknown.
The word is still so novel that no one has even settled on how to pronounce it; the British stress the second syllable and Americans the first. That seems appropriate, given that reactions to the emergence of the term—let
alone the actual environmental changes it aims to describe—have come in all colors and flavors. There’s even been a spirited push for alternatives, some rather biting.
I imagine you’ve heard some of the competing words that have bubbled up. We’re actually in the greed-driven Capitalocene,
the trash-choked Plasticene, the combustible Pyrocene,
the self-loathing Misanthropocene, the testosterone-dominated Manthropocene—even
Obscene. There’s some merit as well as weakness in every label, including the word that sparked it all. [I
left out a few, including the Homogenocene favored by Charles Mann,
Keirán Suckling and others (we’re homogenizing global biology), and the Necrocene of Justin McBrien (we’re killing off a lot of things).]
The anthropocene (both the word and the unfolding age) has so much Rorschach-like plasticity that all I can offer as guidance are my informed but subjective reflections based on what I’ve learned and unlearned
in my long, quirky journey. I’d argue that what matters most is not resolving some common meaning so much as engaging in deeply felt discussions, fresh lines of inquiry, and new proposals for sustaining the
human journey—all of which have been sparked by the emergence of this concept.
To navigate this terrain, it’s best to start with the foundational anthropocene idea, as blurted out in February 2000 during a scientific meeting on human-caused global change. A prominent participant was
Paul J. Crutzen, who’d won a Nobel Prize for helping identify the threat certain synthetic chemicals posed to the planet’s protective ozone layer. At the meeting, his frustration grew as peers described
momentous shifts in Earth’s operating systems, but always anchored them in time by mentioning the Holocene. Holocene is the formal name for the “wholly recent” epoch of planetary history that
began at the end of the last ice age 11,700 years ago.
At one point, Crutzen couldn’t hold back. He interrupted a colleague, as the scientist Will Steffen later described: “Stop using the word Holocene. We’re not in the Holocene any more. We’re
in the … the … the … (searching for the right word) … the Anthropocene!”
In his 2014 book “The Anthropocene,” Christian Schwägerl describes how the room
fell silent at first, and then the word became the center of conversation. “The scientists in that conference room in Mexico were profoundly shaken,” Schwägerl wrote. “[O]ne of the most
frequently cited natural scientists in the world … was not only describing the past with this new term (something to which geologists are accustomed), but he was also redefining and connecting to the future
… a new Earth sculpted by humans.”
Shortly after that meeting, Crutzen learned that Eugene F. Stoermer, an admired analyst of tiny lakebed diatom fossils, had used the word
in the 1980s. The two scientists collaborated on an essay for a newsletterfor Earth systems scientists. They laid
out a scientific rationale for the term and explained why, even though there was no tradition of naming geological spans for their causative elements, in this case it was justified:
“Considering these … major and still growing impacts of human activities on Earth and atmosphere, and at all, including global, scales, it seems to us more than appropriate to emphasize the central role
of mankind in geology and ecology by proposing to use the term ‘anthropocene’ for the current geological epoch.”
Crutzen and several collaborators refined the concept in subsequent papers. The term quickly spread, propelled in a dizzying array of directions as if filling a linguistic vacuum. It began popping up in peer-reviewed
literature in a variety of disciplines and eventually spawned at least three scientific journals (and one magazine) using “Anthropocene” in their titles.
It’s not hard to see why reverberations, pro and con, built so quickly. It was an audacious notion to recommend that a human age deserved to join the Paleocene, Eocene, Oligocene, Miocene, Pliocene, Pleistocene,
and Holocene as the epochs of geological history comprising the Age of Mammals. This stretch of time, more formally called the Cenozoic Era, began
65 million years ago, after the mass extinction that ended the dinosaurs’ age and enabled ours. And it could continue for a very long time—if the most powerful mammal, Homo sapiens, demonstrates
it can turn the sapience in its name into a sustainable journey.
The proposal of an Anthropocene epoch was particularly audacious because it came from a chemist and an ecologist, not a stratigrapher. Stratigraphy is the discipline within geology that develops and maintains the official
Geologic Time Scale and International Chronostratigraphic Chart.
In 2008, a group of stratigraphers and other earth scientists, led by Jan Zalasiewicz of the University of Leicester, published the first careful assessments of the intriguing Crutzen-Stoermer hypothesis. Indeed, they
found a concrete and durable human signature—literally. Tens of billions of tons of concrete are part of that signature, along with vast amounts of smelted aluminum and more exotic alloys, distinctive spherical
particles of fly ash from power plants, bomb radioisotopes, 6 billion tons (and counting) of plastic, and so much more. In a 2008 paper, Zalasiewicz
and others concluded that there appeared to be “sufficient evidence” for an Anthropocene epoch to be considered for formalization by the international geological community.
But a long road lay ahead. The following year, Zalasiewicz and some colleagues began assembling a working group on the “Anthropocene” at the invitation of one of the 16 subcommissions of the International
Commission on Stratigraphy. Those quotation marks around “Anthropocene” in the group’s name won’t disappear until some final judgment on the validity of a new epoch is reached.
In 2010 I was invited to join the working group, largely because of a quirky role I had played in the evolution of this anthropocene idea in 1992, when I essentially predicted Crutzen’s Mexico moment and what
has unfolded since. Since 1985, I’d been writing articles about human impacts on the climate system. In 1991, I finally got a chance to synthesize what I’d been learning, in a short book that
would accompany the first major museum exhibition on global warming, at the American Museum of Natural History. Closing out a chapter on the growing human impact on Earth, I typed an almost offhand proposal that
we’d jolted the planet out of the Holocene:
“Perhaps earth scientists of the future will name this new post-Holocene era for its causative element—for us. We are entering an age that might someday be referred to as, say, the Anthrocene. After all,
it is a geological age of our own making. The challenge now is to find a way to act that will make geologists of the future look upon this age as a remarkable time, a time in which a species began to take into account
the long-term impact of its actions. The alternative will be to leave a legacy of irresponsibility and neglect that will manifest itself in the fossil record as just one more mass extinction—like the record
of bones and empty footprints left behind by the dinosaurs.”
I vaguely recall musing on how to spell my passing reference to a name for this age. (I can’t probe the floppy disks on which any trace of that process sits.) “Anthrocene” seemed more streamlined
than other choices, and I was pretty naïve when it came to word roots in scientific terminology. It didn’t really matter. The book was published shortly after the end of the Persian Gulf War and the
planet-cooling eruption of Mount Pinatubo. Public attention was focused elsewhere. I’m sure no more than a few thousand people read it, certainly not Crutzen or Stoermer. It now floats on Amazon.com’s used listings for
as little as one US cent (plus shipping, of course)—another kind of anthropocene shard, in a way.
Reflecting on this now, I’m quite certain that when I wrote “earth scientists of the future,” I was thinking generations, if not centuries, into the future. But it took just eight years for scientific
rigor to be applied to the idea of an anthropogenic geological age. We do live in fast-forward times.
Language constantly evolves. In 2014, the word passed a significant milestone. The Oxford English Dictionary (OED) adds batches of words four times a year. The 171 words added
in June that year included all manner of obscurities (“cholestasis”), words reflecting trends of the moment (“selfie,” “flexitarian”), and “Anthropocene.”
According to the dictionary’s definition, the Anthropocene is “the era of geological time during which human activity is considered to be the dominant influence on the environment, climate, and ecology
of the earth.”
Before including it, the OED editors had wisely let the word percolate for 14 years after it first entered widespread discourse. But I’d argue that they jumped the gun in one important technical way and missed
the main, grander meaning of the word. That second point is not a criticism; it just reflects the plasticity and richness of this still-emerging neologism.
The technical problem with the definition? The word, despite having roots springing so directly from stratigraphic nomenclature, could still end up rejected as a formal “era of geological time.”…
Many influential stratigraphers have expressed deep skepticism that the Anthropocene deserves formal standing. For one thing, any new addition to the time scale must be useful to science. Calling an abrupt end to the
Holocene could achieve the opposite, creating confusion in the literature. There are significant debates over when to mark the starting point or lower boundary of the Anthropocene in the time scale.
Other scientists are concerned about all those flavors and colors of meaning that surround the word outside of geology—potentially tainting the time scale with environmental messaging. One of the starkest challenges
came last spring in a critique written by two influential geologists, Stanley C. Finney and Lucy E. Edwards. Its title laid out what they saw as a murky and open question: The “Anthropocene” epoch: Scientific decision or political statement?
There was some basis for such concerns. Many scientists and others pressing for a more sustainable human relationship with the environment had latched onto the word and idea as a rallying point. In a 2011 interview
with Elizabeth Kolbert for National Geographic, Crutzen had put it plainly: “What I hope … is that the term ‘Anthropocene’ will be a warning to the world.”
Now in its seventh year, the working group has been under pressure to complete its formal recommendation to the stratigraphic commission. Almost daily, emails fly back and forth among its 35 members, refining drafts
of papers (including a response to Finney and Edwards) and planning next steps. There have been three face-to-face meetings of the group’s members, most recently in Oslo in April 2016.
Coincidentally, that meeting kicked off on the 46th Earth Day. We gathered around a long table in an ornate room at the Fridtjof Nansen Institute in a mansion built a century ago by the famed Arctic explorer for whom
the institute was named. For two long days, discussions led by Zalasiewicz and Colin Waters of the British Geological Survey centered on a review of the “arguments against formalization.” The 17 bullet
points ranged from the technical and straightforward—“stratigraphic record is minimal … based on predictions … ”—to the testy and provocative—“[T]he Anthropocene
is political, not scientific.” As if to remind participants of the gravity of the task, there was a plastic-laminated copy of the scale itself at each seat, along with the usual array of writing pads and
My lack of familiarity with norms of stratigraphy prevented me from engaging too deeply, although I’ve been a minor coauthor on several of the group’s papers. What I think I’ve brought to the table
is context. In a presentation, I urged the geologists to take comfort in knowing they’re hardly the first discipline to be thrust into policy relevance or to have their norms shaken by disruptive change.
I clicked to a slide showing how the “tree of life” envisioned by Darwin had been utterly disrupted now that DNA sequencing allows a more complete view, particularly of microbes. Just days before the
Oslo meeting, a new “tree” had been published in which, as Carl Zimmer noted in the New York Times,
“All the eukaryotes, from humans to flowers to amoebae, fit on a slender twig” compared to a dizzying spray of lines of bacteria.
And now the revolutionary genetic editing tool CRISPR is poised to imprint humans’ ambitions on that tree at least as profoundly as fossil fuels have changed the physical world. I also noted that the sparring
in the stratigraphy community strongly echoed fights that had first erupted in meteorology and climate science 25 years ago, as new lines of evidence and new tools, such as global climate models, pointed to a growing
and disruptive human warming influence. “You’re not alone,” I said. But I stressed, using climate change as an example, that it is possible to separate the “is” of science from
the “ought” of society’s choices. With some bumps and bruises, the Intergovernmental Panel on Climate Change had found a way forward. Now it was geology’s turn.
There was some irony in the stroll each day between our hotel and the Nansen Institute. It took us along the shore in front of a giant Jenga-block scramble of horizontal white towers that belong to Statoil. Norway’s
mostly state-owned oil company has contributed substantially not only to Norway’s economy but also to global climate change. Even as Norway was adding incentives for drivers to buy electric vehicles to take
advantage of ample domestic hydro-electric power, the company announced plans to expand drilling in the Barents Sea to boost fossil-fuel exports. One got the impression that decisions made in that building would
have a bigger impact on world affairs than any conclusions we produced.
But there was a second layer of irony there on the windswept shores of the fjord. The grassy stretch along the sinuous path was also a sculpture park. A vertical slab rose from the grass directly in front of the Statoil
building, imprinted with an image of one of Easter Island’s moai—the haunting stone figures carved at the potent pinnacle of the great, but vanished, Rapa Nui civilization.
While many geologists worry that a human-etched epoch grants us too much power on the basis of too little evidence, a few think the proponents of the geological Anthropocene are thinking way too small. One such expert
is Jay Quade of the University of Arizona. After decades of fieldwork and lab analysis on six continents, Quade—whose father and grandfather were geologists—seems to live, breathe, and eat insights
from ancient rock. I met him in June at a Santa Fe, New Mexico, gathering of scientists focused on the Quaternary Period. He credited the efforts of Crutzen and scientists such as those in the “Anthropocene”
working group for all that they were doing but said his reading of the evidence pointed to an even more massive unfolding geological transition. It could, he believed, be akin to—if not bigger than—the
Permian-Triassic mass extinction 250 million years ago and the Cretaceous-Tertiary extinction that cleared out the dinosaurs and led to the Age of Mammals—and us.
In his keynote talk, he described the human-driven changes under way on Earth as “creating the mother of all stratigraphic marker horizons.” One slide took the audience 50 million years into the future, projecting what the human imprint would look like after such a span—kind of like what geologists see now in probing previous
great events. Our anthropocene moment appears as a brief pulse of trash, rare earths, and the like—along with a profound constriction of mammal species—followed in future ages by a flourishing of surviving
and newly evolved mammals. Are humans among them to assess that record?
Time will tell.