Sometimes realization comes in a blinding flash. Blurred outlines snap into shape and suddenly it all makes sense. Underneath such revelations is typically a much slower-dawning process. Doubts at the back of the mind grow. The sense of confusion that things cannot be made to fit together increases until something clicks. Or perhaps snaps.
Collectively, we three authors of this article must have spent more than 80 years thinking about climate change. Why has it taken us so long to speak out about the obvious dangers of the concept of net-zero? In our defence, the premise of net-zero is deceptively simple—and we admit that it deceived us.
- The climate news you need. Subscribe now to our engaging new weekly digest.
- You’ll receive exclusive, never-before-seen-content, distilled and delivered to your inbox every weekend.
- The Weekender: Succinct, solutions-focused, and designed with the discerning reader in mind.
The threats of climate change are the direct result of there being too much carbon dioxide in the atmosphere. So it follows that we must stop emitting more and even remove some of it. This idea is central to the world’s current plan to avoid catastrophe. In fact, there are many suggestions as to how to actually do this, from mass tree planting, to high tech direct air capture devices that suck out carbon dioxide from the air.
The current consensus is that if we deploy these and other so-called “carbon dioxide removal” techniques at the same time as reducing our burning of fossil fuels, we can more rapidly halt global warming. Hopefully around the middle of this century we will achieve “net-zero”. This is the point at which any residual emissions of greenhouse gases are balanced by technologies removing them from the atmosphere.
This is a great idea, in principle. Unfortunately, in practice it helps perpetuate a belief in technological salvation and diminishes the sense of urgency surrounding the need to curb emissions now.
We have arrived at the painful realization that the idea of net-zero has licenced a recklessly cavalier “burn now, pay later” approach which has seen carbon emissions continue to soar. It has also hastened the destruction of the natural world by increasing deforestation today, and greatly increases the risk of further devastation in the future.
“Over the years, doubt has developed into dread,” writes Dyke, senior lecturer in global systems at the University of Exeter. “This gnawing sense that we have made a terrible mistake. There are now times when I freely admit to a sense of panic. How did we get this so wrong? What are our children supposed to think about how we have acted?”
To understand how this has happened, how humanity has gambled its civilization on no more than promises of future solutions, we must return to the late 1980s, when climate change broke out onto the international stage.
Steps Towards Net-Zero
On June 22, 1988, James Hansen was the administrator of NASA’s Goddard Institute for Space Studies, a prestigious appointment but someone largely unknown outside of academia.
By the afternoon of the 23rd, he was well on the way to becoming the world’s most famous climate scientist. This was as a direct result of his testimony to the United States congress, when he forensically presented the evidence that the Earth’s climate was warming and that humans were the primary cause: “The greenhouse effect has been detected, and it is changing our climate now.”
If we had acted on Hansen’s testimony at the time, we would have been able to decarbonize our societies at a rate of around 2% a year in order to give us about a two-in-three chance of limiting warming to no more than 1.5°C. It would have been a huge challenge, but the main task at that time would have been to simply stop the accelerating use of fossil fuels while fairly sharing out future emissions.
Four years later, there were glimmers of hope that this would be possible. During the 1992 Earth Summit in Rio, all nations agreed to stabilize concentrations of greenhouse gases to ensure that they did not produce dangerous interference with the climate. The 1997 Kyoto Summit attempted to start to put that goal into practice. But as the years passed, the initial task of keeping us safe became increasingly harder given the continual increase in fossil fuel use.
It was around that time that the first computer models linking greenhouse gas emissions to impacts on different sectors of the economy were developed. These hybrid climate-economic models are known as Integrated Assessment Models. They allowed modellers to link economic activity to the climate by, for example, exploring how changes in investments and technology could lead to changes in greenhouse gas emissions.
They seemed like a miracle: you could try out policies on a computer screen before implementing them, saving humanity costly experimentation. They rapidly emerged to become key guidance for climate policy. A primacy they maintain to this day.
Computer Models Replace Critical Thinking
Unfortunately, they also removed the need for deep critical thinking. Such models represent society as a web of idealized, emotionless buyers and sellers and thus ignore complex social and political realities, or even the impacts of climate change itself. Their implicit promise is that market-based approaches will always work. This meant discussions about policies were limited to those most convenient to politicians: incremental changes to legislation and taxes.
Around the time those models were first developed, efforts were being made to secure U.S. action on the climate by allowing it to count carbon sinks of the country’s forests. The U.S. argued that if it managed its forests well, it would be able to store a large amount of carbon in trees and soil which should be subtracted from its obligations to limit the burning of coal, oil, and gas. In the end, the U.S. largely got its way. Ironically, the concessions were all in vain, since the U.S. senate never ratified the agreement.
Postulating a future with more trees could in effect offset the burning of coal, oil, and gas now. As models could easily churn out numbers that saw atmospheric carbon dioxide go as low as one wanted, ever more sophisticated scenarios could be explored which reduced the perceived urgency to reduce fossil fuel use. By including carbon sinks in climate-economic models, a Pandora’s box had been opened.
It’s here we find the genesis of today’s net-zero policies.
“It came to me as a real shock that I must have contributed personally to the net-zero trap,” says Knorr, a senior research scientist in physical geography and ecosystem science at Lund University. “In 2008, the G8 countries declared a voluntary target of reducing carbon dioxide emission by 50% by 2050. Back then, I responded by publishing computations I had performed specifically to show the need for net-zero in the long run, stating that any remaining carbon dioxide emissions by human activities would have to be ‘balanced by an artificial sink’.”
But “since none of our study’s co-authors was an expert,” he adds, “we did not consider how much that artificial sink would be needed to sustain our economic system, or if it was even technically possible to create.”
In the mid-1990s, most attention was focused on increasing energy efficiency and energy switching (such as the UK’s move from coal to gas) and the potential of nuclear energy to deliver large amounts of carbon-free electricity. The hope was that such innovations would quickly reverse increases in fossil fuel emissions.
Climate Models Embrace Carbon Capture…In Theory
But by around the turn of the new millennium it was clear that such hopes were unfounded. Given their core assumption of incremental change, it was becoming more and more difficult for economic-climate models to find viable pathways to avoid dangerous climate change. In response, the models began to include more and more examples of carbon capture and storage, a technology that could remove the carbon dioxide from coal-fired power stations and then store the captured carbon deep underground indefinitely.
This had been shown to be possible in principle: compressed carbon dioxide had been separated from fossil gas and then injected underground in a number of projects since the 1970s. These Enhanced Oil Recovery schemes were designed to force gases into oil wells in order to push oil towards drilling rigs and so allow more to be recovered—oil that would later be burnt, releasing even more carbon dioxide into the atmosphere.
Carbon capture and storage offered the twist that instead of using the carbon dioxide to extract more oil, the gas would instead be left underground and removed from the atmosphere. This promised breakthrough technology would allow climate-friendly coal and so the continued use of this fossil fuel. But long before the world would witness any such schemes, the hypothetical process had been included in climate-economic models. In the end, the mere prospect of carbon capture and storage gave policy-makers a way out of making the much-needed cuts to greenhouse gas emissions.
The Rise of Net-Zero
By the time the international climate change community convened in Copenhagen in 2009, it was clear that carbon capture and storage was not going to be sufficient for two reasons.
First, it still did not exist. There were no carbon capture and storage facilities in operation on any coal-fired power station and no prospect the technology was going to have any impact on rising emissions from increased coal use in the foreseeable future.
The biggest barrier to implementation was essentially cost. The motivation to burn vast amounts of coal is to generate relatively cheap electricity. Retrofitting carbon scrubbers on existing power stations, building the infrastructure to pipe captured carbon, and developing suitable geological storage sites required huge sums of money. Consequently the only application of carbon capture in actual operation then—and now—is to use the trapped gas in enhanced oil recovery schemes. Beyond a single demonstrator, there has never been any capture of carbon dioxide from a coal-fired power station chimney with that captured carbon then being stored underground. [That single demonstrator is the Boundary Dam project in Saskatchewan, and regular Energy Mix readers know how badly that project has gone—Ed.]
Just as important, by 2009 it was becoming increasingly clear that it would not be possible to make even the gradual reductions that policy-makers demanded. That was the case even if carbon capture and storage was up and running. The amount of carbon dioxide that was being pumped into the air each year meant humanity was rapidly running out of time.
With hopes for a solution to the climate crisis fading again, another magic bullet was required. A technology was needed not only to slow down the increasing concentrations of carbon dioxide in the atmosphere, but actually reverse it. In response, the climate-economic modelling community—already able to include plant-based carbon sinks and geological carbon storage in their models—increasingly adopted the “solution” of combining the two.
So it was that Bioenergy Carbon Capture and Storage, or BECCS, rapidly emerged as the new saviour technology. By burning “replaceable” biomass such as wood, crops, and agricultural waste instead of coal in power stations, and then capturing the carbon dioxide from the power station chimney and storing it underground, BECCS could produce electricity at the same time as removing carbon dioxide from the atmosphere. That’s because as biomass such as trees grow, they suck in carbon dioxide from the atmosphere. By planting trees and other bioenergy crops and storing carbon dioxide released when they are burnt, more carbon could be removed from the atmosphere.
With this new solution in hand the international community regrouped from repeated failures to mount another attempt at reining in our dangerous interference with the climate. The scene was set for the crucial 2015 climate conference in Paris.
Next: A Parisian False Dawn
James Dyke is Senior Lecturer in Global Systems, University of Exeter. Robert Watson is Emeritus Professor in Environmental Sciences, University of East Anglia. Wolfgang Knorr is Senior Research Scientist, Physical Geography and Ecosystem Science, Lund University. This article is republished from The Conversation under a Creative Commons licence. Read the original article in full.

