Wednesday, April 28, 2021

Water Level Raised in Lower Snake River, Fish Advocates Cry Foul (Spokesman-Review, Spokane, WA)

 

(LEWISTON, ID) – The Army Corps of Engineers is operating the lower Snake River at a water elevation that decreases survival of protected salmon and steelhead but provides safer navigation conditions for tug and barge operators.

Fish managers have protested the move intended to increase the depth of the navigation channel near Lewiston and argued the needs of the fish should take precedence over transportation of grain and other products. For example, the Nez Perce Tribe suggested that river transportation be temporarily halted or the location of the shipping channel shifted to deeper areas of the reservoir as alternatives to raising the elevation.

“The point of our concern really kind of boils down to, yes, (navigation) is one of the many purposes the Corps of Engineers has for the system, but we are also dealing with fish listed under the Endangered Species Act here,” said David Johnson, director of the Nez Perce Tribe’s Department of Fisheries Resources Management. “In our mind, especially with as low as these (salmon and steelhead) returns have been the last several years, the Corps and (Bonneville Power Administration) could be giving a lot more consideration to how they are operating these dams.

“In times of healthy returns, (a trade-off like raising the river) is something that should be considered, but a trade-off in times of horrible returns shouldn’t be balanced on the backs of the fish.”

Dredging of the navigation channel last occurred in 2015. Since then, sediment at and near the confluence of the Snake and Clearwater rivers has accumulated to the point that, in some places, the channel no longer maintains its federally prescribed minimum depth of 14 feet, raising the risk of barges running aground.

Port of Lewiston Manager David Doeringsfeld said the turning basin in front of the port is also becoming shallow, but the berthing areas are sufficiently deep.

In response to the sedimentation, the Corps opted to raise the river above the elevation prescribed by the latest biological opinion – a federal document that spells out measures designed to ensure salmon and steelhead protected under the Endangered Species Act aren’t pushed further toward extinction.

The agency intends to keep the river 3 feet above that level, known as minimum operating pool, when flows are less than 50,000 cubic feet per second, 2 feet higher when flows are between 50,000 and 79,000 cfs, 1 foot higher between 80,000 and 119,000 cfs, and at the prescribed minimum operating pool when flows are 120,000 cfs or higher. At each of the levels, the agency is permitted 1½ feet of flexibility, meaning the elevation could be even higher at times.

“(The Corps) will operate Lower Granite Dam to temporarily hold water to a higher level when flows are low to maintain the federal navigation channel, until sediment can be removed,” agency spokesman Matt Rabe at Portland said via a prepared statement. “The District continues to develop plans to perform work to remove sediments which are impacting the federally authorized navigation channel.”

Juvenile salmon and steelhead depend on river current to flush them downstream. The biological opinion calls for a lower elevation from April 1 to Aug. 14 because it helps the reservoir to behave more like a free-flowing river.

“When the pool is lower, at minimum operating pool, that allows the fastest water velocity through the reservoir, which then results in faster fish travel time, which then results in higher survival,” said Jay Hesse, director of biological services for Tribe’s Department of Fisheries Resources Management. “As the pool elevation rises, the water velocity slows down and the fish travel time slows down and the survival decreases.”

Hesse said the Corps has agreed to try to shrink its 1½-foot operational flexibility down to 1 foot.

When the river hits the reservoir and its current slows, the sediment carried by the water drops out and accumulates on the bed. The agency typically dredges every seven to 10 years to clear the channel.

“Based on existing conditions and anticipated sedimentation, a dredging action to address immediate navigation needs is expected to be required to maintain safe navigation conditions while the other efforts are underway,” Corps spokesman Joseph Saxon at Walla Walla said. “In the meantime, the (Corps) will operate Lower Granite Dam to temporarily hold water to a higher level when flows are low to maintain the federal navigation channel, until sediment can be removed.”

Dredging is controversial. It adds to the cost of operating the federal hydropower system and some argue the in-water disposal of spoils in deeper areas of the reservoir downstream of Lewiston can harm protected fish as well as unlisted juvenile Pacific lamprey that live in the sediment. Fish advocates, including the Nez Perce Tribe, have gone to court in the past to stop dredging.

Column: Water Created California and the West. Will Drought Finish Them Off? (Los Angeles Times, CA)


In what may become an iconic image for drought-stricken California, Gov. Gavin Newsom stood on the parched bed of Lake Mendocino on April 21 to announce an emergency declaration for Sonoma and Mendocino counties.

“I’m standing currently 40 feet underwater,” he said, “or should be standing 40 feet underwater, save for this rather historic moment.”

Newsom’s point was that the reservoir was at a historically low 43% of capacity, the harbinger of what could be a devastating drought cycle not only for the Northern California counties that fell within his drought declaration, but for most of the state — indeed, the American West.

The last extended drought struck California in 2012-16. Still fresh in the memory, it was a period of stringent mandated cutbacks in water usage.

Lawns were forced to go brown, homeowners prompted to replace their thirsty yards with drought-resistant landscaping and to upgrade their vintage dishwashers and laundry machines with new water-efficient models. Profligate users were ferreted out from public records and, if they could be identified, shamed.

Although there have been wet years since then, notably 2017, the big picture suggests that the drought never really ended and the dry periods of this year and 2020 are representative of the new normal — a permanent drought.

Experts warn that climate change will only make things worse. The years 2014 and 2015 were the two hottest on record, “which made coping with water shortages even more difficult,” the Public Policy Institute of California observed in 2018.

Research suggests that extreme dry years will become more common, but so will extreme wet years. The latter isn’t a panacea for the drought, because the state’s water storage capacity can be overwhelmed by excessive rainfall, especially if a warmer climate reduces the snowpack, nature’s own seasonal reservoir.

Newsom’s step-wise approach of declaring emergencies in the hardest-hit regions of the state and holding back elsewhere until conditions spread shouldn’t leave any doubt that the crisis is just beginning.

“We’re definitely in a drought,” Jeffrey Kightlinger, general manager of the giant Metropolitan Water District of Southern California, told me. “This may go down as one of the five worst years on record.”

Water supply in the State Water Project, which distributes water to agencies and districts serving 27 million Californians and 750,000 acres of farmland, is so low that the project is delivering only 5% of requested supplies this year. The allocation has fallen that far only twice before since 1996, according to the state Department of Water Resources, which runs the project.

It may already be too late to avoid some of the conflicts and consequences of the drought age in California. Every segment of society will have to come to terms with deepening scarcity, and with each others’ competing demands.

Residential users, growers, the fishing industry and stewards of the environment will be increasingly at odds, unless the state can craft a drought response that spreads sacrifices in a way that each group considers fair. To ask the question whether that is likely is to answer it.

Dry years

Dry years, defined as those with below-average precipitation (brown), have outnumbered wet years in California at least since before 1900—and they’re becoming more common and more severe.(Public Policy Institute of California)

California’s water policies and infrastructure were products of an era of abundance. During a century of growth there always seemed to be enough water to satisfy demand — and when there wasn’t, engineering know-how and public funding provided the means to move water from where it was to where demand was growing.

That was the case with the construction of the Los Angeles and Colorado River aqueducts early in the last century and the State Water Project and federal Central Valley Project in succeeding decades.

Complacency marked some of this work. The expectations of the water supply that would be provided by Hoover Dam, for example, were based on surveys of the Colorado River’s flow taken during a historically wet period.

The river has never provided as much water as was estimated in 1922, when the prospective supply was apportioned among the seven states of the Colorado basin. Dealing with the shortfall has been a challenge ever since, at one point even bringing Arizona and California to the brink of interstate war.

Water policy in California has historically been reactive rather than proactive. The first years of 2012-16 drought yielded the Sustainable Groundwater Management Act. Known as SGMA and signed by Gov. Jerry Brown in 2016, the law was the first to regulate the exploitation of groundwater, which feeds one-third of the state’s demand in normal years and half in dry years.

The SGMA places overdrawn groundwater regions, such as the southern part of the agriculturally rich Central Valley, under stringent rules starting in 2040.

The drought has already left its mark on California.

Rate increases by the San Diego County Water Authority averaging 8% a year over the last decade have driven many of that county’s avocado growers out of business, local farmers say, but the pain is more widespread: Agricultural acreage in the county fell to 234,477 in 2019 from 302,000 in 2010.

Residents in the San Jose region are facing annual rate increases for drinking water of up to 9.6% each year for the next eight years; that would mean increases of as much as $5.10 a month on their bills, according to the Santa Clara Valley Water District.

Let’s take a look at the implications for different segments of California society.

To begin with, the structure of California agriculture will have to change, though no one is yet sure how.

“The unfortunate reality is that some amount of farmland will probably have to go out of production to manage the reduced supply of water,” says Ann Hayden, a water expert at the Environmental Defense Fund in Sacramento. She calls for planning now “to support farmers as they’re making decisions about what lands to take out of production.” Fallowed land, she observes, creates air quality and water quality problems that will have to be addressed proactively.

Among the crops vulnerable to changing conditions are almonds, which at $6 billion in value are the state’s second-largest farm commodity (after milk and cream). Driven by the profits to be made, almond acreage has roughly doubled over the last decade to 1.6 million acres.

Almonds are known as thirsty crops, but the real significance of the expansion of acreage is that they’re permanent crops — they must be watered every year. As a result, almond orchards have been heavy users of groundwater. That’s a practice certain to come under pressure as the SGMA mandates come into effect in 2040.

Almond growers are only now starting to come to terms with the looming restrictions. “I’ve been in the industry for 25 years,” Holly A. King, a grower and chair of the California Almond Board, told the California Tree Nut Report last year. “What’s it going to look like in 25 more years? It’s not going to look like what it is today.”

As agricultural and residential demands take center stage, the environment suffers. In commercial terms, the fishing industry bears the brunt. The state’s salmon fishery was on the verge of being wiped out during the last drought stage. This one could finish the job.

As pressure intensifies on federal officials to increase releases from the Central Valley Project’s Shasta Lake — the reservoir behind Shasta Dam on the Sacramento River — to serve farmers, the threat to the fishing industry intensifies.

That’s because releasing water raises the temperature of the reservoir and then the river. “We’re looking at the loss of 90%-100% of juvenile salmon in the Sacramento River this fall,” says Barry Nelson, a consultant to the Golden Gate Salmon Assn. That would wipe out fall-run salmon, the industry’s lifeblood.

The last time that happened, in 2014-15, as few as 2% of naturally spawning fall-run salmon survived. “They were killed in their nests,” Nelson says. “They were cooked by high water temperatures.”

If there’s a bright spot in drought planning, it’s in the Southern California residential sector, which has become a world-leader in water conservation and recycling. Within the MWD, total water demand has fallen over the last decade even as the population has edged up to 19 million from 18 million.

Much of the gain has come from installation of stingier household appliances, but much more can be done in exterior demand through the planting of drought-resistant vegetation to replace lawns. “We think we’ve gotten all the low-hanging fruit indoors,” Kightlinger says, “but there’s a lot more we could do outdoors. We could probably squeeze out another 10% to 20% relatively painlessly.”

By pushing down demand, the MWD has been able to store more water. Its current storage of about 3.4 million acre-feet (one acre-foot or 326,000 gallons is enough to supply one or two families for a year) would cushion the district for about six or seven years, Kightlinger says, given expected supplies coming from the Colorado and in-state sources.

But more planning and management will be needed in coming decades. Some solutions that seemed drastic in the past are getting closer looks. Those include draining Lake Powell, north of the Grand Canyon on the Arizona-Utah border, and making Lake Mead, behind Hoover Dam, the primary reservoir on the Colorado River for California, Arizona and Nevada.

The “Fill Mead First” campaign says that would reduce losses from evaporation and preserve Mead’s capacity to generate hydroelectricity. But deliberately lowering Lake Powell would foster a political backlash in the upper-basin states of Wyoming, Utah and Colorado, which view Lake Powell‘s supply as a sort of guarantee that they can exploit the headwaters of the Colorado for their own purposes.

Both reservoirs are approaching critically low levels, with the surface of Mead currently about 150 feet below its maximum, with expectations that it could fall an additional 50 feet by late 2022; Powell is currently about 134 feet below its maximum elevation, and could fall an additional 25 feet by early next year, according to projections by the U.S. Bureau of Reclamation.

If all this seems dizzyingly complicated, that’s the product of more than a century of fragmented water law and policy in California. The riddle can’t be solved by a patchwork of emergency declarations, no matter how urgent, but only by the crafting now of a comprehensive plan to address the inevitable consequences of climate change in the already arid West.

It’s well past time to come to terms with the words of John Wesley Powell, who led the first government expedition of the Grand Canyon, and who warned of a fraught future at a Los Angeles irrigation conference in 1893.

“Gentlemen,” he said, “you are piling up a heritage of conflict and litigation over water rights, for there is not sufficient water to supply the land.”

 

Chelan PUD's Steve Wright Leaving After Eight Years (Wenatchee World, WA – Paywall Advisory)

(WENATCHEE, WA) — General Manager Steve Wright is leaving the Chelan County PUD after eight years in the utility’s top job.

“The primary reason is simply that it’s been long enough. I’ve spent 40 years in the industry, 20 years in leadership. I want the opportunity to try something new,” he announced in a Tuesday afternoon statement.

“My contract with the PUD to serve as the General Manager runs through the end of this year. I have informed the Board that I do not intend to serve as the General Manager beyond that,” he said in the statement.

Wright was hired by the PUD in 2013 after a short-lived retirement from his previous position as head of the Bonneville Power Administration in Oregon. He started at Bonneville in 1981 in an entry-level position before becoming its intermediate leader in 2000 and then being permanently appointed in 2002.

Wright said he’s made his announcement early to allow the PUD’s board of commissioners time to search for his replacement and with enough time for an overlap in the transition.

In a statement of its own, the commission said it will conduct a national search for a new general manager and noted that the public will have an opportunity for input. They also thanked Wright for his work.

“We are particularly grateful that Steve will stay on as the General Manager through the end of this year and fulfill the full term of his contract,” the commissioners said. “We’d like for him to stay longer, but we appreciate the time he has given us.”

As for the future, Wright’s not sure what’s next for him.

“I have some thoughts but no firm plans. First and foremost, having run a few marathons, I believe in running through the tape,” Wright said in the statement. “I am feeling very optimistic about Chelan PUD and Chelan County. I want to make sure I am providing full effort until it’s the next person’s turn. We will keep doing what we have been doing, guided by our strategic plan.”

Monday, April 26, 2021

Avista Unveils Emissions Strategy (Spokesman-Review, Spokane, WA)


Spokane-based Avista Corp. this week announced its plan to reduce natural gas emissions as part of its strategy to provide a carbon-neutral energy supply by 2045.

“We are proud to continue our commitment to environmental stewardship and sustainability,” Dennis Vermillion, president and CEO of Avista, said in a statement Thursday. “We set an ambitious renewable electric energy goal in 2019 – to serve our customers with 100% clean electricity by 2045 and to have a carbon-neutral supply of electricity by 2027,” Dennis Vermillion, president and CEO of Avista, said in a statement.

Avista’s strategy to achieve lower emissions includes investing in new technology, such as renewable natural gas, hydrogen and other biofuels.

Op/Ed: It's Time to Remove Restrictions on Electric Vehicle Sales in Washington State (Puget Sound Business Journal, Seattle, WA)


Washington is home to many of the world’s largest and most innovative tech companies, whose services have transformed how the entire world does business and unleashed unprecedented growth in the process.

Sitting at the nexus of innovation and environmentalism and with the cleanest state energy grid in the U.S., electric vehicles (EVs) are our best solution for decarbonizing the transportation sector — the largest source of state greenhouse gas emissions. EVs fit perfectly into Washington’s culture of putting economic development and conservation on equal footing; in 2020, we surpassed Gov. Jay Inslee’s initial goal for 50,000 EVs on the road. As battery technology continues to accelerate, EVs will achieve total cost of ownership parity with internal combustion engine vehicles, and EV ownership will likely accelerate.

The trend in greenhouse gas emissions last measured by the state in 2018 is moving in the wrong direction, however, coming in at 9% higher than the Legislature’s emissions target for 2020. Transportation accounts for 45% of Washington’s climate-warming emissions, so we must do all we can to reduce transportation emissions.

That includes accelerating the adoption of zero-emission vehicles (ZEVs).

Because Washington imports all the oil it uses from other states and foreign countries, moving from oil to Washington-generated electricity can enable the state to keep as much as $15 billion a year here in the state’s economy.

Allowing direct EV sales is a crucial component for implementing Washington’s recently adopted ZEV standards.

Current regulations inhibit electric vehicle sales, blocking Washington’s residents from enjoying full freedom to buy EVs in-state. Vehicle manufacturers, except Tesla, are barred from engaging in sales activity unless they operate through a franchise dealership.

While residents can still order these vehicles online, they must travel to a neighboring state (both Oregon and Idaho allow direct EV sales) to test drive or interact with the car before purchasing. New EV companies introducing long-range ZEVs seek to open EV stores in the state, supporting economic investment and job creation for the benefit of all Washingtonians.

Washington must also keep pace with the important dialogue on environmental justice and equity. Hazardous air pollution from vehicle emissions disproportionately impacts families living near high-traffic zones, including low-income communities. Accelerating EV adoption will lower emissions in these zones, providing immediate benefit to these families, even if they don’t own an electric vehicle.

Welcoming EV manufacturers will allow these companies to build retail locations, hire workers, develop charging infrastructure, and invest in Washington state.

Multimodal charging hubs around the region – similar to multimodal transportation hubs – will serve a variety of public and private fleet operators and citizens. This allows economies of scale, reduced costs of electric power and a better managed electrical grid.

The American EV industry is driving the transition to a 21st-century clean energy economy, attracting new investment and creating new domestic manufacturing jobs all around the country. By allowing direct sales, Washington can send a strong signal that it wants to be part of this new mobility shift.

Bruce Agnew is director of Cascadia/ACES NW Network.

Steve Marshall is executive director of CATES.

Joel Levin is executive director of Plug In America.

Wednesday, April 21, 2021

Montana: NorthWestern Energy Building $250 Million Natural Gas Power Plant in Laurel (Billings Gazette, MT)


NorthWestern Energy intends to build a $250 million, 175-megawatt power plant in Laurel as part of a plan to add 325 megawatts of dispatchable power to its Montana portfolio.

The utility laid out plans late Tuesday for the gas-fired power plant, along with a 50-megawatt battery storage project and a five-year agreement to buy 100 megawatts of hydropower. Details were announced in a press release NorthWestern posted to Globe Newswire.

The new power plant in Laurel will be a reciprocating internal combustion engine, or RICE plant. Easily ramped up or down, RICE plants have become a popular tool for balancing generation intermittent resources like wind and solar farms. For perspective, the RICE plant’s output would be about 78% of the nameplate capacity of NorthWestern’s 30% share of Colstrip Power Plant Unit 4.

“This resource portfolio addresses a key portion of our immediate need for generation capacity while also allowing us to make progress toward our goal of an energy supply portfolio in Montana that reduces the carbon intensity of our electric generation by 90% by 2045,” Bob Rowe, NorthWestern's CEO, was quoted saying in the press release.

The Laurel Generating Station is expected to be available to serve customers by Jan. 3, 2024.

The battery storage project is expected to come online in late 2023.

The projects announced Tuesday stem from a competitive solicitation for resources issued at the start of 2020. The utility is expected to report results of its request for proposal to the Montana Public Service Commission at month’s end. The company reports receiving proposals from 21 bidders.

NorthWestern indicates it will apply in May for PSC approval to add the RICE plant and the battery storage project to its portfolio.

The battery storage will be used to store wind energy when generation exceeds demand, so the power can be delivered when needed.

At the start of 2023, British Columbia-owned BC Hydro will supply NorthWestern with 100-megawatts of capacity through 2028.

NorthWestern has signaled a move toward gas-fired power plants for several years. In early 2019, it was telling would-be investors that it had a need for about 800 megawatts of dispatchable generation capacity. It indicated that gas-fired power plants were the preferred source for the generation and would likely be built 200 megawatts at a time.

Gas-fired power plants were also the focus of the utility's 2019 integrated resource plan. However, the day after a public hearing in which NorthWestern explained its reasons for pivoting to gas, the utility announced it intended to buy more of Colstrip Unit 4.

Puget Sound Energy and NorthWestern had entered a purchase sale agreement in which Montana's largest monopoly utility was to pay the aggregate price of $1 for Puget's 185-megawatt share of Unit 4. The two Colstrip owners chose to withdraw from the agreement after utility analysts in Washington determined that dollar deal was no bargain for consumers because of related power prices and unpaid debt.

Word of Laurel’s gas-fired power plant leaked weeks ago in the Montana Legislature as Sen. Steve Fitzpatrick, sponsor of a Colstrip bill benefitting NorthWestern, testified that there would be a gas-fired power plant in Billings. The son of NorthWestern’s former director of government affairs, was 8 miles off the mark but was in the ballpark.

Fitzpatrick’s Senate Bill 379, supported by NorthWestern, shields the utility from financial losses related to Colstrip by guaranteeing full recovery of new shares in the plant, as well as determining that only NorthWestern could decide when that was considered shut down. It set up regulator-proof terms for passing on unanticipated costs to customers.

Relevant to the 175-megawatt Laurel RICE plant, SB 379 also set up a formula to determining what customers were to pay for these new-to-NorthWestern Colstrip acquisitions. A 185-megawatt share of Unit 4 was likely to cost customers $486 million according to PSC analysts, who also offered up an alternative $238 million price that eliminated basic costs like coal for making coal power, power plant maintenance, and capital expenses.

Lawmakers were told there was no cheaper option.

“The only thing that's cheaper would be if you went down to the store and got some candles down at Safeway and burned those. That's probably cheaper. But nobody has produced anything that says that anything is cheaper than this. Replacing Colstrip is not cheaper,” Fitzpatrick told lawmakers last week.

What NorthWestern produced Tuesday was less than what PSC analysts had estimated for the Colstrip terms offered by SB 379.

NorthWestern is also a gas company with 43.1 billion cubic feet of owned reserves and retail demand of 560 million cubic feet annually in Montana, as disclosed in corporate filings. The company owns three natural gas storage fields in Montana, as well as two transmission pipelines through subsidiaries, plus connections to four other transmission systems.

The Laurel power plant is the second gas-fired power plant announced by NorthWestern in the last 12 months. The utility announced the construction of a 58-megawatt RICE plant near Huron, South Dakota in May 2020. The cost of that plant was $84 million.

There will be opposition to a gas-fired power plant. Carbon dioxide emissions from gas-fired power plants are roughly 54% of a coal plant producing the same level of energy, according to U.S. Energy Information Administration. Peaker plants, like the one NorthWestern intends to build in Laurel, have been promoted as a transition generator to an all-renewable energy future.

With the advent of battery storage, there are renewable proponents who say natural gas as a transitional energy source isn’t necessary.

U.S. House Passes Cannabis Banking Bill – Will the Senate? (Portland Business Journal, OR)


The U.S. House of Representatives voted overwhelmingly on Monday to safeguard banks that serve state-legal cannabis businesses — just as it had in the previous session of Congress.

The difference this time is that Democrats control the Senate. That gives the Secure and Fair Enforcement (SAFE) Banking Act at least a chance at full adoption, something that wasn’t the case under Mitch McConnell’s majority leadership.

"As we continue to push forward with full legalization, addressing this irrational, unfair, and unsafe denial of banking services to state-legal cannabis businesses is a top priority," Portland Democrat Rep. Earl Blumenauer said. "This is a critical element of reform that can’t wait, and I urge our cannabis champions in the Senate to take up this legislation as soon as possible."

Oregon Democrat Jeff Merkley and Montana Republican Steve Daines have introduced a version of the bill in the Senate. It has 32 cosponsors, including seven Republicans.

But it's not clear how that bill might fit in with an effort to pass a comprehensive legalization bill, spearheaded by Majority Leader Chuck Schumer, Oregon's Ron Wyden and New Jersey's Cory Booker.

In an interview with Marijuana Moment late Monday, Schumer suggested banking reform could be folded in with comprehensive legislation, a tactic, he said, that "brings in some people who might not normally support legalization."

The banking bill certainly enjoys broad support; the vote in the House on Monday was 321-101, nearly identical to the 321-103 margin a similar version of the bill got when it first passed as a standalone measure in September 2019.

"This legislation is an important step toward resolving the conflict between state and federal law so banks can serve legal cannabis and cannabis-related businesses," the American Bankers Association said in a statement after the vote.

Because cannabis is a Schedule I drug under the Controlled Substance Act, most financial institutions now steer clear of businesses that deal with it, even in states where the drug is legal. The banking problems can also extend to businesses — legal, financial and other services — that serve the cannabis industry, even if they "don't touch the plant."

The bill passed Monday in the House would protect banks that serve state-legal cannabis businesses from federal regulatory penalties.

Blumenauer and Merkley have cited a recent rash of robberies at Portland-area cannabis stores in arguing the bill is more urgently needed than ever.

 

Op/Ed: Clean Energy Infrastructure Vital to Washington’s Future (Puget Sound Business Journal, WA)


Matthew Hepner is executive director, Certified Electrical Workers, International Brotherhood of Electrical Workers.

Weather-related blackouts in Texas earlier this year and the rolling blackouts in California last year are further reminders of how fragile our power grids can be.

As utilities go through the process of planning to comply with Washington’s Clean Energy Transformation Act (CETA) we must acknowledge we have the same vulnerabilities here. Washington utilities therefore must include robust carbon-free energy infrastructure in their energy portfolios.

Washington state led the nation when Gov. Jay Inslee signed CETA in 2019. This groundbreaking legislation requires that our state’s utilities supply Washingtonians with 100% clean carbon-free power by 2045.

Now our region’s utilities like Puget Sound Energy have just gone through the process of developing integrated resource plans (IRP) that involve planning how to remove global greenhouse gas emitting energy sources from their portfolios and replace them with carbon-free sources of power for utility customers. This is a daunting task and we all must support the utilities as they work through this very complex and unprecedented process. In turn, the utilities must properly consider all available technology to supply Washingtonians with the affordable clean power we need.

One important mature technology in the Puget Sound Energy IRP process is closed-loop pumped storage. The proposed Goldendale Energy Storage Project in Klickitat County relies on this established carbon-free technology and would provide our region with the needed energy storage resources that will be essential in complying with CETA.

The Goldendale Project will generate 1,200 megawatts of clean electricity while also storing the region’s abundant wind and solar electricity to use when it is needed. The Goldendale Energy Storage Project is a “closed-loop” pumped hydro storage facility with an upper and lower reservoir where water is recirculated between the two reservoirs. During times of peak sun and/or high winds the plant uses surplus energy to pump water from the lower reservoir to the upper reservoir. Then, during peak demand hours, the water is returned by gravity to the lower reservoir passing through turbine generators that generate electricity.

Closed-loop projects like the Goldendale project are carbon-free with minimal environmental impact. Also, the recent blackouts in California and Texas underscore the importance of large-scale energy storage projects like the Goldendale Energy Storage Project in maintaining a reliable electric grid.

This project also makes good economic sense.

It will create more than 3,000 family-wage jobs during its four-year construction period, and another 50 to 70 permanent jobs. Also, because the size and duration of the construction of the project it is an important opportunity for the building trades to add to our nation’s critically important skilled and technically trained workforce by training union apprentices.

As our skilled workforce ages out, the building trades look for large projects that cover the 4- to 5-year duration of apprenticeship training programs. The Goldendale Project will also give those in the Gorge and surrounding rural areas an opportunity to enter these union training programs to be certified for a living-wage career.

Finally, the Goldendale project aligns perfectly with the Biden administration’s focus on climate change and clean energy. The Goldendale Project will help the U.S. achieve broader goals like the emissions reductions included in the Paris Climate Agreement.

Washington state is leading the nation through meaningful steps to address climate change. We all must work together to support the utilities as they work through finding solutions that will provide carbon-free affordable power. And the utilities must properly consider reliable technology like the Goldendale Storage Project to supply their customers with the power we need.

 

Coal Is Set to Roar Back – So Are Its Climate Risks (NY Times)


The pandemic abruptly slowed the global march of coal. But demand for the world’s dirtiest fuel is forecast to soar this year, gravely undermining the chances of staving off the worst effects of global warming.

Burning coal is the largest source of carbon dioxide emissions, and, after a pandemic-year retreat, demand for coal is set to rise by 4.5 percent this year, mainly to meet soaring electricity demand, according to data published Tuesday by the International Energy Agency, just two days before a White House-hosted virtual summit aimed at rallying global climate action.

“This is a dire warning that the economic recovery from the Covid crisis is currently anything but sustainable for our climate,” Fatih Birol, the head of the agency, said in a statement.

Coal is at the crux of critical political decisions that government leaders need to make this year if they are to transition  to a green economy. Scientists say greenhouse gas emissions need to be halved by 2030 in order for the world to have a fighting chance at limiting dangerous levels of warming.

In short, this a historic juncture for coal.

For 150 years, more and more of its sooty deposits have been extracted from under the ground, first to power the economies of Europe and North America, then Asia and Africa. Today, coal is still the largest source of electricity, though its share is steadily shrinking as other sources of power come online, from nuclear to wind.

Global spending on coal projects dropped to its lowest level in a decade in 2019. And, over the last 20 years, more coal-fired power plants have been retired or shelved than commissioned. The big holdouts are China, India and parts of Southeast Asia, but, even there, coal’s once-swift growth is nowhere as swift as it was just a few years ago, according to a recent analysis.

In some countries where new coal-fired power plants were only recently being built by the gigawatts, plans for new ones have been shelved, as in South Africa, or reconsidered, as in Bangladesh, or facing funding troubles, as in Vietnam. In some countries, like India, existing coal plants are running way below capacity and losing money. In others, like the United States, they are being decommissioned faster than ever.

Nonetheless, demand is still strong. “Coal is not dead,” said Melissa C. Lott, research director for the Center for Global Energy Research at Columbia University. “We have made a lot of progress, but we have not made that curve.”

Coal is the lightning rod of climate diplomacy this year, as countries scramble to rebuild their economies after the coronavirus pandemic while at the same time, stave off the risks of a warming planet. The Biden administration has leaned on its allies Japan and Korea to stop financing coal use abroad. And it has repeatedly called out China for its soaring coal use. China is by far the largest consumer of coal, and is still building coal-fired power plants at home and abroad.

“The principle of common but differentiated responsibilities must be upheld,” Mr. Xi said at his own global summit in the city of Boao.

‘Growing opposition against coal’

Since the start of the industrial era, coal has been the main fuel to light up homes, power factories and, in some places, to cook and heat rooms, too. For over a century, Europe and the United States consumed most of the world’s coal. Today, China and India account for two-thirds of coal consumption.

Other energy sources have joined the mix as electricity demand has soared: nuclear, wind, and, most recently, hydrogen. Coal made room for new entrants but refused to retreat.

Today, several forces are rising against coal. People are clamoring against deadly levels of air pollution, caused by its combustion. Wind and solar energy, once far costlier than coal, are becoming competitive, while some countries are facing a glut of coal-fired plants already built.

So, even in countries where coal use is growing, the pace of growth is slowing.

In South Africa, after years of lawsuits, plans to build a coal-fired power station in Limpopo Province were canceled last November.

In at least three countries, Chinese-funded projects are in trouble or dead. In Kenya, a proposed coal plant has languished for years because of litigation. In Egypt, a planned coal plant is indefinitely postponed. In Bangladesh, Chinese-backed projects are among 15 planned coal plants that the government in Dhaka is reviewing, with an eye to canceling them altogether.

Pakistan, saddled by debts, announced a vague moratorium on new coal projects. Vietnam, which is still expanding its coal fleet, scaled back plans for new plants. The Philippines, under pressure from citizens’ groups, hit the pause button on new projects.

“Broadly speaking, there’s growing opposition against coal and a lot more scrutiny right now,” said Daine Loh, a Southeast Asia power sector energy specialist at Fitch Solutions, an industry analysis firm. “It’s a trend — moving away from coal. It’s very gradual.”

Money is part of the problem. Development banks are shying away from coal. Japan and Korea, two major financiers of coal, have tightened restrictions on new coal projects. Japan is still building coal plants at home, rare among industrialized countries, though Prime Minister Yoshihide Suga said in October that his country would aspire to draw down its emissions to net-zero by 2050.

There are some big exceptions. Indonesia and Australia continue to mine their abundant coal deposits. Perhaps most oddly, Britain, which is hosting the next international climate talks, is opening a new coal mine.

And then there are the world’s biggest coal consumers, China and India.

China’s economy rebounded in 2020. Government stimulus measures encouraged the production of steel, cement and other industrial products that eat up energy. Coal demand rose. The capacity of China’s fleet of coal-fired power plants grew by a whopping 38 gigawatts in 2020, making up the vast majority of new coal projects worldwide and offsetting nearly the same amount of coal capacity that was retired worldwide. (One gigawatt is enough to power a medium-sized city.)

Coal’s future in China is at the center of a robust debate in the country, with prominent policy advisers pressing for a near-moratorium on new coal plants and state-owned companies insisting that China needs to burn more coal for years to come.

India’s coal fleet is growing as well, bankrolled by state-owned lenders. There is not much of a signal from the government that it wants to reduce its reliance on coal, even as it seeks to expand solar energy. The government in New Delhi is allowing some of its oldest, most polluting coal plants to remain open, and it is seeking private investors to mine coal. If India’s economy recovers this year, its coal demand is set to rise by 9 percent, according to the I.E.A.

But even India’s coal fleet isn’t growing as fast as it was just a few years ago. On paper, India plans to add some 60 gigawatts of coal power capacity by 2026, but given how many existing plants are operating at barely half capacity, it’s unclear how many new ones will ultimately be built. A handful of state politicians have publicly opposed new coal-fired power plants in their states.

How much more coal India needs to burn, said Ritu Mathur, an economist at The Energy & Resources Institute in New Delhi, depends on how fast its electricity demand grows — and it could grow very fast if India pushes electric vehicles. “To say we can do away with coal, or that renewables can meet all our demand,” Dr. Mathur said, “is not the story.”

‘The big question is around gas’

What has most quickly come to replace coal in many countries is that other fossil fuel: gas.

From Bangladesh to Ghana to El Salvador, billions of dollars, some from public coffers, are being poured into the development of pipelines, terminals and storage tanks, as the number of countries importing liquefied natural gas has doubled in less than four years. Gas now supplies nearly one-fourth of all energy worldwide.

Its proponents argue that gas, which is less polluting than coal, should be promoted in energy-hungry countries that cannot afford a rapid scale-up of renewable energy. Its critics say multibillion dollar investments in gas projects risk becoming stranded assets, like coal-fired power plants already are in some countries; they add that methane emissions from the combustion of gas are incompatible with the Paris Agreement goal of slowing down climate change.

The United States, buoyed by the fracking boom, is among the world’s top gas exporters, alongside Qatar, Australia and Russia.

American companies are building a gas import terminal and power station in Vietnam. Gas demand is growing sharply in Bangladesh, as the government looks to shift away from coal to meet its galloping energy needs. Ghana this year became the first country in sub-Saharan Africa to import liquefied natural gas. And the U.S. Agency for International Development has been promoting gas as a way to electrify homes and businesses across Africa.

And there’s the rub for the Biden administration: While it has set out to be a global climate leader, it has not yet explained its policy on advancing gas exports — particularly on the use of public funds to build gas infrastructure abroad.

“There’s fairly strong consensus around coal. The big question is around gas,” said Manish Bapna, acting president of the World Resources Institute. “The broader climate community is starting to think about what a gas transition looks like.”

 

Developers Flock to Cold Storage as Americans Stock Their Freezers (NY Times)


Companies are seeking to build, buy or invest in the sector, despite construction costs that are roughly triple that of an ordinary warehouse. Consumers filled their freezers with staples and indulgences in the pandemic, leading to increased demand for cold storage for last-mile delivery.

Americans have treated their freezers a bit like security blankets over the past year, stuffing them full of staples and indulgences, a consumer behavior pattern that has had ripple effects beyond the walls of their kitchens.

Developers that focus on cold storage facilities say they are seeing growing interest from companies seeking to build, buy or invest in the sector, despite construction costs that are roughly triple that of an ordinary warehouse.

Americold, a logistics company focused on the cold storage supply chain, reported that its revenue grew 11.4 percent in 2020 from the previous year.

“We gave guidance pre-Covid for our 2020 year, and we’re one of the few companies that didn’t lift or change that guidance,” said Fred Boehler, chief executive of Americold, which added 46 facilities to its portfolio through a $1.74 billion acquisition of Agro Merchants Group last year. “What we eat and where we eat will change, but we’re going to eat.”

Where we eat has shifted overwhelmingly to our own kitchens and living rooms, and what we eat increasingly comes from the freezer.

“People were very nervous not just about getting to a store, but what was going to happen with the supply chain,” said Jill Standish, global head of the retail practice at consulting firm Accenture. She added that a survey in March 2020, the month the World Health Organization declared the pandemic, found that about one-third of American shoppers were buying more frozen food than normal.

Even though the food supply chain issues that characterized the early days of the pandemic have largely abated, Americans are still stocking up.

“Consumption of frozen or prepared meals was already on the rise leading into Covid,” said Beth Bloom, associate director of food and drink reports at the market research firm Mintel. The pandemic supercharged that trend, as restaurants shuttered and Americans stopped commuting to work and school.

Some people are motivated by the desire to avoid crowds: In a recent Mintel survey, 57 percent of respondents said they tried to limit the amount of time spent in stores — and 36 percent said they were still stockpiling groceries or household supplies.

“They want to stock up more so they can go less frequently,” Ms. Bloom said.

The widespread migration of the white-collar work force from downtown office towers and suburban corporate campuses into their homes is another key part of the dynamic.

“You’re talking about a lot of people that are going to need to fend for themselves at home in situations where they haven’t before, mainly lunchtime,” Ms. Bloom said.

The industry had to make large, rapid adjustments to accommodate these changes taking place in millions of homes across the country.

“This whole idea of food handling and cold storage in an e-commerce world is really different than it was in the past,” Ms. Standish said. “Instead of central locations of huge warehouses that have a long way to go to deliver, we’re seeing a lot of micro-fulfillment centers.”

Increasing demand for cold storage at the last-mile stage of distribution — that is, near where people live — was rising before the pandemic. It accelerated when lockdown orders shut restaurants and food service operations, and Americans stuck at home turned to online grocery shopping en masse.

“The immediate change in consumer behavior due to Covid has caused companies to change how they service those demands,” said Art Rasmussen, a senior vice president at CBRE, a real estate investment and services firm. “Covid has accelerated the online growth by several years, and the infrastructure wasn’t quite ready to take on that capacity.”

Building near population centers is logical, but not necessarily easy or cheap. “Traditionally, people shied away from it because it was capital intensive,” said Tim O’Rourke, managing director at real estate research and services firm JLL.

“Supply and demand are very tight in this industry and always have been,” said Mr. Boehler, the Americold chief. Demand in particular for cold storage catering to the retail sector boomed during the pandemic. “Overnight, it went up 40 percent in terms of demand,” he said.

Building cold storage space can cost $150 per square foot, about three times as much as that of conventional warehouse space, so the “if you build it, they will come” development model used for other types of industrial real estate — typically referred to in the industry as building “on spec” — has not been financially feasible. Shovels go into the ground only after tenants have committed and leases have been signed. The upshot is tighter supply, when companies need a lot more of this space quickly.

Converting existing warehouses generally is not an option. Paradoxically, given that they are constructed to store goods in subzero conditions, cold-storage warehouses need heated floors. Mr. O’Rourke said the intensity of the cold generated by industrial-strength refrigeration equipment can seep into the ground, creating an artificial permafrost. When that frozen ground expands, it is likely to warp a building’s foundation.

Cold-storage facilities require numerous other specialized construction elements to meet safety regulations and manufacturers’ quality standards.

“If you have your ice cream you just bought and bring it home, what happens at the end of the week to that ice cream?” Mr. Boehler said. “It’s got crystals, it’s got a weird coating. The product itself starts to break down.

“Your freezer at home is meant to protect those goods for a couple of weeks,” he added. “Our freezers are designed and insulated to hold those same products for months and months.”

This means keeping goods much, much colder than in an ordinary household freezer. Roofs and walls are all heavily insulated. Doors are all tightly sealed and fitted with high-speed motors to keep cold air from escaping. As all of us who have been scolded for leaving the freezer door open know, those moments of contact with the outside world drive up utility costs quickly.

“It’s all about maintaining a very, very tight tolerance of temperature around whatever we’re storing,” Mr. Boehler said.

In spite of the high capital requirements, Mr. O’Rourke of JLL said more commercial real estate investors were embracing cold storage. “There’s lots of capital flowing into the space,” he said.

Cold storage sales volume rose 22 percent in 2020 on a year-over-year basis, while the broader industrial sector dropped 11 percent and all commercial real estate plummeted 29 percent, according to data from Real Capital Analytics.

The category is becoming more popular because operators can charge a premium, and sector performance has weathered the coronavirus storm better than other types of properties like hotels, offices and malls.

That, in turn, is leading to more developers bucking convention and building cold storage facilities on spec, Mr. O’Rourke said. “There are actually more speculative cold storage projects in the U.S.,” he said.

“A lot of the speculative cold storage projects are being built in population growth centers because they’re highly divisible,” he said. Subsections can have varying temperature ranges so fish sticks, fennel and fresh flowers can all be stored under the same roof.

“We’re all getting used to convenience now, and the way we think about e-commerce has now entered the food world,” he added.

 

The Science of Climate Change Explained: Facts, Evidence and Proof – Definitive Answers to the Big Questions. (NY Times)

By Julia Rosen

Ms. Rosen is a journalist with a Ph.D. in geology. Her research involved studying ice cores from Greenland and Antarctica to understand past climate changes.

The science of climate change is more solid and widely agreed upon than you might think. But the scope of the topic, as well as rampant disinformation, can make it hard to separate fact from fiction. Here, we’ve done our best to present you with not only the most accurate scientific information, but also an explanation of how we know it.

How do we know climate change is really happening?

Climate change is often cast as a prediction made by complicated computer models. But the scientific basis for climate change is much broader, and models are actually only one part of it (and, for what it’s worth, they’re surprisingly accurate).

For more than a century, scientists have understood the basic physics behind why greenhouse gases like carbon dioxide cause warming. These gases make up just a small fraction of the atmosphere but exert outsized control on Earth’s climate by trapping some of the planet’s heat before it escapes into space. This greenhouse effect is important: It’s why a planet so far from the sun has liquid water and life!

However, during the Industrial Revolution, people started burning coal and other fossil fuels to power factories, smelters and steam engines, which added more greenhouse gases to the atmosphere. Ever since, human activities have been heating the planet.

We know this is true thanks to an overwhelming body of evidence that begins with temperature measurements taken at weather stations and on ships starting in the mid-1800s. Later, scientists began tracking surface temperatures with satellites and looking for clues about climate change in geologic records. Together, these data all tell the same story: Earth is getting hotter.

Average global temperatures have increased by 2.2 degrees Fahrenheit, or 1.2 degrees Celsius, since 1880, with the greatest changes happening in the late 20th century. Land areas have warmed more than the sea surface and the Arctic has warmed the most — by more than 4 degrees Fahrenheit just since the 1960s. Temperature extremes have also shifted. In the United States, daily record highs now outnumber record lows two-to-one.

This warming is unprecedented in recent geologic history. A famous illustration, first published in 1998 and often called the hockey-stick graph, shows how temperatures remained fairly flat for centuries (the shaft of the stick) before turning sharply upward (the blade). It’s based on data from tree rings, ice cores and other natural indicators. And the basic picture, which has withstood decades of scrutiny from climate scientists and contrarians alike, shows that Earth is hotter today than it’s been in at least 1,000 years, and probably much longer.

In fact, surface temperatures actually mask the true scale of climate change, because the ocean has absorbed 90 percent of the heat trapped by greenhouse gases. Measurements collected over the last six decades by oceanographic expeditions and networks of floating instruments show that every layer of the ocean is warming up. According to one study, the ocean has absorbed as much heat between 1997 and 2015 as it did in the previous 130 years.

We also know that climate change is happening because we see the effects everywhere. Ice sheets and glaciers are shrinking while sea levels are rising. Arctic sea ice is disappearing. In the spring, snow melts sooner and plants flower earlier. Animals are moving to higher elevations and latitudes to find cooler conditions. And droughts, floods and wildfires have all gotten more extreme. Models predicted many of these changes, but observations show they are now coming to pass.

How much agreement is there among scientists about climate change?

There’s no denying that scientists love a good, old-fashioned argument. But when it comes to climate change, there is virtually no debate: Numerous studies have found that more than 90 percent of scientists who study Earth’s climate agree that the planet is warming and that humans are the primary cause. Most major scientific bodies, from NASA to the World Meteorological Organization, endorse this view. That’s an astounding level of consensus given the contrarian, competitive nature of the scientific enterprise, where questions like what killed the dinosaurs remain bitterly contested.

Scientific agreement about climate change started to emerge in the late 1980s, when the influence of human-caused warming began to rise above natural climate variability. By 1991, two-thirds of earth and atmospheric scientists surveyed for an early consensus study said that they accepted the idea of anthropogenic global warming. And by 1995, the Intergovernmental Panel on Climate Change, a famously conservative body that periodically takes stock of the state of scientific knowledge, concluded that “the balance of evidence suggests that there is a discernible human influence on global climate.” Currently, more than 97 percent of publishing climate scientists agree on the existence and cause of climate change (as does nearly 60 percent of the general population of the United States).

So where did we get the idea that there’s still debate about climate change? A lot of it came from coordinated messaging campaigns by companies and politicians that opposed climate action. Many pushed the narrative that scientists still hadn’t made up their minds about climate change, even though that was misleading. Frank Luntz, a Republican consultant, explained the rationale in an infamous 2002 memo to conservative lawmakers: “Should the public come to believe that the scientific issues are settled, their views about global warming will change accordingly,” he wrote. Questioning consensus remains a common talking point today, and the 97 percent figure has become something of a lightning rod.

To bolster the falsehood of lingering scientific doubt, some people have pointed to things like the Global Warming Petition Project, which urged the United States government to reject the Kyoto Protocol of 1997, an early international climate agreement. The petition proclaimed that climate change wasn’t happening, and even if it were, it wouldn’t be bad for humanity. Since 1998, more than 30,000 people with science degrees have signed it. However, nearly 90 percent of them studied something other than Earth, atmospheric or environmental science, and the signatories included just 39 climatologists. Most were engineers, doctors, and others whose training had little to do with the physics of the climate system.

A few well-known researchers remain opposed to the scientific consensus. Some, like Willie Soon, a researcher affiliated with the Harvard-Smithsonian Center for Astrophysics, have ties to the fossil fuel industry. Others do not, but their assertions have not held up under the weight of evidence. At least one prominent skeptic, the physicist Richard Muller, changed his mind after reassessing historical temperature data as part of the Berkeley Earth project. His team’s findings essentially confirmed the results he had set out to investigate, and he came away firmly convinced that human activities were warming the planet. “Call me a converted skeptic,” he wrote in an Op-Ed for the Times in 2012.

Mr. Luntz, the Republican pollster, has also reversed his position on climate change and now advises politicians on how to motivate climate action.

A final note on uncertainty: Denialists often use it as evidence that climate science isn’t settled. However, in science, uncertainty doesn’t imply a lack of knowledge. Rather, it’s a measure of how well something is known. In the case of climate change, scientists have found a range of possible future changes in temperature, precipitation and other important variables — which will depend largely on how quickly we reduce emissions. But uncertainty does not undermine their confidence that climate change is real and that people are causing it.

Do we really only have 150 years of climate data? How is that enough to tell us about centuries of change?

Earth’s climate is inherently variable. Some years are hot and others are cold, some decades bring more hurricanes than others, some ancient droughts spanned the better part of centuries. Glacial cycles operate over many millenniums. So how can scientists look at data collected over a relatively short period of time and conclude that humans are warming the planet? The answer is that the instrumental temperature data that we have tells us a lot, but it’s not all we have to go on.

Historical records stretch back to the 1880s (and often before), when people began to regularly measure temperatures at weather stations and on ships as they traversed the world’s oceans. These data show a clear warming trend during the 20th century.

Some have questioned whether these records could be skewed, for instance, by the fact that a disproportionate number of weather stations are near cities, which tend to be hotter than surrounding areas as a result of the so-called urban heat island effect. However, researchers regularly correct for these potential biases when reconstructing global temperatures. In addition, warming is corroborated by independent data like satellite observations, which cover the whole planet, and other ways of measuring temperature changes.

Much has also been made of the small dips and pauses that punctuate the rising temperature trend of the last 150 years. But these are just the result of natural climate variability or other human activities that temporarily counteract greenhouse warming. For instance, in the mid-1900s, internal climate dynamics and light-blocking pollution from coal-fired power plants halted global warming for a few decades. (Eventually, rising greenhouse gases and pollution-control laws caused the planet to start heating up again.) Likewise, the so-called warming hiatus of the 2000s was partly a result of natural climate variability that allowed more heat to enter the ocean rather than warm the atmosphere. The years since have been the hottest on record.

Still, could the entire 20th century just be one big natural climate wiggle? To address that question, we can look at other kinds of data that give a longer perspective. Researchers have used geologic records like tree rings, ice cores, corals and sediments that preserve information about prehistoric climates to extend the climate record. The resulting picture of global temperature change is basically flat for centuries, then turns sharply upward over the last 150 years. It has been a target of climate denialists for decades. However, study after study has confirmed the results, which show that the planet hasn’t been this hot in at least 1,000 years, and probably longer.

How do we know climate change is caused by humans?

Scientists have studied past climate changes to understand the factors that can cause the planet to warm or cool. The big ones are changes in solar energy, ocean circulation, volcanic activity and the amount of greenhouse gases in the atmosphere. And they have each played a role at times.

For example, 300 years ago, a combination of reduced solar output and increased volcanic activity cooled parts of the planet enough that Londoners regularly ice skated on the Thames. About 12,000 years ago, major changes in Atlantic circulation plunged the Northern Hemisphere into a frigid state. And 56 million years ago, a giant burst of greenhouse gases, from volcanic activity or vast deposits of methane (or both), abruptly warmed the planet by at least 9 degrees Fahrenheit, scrambling the climate, choking the oceans and triggering mass extinctions.

In trying to determine the cause of current climate changes, scientists have looked at all of these factors. The first three have varied a bit over the last few centuries and they have quite likely had modest effects on climate, particularly before 1950. But they cannot account for the planet’s rapidly rising temperature, especially in the second half of the 20th century, when solar output actually declined and volcanic eruptions exerted a cooling effect.

That warming is best explained by rising greenhouse gas concentrations. Greenhouse gases have a powerful effect on climate (see the next question for why). And since the Industrial Revolution, humans have been adding more of them to the atmosphere, primarily by extracting and burning fossil fuels like coal, oil and gas, which releases carbon dioxide.

Bubbles of ancient air trapped in ice show that, before about 1750, the concentration of carbon dioxide in the atmosphere was roughly 280 parts per million. It began to rise slowly and crossed the 300 p.p.m. threshold around 1900. CO2 levels then accelerated as cars and electricity became big parts of modern life, recently topping 420 p.p.m. The concentration of methane, the second most important greenhouse gas, has more than doubled. We’re now emitting carbon much faster than it was released 56 million years ago.

These rapid increases in greenhouse gases have caused the climate to warm abruptly. In fact, climate models suggest that greenhouse warming can explain virtually all of the temperature change since 1950. According to the most recent report by the Intergovernmental Panel on Climate Change, which assesses published scientific literature, natural drivers and internal climate variability can only explain a small fraction of late-20th century warming.

Another study put it this way: The odds of current warming occurring without anthropogenic greenhouse gas emissions are less than 1 in 100,000.

But greenhouse gases aren’t the only climate-altering compounds people put into the air. Burning fossil fuels also produces particulate pollution that reflects sunlight and cools the planet. Scientists estimate that this pollution has masked up to half of the greenhouse warming we would have otherwise experienced.

Since greenhouse gases occur naturally, how do we know they’re causing Earth’s temperature to rise?

Greenhouse gases like water vapor and carbon dioxide serve an important role in the climate. Without them, Earth would be far too cold to maintain liquid water and humans would not exist!

Here’s how it works: the planet’s temperature is basically a function of the energy the Earth absorbs from the sun (which heats it up) and the energy Earth emits to space as infrared radiation (which cools it down). Because of their molecular structure, greenhouse gases temporarily absorb some of that outgoing infrared radiation and then re-emit it in all directions, sending some of that energy back toward the surface and heating the planet. Scientists have understood this process since the 1850s.

Greenhouse gas concentrations have varied naturally in the past. Over millions of years, atmospheric CO2 levels have changed depending on how much of the gas volcanoes belched into the air and how much got removed through geologic processes. On time scales of hundreds to thousands of years, concentrations have changed as carbon has cycled between the ocean, soil and air.

Today, however, we are the ones causing CO2 levels to increase at an unprecedented pace by taking ancient carbon from geologic deposits of fossil fuels and putting it into the atmosphere when we burn them. Since 1750, carbon dioxide concentrations have increased by almost 50 percent. Methane and nitrous oxide, other important anthropogenic greenhouse gases that are released mainly by agricultural activities, have also spiked over the last 250 years.

We know based on the physics described above that this should cause the climate to warm. We also see certain telltale “fingerprints” of greenhouse warming. For example, nights are warming even faster than days because greenhouse gases don’t go away when the sun sets. And upper layers of the atmosphere have actually cooled, because more energy is being trapped by greenhouse gases in the lower atmosphere.

We also know that we are the cause of rising greenhouse gas concentrations — and not just because we can measure the CO2 coming out of tailpipes and smokestacks. We can see it in the chemical signature of the carbon in CO2.

Carbon comes in three different masses: 12, 13 and 14. Things made of organic matter (including fossil fuels) tend to have relatively less carbon-13. Volcanoes tend to produce CO2 with relatively more carbon-13. And over the last century, the carbon in atmospheric CO2 has gotten lighter, pointing to an organic source.

We can tell it’s old organic matter by looking for carbon-14, which is radioactive and decays over time. Fossil fuels are too ancient to have any carbon-14 left in them, so if they were behind rising CO2 levels, you would expect the amount of carbon-14 in the atmosphere to drop, which is exactly what the data show.

It’s important to note that water vapor is the most abundant greenhouse gas in the atmosphere. However, it does not cause warming; instead it responds to it. That’s because warmer air holds more moisture, which creates a snowball effect in which human-caused warming allows the atmosphere to hold more water vapor and further amplifies climate change. This so-called feedback cycle has doubled the warming caused by anthropogenic greenhouse gas emissions.

Why should we be worried that the planet has warmed 2°F since the 1800s?

A common source of confusion when it comes to climate change is the difference between weather and climate. Weather is the constantly changing set of meteorological conditions that we experience when we step outside, whereas climate is the long-term average of those conditions, usually calculated over a 30-year period. Or, as some say: Weather is your mood and climate is your personality.

So while 2 degrees Fahrenheit doesn’t represent a big change in the weather, it’s a huge change in climate. As we’ve already seen, it’s enough to melt ice and raise sea levels, to shift rainfall patterns around the world and to reorganize ecosystems, sending animals scurrying toward cooler habitats and killing trees by the millions.

It’s also important to remember that two degrees represents the global average, and many parts of the world have already warmed by more than that. For example, land areas have warmed about twice as much as the sea surface. And the Arctic has warmed by about 5 degrees. That’s because the loss of snow and ice at high latitudes allows the ground to absorb more energy, causing additional heating on top of greenhouse warming.

Relatively small long-term changes in climate averages also shift extremes in significant ways. For instance, heat waves have always happened, but they have shattered records in recent years. In June of 2020, a town in Siberia registered temperatures of 100 degrees. And in Australia, meteorologists have added a new color to their weather maps to show areas where temperatures exceed 125 degrees. Rising sea levels have also increased the risk of flooding because of storm surges and high tides. These are the foreshocks of climate change.

And we are in for more changes in the future — up to 9 degrees Fahrenheit of average global warming by the end of the century, in the worst-case scenario. For reference, the difference in global average temperatures between now and the peak of the last ice age, when ice sheets covered large parts of North America and Europe, is about 11 degrees Fahrenheit.

Under the Paris Climate Agreement, which President Biden recently rejoined, countries have agreed to try to limit total warming to between 1.5 and 2 degrees Celsius, or 2.7 and 3.6 degrees Fahrenheit, since preindustrial times. And even this narrow range has huge implications. According to scientific studies, the difference between 2.7 and 3.6 degrees Fahrenheit will very likely mean the difference between coral reefs hanging on or going extinct, and between summer sea ice persisting in the Arctic or disappearing completely. It will also determine how many millions of people suffer from water scarcity and crop failures, and how many are driven from their homes by rising seas. In other words, one degree Fahrenheit makes a world of difference.

Is climate change a part of the planet’s natural warming and cooling cycles?

Earth’s climate has always changed. Hundreds of millions of years ago, the entire planet froze. Fifty million years ago, alligators lived in what we now call the Arctic. And for the last 2.6 million years, the planet has cycled between ice ages when the planet was up to 11 degrees cooler and ice sheets covered much of North America and Europe, and milder interglacial periods like the one we’re in now.

Climate denialists often point to these natural climate changes as a way to cast doubt on the idea that humans are causing climate to change today. However, that argument rests on a logical fallacy. It’s like “seeing a murdered body and concluding that people have died of natural causes in the past, so the murder victim must also have died of natural causes,” a team of social scientists wrote in The Debunking Handbook, which explains the misinformation strategies behind many climate myths.

Indeed, we know that different mechanisms caused the climate to change in the past. Glacial cycles, for example, were triggered by periodic variations in Earth’s orbit, which take place over tens of thousands of years and change how solar energy gets distributed around the globe and across the seasons.

These orbital variations don’t affect the planet’s temperature much on their own. But they set off a cascade of other changes in the climate system; for instance, growing or melting vast Northern Hemisphere ice sheets and altering ocean circulation. These changes, in turn, affect climate by altering the amount of snow and ice, which reflect sunlight, and by changing greenhouse gas concentrations. This is actually part of how we know that greenhouse gases have the ability to significantly affect Earth’s temperature.

For at least the last 800,000 years, atmospheric CO2 concentrations oscillated between about 180 parts per million during ice ages and about 280 p.p.m. during warmer periods, as carbon moved between oceans, forests, soils and the atmosphere. These changes occurred in lock step with global temperatures, and are a major reason the entire planet warmed and cooled during glacial cycles, not just the frozen poles.

Today, however, CO2 levels have soared to 420 p.p.m. — the highest they’ve been in at least three million years. The concentration of CO2 is also increasing about 100 times faster than it did at the end of the last ice age. This suggests something else is going on, and we know what it is: Since the Industrial Revolution, humans have been burning fossil fuels and releasing greenhouse gases that are heating the planet now (see Question 5 for more details on how we know this, and Questions 4 and 8 for how we know that other natural forces aren’t to blame).

Over the next century or two, societies and ecosystems will experience the consequences of this climate change. But our emissions will have even more lasting geologic impacts: According to some studies, greenhouse gas levels may have already warmed the planet enough to delay the onset of the next glacial cycle for at least an additional 50,000 years.

How do we know global warming is not because of the sun or volcanoes?

The sun is the ultimate source of energy in Earth’s climate system, so it’s a natural candidate for causing climate change. And solar activity has certainly changed over time. We know from satellite measurements and other astronomical observations that the sun’s output changes on 11-year cycles. Geologic records and sunspot numbers, which astronomers have tracked for centuries, also show long-term variations in the sun’s activity, including some exceptionally quiet periods in the late 1600s and early 1800s.

We know that, from 1900 until the 1950s, solar irradiance increased. And studies suggest that this had a modest effect on early 20th century climate, explaining up to 10 percent of the warming that’s occurred since the late 1800s. However, in the second half of the century, when the most warming occurred, solar activity actually declined. This disparity is one of the main reasons we know that the sun is not the driving force behind climate change.

Another reason we know that solar activity hasn’t caused recent warming is that, if it had, all the layers of the atmosphere should be heating up. Instead, data show that the upper atmosphere has actually cooled in recent decades — a hallmark of greenhouse warming.

So how about volcanoes? Eruptions cool the planet by injecting ash and aerosol particles into the atmosphere that reflect sunlight. We’ve observed this effect in the years following large eruptions. There are also some notable historical examples, like when Iceland’s Laki volcano erupted in 1783, causing widespread crop failures in Europe and beyond, and the “year without a summer,” which followed the 1815 eruption of Mount Tambora in Indonesia.

Since volcanoes mainly act as climate coolers, they can’t really explain recent warming. However, scientists say that they may also have contributed slightly to rising temperatures in the early 20th century. That’s because there were several large eruptions in the late 1800s that cooled the planet, followed by a few decades with no major volcanic events when warming caught up. During the second half of the 20th century, though, several big eruptions occurred as the planet was heating up fast. If anything, they temporarily masked some amount of human-caused warming.

The second way volcanoes can impact climate is by emitting carbon dioxide. This is important on time scales of millions of years — it’s what keeps the planet habitable (see Question 5 for more on the greenhouse effect). But by comparison to modern anthropogenic emissions, even big eruptions like Krakatoa and Mount St. Helens are just a drop in the bucket. After all, they last only a few hours or days, while we burn fossil fuels 24-7. Studies suggest that, today, volcanoes account for 1 to 2 percent of total CO2 emissions.

How can winters and certain places be getting colder if the planet is warming?

When a big snowstorm hits the United States, climate denialists can try to cite it as proof that climate change isn’t happening. In 2015, Senator James Inhofe, an Oklahoma Republican, famously lobbed a snowball in the Senate as he denounced climate science. But these events don’t actually disprove climate change.

While there have been some memorable storms in recent years, winters are actually warming across the world. In the United States, average temperatures in December, January and February have increased by about 2.5 degrees this century.

On the flip side, record cold days are becoming less common than record warm days. In the United States, record highs now outnumber record lows two-to-one. And ever-smaller areas of the country experience extremely cold winter temperatures. (The same trends are happening globally.)

So what’s with the blizzards? Weather always varies, so it’s no surprise that we still have severe winter storms even as average temperatures rise. However, some studies suggest that climate change may be to blame. One possibility is that rapid Arctic warming has affected atmospheric circulation, including the fast-flowing, high-altitude air that usually swirls over the North Pole (a.k.a. the Polar Vortex). Some studies suggest that these changes are bringing more frigid temperatures to lower latitudes and causing weather systems to stall, allowing storms to produce more snowfall. This may explain what we’ve experienced in the U.S. over the past few decades, as well as a wintertime cooling trend in Siberia, although exactly how the Arctic affects global weather remains a topic of ongoing scientific debate.

Climate change may also explain the apparent paradox behind some of the other places on Earth that haven’t warmed much. For instance, a splotch of water in the North Atlantic has cooled in recent years, and scientists say they suspect that may be because ocean circulation is slowing as a result of freshwater streaming off a melting Greenland. If this circulation grinds almost to a halt, as it’s done in the geologic past, it would alter weather patterns around the world.

Not all cold weather stems from some counterintuitive consequence of climate change. But it’s a good reminder that Earth’s climate system is complex and chaotic, so the effects of human-caused changes will play out differently in different places. That’s why “global warming” is a bit of an oversimplification. Instead, some scientists have suggested that the phenomenon of human-caused climate change would more aptly be called “global weirding.”

Wildfires and bad weather have always happened. How do we know there’s a connection to climate change?

Extreme weather and natural disasters are part of life on Earth — just ask the dinosaurs. But there is good evidence that climate change has increased the frequency and severity of certain phenomena like heat waves, droughts and floods. Recent research has also allowed scientists to identify the influence of climate change on specific events.

Let’s start with heat waves. Studies show that stretches of abnormally high temperatures now happen about five times more often than they would without climate change, and they last longer, too. Climate models project that, by the 2040s, heat waves will be about 12 times more frequent. And that’s concerning since extreme heat often causes increased hospitalizations and deaths, particularly among older people and those with underlying health conditions. In the summer of 2003, for example, a heat wave caused an estimated 70,000 excess deaths across Europe. (Human-caused warming amplified the death toll.)

Climate change has also exacerbated droughts, primarily by increasing evaporation. Droughts occur naturally because of random climate variability and factors like whether El Niño or La Niña conditions prevail in the tropical Pacific. But some researchers have found evidence that greenhouse warming has been affecting droughts since even before the Dust Bowl. And it continues to do so today. According to one analysis, the drought that afflicted the American Southwest from 2000 to 2018 was almost 50 percent more severe because of climate change. It was the worst drought the region had experienced in more than 1,000 years.

Rising temperatures have also increased the intensity of heavy precipitation events and the flooding that often follows. For example, studies have found that, because warmer air holds more moisture, Hurricane Harvey, which struck Houston in 2017, dropped between 15 and 40 percent more rainfall than it would have without climate change.

It’s still unclear whether climate change is changing the overall frequency of hurricanes, but it is making them stronger. And warming appears to favor certain kinds of weather patterns, like the “Midwest Water Hose” events that caused devastating flooding across the Midwest in 2019.

It’s important to remember that in most natural disasters, there are multiple factors at play. For instance, the 2019 Midwest floods occurred after a recent cold snap had frozen the ground solid, preventing the soil from absorbing rainwater and increasing runoff into the Missouri and Mississippi Rivers. These waterways have also been reshaped by levees and other forms of river engineering, some of which failed in the floods.

Wildfires are another phenomenon with multiple causes. In many places, fire risk has increased because humans have aggressively fought natural fires and prevented Indigenous peoples from carrying out traditional burning practices. This has allowed fuel to accumulate that makes current fires worse.

However, climate change still plays a major role by heating and drying forests, turning them into tinderboxes. Studies show that warming is the driving factor behind the recent increases in wildfires; one analysis found that climate change is responsible for doubling the area burned across the American West between 1984 and 2015. And researchers say that warming will only make fires bigger and more dangerous in the future.

How bad are the effects of climate change going to be?

It depends on how aggressively we act to address climate change. If we continue with business as usual, by the end of the century, it will be too hot to go outside during heat waves in the Middle East and South Asia. Droughts will grip Central America, the Mediterranean and southern Africa. And many island nations and low-lying areas, from Texas to Bangladesh, will be overtaken by rising seas. Conversely, climate change could bring welcome warming and extended growing seasons to the upper Midwest, Canada, the Nordic countries and Russia. Farther north, however, the loss of snow, ice and permafrost will upend the traditions of Indigenous peoples and threaten infrastructure.

It’s complicated, but the underlying message is simple: unchecked climate change will likely exacerbate existing inequalities. At a national level, poorer countries will be hit hardest, even though they have historically emitted only a fraction of the greenhouse gases that cause warming. That’s because many less developed countries tend to be in tropical regions where additional warming will make the climate increasingly intolerable for humans and crops. These nations also often have greater vulnerabilities, like large coastal populations and people living in improvised housing that is easily damaged in storms. And they have fewer resources to adapt, which will require expensive measures like redesigning cities, engineering coastlines and changing how people grow food.

Already, between 1961 and 2000, climate change appears to have harmed the economies of the poorest countries while boosting the fortunes of the wealthiest nations that have done the most to cause the problem, making the global wealth gap 25 percent bigger than it would otherwise have been. Similarly, the Global Climate Risk Index found that lower income countries — like Myanmar, Haiti and Nepal — rank high on the list of nations most affected by extreme weather between 1999 and 2018. Climate change has also contributed to increased human migration, which is expected to increase significantly.

Even within wealthy countries, the poor and marginalized will suffer the most. People with more resources have greater buffers, like air-conditioners to keep their houses cool during dangerous heat waves, and the means to pay the resulting energy bills. They also have an easier time evacuating their homes before disasters, and recovering afterward. Lower income people have fewer of these advantages, and they are also more likely to live in hotter neighborhoods and work outdoors, where they face the brunt of climate change.

These inequalities will play out on an individual, community, and regional level. A 2017 analysis of the U.S. found that, under business as usual, the poorest one-third of counties, which are concentrated in the South, will experience damages totaling as much as 20 percent of gross domestic product, while others, mostly in the northern part of the country, will see modest economic gains. Solomon Hsiang, an economist at University of California, Berkeley, and the lead author of the study, has said that climate change “may result in the largest transfer of wealth from the poor to the rich in the country’s history.”

Even the climate “winners” will not be immune from all climate impacts, though. Desirable locations will face an influx of migrants. And as the coronavirus pandemic has demonstrated, disasters in one place quickly ripple across our globalized economy. For instance, scientists expect climate change to increase the odds of multiple crop failures occurring at the same time in different places, throwing the world into a food crisis.

On top of that, warmer weather is aiding the spread of infectious diseases and the vectors that transmit them, like ticks and mosquitoes. Research has also identified troubling correlations between rising temperatures and increased interpersonal violence, and climate change is widely recognized as a “threat multiplier” that increases the odds of larger conflicts within and between countries. In other words, climate change will bring many changes that no amount of money can stop. What could help is taking action to limit warming.

What will it cost to do something about climate change, versus doing nothing?

One of the most common arguments against taking aggressive action to combat climate change is that doing so will kill jobs and cripple the economy. But this implies that there’s an alternative in which we pay nothing for climate change. And unfortunately, there isn’t. In reality, not tackling climate change will cost a lot, and cause enormous human suffering and ecological damage, while transitioning to a greener economy would benefit many people and ecosystems around the world.

Let’s start with how much it will cost to address climate change. To keep warming well below 2 degrees Celsius, the goal of the Paris Climate Agreement, society will have to reach net zero greenhouse gas emissions by the middle of this century. That will require significant investments in things like renewable energy, electric cars and charging infrastructure, not to mention efforts to adapt to hotter temperatures, rising sea-levels and other unavoidable effects of current climate changes. And we’ll have to make changes fast.

Estimates of the cost vary widely. One recent study found that keeping warming to 2 degrees Celsius would require a total investment of between $4 trillion and $60 trillion, with a median estimate of $16 trillion, while keeping warming to 1.5 degrees Celsius could cost between $10 trillion and $100 trillion, with a median estimate of $30 trillion. (For reference, the entire world economy was about $88 trillion in 2019.) Other studies have found that reaching net zero will require annual investments ranging from less than 1.5 percent of global gross domestic product to as much as 4 percent. That’s a lot, but within the range of historical energy investments in countries like the U.S.

Now, let’s consider the costs of unchecked climate change, which will fall hardest on the most vulnerable. These include damage to property and infrastructure from sea-level rise and extreme weather, death and sickness linked to natural disasters, pollution and infectious disease, reduced agricultural yields and lost labor productivity because of rising temperatures, decreased water availability and increased energy costs, and species extinction and habitat destruction. Dr. Hsiang, the U.C. Berkeley economist, describes it as “death by a thousand cuts.”

As a result, climate damages are hard to quantify. Moody’s Analytics estimates that even 2 degrees Celsius of warming will cost the world $69 trillion by 2100, and economists expect the toll to keep rising with the temperature. In a recent survey, economists estimated the cost would equal 5 percent of global G.D.P. at 3 degrees Celsius of warming (our trajectory under current policies) and 10 percent for 5 degrees Celsius. Other research indicates that, if current warming trends continue, global G.D.P. per capita will decrease between 7 percent and 23 percent by the end of the century — an economic blow equivalent to multiple coronavirus pandemics every year. And some fear these are vast underestimates.

Already, studies suggest that climate change has slashed incomes in the poorest countries by as much as 30 percent and reduced global agricultural productivity by 21 percent since 1961. Extreme weather events have also racked up a large bill. In 2020, in the United States alone, climate-related disasters like hurricanes, droughts, and wildfires caused nearly $100 billion in damages to businesses, property and infrastructure, compared to an average of $18 billion per year in the 1980s.

Given the steep price of inaction, many economists say that addressing climate change is a better deal. It’s like that old saying: an ounce of prevention is worth a pound of cure. In this case, limiting warming will greatly reduce future damage and inequality caused by climate change. It will also produce so-called co-benefits, like saving one million lives every year by reducing air pollution, and millions more from eating healthier, climate-friendly diets. Some studies even find that meeting the Paris Agreement goals could create jobs and increase global G.D.P. And, of course, reining in climate change will spare many species and ecosystems upon which humans depend — and which many people believe to have their own innate value.

The challenge is that we need to reduce emissions now to avoid damages later, which requires big investments over the next few decades. And the longer we delay, the more we will pay to meet the Paris goals. One recent analysis found that reaching net-zero by 2050 would cost the U.S. almost twice as much if we waited until 2030 instead of acting now. But even if we miss the Paris target, the economics still make a strong case for climate action, because every additional degree of warming will cost us more — in dollars, and in lives.