Friday, July 1, 2016

Climate Central - News

Climate Central is a nonprofit science and media organization created to provide clear and objective information about climate change and its potential solutions.
  1. Here’s What Your July 4 Road Trip Means for the Climate

    When an expected record-breaking 36 million Americans take their holiday road trips this Fourth of July weekend, they’ll be part of what is quickly becoming our nation’s biggest source of carbon dioxide emissions — transportation.

    This winter, for the first time since 1979, carbon dioxide emissions from cars, trucks and SUVs surpassed the carbon pollution from electric power plants, which have been America’s chief climate polluter for more than 35 years, U.S. Energy Information Administration data released this week show.

    Traffic in Los Angeles. Credit: Prayitno/flickr

    “The major significance is it shows that decarbonization is about more than just the power sector,” said Sam Ori, executive director of the Energy Policy Institute at the University of Chicago.

    In both February and March, the transportation sector’s annual carbon dioxide emissions added up to more than the emissions from electric power plants (when calculated using a rolling 12-month total).


    In raw numbers, Americans’ vehicle tailpipes emitted 1.88 billion metric tons of carbon dioxide for the 12 months ending in March. During the same period, U.S. power plants polluted the air with 1.84 billion tons of carbon dioxide. The numbers were similar in February.

    By contrast, for the 12 months ending in March 2015, vehicles polluted the air with 1.84 billion tons of carbon dioxide, quite a bit less than the 1.99 billion tons of carbon dioxide that power plants emitted into the atmosphere.

    Roughly two-thirds of transportation sector carbon emissions come from light-duty cars and trucks, with airplanes, trains, heavy trucks and ships making up the rest.

    Cheaper gasoline is the primary reason for the increase in emissions as Americans are driving a lot more than they used to. For the moment, that’s offsetting increasing vehicle fuel efficiency.  

    Credit: Sam Ori/Energy Policy Insitute of Chicago, University of Chicago

    At the same time, power plants are getting greener and cleaner. Their emissions have been falling for more than six years as the fracking boom, cheap natural gas prices, federal mercury pollution regulations, and the prospect of having to comply with climate change rules such as the Clean Power Plan have encouraged utilities to shutter coal-fired power plants.

    Cleaner-burning natural gas power plants and wind and solar farms are replacing those that run on coal, and today, natural gas, which emits about half the carbon as coal, is becoming America’s chief source of electricity.

    Consumer transportation choices — the kinds of cars people choose to drive and the road trips they choose to take when gasoline prices fall — are among the biggest reasons transportation is becoming America’s biggest climate change problem, Ori said.

    “Miles driven on U.S. highways grew by the second-fastest rate in 40 years last year and will probably match that rate this year,” Ori said.

    With gasoline averaging $2.31 per gallon of regular unleaded — 47 cents less than a year ago, according to AAA — Americans are driving more than ever before. Ori’s analysis of EIA data shows that Americans drive an average of about 8.7 billion miles per day, up from about 8.1 billion in 2011 and 6.8 billion in 1997.

    The 2008 recession stopped growth of vehicle miles traveled in its tracks, but those miles started growing quickly in 2013 and 2014 as the economy improved and crude oil prices collapsed. This weekend, AAA expects the highest-ever Fourth of July travel volume — 36 million people — on the highways, mainly because of cheap gasoline.

    Cheap gasoline has made it easier for people to ditch electric vehicles and fuel-efficient cars and buy larger, less efficient vehicles instead.

    Credit: Sam Ori/Energy Policy Insitute of Chicago, University of Chicago

    “As oil and gasoline prices fell, U.S. consumers piled into SUVs and pickup trucks at record levels,” Ori said. “Those kinds of light trucks now account for about 60 percent of the U.S. market, up from around 50 percent in 2013.”

    All vehicles are becoming more efficient, but larger vehicles will always be less efficient than smaller ones, so the more people drive larger vehicles, the more pollution they’ll emit, said Daniel Sperling, a civil engineering professor at the Institute of Transportation Studies at the University of California-Davis.

    The trend toward higher transportation emissions shows that to address climate change, the government needs to both boost fuel efficiency standards and think beyond them by developing new fuels and technology that would reduce carbon emissions from vehicle tailpipes, said John DeCicco, a research professor at the University of Michigan Energy Institute.

    “In terms of how people think about climate change, there is a need for much greater awareness of the significance of transportation as well as better education about consumer choices,” he said.

    It has always been easier to slash emissions from power plants than from cars and trucks because regulating power plants doesn’t require consumers to change their lifestyle choices, said Timothy Johnson, an energy and environment professor at Duke University.

    For vehicle emissions to be drop, it will have to become more expensive for people to drive, Johnson said.

    “Transportation costs are a reasonably small share of average household spending (less than 20 percent) for people in the middle class and above, and people seem to tolerate a certain amount of time spent commuting,” he said. “Transportation choices and behavior are not likely to change unless either of these factors increases appreciably.”


  2. Mexico, Canada, U.S. to Make Clean Power Pledge

    The U.S., Mexico and Canada are expected to pledge Wednesday to collectively generate 50 percent of their electricity from zero-carbon sources by 2025, according to White House officials.

    The agreement is expected to be struck at the North American Leaders’ Summit in Ottawa. It means that when all the electricity generated in the three countries is added up, the amount coming from zero-carbon sources will jump from 37 percent today to half within 10 years. Zero-carbon sources include solar, wind, hydropower and nuclear, along with energy efficiency and other measures, White House officials said Monday.

    Wind turbines near Palm Springs, Calif.
    Credit: nate2b/flickr

    “At a time when other parts of the world are splintering, it’s encouraging to see more of a unified effort in North America,” said Michael Gerrard, director of the Sabin Center for Climate Change Law at Columbia University. “This pledge won't be legally binding, but it signals political commitments by the current leadership of these three countries.”

    About 31 percent of U.S. electricity comes from zero-carbon sources today, including 20 percent from nuclear power, and about 11 percent from hydropower, wind and solar. Hydropower is Canada’s primary source of electricity, representing nearly 60 percent of its power supply. Clean energy generates 22 percent of Mexico’s electricity.


    The U.S., Canada and Mexico have each pledged to cut their greenhouse gas emissions as part of their commitments to the Paris climate agreement struck in December. Reducing greenhouse gas emissions from electric power plants forms the core of U.S. climate policy, including the Obama administration’s Clean Power Plan, which aims to drastically cut carbon pollution from coal-fired power plants.

    Wednesday’s agreement will put all three countries on a path to meeting their climate goals, though it won’t be enough by itself, said Michael Mann, a Penn State University climatologist.

    “Similarly strong commitments to reduce emissions in the electric power sector and other sectors of the economy will be critical if they are to meet their total greenhouse gas emissions pledges,” he said.

    Gerrard said Wednesday’s agreement will provide many opportunities for cross-border cooperation, including emissions trading and the export of clean energy.

    A hydroelectric dam in southern Quebec, Canada.
    Credit: Axel Drainville/flickr

    “The U.S. can't meet its Paris climate goals based solely on the Clean Power Plan and other policies now in place, and joint efforts with Canada and Mexico can make important contributions,” he said. “The magnitude of the needed energy transition away from fossil fuels is such that, in addition to efficiency, wind and solar, we need large doses of hydropower and probably nuclear to fill the gap.”

    Robert Stavins, a professor of business and government at Harvard University’s John F. Kennedy School of Government, said while the pledges are not binding, they can lead to a greater reliance on renewables if Canada, the U.S. and Mexico follow through with meaningful policies and new energy efficiency standards.

    “Such policies are the instruments through which the U.S. will meet its Paris contribution,” he said.

    Mann cautioned that some “zero-carbon” energy sources, including hydro and nuclear power, come with high environmental costs, and the risks need to be weighed carefully.

    For example, hydropower reservoirs often emit methane — a powerful greenhouse gas helping to drive climate change. California does not consider large hydropower projects to be a renewable power source.

    “Wind and solar are arguably preferable choices from a full environmental cost-accounting standpoint, but such matters are worthy of a robust policy debate,” Mann said.


  3. ‘Water Windfall’ Found in Drought-Stricken California

    California’s Central Valley has three times more freshwater in underground aquifers than previously thought, drinking water that could help the state weather future drought and fortify itself against a changing climate, according to a new Stanford University study.

    But tapping that water, locked thousands of feet beneath the ground, will be expensive and comes with an enormous risk  — it could cause the valley floor to sink, according to the study, published Monday in the Proceedings of the National Academy of Sciences. Sinking land in the Central Valley is threatening roads, homes and other infrastructure, and reduces the amount of water some aquifers can hold.

    California's parched Central Valley in 2014.
    Credit: Stuart Rankin/NASA/flickr

    “It’s not often that you find a ‘water windfall,’ but we just did,” said study co-author Rob Jackson, an earth system science professor at Stanford University. “California’s already using an increasing amount of groundwater from deeper than 1,000 feet. Our goal was to estimate how much water is potentially available.”

    Climate change is exposing the state to a greater threat of drought, reducing the amount of water available for farming and drinking as higher temperatures evaporate reservoirs. More precipitation is expected to fall as rain instead of snow in California as the world warms, forcing the state to find new ways to store rain water for municipal and agricultural use.


    To stave off losses during its four-year drought, California has relied on groundwater to irrigate its farm fields. So much groundwater is being used that the water table has fallen by 50 feet in some places in the Central Valley, and the valley floor is sinking, or subsiding, as aquifers are depleted.

    Land subsidence, which has been occurring in the valley for decades because of groundwater pumping, has accelerated to two inches per month in some places, according to NASA. Sinking land threatens roads, bridges, aqueducts, buildings and other infrastructure as the land collapses beneath them.

    Most of the groundwater comes from aquifers less than 1,000 feet deep. Deeper aquifers are usually considered too salty to be used for drinking or irrigation, requiring costly desalination and drilling operations to access them.

    Analyzing water data gathered from oil and gas wells across eight Central Valley counties, the Stanford researchers show that there are about 2,700 cubic kilometers of accessible fresh or brackish water locked in the Central Valley’s deep underground aquifers. That’s almost triple the 1,020 cubic kilometers of freshwater that had been previously estimated.

    Farming in California consumes between 25 million and 33 million acre-feet of water annually, or between 31 and 40 cubic kilometers of water, according to a 2015 Congressional Research Service report. A cubic kilometer of water is roughly equivalent to 1.3 times Los Angeles’ annual water use.

    Some of the water that Jackson’s team found is considered brackish — containing low levels of salt — but it could be affordably desalinated, the study says.

    “States such as Texas and Florida, and countries, including China and Australia, are already desalinating brackish water to meet their growing water demands,” the study says. “Accounting for deep but relatively fresh groundwater can substantially expand California’s groundwater resources, which is critical given the state’s current water shortages.”

    Additional research is needed to determine how much tapping the water would cause the valley floor to sink and how oil and gas development, which is common in those deep aquifers, could contaminate the water, especially from fracking, according to the study.

    The California drought has forced cities to cut back on water use.
    Credit: Kevin Cortopassi/flickr

    “We're not advocating running out and drilling lots more groundwater wells,” Jackson said. “The Central Valley's been in denial about groundwater overdrafts for years. We need to consider ground subsidence. We also need to think about oil and gas activities directly in and around freshwater aquifers. Is that the best use of the resource long term?”

    California’s water agency, the State Department of Water Resources, is concerned about the long-term implications of possibly using — and using up — a newly found reserve of freshwater.

    “Understanding the total aquifer capacity is valuable from a technical standpoint, but a more useful estimate would be how much of the aquifer can we truly utilize before we experience significant impacts to surrounding agricultural, urban and domestic water users, to public infrastructures, to the environment and to the aquifers’ ability to recharge in a reasonable time frame,” said Lauren Hersh, spokeswoman for the California Department of Water Resources’ Sustainable Groundwater Management Program.

    Leonard Konikow, an emeritus U.S. Geological Survey groundwater scientist and author of a 2013 federal government report on groundwater depletion in the U.S., said deep underground freshwater may be too expensive for many in California to access.

    “In a severe drought, such deep drilling for water might be justified for municipal or industrial supplies, but I can’t imagine that the cost would ever be justified for agricultural purposes,” he said.

    But Jackson said deep freshwater is a largely untapped and little-understood resource.

    “It’s a huge pool of water,” Jackson said. “Some companies and towns are already pumping deep groundwater. It’s a little more expensive to use because of the pumping costs, but people are already doing it. Remember, too, that private landholders often have few restrictions on what they can pump.”


  4. A Computer Just Changed the Coral Research Game

    Coral reefs are increasingly imperiled by global warming. Rising temperatures and ocean acidification are destroying some of the world’s most stunning ecosystems.

    That’s why scientists’ efforts to save them have been going into overdrive recently, working in labs and in the field to find out what makes some corals more resilient than others. The latest tool in the fight to save reefs doesn’t require test tubes or flippers, though. Instead, it requires microchips and computer coding that can analyze data orders of magnitude faster than the human eye.

    Computer analysis of XL Catlin Global Reef Record data now matches the accuracy of marine biologists but is roughly 900 times faster.
    Credit: XL Catlin Global Reef Record

    Researchers with the XL Catlin Seaview Survey have been gathering a massive store of coral reef data over the past four years using Google Street View technology. But it was almost too much of a good thing. Their store of data was so large, it would take an expert up to three decades to sort through and catalog it. Unfortunately, reefs don’t really have the luxury of waiting that long.

    A global coral bleaching event is going into its third year. It’s already affected 40 percent of the world’s coral reefs, including a widespread die-out in the Great Barrier Reef. Rising ocean temperatures due to climate change are, in large part, to blame. Ocean acidification is already eating away at reefs as well. Unless carbon pollution is reined in, many reefs could go extinct by mid-century.


    The XL Catlin Seaview Survey has documented this bleaching as well as the reefs and species that have survived it. Finding those survivors is incredibly important to scientists trying to preserve reefs, but they’re needles in a haystack of gigabytes of data.

    “We started the project know we were going to gather a lot of data,” Richard Vevers, executive director of the XL Catlin Seaview Survey, said. “We also knew early on we would have a problem analyzing images so we had to find a short cut.”

    That shortcut has been to turn to computers and the concept of “deep learning” — basically training computer eyes to spot patterns and changes. It’s a little like Facebook’s ability to recognize your friends in pictures from your last night out, but instead of beers with your buds, it's for corals and reef changes and saving species from going extinct.

    Vevers and his cadre of reef scientists connected with computer scientists at the University of California, Berkeley’s Artificial Intelligence Research Center for the data analysis. After training the computers to essentially be android coral reef scientists, they put them to task of analyzing the images. At a clip of 90 images per minute, three graphics processing units were able to knock out the analysis in a matter of weeks with an identification success rate greater than 90 percent, or about on par with a trained expert.

    On Wednesday, XL Catlin Seaview Survey released the new dataset to the public to aid in what Vevers said was a “shift in science from understanding the problem to looking for solutions.”

    Those solutions could include targeted conservation efforts for vulnerable reefs and identifying the corals most resilient to climate change. And while computers will aid scientists in identifying where the best places to preserve healthy reefs are, ultimately, the fate of corals rests in the world’s hands and is dependent on carbon pollution being reduced.


  5. Climate Change is Tipping Scales Toward More Wildfires

    By Climate Central

    The 2016 wildfire season has barely begun and dozens of large wildfires have already raged through Western states, with hundreds of thousands of acres burned. This comes on the heels of a 2015 wildfire season that was the worst on record in the U.S., with more than 10 million acres burned.

    These are not just random events. Climate change is producing conditions ripe for wildfires, tipping the scales in favor of the dramatic increases in large wildfires we have seen across the West since the 1970s. Snowpack is melting earlier as winter and spring temperatures rise, and in most states an increasing percentage of winter precipitation is falling as rain, meaning there is often less snowpack to begin with. Summer temperatures are rising, particularly in Southwestern states, where the number of extremely hot days is steadily increasing, creating more days where forests and grasslands are dried out and ready to burn.

    In 2015, far below-average snowpack in California and the Pacific Northwest created exceptionally dry conditions across the West, and the region experienced fires of a size rarely seen. Washington’s Okanogan Complex fire was the largest group of fires on record for the state. And multiple years of searing drought in California contributed to several fires that were among the state’s top 10 most destructive fires on record.

    A Climate Central analysis of 45 years of U.S. Forest Service records from the western U.S. show that the number of large fires on Forest Service land is increasing dramatically. The area burned by these fires is also growing at an alarming rate.

    Full Report

    Press Release
    • Across the Western U.S., the average annual number of large fires (larger than 1,000 acres) burning each year has more than tripled between the 1970s and the 2010s.
    • The area burned by these fires has shown an even larger increase: in an average year, more than six times as many acres across the West were burned in the 2010s than in the 1970s.
    • The fire season is 105 days longer than it was in the 1970, and is approaching the point where the notion of a fire season will be made obsolete by the reality of year-round wildfires across the West.
    Age of Western Wildfires

    Wildfires and Air Pollution: A Hidden Health Hazard

    Meltdown: More Rain, Less Snow as the World Warms


    The situation in some individual states is more extreme:

    • The average number of large fires burning each year on Forest Service land has increased at least 10-fold in the Northern Rocky Mountain states of Wyoming, Idaho, and Montana.
    • In the Pacific Northwest, there are now five times as many large fires burning in a typical year in Washington as there were in the 1970s; in Oregon there are nearly seven times as many.

    And the conditions are likely to get worse in the next several decades. Climate Central’s States at Risk project analyzed historical climate data and downscaled climate projections from 29 different global climate models. We found that in most western states, the climate conditions that can stoke summer wildfires are projected to increase substantially in the relatively short period between now and 2050. Arizona is expected to see more than a month of additional high-risk fire days by 2050.

    This analysis relies on the Keetch-Byram Drought Index (KBDI), a measure of the dryness of the top 8 inches of the forest floor, which serves as a proxy for the dryness of forest fuels. KBDI is one of a number of indicators of wildfire potential, but is commonly used by the U.S. Forest Service (along with other tools) to predict fire danger. The scale runs from 0 to 800.

    Our analysis found that the number of days with KBDI above 600 (a level at which the potential for wildfire is high) would increase significantly between now and 2050 in 10 of the western states if greenhouse gas emissions continue unabated. Across these 10 states, the total number of days with KBDI above 600 is projected to increase 46 percent between 2000 and 2050.

    Although many factors influence the ignition and spread of wildfires, our KBDI projections suggest that the conditions suitable for more and larger fires will increase in the near- to midterm. This finding is consistent with other research on wildfire potential and burning. 

  6. Wind at China’s Back to Amp Up Its Renewables

    China can tap just 10 percent of its wind resources to supply more than a quarter of its electricity by 2030, significantly boosting the global transition to renewable energy, according to an MIT study.

    Only 3 percent of China’s power comes from wind today. But if China, which is heavily dependent on coal for its electrical power, continues its feverish wind farm construction boom by building turbines closer to the power grid, the country could theoretically supply 26 percent of its power with wind within 15 years, according to a study published this week in the journal Nature Energy.

    The Tangshanpeng wind farm in China. Credit: Land Rover Our Planet/flickr

    The study used computer models that simulate China’s power grid operations under different scenarios to project how wind power could be distributed throughout the country through 2030.

    “This would go a long way toward meeting China’s climate goal of delivering 20 percent of the nation’s primary energy from non-fossil fuels by 2030 — part of China’s Paris climate pledge,” said study co-author Valerie Karplus, an assistant professor of global economics at MIT.


    China is the world’s leading carbon dioxide emitter and largest consumer of coal for electric power generation, but it’s also leading the world in wind farm construction, in part to help reduce greenhouse gas emissions.

    Meeting the world’s goal to keep global warming from exceeding 2°C (3.6°F) is impossible without China cutting its greenhouse gas emissions and weaning itself away from coal. As China transitions from an industrial economy to a more service-oriented one, electricity demand and coal use there is falling — so much so that the China’s carbon emissions could peak by 2030.

    “Wind energy for the most part displaces coal-fired power in China, hence overall emissions will go down significantly,” said study lead author Michael Roy Davidson, a researcher at MIT’s China Energy and Climate Project.

    Though China built more wind turbines than any other country last year, it remains the world’s second-largest wind power producer, generating 185.1 megawatt hours of wind power — just under the 190 million megawatt hours of wind power the U.S. generated in 2015.

    China’s total annual wind resources — the electricity that could be produced from all the breezes blowing across its land and waters if it could be fully harnessed — are about 26 million gigawatt hours — or roughly 26 times the electricity India consumes in a year. The study shows that China’s wind farms may be able to harness 10 percent of that wind power potential by the end of the next decade, when total electricity demand is expected to be 10 million gigawatt hours.

    But China has a problem with its wind power boom — its wind farms have been generally concentrated in remote areas that are among the windiest but not necessarily closest to the electric power grid, which was designed mainly for coal-fired power plants near cities.

    A wind farm near Xinjiang, China. Credit: Asian Development Bank/flickr

    The study shows that if wind farms were constructed closer to the existing grid but not necessarily in the windiest areas, wind power could be produced more efficiently and can supply more and more of China’s electricity if effectively integrated into the existing grid.

    Some big policy changes in China may be needed for that to happen, however. The study shows that one of the biggest reforms needed is to reverse a policy requiring a minimum amount of electricity production from coal-fired power plants as a way to ensure the plants are profitable. Reducing the amount of power that coal plants are required to produce would name room for more wind.

    “To integrate 10 percent of this (wind) potential in China requires significant electricity system reform,” said study co-author Da Zhang, a researcher at MIT’s China Energy and Climate Project.

    Greater electric power policy changes are needed for China to harness more than 10 percent of its wind power resources, including expanding the existing power grid and managing the grid more effectively so coal plants can more easily accommodate the intermittency of wind, Karplus said.

    Wind’s intermittency — its tendency to suddenly become calm or gusty — is difficult for grids built for coal-fired plants to handle because coal plants take a lot of time to bring online when winds drop, or draw down when winds are gusty.

    Mark Z. Jacobson, a civil and environmental engineering professor focusing on renewables and climate change at Stanford University who is unaffiliated with the study, said the study is the first to analyze both China’s wind power resources and its ability to integrate that power into the existing electric grid.

    China is likely to obtain more of its electricity from wind power as the country’s transportation system uses more electricity and less fossil fuel, increasing the demand for electrical power -- including wind -- easing its integration onto the grid, he said.

    “I believe the goal of 26 percent of power into the electric grid coming from wind by 2030 is a very modest and doable goal,” Jacobson said. “I anticipate higher penetration by then.”


  7. Dead Trees Adding to California Firefighters’ Battle

    With drought and climate change conspiring to push California’s summer wildfire season into premature overdrive, the state’s lead wildfire agency has acquired a multimillion dollar arsenal to help it cope with unprecedented numbers of dying trees.

    California recently bought $6 million worth of chippers, mobile sawmills, portable incinerators and other equipment to help its firefighters remove some of the nearly 30 million trees that now stand dead across the state, killed by drought and insects.

    A firefighter from Chino Hills keeps watch on a wildfire as it burns near Potrero, Calif.
    Credit: REUTERS/Mike Blake

    The equipment is being used as parched southern California landscapes explode in the types of summertime flames that wouldn’t normally be expected until August. Grasses that fattened up following winter storms in central and northern California are expected to fuel major blazes in the weeks ahead.

    “The more time that goes by, the dryer the fuels are going to become,” said Tom Rolinski, a U.S. Forest Service meteorologist who forecasts fire conditions in southern California. “As this summer unfolds and we get into the August and September timeframes, the fuels are going to be that much dryer, and we’re probably going to see more intense fires.”

    The California Department of Forestry and Fire Protection, normally called CAL FIRE, which is charged with protecting tens of millions of acres of mostly private land, responded to about 250 fires last week — an unusually large number for mid-June.


    On Tuesday, CAL FIRE was working with other agencies to try to contain two major blazes in southern California as firefighters in other southwestern states also battled big fires amid record-breaking heat.

    The fires are being fueled by droughts exacerbated by warming temperatures, which scientists have linked to climate change and to the natural whims of the weather.

    “Warming causes fuels to be drier than they would otherwise be,” said Park Williams, who researches ecology and climate change at Columbia’s Lamont-Doherty Earth Observatory. “Whether that corresponds to a large area burned for California this year will depend on human activities and individual weather events.”

    Even as firefighters in California toil to battle the extraordinary blazes, they’re being forced to deal with another extraordinary phenomenon: the widespread dying of trees.

    About 30 million trees across the state are estimated to have died, succumbing to attacks by beetles because of the weakening effects of drought.

    “It’s the drought that sort of sets it off, and then it lets the beetles get out of hand,” said Roger Bales, a professor at the University of California, Merced.

    Gov. Jerry Brown declared an emergency in the fall because of the unprecedented die-offs, helping to free up funds needed to remove and dispose of some of them. CAL FIRE hired hundreds of seasonal workers early this year to help remove dead trees and clear out other potential fuel for fires.

    While ecologists value dead trees as natural assets that provide holes and logs needed by wildlife, firefighters view them as safety hazards that can crash down on roads, power lines and homes and that could potentially fuel bigger blazes.

    The “scale of this tree die-off is unprecedented in modern history,” Brown’s emergency declaration stated, worsening wildfire risks and erosion threats and creating “life safety risks from falling trees.”

    A group of ecologists formally objected to the emergency declaration, arguing in a letter to Brown that dead trees are natural and necessary parts of Californian landscapes. They pointed to a growing body of research downplaying the wildfire hazards posed by trees killed by beetles.

    Dead pines photographed during an aerial survey last year in Los Padres National Forest.
    Source: U.S. Forest Service

    One of those ecologists, Chad Hanson, director of a small California nonprofit, says he agrees that dead trees that pose falling hazards should be removed. But he said trunks should be left on the ground to provide habitat instead of being incinerated or removed. “Once you fell the trees, they’re no longer a hazard,” he said.

    Summertime fires in California cause less property damage than the fires that are fanned by dry Santa Ana winds in the fall and winter, but they sap more firefighting resources, research published last year showed.

    “We were really trying to figure out how fires will change in southern California in the future,” said James Randerson, a University of California, Irvine earth scientist who contributed to the study. “What we realized early on is that there are two distinct fire types.”

    While the effects of climate change on Santa Ana winds fires remain riddled with uncertainty, scientists are generally convinced that the parching effects of global warming will lead to bigger, longer and more damaging summertime blazes in California — if they aren’t already doing so.

    That suggests the intense and early summer fire seasons in this and other recent drought-stricken years may have been less an aberration and more a bellwether of something that CAL FIRE officials frequently describe as a “new normal” for firefighters.

    With more greenhouse gas pollution piling into the atmosphere daily, continuing to warm the planet toward a 2°F increase from preindustrial times, and with warmer weather exacerbating droughts, mass tree die-offs could become routine features of Western landscapes.

    Not only would that eliminate or shrink some forests, driving them northward or uphill toward cooler climates, it could also force increasingly overworked firefighting agencies to juggle the additional routine task of managing dead trees.

    CAL FIRE is focusing its tree removal efforts in areas where most trees have died and where the dead trees pose the most immediate dangers.

    “We’re focused on high hazard areas with the greatest threat to life safety and critical infrastructure,” CAL FIRE spokeswoman Janet Upton said. “There are literally hundreds of thousands of acres, and growing, affected by the unprecedented scope and magnitude of tree mortality.”


  8. Extreme Oil Prices May Be Costly to the Climate

    When oil and gas prices go to extremes, such as when they crashed two years ago, scientists begin to look for answers about what those prices mean for climate change — especially when cheap oil encourages people to guzzle more gasoline in less fuel-efficient vehicles.

    A new study shows that if oil and gas prices remain at either extreme — very high or very low — for long periods of time, they are likely to prevent countries from keeping global warming from exceeding 2°C (3.6°F). That’s especially the case if countries do not have climate policies, such as carbon pricing, that try to aggressively cap carbon emissions.

    Oil wells in North Dakota.
    Credit: Geof Wilson/flickr

    If oil and gas prices remain low — less than $55 per barrel —  for decades, it will be a disincentive to develop renewable energy and decarbonize the global economy, according to the research by the World Bank and the International Institute for Applied Systems Analysis (IIASA).

    If oil and gas prices stay high for decades — $110 per barrel or more — oil demand still won’t be knocked down sufficiently to drastically reduce carbon emissions. High oil and gas prices encourage some renewables development, but their climate benefits are partially cancelled out by coal production. Coal gets a boost from high oil prices because it’s a cheaper alternative to natural gas, the research shows.


    “High oil prices will not be a climate savior any more than low prices will lead to climate catastrophe,” said study lead author David McCollum, an IIASA researcher.  “If the world is really serious about meeting the kind of tight carbon budgets that are required for 2°C or lower, then strong climate policy signals that put a sizeable price on carbon will be needed.”

    Carbon pricing is needed to cut greenhouse gas emissions during long periods of both low and high oil prices, he said. Cheap oil calls for a higher carbon price because there will be higher demand for fossil fuels and less incentive to develop renewables. Expensive oil calls for a lower carbon price because expanded renewables development will offset some of the emissions from increased coal production, the study says.

    The study, published last week in the journal Nature Energy, also underscores the “surprising” importance of natural gas prices in finding a solution to climate change, said Andrew Logan, director of the oil and gas program at business and climate think tank Cires, which is unaffiliated with the research.

    Natural gas prices and oil prices historically rise and fall together — something known as “coupling,” which has significant climate implications. When oil prices are low as they are today, the corresponding low natural gas prices encourage electric utilities to build natural gas power plants and close their coal-fired power plants, which are major contributors to climate change. Natural gas emits about half the carbon as coal when used to generate electricity.

    The study says that the coupling of oil and gas prices in the future is uncertain. But if coupling continues through a decades-long period of very high oil prices, carbon emissions will rise because expensive natural gas will encourage more coal to be used for electricity.

    Low oil prices have led to a dive in gasoline prices across the U.S. since June 2014.
    Credit: Andrew Mager/flickr

    Conversely, if coupling continues through decades of low oil prices, the climate will benefit because coal won’t be able compete with cheap natural gas. But cheap oil is likely to slow the expansion of renewables, too, undermining the climate benefit of expanded natural gas production and countries’ ability to help decarbonize the global economy, the study says.

    “From a climate perspective, a reduction in coal is a good thing, whereas a reduction in renewables is not so good,” McCollum said.

    Logan said the study adds important nuance to what has been a long-standing debate over how much oil prices affect the transition to a low-carbon economy.

    “While this is just a single paper, it does suggest that we need to approach climate policy with an eye toward addressing the impacts of oil prices, particularly low oil prices,” he said.

    Rob Jackson, a professor of Earth system science at Stanford University who is unaffiliated with the study, said he questions the study’s assumption that oil and natural gas prices will rise and fall together for the foreseeable future.

    “I'm unconvinced that natural gas and oil prices will stay so closely coupled,” he said, adding that more hydraulic fracturing and liquified natural gas terminals should reduce oil and gas prices’ dependence on one another.

    “We can't tell how much natural gas might compete with coal and renewables in the future because the paper doesn't include such a scenario,” Jackson said.

    Logan said that as oil has become a less important fuel for heating and electric power generation, there is no logical reason for oil and natural gas prices to be coupled.

    “With the shale gas revolution of the past decade and significant overcapacity in the global LNG (liquefied natural gas) market, the prices of the fuels have begun to decouple, and I suspect this trend will accelerate going forward,” Logan said. “What this paper underscores is that this change, in the near-term, has potentially significant and positive implications for the climate. Of course, in the longer term such a decoupling could prove damaging as sustained low natural gas prices are likely to hinder investment in clean energy.”


  9. Global Coral Bleaching Continues For a Record Third Year

    Bad coral reef news seems to be never-ending these days. Case in point: on Monday, scientists announced that the world is in for an unprecedented third year of coral bleaching across the globe.

    The announcement comes courtesy of NOAA Coral Reef Watch, which keeps an eye on a number of climate factors that can stress reefs out. That includes rising ocean temperatures, which have absolutely pummeled reefs in recent years and will only ratchet up the pressure as the globe continues to warm.

    A dying giant clam in the Great Barrier Reef following severe bleaching in winter 2016.
    Credit: XL Catlin Seaview Survey

    “This is the most widespread, longest coral bleaching event ever to occur globally,” Mark Eakin, the director of NOAA Coral Reef Watch, said.

    Over the past two years, reefs have been essentially boiled to death in parts of every ocean basin on earth. Abnormally hot waters have turned vibrant coral communities into pale white ghost towns as heat has sapped coral of the algae they need to survive. That includes a tragedy unfolding in the Great Barrier Reef, which could be permanently reshaped by rising ocean temperatures.


    There are only two other global coral bleaching events to precede this one: 1998 and 2010. Both came during El Niño years. This event is a different creature, though.

    It kicked off in 2014, when El Niño was still bubbling, and it’s still going strong in the middle of 2016 despite El Niño’s demise. Bleaching alerts are in place through fall despite increasing odds of La Niña, a Pacific Ocean phenomenon which tends to cool the planet a bit as a whole.

    Not all parts of the ocean are cooler than normal during La Niña, however. In particular, NOAA Coral Reef Watch sounded the alarm for Palau and the Federate State of Micronesia, which sit on the edge of the horseshoe of warm water that typically forms during La Niña in the western tropical Pacific.

    Both are small island nations where reefs play a vital role in tourism and storm surge protection. Other areas such as the Caribbean are still dealing with the added heat of El Niño propagating through the ocean and can expect bleaching risks to remain this summer and fall.

    Coral bleaching forecast through September 2016.
    Credit: NOAA Coral Reef Watch

    That risks extends to many reefs in the U.S. including the Florida Keys, U.S. Virgin Islands and Puerto Rico. Other U.S. reefs are also at risk through the fall including those near Hawaii, Guam and the Commonwealth of the Northern Mariana Islands. Overall, U.S. reefs have been disproportionately affected by bleaching by dint of their wide geographical reach.

    “More than 70 percent of U.S. reefs have already been hit,” Eakin said, noting that in comparison, 40 percent of reefs have been affected globally.

    In some areas such as Florida, the bleaching event has lasted so long that reefs have been beset by bleaching twice and could be in for their third go-round this summer and fall.

    The U.S. is far from the only place to suffer, though. The Great Barrier Reef has also been hit hard with up to 93 percent of the reef showing signs of bleaching. The damage was so extensive, it brought scientists to tears. Research released in April showed that global warming made the bleaching there up to 175 times more likely.

    Scientists surveying coral bleaching in the Maldives in May 2016.
    Credit: XL Catlin Seaview Survey

    Other parts of the ocean have been equally devastated, including some of the most pristine reefs on the planet that sit in the middle of the Pacific. The death of El Niño has helped cool waters in that region a bit, but warming in other parts of the ocean mean a new set of reefs are in the crosshairs.

    Scientists have pointed to global warming as a major driver. Roughly 93 percent of the heat the planet has been absorbing due to excess carbon pollution is ending up in the oceans. That’s causing changes to ecosystems across the high seas, but none are quite as dramatic as what’s happening at coral reefs.

    “If you think of them as a (climate change) bellwether, they’re ringing the bells like crazy right now,” Jennifer Koss, NOAA's Coral Reef Conservation Program director, said. “We can’t afford to not listen to them.”

    As to whether this round of bleaching goes from being a singular event to a new normal, Eakin said, “ask me next year if this doesn’t end.”


  10. This Mammal Has Been Wiped Out Due to Climate Change

    By Michael Slezak, The Guardian

    Human-caused climate change appears to have driven the Great Barrier Reef’s only endemic mammal species into the history books, with the Bramble Cay melomys, a small rodent that lives on a tiny island in the eastern Torres Strait, being completely wiped-out from its only known location.

    It is also the first recorded extinction of a mammal anywhere in the world thought to be primarily due to human-caused climate change.

    An expert says this extinction is likely just the tip of the iceberg, with climate change exerting increasing pressures on species everywhere.

    The Bramble Cay melomys has become extinct.
    Credit: Wikimedia Commons

    The rodent, also called the mosaic-tailed rat, was only known to live on Bramble Cay, a small coral cay just 1,100 feet long and 500 feet wide off the north coast of Queensland, Australia, which sits at most 10 feet above sea level.

    It had the most isolated and restricted range of any Australian mammal, and was considered the only mammal species endemic to the Great Barrier Reef.

    When its existence was first recorded by Europeans in 1845, it was seen in high density on the island, with sailors reporting they shot the “large rats” with bows and arrows. In 1978, it was estimated there were several hundred on the small island.

    But the melomys were last seen in 2009, and after an extensive search for the animal in 2014, a report has recommended its status be changed from “endangered” to “extinct.”

    Led by Ian Gynther from Queensland’s Department of Environment and Heritage Protection, and in partnership with the University of Queensland, the survey laid 150 traps on the island for six nights, and involved extensive measurements of the island and its vegetation.

    In their report, co-authored by Natalie Waller and Luke Leung from the University of Queensland, the researchers concluded the “root cause” of the extinction was sea-level rise. As a result of rising seas, the island was inundated on multiple occasions, they said, killing the animals and also destroying their habitat.

    It was estimated the area of the cay above high tide has decreased from 4 hectares in 1998 to 2.5 hectares in 2014. And worse for the melomys, they lost 97 percent of their habitat in just 10 years, with vegetation cover declining from 2.2 hectares in 2004 to just 0.065 hectares in 2014. (One hectare contains about 2.47 acres.)

    “For low-lying islands like Bramble Cay, the destructive effects of extreme water levels resulting from severe meteorological events are compounded by the impacts from anthropogenic climate change-driven sea-level rise,” the authors said in their report.

    Globally, averaged sea level has risen by almost 8 inches between 1901 and 2010, a rate unparalleled in any period during the last 6,000 years. But around the Torres Strait, sea level appears to have risen at almost twice the global average rate between 1993 and 2014.

    “Significantly, this probably represents the first recorded mammalian extinction due to anthropogenic climate change,” the researchers said in their report, quietly published on the Queensland government’s website last week.As a result, the Queensland government website now recommends suggested recovery actions not be taken. “Because the Bramble Cay melomys is now confirmed to have been lost from Bramble Cay, no recovery actions for this population can be implemented,” it says.

    Although small, the island is significant for many people and animals. It is the most important breeding ground for green turtles and a number of seabirds in the Torres Strait, the researchers said, and has long been visited by its traditional custodians, the Darnley Islanders, who caught fish, turtles and birds.

    The authors said the IUCN lists one other mammal that was driven to extinction, partly by extreme weather — the Little Swan Island hutia — but introduced cats on the island were considered the main driver of extinction.

    In contrast, the researchers say the melomys was driven to extinction due “solely (or primarily) to anthropogenic climate change."

    The one hope for the species, the authors say, is that there might be an undiscovered population in Papua New Guinea. They say the melomys might have arrived on Brambles Cay on rafting debris from the Fly River region of Papua New Guinea. If that is true, then either the Brambles Cay melomys, or a close relative, may still live undiscovered there.

    The authors recommend targeted surveys in Papua New Guinea be carried out, to see if they are there.

    John White, an ecologist from Deakin University in Australia, who was not involved in the study, said this extinction is the tip of the iceberg. “I am of absolutely no doubt we will lose species due to the increasing pressures being exerted by climate change,” he said. “Species restricted to small, low lying islands, or those with very tight environmental requirements are likely to be the first to go.”

    report in 2015 found one sixth of the world’s species face extinction due to climate change, and scientists have warned that the world is on the edge of the sixth mass extinction.

    “Certainly, extinction and climatic change has gone hand in hand throughout the history of the world,” he said. “So, if this is one of the first, it is more than likely not going to be the last.”

    Reprinted with permission from The Guardian.