Friday, August 26, 2016

Climate Central - News

Climate Central is a nonprofit science and media organization created to provide clear and objective information about climate change and its potential solutions.
  1. California’s New Climate Rules Explained

    Following more than a year of legislative toing and froing, California’s leaders agreed this week on how ambitious the state will be in the fight against climate change after 2020.

    Short answer: very.

    The San Francisco Bay Area.
    Credit: Thomas Hawk/Flickr 

    A progressive culture and Silicon Valley-style innovation a decade ago thrust California toward the head of the worldwide pack when it comes to shifting away from polluting fossil fuels in favor of cleaner alternatives.

    This week, the state Assembly and Senate ensured the state’s leadership will be strengthened when lawmakers approved two key bills.

    The legislation will require Californian agencies take steps needed to reduce greenhouse gas pollution by 40 percent in 2030, compared with 1990 levels. Gov. Jerry Brown plans to sign it.

    {related}

    How do California’s Goals Stack Up?

    The state’s new climate goals are far more ambitious than those of the U.S. overall, and they’re in line with ambitions in Europe, which is a world leader on climate action.

    Both the European Union and California are shooting for 40 percent pollution reductions in 2030 compared with 1990 levels. The Europeans got off to an earlier start, setting a more ambitious target for 2020 than California. That means California will have to work harder to reach its goals for 2030.

    “This is exactly the right goal for California,” said Chris Busch, director of research at Energy Innovation, a policy think tank in California. “It’s strong but undeniably achievable and beneficial.”

    Are any countries or states more ambitious than California?

    Just as California is the star of climate action in the U.S., the European Union has its own big shot — Germany. Germany aims to reduce its climate impacts by 40 percent by 2020 compared with 1990 levels, which is something California and the EU aim to achieve a decade later.

    Still, per person, Californians and Germans continue to be heavier polluters than most Europeans, releasing the equivalent of about 12 tons each of heat-trapping carbon dioxide in 2013. That’s a third more than the European Union average.

    For comparison, the average Indian releases less than 2 tons a year, a figure that is rising as the country’s economy grows.

    How will California hit its goals?

    California runs a number of programs that help it reduce its climate pollution, including financial support for electric vehicle purchases, mandates on biofuels and requirements that utilities source a substantial amount of their power from renewable sources.

    The new legislation requires the California Air Resources Board to strengthen these programs, or roll out new ones, to ensure that the new targets are met.

    California also was among the first in the world to set up a cap-and-trade program, which imposes a limit on the amount of pollution that industry can release each year. By selling tradeable allowances needed to pollute, the state raises money that reduces electricity bills and that funds programs that can reduce climate pollution.

    How is the cap-and-trade program faring?

    The cap-and-trade program is withering like a Californian tomato left out in the sun.

    California State Capitol in Sacramento.
    Credit: Tony Webster/Flickr

    One problem is that California’s notoriously dysfunctional legislature can’t agree on how to spend more than $1 billion in cap-and-trade funds, meaning the money is just sitting around instead of being put to work.

    “It’s not like a savings account that we put in the bank and wait for retirement,” a frustrated Sen. Fran Pavley recently opined to the L.A. Times.

    The final legislative sessions for the year are planned for next week, ahead of elections scheduled for November. Many hope the spending will be approved before the break.

    “We still have a few more days,” the Environmental Defense Fund’s Erica Morehouse optimistically pointed out

    Is that the biggest problem for cap-and-trade?

    Not even close.

    A bigger problem is that the cap-and-trade program expires at the end of 2020. Legal experts have concluded that a two-thirds vote from the Assembly and Senate may be needed to extend the program after 2020.

    That could see the program destroyed just when it’s needed the most — to help achieve increasingly ambitious climate goals.

    “The important thing to recognize about the post-2020 market in California is that the level of effort fundamentally changes,” Stanford expert Michael Wara said.

    The unlikelihood of supermajority lawmaker support for cap-and-trade may help to explain why just a small portion of pollution allowances were sold during California’s last two quarterly auctions, reducing anticipated funds flowing into California by nearly $1 billion.

    Can California’s cap-and-trade program be saved?

    California officials have been reluctant to even admit that the legal conundrum exists, perhaps wary of weakening their position in a future court battle. This week, though, senior Brown advisor Nancy McFadden suggested a ballot measure might help circumvent the need for supermajority lawmaker support.

    When a reporter asked Brown about the future of cap-and-trade on Wednesday, he said he expects California’s business community will eventually demand that lawmakers support it, because it would provide an efficient way of helping to comply with the new law.

    “I think a lot of it’s going to be driven by the business interests themselves,” Brown said. “They’re going to plead for a market system called cap-and-trade.”

    Independent experts also point out the program could probably continue operating after 2020 under current law, but only if it hands out the pollution allowances for free. Such an idea is likely be opposed as a handout by Californian lawmakers.

  2. Study Finds Biofuels Worse for Climate than Gasoline

    Years of number crunching that had seemed to corroborate the climate benefits of American biofuels were starkly challenged in a science journal on Thursday, with a team of scientists using a new approach to conclude that the climate would be better off without them.

    Based largely on comparisons of tailpipe pollution and crop growth linked to biofuels, University of Michigan Energy Institute scientists estimated that powering an American vehicle with ethanol made from corn would have caused more carbon pollution than using gasoline during the eight years studied.

    Corn is the main crop used in the U.S. to produce biofuel.
    Credit: Jim Deane/Flickr

    Most gasoline sold in the U.S. contains some ethanol, and the findings, published in Climatic Change, were controversial. They rejected years of work by other scientists who have relied on a more traditional approach to judging climate impacts from bioenergy — an approach called life-cycle analysis.

    Following the hottest month on record globally, and with temperatures nearly 2°F warmer and tides more half a foot higher than they were in the 1800s, the implications of biofuels causing more harm to the climate than good would be sweeping.

    The research was financially supported by the American Petroleum Institute, which represents fossil fuel industry companies and has sued the federal government over its biofuel rules.

    {related}

    “I’m bluntly telling the life-cycle analysis community, ‘Your method is inappropriate,’” said professor John DeCicco, who led the work. “I evaluated to what extent have we increased the rate at which the carbon dioxide is being removed from the atmosphere?”

    Lifecycle analyses assume that all carbon pollution from biofuels is eventually absorbed by growing crops. DeCicco’s analysis found that energy crops were responsible for additional plant growth that absorbed just 37 percent of biofuel pollution from 2005 to 2013, leaving most of it in the atmosphere, where it traps heat.

    “The question, ‘How does the overall greenhouse gas emission impact of corn ethanol compare to that of gasoline?’ does not have a scientific answer,” DeCicco said. “What I can say definitively is that, whatever the magnitude of the emissions impact is, it is unambiguously worse than petroleum gasoline.”

    The findings were criticized by scientists whose work is directly challenged by them.

    Argonne National Laboratory scientist Michael Wang, who has led lifecycle analyses that found climate benefits from different biofuels, called the research “highly questionable” for a range of technical reasons, including its focus on growth by American crops instead of the global network of farms.

    Driven by federal and Californian policies that promote biofuels to slow global warming, the use of ethanol, biodiesel and similar products more than trebled nationwide during the years studied, providing 6 percent of Americans’ fuel by 2013. Federal data shows gasoline sold in the U.S. last year contained about 10 percent corn ethanol.

    Thursday’s paper provided fresh fuel for a heated debate among opposing groups of scientists over bioenergy’s climate impacts. Some are certain it’s a helper in the fight against climate change. Others are convinced it’s a threat.

    “In the long run, there’s no question that biofuels displacing petroleum is a benefit,” said Daniel Schrag, a geology professor at Harvard who advises the EPA on bioenergy climate impacts. His views sharply oppose those of DeCicco. “It’s just a question of how long you have to wait.”

    Eight years of pollution from biofuels compared with extra carbon absorption by energy crops. Michigan scientists found 37 percent of the pollution remained in the atmosphere — 83 teragrams.
    Credit: DeCicco et al., Carbon balance effects of U.S. biofuel production and use, Climatic Change, 2016

    Schrag dismissed Thursday’s findings, saying there’s no reason to develop a new approach to measuring biofuels’ impacts. He said the proposed new approach fails because it doesn’t account for the years it can take for bioenergy to benefit the climate.

    Analyses by scientists who have studied the life-cycle impacts of growing corn and other crops to produce ethanol have generally concluded biofuels can create between 10 percent to 50 percent less carbon dioxide pollution than gasoline.

    Those estimates have been based on the notion that although bioenergy releases an initial blast of carbon dioxide pollution, the benefits of it accrue over time, as crops, trees and grass grow and suck that carbon dioxide back into their roots, flowers and leaves.

    Such benefits are more conceptual than scientific, turning scientific debates at the EPA and elsewhere over how to calculate them into seemingly intractable policy quagmires.

    “What timescale should we look at?,” Schrag said. “Some of the fundamental questions about timescale are not scientific questions. They are societal questions.”

    The University of Michigan scientists dispensed with the timescale-based approach altogether, eliminating the need for policy decisions about which timeframes should be used. Instead, their research provided an overview of eight years of overall climate impacts of America’s multibillion-dollar biofuel sector.

    The findings from the new approach were welcomed by Timothy Searchinger, a Princeton researcher who has been a vocal critic of bioenergy. He has been speaking out for years about the shortcomings of traditional approaches used to measure its climate impacts.

    Searchinger said the approach developed in Michigan provides an “additional calculation” to help overcome the flawed assumption that climate pollution released when bioenergy burns does not matter.

    Although European officials have warned of the limitations of the use of lifecycle analyses in assessing the climate impacts of bioenergy, the EPA has been steadfast for more than five years in its attempts to create a new regulatory framework that would continue to embrace the approach.

    “The U.S. is not coming close to offsetting the carbon released by burning biofuels through additional crop growth,” Searchinger said.

    {like}

  3. Study Suggests Earlier Onset of Human-Driven Warming

    To fully understand the warming of the planet that is being driven by human emissions of greenhouse gases, scientists need to examine the history of climate changes on Earth. Hampering this effort is the fact that direct measurements of temperature and other climate data only go back to about the late 19th century.

    But by using records kept by the Earth itself, that history can be extended back hundreds or even thousands of years.

    In a study published Wednesday in the journal Nature, a group of researchers has knitted together such natural records  — found, for example, in coral reefs, ice sheets and caves. They used those records to trace the thread of human-driven warming back to what they say is its beginning, nearly 200 years ago, when the coal-burning that took off with the Industrial Revolution was still revving up.

    Coring at a coral reef of the northwest coast of Australia.
    Click image to enlarge. Credit: Eric Matson AIMS

    Though the impact then on temperatures was small, it is measurable in certain regions, the researchers say.

    Some climate scientists not involved in the research quibble with just how much of that early signal can actually be attributed to greenhouse gases. However, there is broad agreement that the study reinforces the importance of the starting point that is used when evaluating how much the Earth has already warmed and how close we are to breaching international climate goals.

    “This early warming does mean that our instrumental records (which typically only begin in the 1880s) don’t allow us to see the picture of how humans have changed the climate,” study co-author Nerilie Abram, a paleoclimatologist at Australian National University, said in an email. “So when we are talking about targets of trying to limit climate warming to less than 1.5˚C, we are actually closer to that limit than what we would calculate from instrumental records alone.”

    What Is Pre-Industrial?

    When international negotiators forged an agreement last year on limiting temperature rise this century, they settled on a threshold of 2˚C from pre-industrial times (with some talk of tightening that limit to 1.5˚C). But exactly what period is picked to represent the pre-industrial era is key. Comparing temperatures today to the beginning of the instrumental record is problematic because at best that record goes back only to the 1880s, when some warming had likely already occurred.

    {related}

    But the Earth itself keeps records of how climate has changed over millennia, in the growth of coral reefs, the layers of ice laid down in glaciers, and the rings added each year to trees. The study authors worked with a consortium that has brought together records from different sources from spots all over the world and worked to put them together into a coherent picture of past climate change.

    The record includes new reconstructions of sea surface temperatures, something often left out of such projects because of the difficulty of obtaining ocean records, Abram said.

    The reconstruction allowed the group to examine global and regional temperature records going back 500 years. With that extended record, they picked the period 1622 to 1799 as their preindustrial era.

    An ice core still sits inside a drill.
    Click image to enlarge. Credit: Nerilie Abram

    That period is “definitely before we really started burning any significant amount of fossil fuels,” NASA climate scientist Kate Marvel said.

    Using statistical analyses, the team picked out a small, but measureable, increase in temperatures as early as the 1830s for some regions, including the tropical oceans, as well as the Northern Hemisphere more broadly.

    The findings are “further evidence that the climate has already changed significantly since the pre-industrial period,” Ed Hawkins, a climate scientist at the University of Reading in England, said in an email.

    Regional Differences

    These changes wouldn’t have been noticeable to people at the time. It was only in the 20th century that warming pushed the climate outside of what would be seen from natural variations. That natural variability also explains why the warming signal emerged first in the tropics — year-to-year variability in that region is very low, which means the signal is easier to tease out.

    The study also found that the rate of warming of the tropical oceans was about the same as that of the continents in the Northern Hemisphere. Unsurprisingly, the Arctic showed the highest rate of warming.

    Warming in the Southern Hemisphere, however, was delayed compared to the Northern Hemisphere in the reconstruction, though the researchers aren’t sure why, particularly as climate models don’t show that delay.

    Some possible explanations include a higher variability of the Southern Hemisphere climate, as well as some unappreciated aspects of how sea ice might regulate the climate. There is also a relative dearth of data compared to the Northern Hemisphere.

    More specifically, a clear warming signal has yet to emerge for Antarctica, which could be because the continent is somewhat isolated from broader climate changes by both atmospheric and oceanic currents that encircle it.

    “Antarctica sort of does its own thing,” study co-author Nicholas McKay, a climatologist at Northern Arizona University, said.

    Baseline Matters

    The researchers were surprised that they found such an early onset of warming, both Abram and McKay said. At first they suspected that the initial warming was actually the climate rebounding from the cooling impact of two major volcanic eruptions in the early 1800s, and that greenhouse warming took over later.

    “But by testing our methods, and by looking at when warming develops in climate model simulations where only greenhouse gases are changed, we were able to show that the early warming is a small but detectable signal that can be explained by the small increases in greenhouse gases that were already happening in the mid 19th century,” Abram said.

    The so-called "hockey stick" graph, which shows temperatures both from the instrumental record (in red) and paleoclimate data.
    Click image to enlarge. Credit: IPCC

    Michael Mann, a Penn State climatologist, who put together the famous “Hockey Stick” climate reconstruction, still thinks that more of that early warming is due to the rebound from volcanic cooling and that a more rigorous analysis is needed to tease out just how much warming can be attributed to greenhouse gas-driven warming.

    In particular, Mann takes issue with a statement in the study that their findings indicate the Earth’s temperature may respond faster to changes in greenhouse gases levels than previously thought, which he says is “a really basic error in interpretation.”

    Mann does agree, however, that warming goes back further than instrumental records can show and that today’s temperature rise should be compared to an earlier baseline than it currently is, or we risk underestimating warming. This was the point that other climate scientists said was the main contribution of the study.

    As Marvel put it, climate change is a question of “change from what, and that what matters.”

    {like}

  4. The Future of National Parks is Going to be a Lot Hotter

    Summertime is primetime for national parks. As snow melts, wildflowers bloom and waterfalls roar, generations of visitors have flocked to the natural wonders that dot the American landscape (to say nothing of all the amazing cultural sites the National Park Service protects).

    The National Park Service was created a century ago — August 25, 1916 to be exact — to keep an eye on the growing treasure trove of national parks. It’s been a good century as more and more land has been set aside and annual visitors now number more than 300 million, but it’s also not been without challenges. Chief among them is climate change, which will drastically alter national park landscapes in the coming decades including cranking up the heat.

    As part of Climate Central’s ongoing States at Risk project, we analyzed just how much hotter parks are projected to get later this century. We looked at the future summer temperatures in all the parks in the Lower 48 states except Dry Tortugas National Park (sorry, Fort Jefferson lovers!) assuming greenhouse gas emissions continue on their current trend. To put it in clearer context, we mapped out what places today are most comparable to park’s climates of tomorrow.

    To embed this interactive, click the preferred size for the code: 720 x 720 | 600 x 600 | 500 x 500

    The results could make you sweat. Parks are projected to have summers that are 8-12°F hotter by 2100. That means currently cool mountainous parks could be as hot as the plains. Parks in the Southeast, already a pretty hot place, will face even more extreme temperatures with a climate more like southern Texas. And otherworldly Joshua Tree National Park in southern California will face the greatest geographical climate shift, with temperatures more like Abu Dhabi by 2100.

    We also analyzed how many more days with extreme heat the parks could face. Extreme heat is a hallmark of global warming, and its impact will be most arresting in the national parks where people go, by design, to be outside in the summer. Like the rest of the country, parks are going to be seeing more dangerously hot days above 90°F, 95°F, and 100°F.

    By 2100, the glaciers of Montana’s Glacier National Park will be long gone and rising temperatures will be one of the big reasons why. Visitors will not only have to contend with an ice-free landscape, but also hotter temperatures. Today the park sees an average of only one 90°F day each year. It could see 27 days with temperatures above 90°F by the end of the century.

    Yosemite National Park, high in the Sierra Nevada mountains of California, currently sees about two weeks of 90°F weather every year. By 2050, it could see nearly a month of those temperatures, and by 2100 it could get nearly 50 such days each year.

    And the Great Smoky Mountains, currently the most-visited National Park, could go from fewer than 10 days above 90°F each year, on average now, to three months with those scorching temperatures.

    In numerous other parks, the number of days above 100°F is projected to skyrocket. Big Bend National Park in Texas could see more than 110 days above 100°F each year, on average. And Great Basin National Park in Nevada, which currently doesn’t have any days above 100°F in a typical year, could see a month of those temperatures each year by 2100.

    It’s likely that parks on the more extreme end of the temperature scale will see a drop in summer visitation, but more visitors are likely to show up in fall and spring when it won’t be fry-an-egg-on-the-sidewalk hot. That may stretch park resources thin as most parks are set up to handle summer crowds and quieter shoulder seasons. How parks deal with the change in visitation season is an open question.

    And all this is to say nothing about the impacts extreme heat will have on the natural resources around which we created national parks in the first place. Joshua Tree could become too hot for its namesake trees and there’s evidence that extreme summer days could create more rockfalls in Yosemite, which could change the face of the stunning valley at the center of the park. Wildfire risk will also skyrocket across the West and could make summer park vacations not only more hot but more smoky.

    Those are just the most visible changes. Whole ecosystems are likely to be disrupted and there are consequences scientists probably haven’t even uncovered yet (those are the ones that could be the worst since we’ll be least prepared).

    Despite the daunting situation facing the National Park Service in its second century, there are signs it’s up for the challenge. It’s already addressing climate change from the coast to the high mountains and has an A-Team team of experts to help parks answer the gnarly questions they face.

    There’s no denying that national parks will look a lot different by the end of the century, but that won’t make them any less a part of the fabric of American identity.

    Analysis by James Bronzan and Alyson Kenward, PhD.

    Methodology: Future temperatures for 47 National Parks were calculated based on the median of 29 spatially downscaled climate models (CMIP5) at 1/8 degree scale, then averaged within park boundaries. National parks in Alaska and Hawaii, along with Dry Tortugas National Park, were excluded because projections at this resolution were unavailable. Temperatures for 2050 are based on the 20-year average of 2041-2060 and for 2100 are based on the period 2080-2099. Projected temperatures assume that greenhouse gas emissions continue at their current rate (RCP8.5). The interactive map features the average summer daily high temperature (June-August), while days over 90oF, 95oF, and 100oF were counted annually. The current period values for parks and climate divisions are based on the 1991-2010 average calculated using a gridded observational dataset by Ed Maurer of Santa Clara University. 

  5. Climate Central’s ‘Pulp Fiction’ a Finalist for 2 Awards

    By staff report

    A groundbreaking series by Climate Central on the climate-threatening rise of wood energy was named as a finalist for one national journalism award, and as a runner-up in another.

    Wood energy — pellets made from trees, mostly from the U.S. South, and burned as fuel in power plants — is touted as being good for the planet, and heavily subsidized by governments in Europe for the climate benefits. But as senior science writer John Upton’s three-part series “Pulp Fiction” showed, the electricity produced using wood is heating the atmosphere more quickly than coal.

    The Online News Association announced Pulp Fiction is a finalist for explanatory reporting. The ONA will announce its award winners on Sept. 17.

    Columbia University announced Wednesday that Pulp Fiction was a runner-up for the John B. Oakes Award, a prestigious environmental writing prize. First prize went to InsideClimate News for its impactful series “Exxon: The Road Not Taken.”

    "We're proud of the work John and the team did on the series," said Paul Hanle, President and CEO of Climate Central. "This is the first major recognition of our journalism expertise, the first of many we hope, as we chart a new direction for science-based reporting."

    Credit: By Ted Blanco

    Last year, about 10 million tons of American trees were dehydrated at dozens of new pellet mills operating in southeastern states and turned into pellets, which were exported to Europe to be burned in converted coal power plants. The figure has grown by about a third each year for three years, alarming scientists and environmentalists.

    Several months after Pulp Fiction was published, the European Commission, which advises European lawmakers, launched an investigation into the environmental and wood market impacts of Britain's use of American wood for energy.

    And this month the commission published a report confirming many of Climate Central’s findings about the risks of the wood pellet trade on American forests and the global climate. The commission is preparing to recommend changes to help reduce those risks.

    Upton reported exhaustively from the field, from Virginia, North Carolina, Louisiana, Mississippi, Oregon as well as in the U.K. and Brussels. He interviewed dozens of forestry scientists, ecologists, climate scientists and other academic and government researchers, as well as workers in the affected forestry and energy industries.

    In addition to more than 12,000 words, the series featured a one-minute animation explaining the climate impacts of wood energy, six interactive 360-degree panoramas, interactive charts, graphics, and short videos of interviews with key sources in a stand-alone website.

  6. Key Program in Doubt as California Climate Bill Approved

    California lawmakers cast key votes Tuesday for ambitious climate pollution reduction goals after 2020 — shortly before learning that an auction held under a program designed to help achieve those goals had flopped, hinting at major challenges ahead.

    The state Assembly voted 47-29 to approve SB32. The bill would require the state government to enact rules and programs to “ensure” that greenhouse gas emissions fall to 40 percent below 1990 levels by 2030. That’s more ambitious than national goals and it’s in line with those adopted in Europe.

    Big oil companies that operate refineries in California have opposed climate legislation and the state's cap-and-trade program.
    Credit: Flickred!/flickr

    The vote was held shortly before California announced tepid results from the latest auction held under its cap-and-trade program, which imposes a limit on annual greenhouse gas pollution and charges companies for allowances needed to release pollution.

    The auction results suggested heavy industry in California has become skeptical about the need to purchase more allowances, creating doubts about the future of one of the world’s first and biggest cap-and-trade programs.

    {related}

    “Today’s auction results show that the markets need certainty,” Nancy McFadden, a senior adviser to Gov. Jerry Brown, said in a statement after the results were published. “Shoring up the cap-and-trade program  — either through the Legislature or by the voters — will provide that certainty.”

    The auction results showed two-thirds of pollution allowances offered for sale last week went unsold. That was nonetheless an improvement over the result of an auction held in May, when the vast majority went unsold.

    “It shows us that we’re in a period of serious concern about the future of California’s climate policy,” said Danny Cullenward, an energy economist and lawyer who researches climate policies at the Carnegie Institution for Science. “It’s bad news.”

    Official estimates of revenue from the auction are due to be announced next month. Cullenward’s calculations indicate the auction raised less than $10 million of what could have been hundreds of millions of dollars for environmental programs. Due to state rules, the revenue from most of the allowances that did sell last week will help reduce electricity bills.

    Some experts think bidders stayed away from California’s recent auctions because there are more allowances on the market than are needed, suggesting pollution reductions have been unexpectedly easy so far.

    “The 2020 target is turning out to be a pretty easy lift,” said Chris Busch, director of research at Energy Innovation, a policy think tank in California. “As a result, there’s not as much to do for the carbon market.”

    Others were more downbeat.

    “Another theory is that people are really starting to have doubts about whether this is a sustainable program, and whether the allowances they’re buying now are going to be useful after 2020,” said Michael Wara, an energy and environmental law expert at Stanford.

    California State Capitol in Sacramento.
    Credit: Tiocfaidh ár lá 1916/flickr

    Wara said he interpreted the latest auction results as indicating that “there’s not a lot of confidence” in the market after 2020.

    Unless a proposed extension of the cap-and-trade program beyond 2020 is supported by two-thirds of lawmakers, polluters are expected to argue in court that it violates a California law passed by voters in 2010 dealing with new taxes and fees.

    Such supermajority support currently seems unlikely, stoking talk in Sacramento of a potential ballot initiative in 2018 that would bypass lawmakers and appeal directly to voters.

    Globally, temperatures have risen nearly 2°F worldwide since the 1800s, with heat-trapping pollution causing seas to rise and amplifying storms and droughts. California has been a global leader in switching away from polluting fuels in favor of cleaner alternatives, and its work is supported by environmental groups across the country.

    The Assembly had been a sticking point for backers of SB32. Tuesday’s vote was welcomed by the bill’s supporters.

    “It’s definitely significant,” said Erica Morehouse, an attorney with the nonprofit Environmental Defense Fund who works on Californian legislative issues. “Now all members of the legislature have taken votes.”

    The Senate previously approved the legislation, but Morehouse said it will need to vote again because the bill has been amended. Companion legislation will also require legislative approval. That would all need to happen before the legislature goes into recess, scheduled for next week.

    Finally, the signature of Gov. Jerry Brown would be needed. He has championed the legislation, but he had been aiming for supermajority support among lawmakers. “I look forward to signing this bill,” he said in a statement after the legislature voted Tuesday.

    The amended bill contained no direct references to the cap-and-trade program, and it did not secure supermajority support from the Assembly, again raising the possibility that voters will be asked in 2018 to approve an extension of the cap-and-trade program after 2020.

    “Big environmental policies tend to face legal challenges — that’s just how the system works,” Morehouse said. “There would have been fewer outstanding legal questions if they’d gotten the supermajority.”

    {like}

  7. This Is What the Ice-Free Northwest Passage Looks Like

    An ice-free Northwest Passage was once the stuff of legend. But it’s now becoming the norm thanks to global warming, and commercial freighters to luxury cruise ships are racing to turn a profit off the newest frontier on earth.

    A satellite image taken on Aug. 9 and published by NASA Earth Observatory shows a nearly ice-free path from the North Atlantic to the Pacific Ocean. In comparison to 2013, which was a high ice year for the Arctic — at least by modern standards (it was still the sixth-lowest extent on record!) — it’s clear just how dramatic this year is.


    Satellite imagery showing the Northwest Passage in 2013, a relatively "icy" year by modern standards, and 2016.
    Credit: NASA

    It’s not the first time the Northwest Passage has opened up in recorded history — that would be 2007 — but it coincides with a particularly high profile voyage. The Crystal Serenity, a hulking 820-foot, 13 deck cruise ship, set out just last week from Anchorage on a 32-day voyage that will end in New York. It’s the largest ship to ever pass through the Northwest Passage. But with a driving range on board, the cheap berths going for $22,000 and $50,000 emergency evacuation insurance policies required for each passenger, it’s made, shall we say, waves.

    The Arctic has warmed twice as fast as the rest of the world, dramatically reshaping the region. Since satellite records began in the late 1970s, Arctic sea ice has disappeared at a rate of 13.4 percent per decade. Older sea ice that’s more resilient to breakup is also on the decline. In 30 years, it’s gone from 20 percent of all Arctic ice pack to just 3 percent.

    {related}

    After a string of record low months, Arctic sea ice decline has slowed a bit. As of August, Arctic sea ice extent is currently at its third lowest extent to date. Researchers have said it’s unlikely to reach 2012’s record-setting low, but it’s right in line with the trend of less and less ice.

    The world will continue to warm and ice will continue to melt in the coming decades. That will make the the Northwest Passage as well as other parts of the Arctic more accessible more often. With that, tourism and resource extraction are also likely to become more common. The Crystal Serenity’s operators have already said they’ll be back in 2017 and other ships are likely to follow.

    Disappearing ice isn’t just reshaping the physical landscape. The cultural landscape could also see a huge shakeup. The Crystal Serenity will allow passengers to disembark in the remote Inuit community of Ulukhaktok in the Northwest Territories. The number passengers will outnumber the villagers by 4-to-1, according to Slate (which has a particularly biting writeup of the cruise).

    It has the potential to bring much-needed money to some of the most remote places in North America and cultural exchange. But how that ripples through the cultural fabric is something that’s yet to be determined.

    And that’s to say nothing of the geopolitical struggle between countries looking to cash in on mineral, oil and other riches that have been hidden under ice for all of modern human history. It truly is a brave, new top of the world.

    {like}

  8. In Streak of Extreme Storms, What’s the Role of Warming?

    The staggering rains that swamped some 60,000 houses in southern Louisiana and shattered the previous state rain record are the latest — and perhaps most remarkable — in a string of jaw-dropping rain events across the U.S. over the past year.

    From South Carolina to Houston to West Virginia and Ellicott City, Md., each instance of extreme rainfall and subsequent flooding raises questions about the potential role of climate change in making such events more likely.

    Flooded homes are seen in St. Amant, La., on Aug. 15, 2016.
    Click image to enlarge. Credit: REUTERS/Jonathan Bachman

    Storms are expected to unleash more rain in the future as the world continues to heat up and an uptick in heavy rainfall events in the U.S. over the past few decades is already evident.

    So-called attribution studies have pointed to increased odds of certain events thanks to rising global temperatures, but what about the recent spate of intense rain events? What can we say about the potential role of global warming in juicing such a seemingly remarkable streak?

    ‘It’s Just Insanity’

    Even amongst the array of other mind-boggling rain and flood events around the U.S., the recent Louisiana disaster stands out.

    The slow-moving storm continuously pulled moisture from the Gulf of Mexico, dumping rain over the same area for hours and hours. The Baton Rouge airport recorded 32 straight hours of rainfall, from the night of Aug. 11 through the morning of Aug. 13.

    “It literally rained, for me, every waking minute of the day on Friday,” Barry Keim, the Louisiana state climatologist, said. “There was never a moment when it was not raining at my house.”

    For Baton Rouge, a 1-in-100-year rain event (or one that has a 1 percent chance of happening in any given year), would be 14 inches falling over two days. That would already be a “pretty rare event,” Keim, who also studies climate extremes at Louisiana State University, said.

    {related}

    A 1-in-1,000-year event — “we’re talking about something that’s not likely to ever happen” — would be 21 inches falling over the same time period, he said.

    There were nine stations in the area that topped that 1-in-1,000 level, two of which saw more than 25 inches in just two days. The highest rainfall was recorded in Watson, La., which saw 31.39 inches. That obliterated the previous two-day rainfall record by more than 7 inches.

    “It’s just insanity,” Keim said.

    Half of southern Louisiana received 10 inches or more of rain, and it’s possible that more homes were flooded in this event than by Hurricane Katrina, Keim said. Many of those homes hadn’t flooded during the previous flood of record, in 1983, or at any time since.

    “The whole region just got absolutely hammered,” Keim said.

    More Warming Means More Rain

    As the world warms, more water evaporates from the Earth’s surface into the atmosphere, which means that more moisture is available to storms like the one that hit Louisiana.

    “So naturally, any event that would occur anyway is going to produce more rain and is more likely to produce flooding rains,” Ken Kunkel, a climate scientist with the National Centers for Environmental Information, said.

    Total rainfall across Louisiana for the week ending in Aug. 18, 2016. A widespread area saw 10 inches or more of rain.
    Click image to enlarge. Credit: NOAA

    A trend in more heavy rainfall over recent decades has been noted in the Intergovernmental Panel on Climate Change report, the National Climate Assessment and “anywhere there’s data pretty much,” Adam Sobel, a climate scientist who leads Columbia University’s Initiative on Extreme Weather and Climate, said.

    The National Climate Assessment concluded that heavy precipitation events have become both more common and more intense in the U.S. in the last few decades. Work of Kunkel’s included in the report showed a clear increase in the number of two-day rain events that were above a 1-in-5-year threshold.

    In the Southeast specifically, there has been a 27 percent increase in the heaviest rain events, or those in the top 1 percent of all rains. Not every year has more heavy rainfall events than the last — 2012, for example, saw an intense, widespread drought — but the overall trend is clear.

    Kunkel has extended his analysis used in the NCA through this event, and to date, 2016 already ranks in the top 1 percent of all years for heavy rainfall. (Last year had the third most events.) And with several months still to go, “we have plenty of time to pick up more,” he said.

    Though he has studied these trends for years, Kunkel can still be surprised at the sheer magnitude of some downpours, including those in Louisiana.

    “Things just keep going up,” he said.

    Attribution Sheds Light on Climate Signal

    It’s not just abundant moisture that drives deluges. Particular types of storms or the paths they follow can spur heavy rains, and warming may have differing effects on each. For example, a study published in February found that a certain jet stream pattern that led to one of wettest winters in U.K. history was more likely to occur with warming.

    Attribution studies can account for all of these factors by using climate models to compare the likelihood of an event occurring in a world with greenhouse gas-driven warming and one without it. (They can also compare how often they occur now to how often they did so in the past if quality observations go back far enough in time.) Sometimes a clear climate change signal can’t be found, but that doesn’t mean one isn’t there.

     

     

    The same process could be done with a series of events, experts said, comparing the relative likelihood of a certain number of extreme rain events happening in, say, the U.S., over a given period of time with and without global warming — and evaluating whether that number is more than would be expected by chance.

    The U.S. is a large area, so it’s not surprising to have a few such events each year, Geert Jan van Oldenborgh of the Royal Netherlands Meteorological Institute (KNMI) said in an email. Van Oldenborgh works with Climate Central’s World Weather Attribution project to conduct rapid-response attribution after an extreme weather event.

    It would take running more iterations of climate models to produce a robust number of years with multiple events, but “it’s definitely technically possible,” Sobel said. While no one has attempted it so far, Sobel said it was likely just a matter of time.

    Sobel doesn’t think the onslaught of storms this year can be entirely chalked up to the influence of global warming, given other influences. These include impermeable surfaces like pavement, which can exacerbate flooding, as well as sprawl, which means heavy rains are more likely to fall on a populated area. Not to mention more media attention being paid to such events.

    “They seem to be coming so fast and hard that I just can’t quite believe it’s all global warming,” Sobel said. “We expect to see more, but not way more by tomorrow.”

    {like}

  9. Most Cities Too Hot to Host Summer Olympics by 2085

    By Lily Jamali, Reuters

    In 70 years, most cities in the Northern Hemisphere will be unfit to host the summer Olympics due to rising temperatures associated with climate change, according to a medical journal's findings.

    “Our study using climate change projection shows that there will be very few cities at the end of the century that will be able to hold the summer Olympics as we know them today,” said John Balmes, a professor of public safety at the University of California, Berkeley, who co-authored a paper published last week in the The Lancet.

    The Olympic rings are seen as the sun rises over Fort Copacabana ahead of the men's road race at the Rio Olympics.
    Credit: REUTERS/Paul Hanna

    The findings indicate that by 2085, only eight Northern Hemisphere cities outside of Western Europe are likely to be cool enough to host the summer Games. San Francisco would be one of just three North American cities that could serve as hosts.

    The researchers used projections for the wetbulb globe temperature (WBGT), a measurement that combines temperature, humidity, heat radiation and wind, to determine the viability of potential host cities.

    They focused their search in the Northern Hemisphere, home to 90 percent of the world’s population, and only considered  cities with at least 600,000 residents, the size considered necessary for hosting the Games.

    The findings assumed that any city with more than a 10 percent chance of having to cancel a marathon due to temperatures exceeding 26 degrees Celsius, or 78 degrees Fahrenheit, would not be a viable venue.

    Christ the Redeemer during sunrise in Rio de Janiero, Brazil.
    Credit: REUTERS/Kai Pfaffenbach

    “If you’re going to be spending billions of dollars to host an event, you’re going to want have a level of certainty that you’re not going to have to cancel it at the last minute,” said UC Berkeley Professor Kirk Smith, a co-author on the study.

    According to the projection models, by 2085, all of the cities that are or were in contention for either the 2020 or 2024 summer Olympics, Istanbul, Madrid, Rome, Paris and Budapest, would be unfit to host the games. 

    Tokyo, the host of the 2020 games, would also be too hot to ensure athlete safety.

    In North America, the only suitable sites would be Calgary, Vancouver and San Francisco.

    “If we project out to the 22nd century, then there are only 4 cities in the world that can host the summer Olympics and that would be Edinburgh, Glasgow, Dublin and Belfast,” said Balmes.

    Writing by Ben Gruber in Miami; Editing by Meredith Mazzilli

  10. After Scorching Heat, Earth Likely to Get Respite in 2017

    By Alister Doyle, Reuters

    The Earth is likely to get relief in 2017 from record scorching temperatures that bolstered governments' resolve last year in reaching a deal to combat climate change, scientists said.

    July was the hottest single month since records began in the 19th century, driven by greenhouse gases and an El Niño event warming the Pacific. And NASA this week cited a 99 percent chance that 2016 will be the warmest year, ahead of 2015 and 2014.

    Dams containing small amounts of water can be seen in a drought-affected farming area located west of Melbourne, Australia, in this picture taken on January 12, 2016.
    Credit: REUTERS/David Gray

    In a welcome break, a new annual record is unlikely in 2017 since the effect of El Niño — a phenomenon that warms the eastern Pacific and can disrupt weather patterns worldwide every two-seven years — is fading.

    "Next year is probably going to be cooler than 2016," said Phil Jones of the Climatic Research Unit at Britain's University of East Anglia. He added there was no sign of a strong La Nina, El Niño's opposite that can cool the planet.

    In 1998, a powerful El Niño led to a record year of heat and it took until 2005 to surpass the warmth. That hiatus led some people who doubt mainstream findings that climate change has a human cause to conclude that global warming had stopped.

    "If 2017 is cooler, there will probably be some climate skeptics surfing on this information," said Jean-Noel Thepaut, head of the Copernicus Climate Change Service at the European Center for Medium-Range Weather Forecasts.

    "The long-term trend is towards warming but there is natural variability so there are ups and downs. The scientific community will have again to explain what is happening," he told Reuters.

    The spike in temperatures in 1998 may also have contributed for several years to reduced government attention to climate change, which has been linked to more heat waves, floods, downpours and rising sea levels.

    "One thing that the scientific community needs to be careful about is that they are not gearing up for a new 'hiatus' event," said Glen Peters of the Center for International Climate and Energy Research in Oslo.

    At a Paris summit last December, governments agreed the most comprehensive plan yet to shift away from fossil fuels, setting a goal of limiting the rise in temperatures to "well below" two degrees Celsius (3.6 Fahrenheit) above pre-industrial times, ideally 1.5 Celsius.

    Scientists are meeting in Geneva this week to sketch out themes for a report about the 1.5°C goal that was requested by world leaders at the summit for delivery in 2018.

    Reporting by Alister Doyle; editing by Mark Heinrich