Tuesday, April 28, 2015

Climate Central - News

Climate Central is a nonprofit science and media organization created to provide clear and objective information about climate change and its potential solutions.
  1. These Are The Amazon Trees That Keep The Planet Cool

    Within the botanical menagerie that makes up the Amazon rainforest, which is so important it’s frequently dubbed the “lungs of the planet,” scientists have pinpointed a small number of tree species that are doing the heaviest breathing as they help to slow global warming.

    Their discovery — that 182 species store half the rainforest’s woodbound carbon — suggests that the future of the world’s climate, and the contours of its coastal areas, are intertwined with the fate of this small portion of an estimated 16,000 Amazonian tree species.

    Brazil nut trees store more than 1 percent of the woody carbon in the Amazon rainforest. Credit: CIFOR/Flickr

    Despite ongoing logging and recent drought, the Amazon is home to perhaps a sixth of the carbon stored in living vegetation the world over, helping to keep levels of climate-changing carbon dioxide down in the atmosphere.

    “The Amazon is a massively important carbon stock, and it’s currently acting as a carbon sink,” Leeds University’s Sophie Fauset, who led the research, said. “What we’re trying to do is increase our understanding of where this carbon is going; which trees are storing it.”

    The findings were published Tuesday in Nature Communications following analysis of data covering 530 areas. The most common tree identified in the study, a variety of palm known to scientists as Iriartea, was also found to hold the most carbon. But the other 181 species identified as the most important for carbon storage weren’t necessarily the most common species in the rainforest. They were species that shared combinations of important features, being relatively abundant, long-living and large-growing.

    {related}

    “There are a few species that seem to grow big — and those are the ones you’d want to emphasize in conservation,” said University of California at Berkeley forest ecology professor John Battles, who was not involved with the research. “If you were managing these forests, you would leave these trees.”

    One of the most carbon-hungry types of trees identified in the study was the Brazil nut tree, which grows trunks that can easily exceed a height of 100 feet. In a list ranking species by the total number of individual trees growing within the 530 studied plots, Brazil nut trees ranked 243rd. In terms of total growth and productivity, by contrast, they ranked fourth overall, and they were found to contain 1.3 percent of the forest’s carbon.

    Brazil nut harvesting in Peru. Credit: CIFOR/Flickr

    “The default assumption would be, I think, that given the enormous biological diversity of tropical forests, carbon cycling would be more equitably distributed among plant species,” National Center for Atmospheric Research scientist Rosie Fisher, who wasn’t involved with the study, said. “This discovery overturns that paradigm.”

    Fisher said she would be “hesitant to suggest the most obvious idea — that we could store lots of carbon by planting these very large species,” because so little is known about how sensitive they are to the types of droughts and fires that are projected to become more common in the future, nor whether they would thrive in managed forests.

    More work will be needed to determine how the findings could be applied to conservation and climate protection strategies — particularly as the climate changes in the Amazon and across the rest of the planet. Rainfall rates have dwindled by a quarter across the southeastern Amazon since the turn of the century, with deforestation and changes in atmospheric circulation regarded as culprits.

    “We must remember that these species established under 20th century climate conditions,” Fisher said. “The hyper-dominant species of the coming decades may need to possess different characteristics.”

    {like}

  2. Pacific Northwest’s ‘Wet Drought’ Possible Sign of Future

    The desiccated soils and barren slopes of California have grabbed news headlines for months on end as the state is in its fourth year of a crippling drought that has forced unprecedented statewide water restrictions and billions of dollars in agricultural losses.

    Unusually low snow levels seen at Oregon's Crater Lake on April 21, 2015.
    Click image to enlarge. Credit: NPS

    But while most eyes have been trained on the plight of the Golden State, its neighbors to the north are also facing a dearth of water, victims of some of the same atmospheric forces that have left California parched.

    Oregon and Washington aren’t currently in the same dire straits as California, having at least received a fair bit of rain this winter, but the warm, snowless conditions could be a harbinger of the future in an overall warming world. Some experts and officials are hoping the region can learn from today’s situation to better prepare for an altered climate later in the century.

    “We have an opportunity here to start thinking about our future,” Kathie Dello, deputy director of the Oregon Climate Service at Oregon State University, said.

    Wet Drought

    The drought in California is one of both heat and dryness, as a persistent ridge of high pressure that parked itself over the western U.S. over the past two winters blocked much-needed storms and drove up temperatures to spring and summer levels.

    Oregon and Washington, on the other hand, are stuck in a seemingly oxymoronic wet drought. The storms that were prevented from hitting California did provide rains to the Pacific Northwest, with winter precipitation in Oregon only about 30 percent below average, not even in the bottom 10 years historically, said Philip Mote, director of the Oregon Climate Service.

    {related}

    But the sky-high temperatures that marked the warmest winter on record for Washington and the second warmest for Oregon meant that much of the precipitation fell as rain, and not snow. Like California, parts of both these states depend on melting snowfall to fill their reservoirs, leaving them with potential shortages this year. Elevated temperatures also meant that what snow there was melted much earlier than normal.

    Three-fourths of snow survey sites in Oregon had record-low snow measurements as of April 1, and fewer than half of them had any snow on the ground, according to a report by the Natural Resources Conservation Service. The snowpack across much of the Cascades Range in Washington was less than 25 percent, while the Olympic Mountains checked in at only 3 percent on April 1, an “unbelievably low”  amount, Karin Bumbaco, assistant state climatologist in Washington, said.

    Water and Wildfires

    Those numbers, along with expectations that the drought conditions will persist if not intensify, have officials bracing for impacts this spring and summer.

    “The two themes that keep coming up are summertime water supply and wildfires,” Dello said.

    The water shortage concerns aren’t as widespread as in California because the western parts of Oregon and Washington tend to depend solely on rain, and so their supplies are fairly healthy. But in eastern areas that do depend on the snowpack to keep reservoirs topped up, residents and officials “are really concerned about what’s going to happen,” Dello said.

    Expected streamflow (as a percentage of normal) for May 1, 2015, across the Pacific Northwest. The drought-stricken areas of eastern Oregon and Washington have the lowest expected streamflows.
    Click image to enlarge. Credit: NIDIS

    In eastern Oregon, there is concern that a lack of water to irrigate pastures for cattle grazing could further drive up the price of beef, and many farmers are already planning to let fields lie fallow, The Oregonian reported. In Washington, junior water users will get only 60 percent of their water allocations, Bumbaco said. The numbers could be worse, she added, but reservoir managers stored more rain than they typically would, anticipating the poor snow runoff.

    The poor spring and summer runoff could also impact local wildlife. The Department of Fish and Wildlife is concerned about the ability of fish, like salmon, to be able to make it down streams to the ocean and is requesting money from the state to truck them to the sea, Bumbaco said.

    Come June and July, the National Interagency Fire Center expects “increasing to above normal” potential for wildfires in a broad swath of the drought-stricken West, including all of Oregon and most of Washington, which could put homes, businesses and ecosystems at risk.

    Warm Western Future

    While the scarcity of snow poses immediate challenges for Pacific Northwest communities, it also presents an opportunity to better prepare the region for a warmer world.

    The Pacific Northwest has already warmed by 1.3°F since 1895, and is expected to have warmed by 3° to 10°F by the end of the century (compared to the 1970-1999 average), according to the National Climate Assessment. And while heavy downpours there are expected to rise because of the greater water-holding capacity of the warmer atmosphere, less of that precipitation will fall as snow at all but the highest elevations. The warmer temperatures also mean a likely earlier spring snowmelt, changing the equations for calculating water supplies during the dry season.

    Given those expectations, this winter stands as an example of what the average winter in Oregon or Washington could be like by the end of the century.

    The National Interagency Fire Center's outlook for wildfire conditions across the country during June and July 2015. Conditions from California through Washington are elevated during that time period due to the drought in the region.
    Click image to enlarge. Credit: NIFC

    “There’s been a lot of talk about that in the community,” Bumbaco said. “I don’t’ want to say to anyone that this is climate change right now,” she cautioned, but said that it’s a fair statement that it could be a glimpse of the future.

    Just as the drought is forcing some hard reckoning in California in terms of thinking about how water is stored, transported and used — including for watering lush suburban lawns and water-thirsty crops in an arid landscape — it could spur changes to be made in the Pacific Northwest.

    Previous droughts contributed to changes in Seattle’s water system, as well as land-use rules that have contained urban development and prevented the kind of sprawl that has strained water resources in California, Dello said.

    Exactly what form new changes might take is still very much up in the air, but officials have already floated ideas to increase water storage, use recycled water for activities like watering  lawns and flushing toilets, modernizing irrigation and encouraging efficient water fixtures in houses to reduce water use, and perhaps even making changes to the century-old system of parceling out water rights in the West.

    “I don’t see that changing easily; it’s such an institution,” Dello said. But, she added, “people are certainly studying this.”

    {like}

  3. Blooming Algae Could Accelerate Arctic Warming

    Blooms of algae in the Arctic Ocean could add a previously unsuspected warming feedback to the mix of factors driving temperatures in the north polar regions up faster than any other place on the planet, according to the authors of a new study in Proceedings of the National Academy of Sciences.

    “By the end of the century, this could lead to 20 percent more warming in the Arctic than we would see otherwise,” said lead author Jong-Yeon Park, of the Max Planck Institute for Meteorology, in Hamburg Germany.

    Average temperatures in the region are already 2.7°F higher than the 1971-2000 average — twice as much as the warming seen in other parts of the world. Even without this newly identified algae feedback, summer temperatures in the region could be as much as 23.4° F warmer in summer than they were before human emissions began in the 1800s. Add 20 percent to that and you’re up to 28° — a level that could thaw permafrost drastically, and release even more heat-trapping CO2 into the air.

    This true-color image captures a bloom in the Ross Sea on January 22, 2011, as viewed NASA’s Aqua satellite. Bright greens of plant-life have replaced the deep blues of open ocean water.
    Credit: NASA

    There’s no question that algae blooms are on the increase as Arctic ice thins. Scientists have generally believed that more algae — more specifically, the type known as phytoplankton — would be good for the climate, since they thrive on CO2 while alive, then carry the carbon they’ve absorbed down to the sea bottom when they die. Some experts have even suggested that fertilizing the oceans to encourage algal growth would be one way to counteract global warming.

    But Park and his co-authors point out that thicker layers of algae on the sea surface would prevent sunlight from penetrating deeper into the water.

    “More heat is trapped in the upper layers of the ocean, where it can be easily released back into the atmosphere,” Park said. He and his team reached this conclusion by marrying computer models of how ocean ecosystems behave to models that simulate the climate. Then they ramped up levels of CO2 to see how the algae would respond to the resulting warming, the extra carbon dioxide itself, and changes in sea ice. 

    {related}

    The analysis makes sense, according to independent scientists — up to a point, anyway.

    “The authors show that phytoplankton plays a role in the vertical distribution of solar energy reaching the Arctic Ocean,” Mar Fernández-Méndez, a sea-ice biologist at the Max Planck Institute for Marine Biology, in Bremen, Germany, said. “But while the study is credible, it’s based on model results, not observations.”

    That being the case, Fernández-Méndez said, any incorrect assumptions in the model would lead to an incorrect conclusion. The particular model Park and his colleagues used, she said, is not specifically designed for the Arctic, where a number of factors could skew the results.

    Scientists have generally believed that more phytoplankton, which thrive on CO2 while alive, would be good for the climate.
    Credit: NOAA

    One, which the authors themselves note, is that the warming of the Arctic Ocean that is already happening could trap nutrients in deeper, cooler layers that would make them less available to feed algae blooms. Another is that an increase in Arctic cloud cover — a plausible outcome of global warming, which promotes evaporation from the oceans — could deprive algae of the sunlight they need to thrive. Nevertheless, said Fernández-Méndez, “the results stress the importance of taking biological processes into account in climate models.”
     

    This study, like virtually all research that breaks new ground, is hardly likely to be the final word on the matter.

    “This aspect of climate change has not been adequately modelled in the past,” said Victor Smetacek, of the Alfred Wegener Institute for Polar Research, in Bremerhaven, Germany. The new study, he said, is an important step in the right direction.

    {like}

  4. Extreme Heat and Heavy Rain Events Expected to Double

    Extreme weather events such as droughts, heat waves and torrential rainfalls are the most powerful and obvious reminders that the climate is changing. These disasters were happening long before humans started pumping heat-trapping greenhouse gases into the atmosphere, but global warming has tipped the odds in their favor. A devastating heat wave like the one that killed 35,000 people in Europe in 2003, for example, is now more than 10 times more likely than it used to be.

    But that’s just a single event in a single place, which doesn’t say much about the world as a whole. A new analysis in Nature Climate Change, however, takes a much broader view. About 18 percent of heavy precipitation events worldwide and 75 percent of hot temperature extremes — defined as events that come only once in every thousand days, on average — can already be attributed to human activity, says the study. And as the world continues to warm, the frequency of those events is expected to double by 2100.

    A heavy downpour in Bournemouth on the south coast of England.
    Credit: Chris Downer

    The new research differs from other so-called extreme event attribution studies, not just in its broad-brush approach, but also in how the term “extreme” is defined. 

    The 2003 heatwave that killed so many people in Europe, for example, was so off the charts that it would ordinarily be expected to come along only once in a thousand years without global warming. By contrast, the hot temperatures and heavy precipitation events described in the new paper would normally come along once in every three years, with both "hot" and "heavy" varying depending on what's normal for a given location. “We think of these as ‘moderate extremes,’ ” lead author Erich Fischer, of the Swiss Federal Institute of Technology in Zürich (ETH Zürich), said.

    Because these moderate extremes are by definition more common, and because the authors looked at global statistics rather than those for highly localized, rare events, the conclusions are extremely robust, said Peter Stott, leader of the Climate Monitoring and Attribution Team at the Met Office Hadley Centre, in the U.K. “I think this paper is very convincing,” said Stott, who was not involved in the research.

    {related}

    It’s also valuable, he said, because even moderate extremes of hot temperatures and precipitation, can have significant local impacts. “Policymakers need to know what’s happening overall in terms of exposure.”

    To calculate how the frequency of local extremes has changed as a result of global warming, Fischer and his co-author, Reto Knutti, also at ETH Zürich, fed data from tens of thousands of locations around the globe into 25 state-of-the-art global climate models, then combined the results to see how the current frequency of moderate extremes compares with what would have happened in the absence of climate change.

    They also projected forward in time to see how the frequency of extremes is likely to change. Not surprisingly, they’ll increase as the planet continues to warm. But as Fischer said, “the increase is nonlinear. An amount of warming that doesn’t seem so dramatic — say, from 1.5°C above pre-industrial temperatures to 2°C — will double the number of extreme events.” It won’t be the same everywhere, Fischer said. “But on average, that’s what we see.”

    Parisians beat the heat by playing in the fountain.
    Credit: Chris Walts/flickr

    The planet is about 1.4°C warmer than it was before human greenhouse emissions began to climb in 1800 or so, and while policymakers have identified the 2°C level as a red line the planet shouldn’t cross, it’s now considered unlikely that this can be avoided.

    By itself, the fact that these moderately extreme events will continue to come more often doesn’t tell people precisely what impacts they can expect on the local level. “We can give you probabilities for how the frequency of events will change,” Fischer said. “But we can’t tell you how much more damage they’ll do.”

    For that, he said, you need to know how vulnerable each location is, and that can vary widely depending on how resilient the local population and infrastructure are to climate extremes. A city in the U.S., for example, or in Western Europe, might be able withstand extreme events more easily than one in, say, Bangladesh. “More work is obviously needed,” Stott said. “But this is a crucial first step.”  

    {like}                                                       

  5. At $24 Trillion, Oceans are World’s 7th-Largest Economy

    By Oliver Milman, The Guardian

    The monetary value of the world’s oceans has been estimated at $24 trillion in a new report that warns that overfishing, pollution and climate change are putting an unprecedented strain upon marine ecosystems.

    The report, commissioned by WWF, states the asset value of oceans is $24 trillion and values the annual “goods and services” it provides, such as food, at $2.5 trillion.

    This economic clout would make the oceans the seventh-largest economy in the world although the report’s authors, which include the Boston Consulting Group, say this is an underestimate as it does not factor in things such as oil, wind power and intangibles, such as the ocean’s role in climate regulation.

    Credit: icelight/flickr

    The economic value is largely comprised of fisheries, tourism, shipping lanes and the coastal protection provided by corals and mangroves.

    However, the oceans are facing mounting pressures. They soak up around half of the carbon dioxide pumped into the atmosphere by human activity, a process that is warming the water and increasing the acidification of the ocean.

    The report warns that nearly two-thirds of the world’s fisheries are “fully exploited” with most of the rest overexploited. The biological diversity of the oceans slumped by 39 percent between 1970 and 2010, while half of the world’s corals and nearly a third of its seagrasses have disappeared in this time.

    Professor Ove Hoegh-Guldberg, lead author of the report and director of the Australia-based Global Change Institute, said it was important that the business community understood the value of the oceans so that a strategy could be devised to reverse its decline.

    “If you don’t look after an asset like the ocean it starts to degrade so it’s important we start to solve these problems now on an international basis,” he said. “The oceans are in a bad state that is rapidly getting worse. Fisheries are starting to collapse, there are record levels of pollution, such as plastic pollution, and there is climate change.”

    Hoegh-Guldberg said the “shocking” rate of change in the world’s oceans was illustrated in the latest report by the U.N.’s climate science panel, which stated that changes in the ocean’s chemistry due to an increase in CO2 emissions was faster than at any point in the past 65 million years.

    Warming temperatures can make life challenging for some marine species, while the acidification of the ocean hampers the ability of creatures such as corals and mollusks to form their shells and skeletons.

    “The changes we are making will take 10,000 years at least to turnaround, so we don’t want to go down this pathway,” Hoegh-Guldberg said. “This generation of humans is defining the future of 300 generations of humans. We are conducting these experiments with our world despite the consequences for people.”

    Half of the world’s corals and nearly a third of its seagrasses have disappeared

    Hoegh-Guldberg said that nations should do more to manage localized issues such as pollution and overfishing to help oceans deal with climate change.

    “If you protect marine areas and regulate fishing, you can help corals survive the impact of climate change,” he said. “If we solve these local problems we can buy some time while we deal with the global climate issue. But let’s not pretend here — if we don’t get off the current CO2-rich pathway we’re on now, all the attempts to control local factors won’t work. Coral reefs will become a distant memory and the ability to feed people will be severely degraded.”

    The report calls for eight key steps to revive the health of the oceans, including a stronger focus in U.N. agreements on oceans, deep cuts to emissions, at least 30 percent of marine areas to be protected by 2030 and greater action to tackle illegal fishing.

    Reprinted from The Guardian with permission.

  6. Look What’s Cooking in the World of Renewable Energy

    By Phil McKenna, Ensia

    Inside a sprawling single-story office building in Bedford, Mass., in a secret room known as the Growth Hall, the future of solar power is cooking at more than 2,500 °F. Behind closed doors and downturned blinds, custom-built ovens with ambitious names like “Fearless” and “Intrepid” are helping to perfect a new technique of making silicon wafers, the workhorse of today’s solar panels. If all goes well, the new method could cut the cost of solar power by more than 20 percent in the next few years.

    The hope is that silicon wafers will cut the cost of solar power by more than 20 percent in the next few years.
    Credit: 1366 Technologies

    “This humble wafer will allow solar to be as cheap as coal and will drastically change the way we consume energy,” says Frank van Mierlo, CEO of 1366 Technologies, the company behind the new method of wafer fabrication.

    Secret rooms or not, these are exciting times in the world of renewable energy. Thanks to technological advances and a ramp-up in production over the decade, grid parity — the point at which sources of renewable energy such as solar and wind cost the same as electricity derived from burning fossil fuels — is quickly approaching. In some cases it has already been achieved, and additional innovations waiting in the wings hold huge promise for driving costs even lower, ushering in an entirely new era for renewables.

    Solar Surprise

    In January 2015, Saudi Arabian company ACWA Power surprised industry analysts when it won a bid to build a 200-megawatt solar power plant in Dubai that will be able to produce electricity for 6 cents per kilowatt-hour. The price was less than the cost of electricity from natural gas or coal power plants, a first for a solar installation. Electricity from new natural gas and coal plants would cost an estimated 6.4 cents and 9.6 cents per kilowatt-hour, respectively, according to the U.S. Energy Information Agency.

    Technological advances, including photovoltaics that can convert higher percentages of sunlight into energy, have made solar panels more efficient. At the same time economies of scale have driven down their costs.

    For much of the early 2000s, the price of a solar panel or module hovered around $4 per watt. At the time Martin Green, one of the world’s leading photovoltaic researchers, calculated the cost of every component, including the polycrystalline silicon ingots used in making silicon wafers, the protective glass on the outside of the module, and the silver used in the module’s wiring. Green famously declared that so long as we rely on crystalline silicon for solar power, the price would likely never drop below $1/watt.

    “There is a tenth of a percent of an efficiency gain here and cost reductions there that have added up to make solar very competitive.” — Mark Barineau

    The future, Green and nearly everyone else in the field believed, was with thin films, solar modules that relied on materials other than silicon that required a fraction of the raw materials.

    Then, from 2007 to 2014, the price of crystalline silicon modules dropped from $4 per watt to $0.50 per watt, all but ending the development of thin films.

    The dramatic reduction in cost came from a wide number of incremental gains, says Mark Barineau, a solar analyst with Lux Research. Factors include a new, low-cost process for making polycrystalline silicon; thinner silicon wafers; thinner wires on the front of the module that block less sunlight and use less silver; less-expensive plastics instead of glass; and greater automation in manufacturing.

    “There is a tenth of a percent of an efficiency gain here and cost reductions there that have added up to make solar very competitive,” Barineau says.

    25 Cents Per Watt

    “Getting below $1 [per watt] has exceeded my expectations,” Green says. “But now, I think it can get even lower.”

    One likely candidate to get it there is 1366’s new method of wafer fabrication. The silicon wafers behind today’s solar panels are cut from large ingots of polycrystalline silicon. The process is extremely inefficient, turning as much as half of the initial ingot into sawdust. 1366 takes a different approach, melting the silicon in specially built ovens and recasting it into thin wafers for less than half the cost per wafer or a 20 percent drop in the overall cost of a crystalline silicon module. 1366 hopes to begin mass production in 2016, according to van Mierlo.

    Cost-reducing thin film solar photovoltaic technology could be experiencing a renaissance, thanks to recent efficiency innovations by U.S. manufacturer First Solar.
    Credit:  First Solar, Inc.

    Meanwhile, thin films, once thought to be the future of solar power, then crushed by low-cost crystalline silicon, could experience a renaissance. The recent record-setting low-cost bid for solar power in Dubai harnesses thin-film cadmium telluride solar modules made by U.S. manufacturer First Solar. The company not only hung on as the vast majority of thin film companies folded, but has consistently produced some of the least expensive modules by increasing the efficiency of their solar cells while scaling up production. The company now says it can manufacture solar modules for less than 40 cents per watt and anticipates further price reductions in coming years.

    Ten years from now we could easily see the cost of solar modules dropping to 25 cents per watt, or roughly half their current cost, Green says. To reduce costs beyond that, the conversion efficiency of sunlight into electricity will have to increase substantially. To get there, other semiconducting materials will have to be stacked on top of existing solar cells to convert a wider spectrum of sunlight into electricity.

    “If you can stack something on top of a silicon wafer it will be pretty much unbeatable,” Green says.

    Green and colleagues set a record for crystalline silicon solar module efficiency at 22.9 percent in 1996 that still holds today. Green doubts the efficiency of crystalline silicon alone will ever get much higher. With cell stacking, however, he says “the sky is the limit.”

    A Matter of Size

    While solar power is just starting to reach grid parity, wind energy is already there. In 2014, the average worldwide price of onshore wind energy was the same as electricity from natural gas, according to Bloomberg New Energy Finance.

    As with solar, the credit goes to technological advances and volume increases. For wind, however, innovation has mainly been a matter of size. From 1981 to 2015 the average length of a wind turbine rotor blade has increased more than sixfold, from 9 meters to 60 meters, as the cost of wind energy has dropped by a factor of 10.

    Bigger wind turbine blades mean more energy captured — to a point. 
    Credit: floato/flickr

    “Increasing the rotor size means you are capturing more energy, and that is the single most import driver in reducing the cost of wind energy,” says D. Todd Griffith of Sandia National Laboratories in Albuquerque, New Mexico.

    Griffith recently oversaw the design and testing of several 100-meter-long blade models at Sandia. His group didn’t actually build the blades, but created detailed designs that they subsequently tested in computer models. When the project started in 2009, the biggest blades in commercial operation were 60 meters long. Griffith and his colleagues wanted to see how far they could push the trend of ever-increasing blades before they ran into material limitations.

    “I fully expect to see 100 meter blades and beyond.” — D. Todd Griffith

    Their first design was an all-fiberglass blade that used a similar shape and materials as those found in relatively smaller commercial blades at the time. The result was a prohibitively heavy 126-ton blade that was so thin and long it would be susceptible to vibration in strong winds and gravitational strain.

    The group made two subsequent designs employing stronger, lighter carbon fiber and a blade shape that was flat-backed instead of sharp-edged. The resulting 100-meter blade design was 60 percent lighter than the initial model.

    Since the project began in 2009 the largest blades used in commercial offshore wind turbines have grown from 60 meters to roughly 80 meters with larger commercial prototypes now under development. “I fully expect to see 100 meter blades and beyond,” Griffith says.

    As blades grow longer, the towers that elevate them are getting taller to catch more consistent, higher speed wind. And as towers grow taller, transportation costs are growing increasingly expensive. To counter the increased costs GE recently debuted a “space frame” tower, a steel lattice tower wrapped in fabric. The new towers use roughly 30 percent less steel than conventional tube towers of the same height and can be delivered entirely in standard-size shipping containers for on-site assembly. The company recently received a $3.7 million grant from the U.S. Department of Energy to develop similar space frame blades.

    Offshore Innovation

    Like crystalline silicon solar panels, however, existing wind technology will eventually run up against material limits. Another innovation on the horizon for wind is related instead to location. Wind farms are moving offshore in pursuit of greater wind resources and less land use conflict. The farther offshore they go, the deeper the water, making the current method of fixing turbines to the seafloor prohibitively expensive. If the industry moves instead to floating support structures, today’s top-heavy wind turbine design will likely prove too unwieldy.

    One potential solution is a vertical axis turbine, one where the main rotor shaft is set vertically, like a merry go round, rather than horizontally like a conventional wind turbine. The generator for such a turbine could be placed at sea level, giving the device a much lower center of gravity.

    “There is a very good chance that some other type of turbine technology, very well vertical axis, will be the most cost effective in deep water,” Griffith says.

    The past decade has yielded remarkable innovations in solar and wind technology, bringing improvements in efficiency and cost that in some cases have exceeded the most optimistic expectations. What the coming decade will bring remains unclear, but if history is any guide, the future of renewables looks extremely positive. 

    UPDATED 4.15.15: The section on wind energy innovations was updated to clarify that Sandia designed but did not construct 100-meter turbine blades.

    Reprinted from Ensia with permission.

  7. Carbon Pricing Helping Farmers Ease Methane Pollution

    SACRAMENTO COUNTY, CALIF. — Leo Van Warmerdam pointed to a red shed housing a large generator on his family’s dairy farm as he loped over two acres of manure. The thick black plastic stretching across the manure ballooned as he walked on it, inflated by methane building from beneath. The cover seemed to be doing its job: It didn’t smell much different above the lagoon than it did elsewhere in this livestock-dominated swath of the Central Valley, just south of Sacramento.

    Beneath the sun-parched plastic, microbes were breaking down the manure from about 1,000 cows into methane, which was being piped to the generator and burned to produce electricity.

    These systems are called biogas digesters, and they’re helping to protect the climate from methane, which is a far more potent greenhouse gas than carbon dioxide. As the world’s appetite for meat and dairy has grown, agriculture has become a bigger cause of global warming than deforestation, and that’s mostly because of the methane released by livestock farming.

    On Thursday, the Obama Administration laid out a 10-point plan for reducing climate pollution that focused on rural areas, including farms and forests. Key among those 10 points was a plan to encourage more farmers to install biogas digesters, and to support 500 more biogas installations at farms during the next decade.

    Leo Van Warmderdam surveys his dairy farm near Galt, Calif., from atop its covered manure lagoon.
    Credit: John Upton

    The government will partner with farmers to “significantly reduce” their climate impacts through “incentive-based initiatives,” U.S. Agriculture Secretary Tom Vilsack said during a speech in Michigan on Thursday. That would be done, he said, “while improving yields, increasing farm operation's energy efficiency, and helping farmers and ranchers earn revenue from clean energy production."

    The initiatives should help reduce the more than 2 million tons of methane released by American livestock farming every year. But 500 biogas digesters would be a drop in the proverbial bucket.

    The EPA concluded in 2010 that such systems are “technically feasible” at more than 8,200 American dairy and swine farms, and that they could also be used at some poultry operations. Yet fewer than 200 had been installed nationwide, with high costs found to be a primary reason for the tardy uptake.

    Biogas digesters produce a useable and tradeable commodity — energy — but installations on smaller farms rarely produce enough of it to pay for themselves. In California, a cap-and-trade system that charges polluters a fee for releasing carbon dioxide is helping to close the financial viability gap.

    {related}

    Construction of biodigesters in California starts at around $1.5 million for the smallest farms. Thursday’s announcement by the White House, combined with the continued growth of cap-and-trade systems, should help farmers overcome some of these high costs.

    California’s cap-and-trade system requires large climate polluters, including power plants and gasoline suppliers, to purchase permits that allow the release of climate-changing carbon dioxide. The allowances sell for a little more than $12 per ton of carbon dioxide, with about $1 billion a year in revenue being spent on green projects. Revenues are projected to grow in the coming years, even as the cap on the number of allowances available is shrunk to help the state meet its ambitious climate-protection laws.

    But the agricultural and forestry sectors, and methane pollution and ozone-depleting greenhouse gases are not directly regulated by the scheme. So California gives polluters that are regulated by it an option. If they can find a cheaper way of keeping the equivalent of a ton of carbon dioxide out of the atmosphere, by spending on certain types of forestry, agricultural or industrial projects, then they can use carbon offset credits from those projects instead of allowances to permit the release of up to 8 percent of their pollution.

    Farmers work through agents to sell their offset credits. The farmers can build their own biogas systems, or they can farm out the job to specialists. Van Warmerdam does the latter. He leases land to an operator for an undisclosed amount. “We’re diversifying our income,” he said. “It’s worthwhile. There’s no risk to me.”

    The operator, Daryl Maas, sells electricity from the burning methane to the local utility. He’s also working with a broker who will sell offset credits produced by the project to Californian polluters. When he operates in Washington state, Maas instead sells such offsets to voluntary programs, such as those that offer to offset the climate impacts of a long flight or a corporation’s operations, though those bring in considerably less cash.

    Maas says such offsets can provide as much as a quarter of a biogas system’s revenue. “You can’t build the whole project around it,” he said. “But it’s a nice, welcome addition.”

    Carbon offsets like those allowed in California are common in cap-and-trade systems, which are rapidly growing in popularity worldwide. But offsets are controversial. Critics question why a farmer like Van Warmerdam should receive public funds for a project that might have been built anyway. Indeed, Van Warmerdam says he had been trying to install such a system for nearly a decade before cap-and-trade began a couple years ago. He was stymied twice — not directly by financial issues, he says, but by problems with contractors. “It never made it to the building stage,” he said.

    Manure from the Van Warmderdam family's cows flows into a covered lagoon, seen on the other side of the shed, where it turns into methane.
    Credit: John Upton

    Two environmental groups, the Citizens’ Climate Lobby and Our Children’s Earth, sued California to try to block the use of offsets, which they argued lacked integrity and could be abused. Another environmental group, the Environmental Defense Fund, which has expressed confidence in the integrity of the cap-and-trade system’s verification system, has been helping California defend itself in court.

    A San Francisco Superior Court judge ruled against the lawsuit two years ago, pointing to research indicating that fewer than 10 percent of the 8,200 biogas systems that could be installed in the U.S. would be installed without offset credits.

    “It is not standard practice to install anaerobic digesters,” Judge Ernest Goldsmith wrote in his ruling. “Cost is the primary barrier to installing digesters and offset credits directly address this problem.” The California State Supreme Court is due to decide whether it will hear an appeal filed by Our Children's Earth.

    California’s cap-and-trade began operating in early 2013 — just as Maas was building the biogas system on the Van Warmerdams’ farm. He says the young age of the cap-and-trade program, combined with uncertainty over the future of its rules, can make it difficult to rely on stable revenues from carbon offsets allowed under it. “It will just take a while to mature,” he said. “But, already, there’s a market there.”

    California’s cap-and-trade program is set to expire in 2020. Lawmakers are expected to renew it before then, but it’s impossible to say what changes might be introduced along the way.

    “You can’t execute a project beyond 2020,” Maas said. “The price is really the biggest question mark. What’s the price of carbon going to be? That’s determined by regulation.”

    Many of the questions on Van Warmerdam's mind, meanwhile, had related to whether biogas systems would eventually become mandatory.

    “I see that some day they’re probably going to force you to put these things in,” he said.

    That helped spur him to join the hundreds of pioneering American farmers who are already using grants and carbon offset revenues to turn waste manure into a commodity — instead of being knee-deep in an environmental menace.

    Editor's note: This story originally misspelled Leo Van Warmerdam's last name.

    {like}

  8. New NASA Satellite Gets the Dirt on Soil Moisture

    Tracking soil moisture is a dirty job, but someone’s gotta do it.

    Soil moisture is a critical indicator of drought. For decades, ground observations have done the heavy lifting but they’re few and far between. That’s why NASA spent $1 billion to launch a soil moisture monitoring satellite earlier this year. After months of calibration, the satellite dubbed the Soil Moisture Active Passive mission or SMAP (go ahead and try not to say “oh, SMAP” in your head), has sent back the first global view of soil moisture.

    Click image to enlarge. Credit: NASA Jet Propulsion Laboratory-Caltech/GSFC

    SMAP uses two instruments — a radar and radiometer — to measure soil moisture at a 5.6-mile resolution, all with the goal of providing a better view of how water moves across the planet, particularly on land (a helpful piece of knowledge for humans). The above map was created using the radar, which sends microwave pulses from the satellite down to the Earth’s surface 426 miles below, and then measures the backscatter that pops back up. For the map, good soil moisture in places such as the Amazon basin and forests of northern Canada is indicated in red. In comparison, drier spots from the Arctic tundra to the Sahara Desert are in blue.

    The shifts in soil moisture from week-to-week or year-to-year is a key piece of information for people working in fields and forests. It provides a key measure of drought and an indicator for forecasting out river flows, reservoir levels, the severity of wildfire season and crop irrigation availability.

    {related}

    SMAP’s equipment also allow it to monitor whether soil is frozen, muddy or dried out. The frozen-or-not part could prove particularly insightful for Arctic permafrost, which stores large quantities of methane, a potent greenhouse gas. That permafrost is expected to thaw as the planet warms, but observation stations in the region are sparse and estimates of that thaw could get a boost with the new data.

    More soil moisture data is also key for climate scientists looking to understand what areas are likely to get drier due to climate change. SMAP’s high-resolution data means they’ll be able to refine their models and help everyone from farmers to firefighters understand what the future could have in store.

    In the near term, the data could improve drought outlooks and weather forecasts. But it’ll be another month until data becomes available on a regular basis, so you’ll just have to enjoy the sneak peek in the meantime.

    {like}

  9. Scientists Turn to Drones For Closer Look at Sea Ice

    The sun has finally risen above the horizon in the Arctic after months of darkness. That means the floating ice that clogs the world’s northernmost seas every winter is beginning to loosen and it’s time for Christopher Zappa to head for the town of Ny-Ålesund, in the Svalbard Archipelago, a group of islands located about halfway between the northern tip of Norway and the North Pole.

    Zappa, an oceanographer at Columbia University’s Lamont-Doherty Earth Observatory, wants to understand the details of exactly how sea ice breaks up and melts, and he is going to call on a quintessentially 21st century technology to help him do it. Zappa is among a small group of scientists globally who are pioneering the use of “unmanned airborne systems” — or drones, to you and me — in a campaign to better understand Earth’s changing climate.

    Svalbard is an ideal place for Zappa’s studies. The islands lie astride Fram Strait, where sea ice blowing out of the Arctic Ocean streams southward every summer: breakup and melting are going constantly there from April through September. By September, the ice will dwindle to its annual minimum extent — a minimum that has trended dramatically downward since the late 1970s, largely as a result of global warming. The open water exposed as the ice melts absorbs solar energy that would otherwise bounce back into space, further heating the planet.

    {related}

    For these last two weeks of April and the first week of May, Zappa and several colleagues will be launching their drones, which fly autonomously, on alternating four-hour sorties westward over the ice to measure water and ice temperatures; ocean salinity; albedo (that is, the reflectivity of the ice) and more.

    Satellite observations are important, but they only give you a big-picture sense of how much ice is there,” Zappa said. Research ships come much closer to the action, but they only let scientists study limited areas of ice.

    “With drones, we can study melting and other processes as they’re happening, on a very fine scale,” Zappa said. And they can cover hundreds of square miles of ice and ocean with every flight. “They’ll go about halfway to Greenland and back on every flight,” he said. It takes just two people to launch and recover the drones, which take off and land like conventional winged aircraft.

    Unlike the high-altitude Global Hawk drones NASA uses to study hurricanes, the unmanned vehicles that Zappa uses, known as Manta UAVs, are modest in size and cost. They run between $100,000 and $250,000, compared with a Global Hawk’s price tag of more than $200 million; they have an 8-foot wingspan compared with the Hawk’s 130 feet; and they carry up to 10 lbs. of scientific instruments vs. the bigger aircraft’s ton and a half.

    Scientists watched from the deck of the Healy as it cut a path through thick multiyear ice on July 6, 2011. 
    Credit: NASA/Kathryn Hansen/flickr

    The drones not only skim just feet above the surface for close-up observations, they’re also designed so the scientists can swap instruments in and out quickly between flights, then send the aircraft back out, like the pit crew at a NASCAR race. One instrument package, for example, uses heat-sensitive, near-infrared cameras to measure variations in temperature in both ice and the water. Another has cameras that detect both infrared and partly visible light, allowing the scientists actually to see the structure of the disintegrating ice. Another carries a radar altimeter, which makes high-precision measurements of the ice’s surface texture. Yet another drops “microbuoys,” which plop into the frigid water to gauge salinity, then beam the data back to base.

    While the instruments on these flights are focused on studying changes in sea ice, Zappa said, “the technology is applicable all over the world.” You could go to the equator to look at algal blooms or the day-night cycle of carbon dioxide going into and out of the ocean or dozens of other phenomena, he said.

    But useful as drones are, Zappa wants to make them even more useful. Launch a drone from land and you can cover hundreds of square miles. Launch it from a ship, and you can cover a different, equally large swath of ocean every time. Next summer, he’ll be doing just that, from the Schmidt Ocean Institute’s research vessel Falkor.

    “We’re going to be studying the sea-surface microlayer,” he said — the top five one-hundredths of an inch of the ocean’s surface. “It’s not well understood, but lots of biology happens there, and it turns out to be important to the exchange of gases between the air and the water.”

    {like}

  10. Four Hopeful Clean Energy Trends for Earth Day

    By Brian Kahn and Bobby Magill 

    Earth Day can serve as a reminder not only of the wonders of the natural world, but also of  the perils it faces in a changing climate, especially as bad news about global warming seems to come on a daily basis. There are encouraging trends, though, that show progress against the primary cause of climate change — greenhouse gas emissions from humans extracting and burning fossil energy.

    So as Earth Day is celebrated the world over today, here are some big and small trends in energy that are playing a role in helping reducing those emissions:

    The Number: 0.7 percent

    The Trend: CO2 emissions are decoupling from the economy

    Rising greenhouse gases from burning ever-more fossil fuels have almost always been linked to a growing economy. But that’s not the case today as more and more people use energy more efficiency and embrace low-carbon energy such as solar and wind power. This phenomenon is called “decoupling” — the unlinking of a growing economy to rising greenhouse gas emissions.

    2014 was the year of decoupling. When U.S. GDP grew 2.4 percent between 2013 and 2014, greenhouse gas emissions increased only 0.7 percent. The previous year, emissions grew at about the same rate as the GDP — just over 2 percent.

    In March, the International Energy Agency announced that decoupling is happening globally, too. The global economy grew in both 2013 and 2014, but energy-related CO2 emissions stayed the same.

    Nobody knows for sure if decoupling will continue, but the fact that it happened at all last year shows that economies can grow as they embrace clean energy to halt a changing climate.

    The Number: 174,000

    The Trend: U.S. solar industry jobs growing fast

    All the solar industry needs right now is a few good installers. Well, maybe more than a few.

    The Solar Foundation’s National Solar Jobs Census released in March shows that employment in the industry leaped nearly 22 percent in 2014, adding 31,000 new jobs for a total of 174,000 workers nationwide. Likewise, the wind industry, after having a bad year in 2013, added 23,000 jobs last year, boosting wind industry employment to 73,000 jobs, according to the American Wind Energy Association.

    Renewables industries, especially solar, are growing as prices for materials fall and solar installations become less expensive. Affordable solar being installed on rooftops across the country is expected to lead to even more jobs this year — 36,000 new employment opportunities in the solar industry.

    The Number: 457 gigawatts

    The Trend: Solar and wind generating capacity is up globally

    The main way to address climate change? That would be cutting CO2 emissions, and wind and solar offer two of the clearest paths forward on that front.

    The world had a combined 457 gigawatts (GW) of solar photovoltaic and wind generating capacity in 2013, up from a mere 17 GW in 2000, according to the most recent data from REN21, a renewable energy monitoring group. That’s only a fraction of the globe’s energy mix, but it’s a major increase.

    What’s more, less money is buying more renewable energy. In the case of solar photovoltaics, investments fell 22 percent in 2013, but generating capacity rose 32 percent.

    The Number: $270 billion

    The Trend: Renewable investments have grown 600 percent since 2004

    If less money is buying more renewables, then it’s doubly notable that investments in renewable energy have risen 600 percent since 2004.

    Last year, $270 billion built 95 gigawatts of solar and wind power generation worldwide — more than ever had been built before. In comparison, in 2011, $279 billion in global renewables investments built wind and solar farms that were able to generate 70 gigawatts of renewable energy.

    The rise in renewable investments is being hailed by some analysts as a “paradigm shift” with renewables starting to help stabilize CO2 emissions globally.

    {like}