These two words might make you sleepy: stormwater management. But they should make you scared.
Or, if a new study can help shake things up, hopeful.
Runoff carrying oil, salt, fertilizer, sediment and other pollutants is a top threat to lakes across the U.S., and several factors — including rising population, growing cities and changing climate — are loading the future’s dice in favor of even more trouble. With this challenge in mind, a new study in the Journal of Geophysical Research paves the way to fight flooding and enhance water quality with plans geared toward individual watersheds.
In the first multi-factor analysis of land cover and climate for watersheds across the entire continental U.S., researchers from University of Massachusetts Amherst concluded that professionals — think land use experts, city planners and water quality managers — need to remember that myriad factors affect stormwater runoff. Not only that, the factors vary based on space and place: Different watersheds have different temperatures, rainfall, land use, evaporation, plant characteristics, and more. The new paper estimates statistical relationships among these variables for watersheds in all lower 48 states and quantifies relationships in a way that could help decision-makers on the ground make smart choices that allow them to minimize adverse impacts of runoff.
Managers can use the study’s statistical conclusions to target variables that most greatly affect runoff in their watershed. For example, the researchers write, “[i]n watersheds where runoff is strongly influenced by evaporation, one could develop policy toward green infrastructure by incorporating infiltration enhancing components into urban structures like roads, buildings, and parking lots.” — November 27, 2015
With the Paris climate negotiations just two weeks away, we’ll soon see if the world can agree on a plan for slashing greenhouse gas emissions to slow climate change. To get insight into the upcoming conference, a research team from Chalmers University of Technology in Gothenburg, Sweden, built a simplified (read: not completely real-world) mathematical model that reached two conclusions about climate talks: First, the more countries that are involved, the harder it is to settle on a deal. Second — counterintuitively — when negotiators think more strategically, they impede their chance of reaching consensus.
To improve on past studies exploring the theory behind negotiation outcomes, the researchers, whose work was reported recently in the journal Nature Climate Change, modeled negotiators with talent for thinking strategically. Under this scenario — which aligns with surveys showing that high-level policy-makers use a degree of strategic reasoning that’s higher than that of the average person — negotiators each want to get the best deal possible for their country: They don’t want to offer any more emission cuts than they have to. Negotiators with high strategic reasoning are marked by a skill for predicting other negotiators’ behavior.
Compared with the real thing, this model of climate talks was super simple. That’s because, instead of actual nations and real back-and-forth haggling, the model abstracted to equal-size countries that started with an initial offer of how much they’d cut their greenhouse gas emissions, then gave them a chance to negotiate over a number of turns, either raising or lowering their offers depending on what other countries did. A series of equations shaped how negotiators would react.
The team set up several scenarios, some with more strategic negotiators than others. They also varied the number of negotiating countries in each scenario.
After running each scenario 100,000 times, the researchers found that the more countries involved, the lower the chance of agreeing to emissions terms that could avoid catastrophic climate change. Likewise, situations with more negotiators who strategically thought through things succeeded less often than scenarios in which fewer participants had high levels of strategic reasoning. “[S]trategic reasoning,” the authors write, “typically increases the risk of climate catastrophe in our model.”
With more than 190 nations and some of the world’s best negotiators on hand for the Paris talks later this month, the paper’s conclusions are stark. But there’s room for hope: Because the model is much more simple than the real world — and because humans don’t always act with model rationality — actual outcomes could still bode well for our planet. — November 20, 2015
Groundwater has long been an important resource for irrigation, drinking water, energy production and more, but getting a clear picture of how much is available and being recharged over time has been a challenge.
Now an international group of hydrologists led by Tom Gleeson, engineering professor at the University of Victoria, with fellow researchers from the University of Texas at Austin, the University of Calgary and the University of Göttingen, say they’ve produced the first data-driven estimate of Earth’s total supply of groundwater — with a focus on “modern” groundwater.
Writing in the journal Nature Geoscience, the researchers estimate the total volume of groundwater in the upper 2 kilometers (1.2 mile) of Earth’s crust to be 22.6 million cubic kilometers (5.4 million cubic miles) — an amount that if extracted and spread evenly across Earth’s land surface would be 180 meters (590 feet) deep.
Perhaps most interesting was the study’s focus on “modern” groundwater, or groundwater that is less than 50 years old.
Using geochemical, geologic, hydrologic and geospatial data sets, the researchers estimated this newer groundwater accounts for at most 6 percent of all groundwater in Earth’s uppermost layers. This number is important because it represents roughly the amount of groundwater that can recharged in little less than a human lifetime.
Furthermore, the study provides an important differentiation between older and more modern groundwater. Older, deeper groundwater is what we typically use for agriculture and industry and may be brackish, saline or low quality, whereas newer, higher quality groundwater closer to the surface is critical to overall aquifer health and recharge, yet may be more susceptible to contamination and climate change.
According to Gleeson, this new research differs from GRACE and other technologies previously used to assess global groundwater resources.
“GRACE measures the change in the mass of the earth, which is often interpreted to be due to the depletion of groundwater, but it cannot estimate the total groundwater or the fraction of young groundwater as we have,” says Gleeson. “Additionally, our results are more based on measured groundwater data such as tritium concentrations and water levels, and recent global maps of permeability and porosity.”
This new research builds on Gleeson’s earlier work pinpointing global hot spots where groundwater pumping — primarily for agriculture — is exceeding recharge rates. — November 16, 2015
Whether people have the water they need depends on how much water there is and how much demand there is for it, right? Wrong.
According to a recent study published in Environmental Research Letters, governance, not pure supply and demand, is the most prevalent factor determining “water vulnerability” — the degree to which access to water is susceptible to disruption — in low-income nations.
Focusing on 119 countries and territories with per capita gross domestic product less than $10,725, researchers from Washington State University and Stanford University identified four basic categories of characteristics important to water supply vulnerability: demand, endowment (a mashup of quantity, quality and variability), infrastructure and institutions (the presence of government regulations and the likelihood they’ll be effective). They then used publicly available data to assess19 traits representative of those categories, such as access to improved sanitation, dependence on groundwater, virtual water imported and exported, and government corruption.
All of the countries were found to be vulnerable in at least one of the four categories, and 23 of them were vulnerable in all four. The category characterizing the highest number of countries — 44 — was that of institutional vulnerability — suggesting that social and political factors are important areas of focus for initiatives aiming to boost water security.
Noting that access to water is a fast-growing challenge, with extractive use tripling over the past half century while population doubled, lead author Julie Padowski, a member of the water resources faculty at the University of Washington, suggested in a news release summarizing the paper’s findings that the results could be valuable in helping identify and target strategies for boosting water security.
“The common occurrence of institutional vulnerability,” she said, “adds weight to the argument that it’s not just how much water we have, it’s how we manage it that’s really important.” — November 6, 2015
According to a recent report from the U.S. Energy Information Administration, the number of new and reactivated coal mines in the U.S. hit a 10-year low in 2013 — down nearly 67 percent from 2008.
Total coal production in the U.S. was also down during the same time period. The drop was driven by a number of factors, including regulatory pressure, reduced investment, shifting demand toward natural gas and renewables and weak exports.
Data and chart by Energy Information Administration
Overall, the total number of coal mines in the U.S. also declined, dropping by nearly 400 sites from 2008 to 2013. The report doesn’t distinguish between high- and low-output mines, which is an important distinction. According to the EIA, “in 2013, 877 Appalachian mines produced 270 million short tons [245 million metric tons] of coal compared with 52 Western region mines that produced 530 million short tons [480 million metric tons].”
Data and chart by Energy Information Administration
Why does this matter? At the end of 2014, the U.S. had 262 billion tons (237 billion metric tons) worth of coal reserves, or about 27 percent of the global total. Following the U.S., the next five countries for coal reserves are Russia, China, Australia, India and Germany. Together these countries account for 77 percent of known global reserves.
The decline in domestic coal mining and use may help reduce greenhouse gas emissions as outlined in the EPA’s Clean Power Plan.
Despite decreased U.S. production and consumption, global demand for coal is projected to continue increasing. The report “Global Coal Mining to 2020” prepared by Timetric outlines India’s plans to ramp up domestic coal production and consumption while Russia has set a target of increasing in-country use for power generation from 25 percent in 2014 to 27 percent by 2020. Overall, global demand is projected to continue increasing by 2.1 percent annually through 2019, according to IEA. — October 23, 2015
Human population is expected to soar to between 9 and 10 billion by 2050, with a growing number of people moving into the middle class and requiring more energy, natural resources and animal products. This combination of population growth and resource demand will put increased pressure on the planet’s resources, a fact that has interested parties from academia, nonprofits, government, industry and more scrambling to find ways to improve living conditions for billions of people while staying within the planet’s ecological boundaries.
But that’s just the starting point of a recent study led by The Nature Conservancy in partnership with the Department of Geography at McGill University and the University of Minnesota’s Institute on the Environment (which, full disclosure, provides support for Ensia). The researchers analyzed global development drivers, including urbanization, agriculture, energy and mining, to see how future development is likely to affect what’s left of the planet’s remaining natural lands. Their conclusion: 20 percent of all natural lands are at risk from future development.
Although the three regions currently with the most developed land — Central America, Europe and South Asia — are projected to remain the most developed, the researchers note that Africa and South America, currently relatively undeveloped, have the greatest amount of land at risk. The researchers estimate that the amount of land developed on those continents in coming decades could be about twice as much as it is today in South America and three times as much in Africa. Meanwhile, only about 5 percent of the lands that face development risk are currently under “strict protection” (many “protected areas” allow some forms of development).
The goal of the research wasn’t just to point out where nature is going to be converted to farms, cities and other signs of modern life; it’s meant to offer insights that can be used to balance conservation strategies alongside ongoing development.
As Joseph Kiesecker, a lead scientist for The Nature Conservancy’s Conservation Lands Team, says in this video, “The study’s not intended to be a doom-and-gloom story. It’s really intended to be hopefully inspiring. Armed with the knowledge of where future development risk may occur, we can get ahead of the curve. We can get into those landscapes, we can proactively plan for development and find solutions that will strike that balance [between development and conservation].” — October 14, 2015
Earlier today Chilean president Michelle Bachelet announced the formation of Nazca-Desventuradas Marine Park — the largest marine protected area in the Americas. Located more than 850 kilometers (530 miles) northwest of Santiago in the Pacific Ocean, the reserve encompasses a surface area of 297,518 square kilometers (114,872 square miles). All told, Chile has now protected 12 percent of its entire marine surface area.
The news comes on the heels of an announcement last week that New Zealand was establishing an even larger 620,000-square-kilometer (385,000-square-mile) preserve called the Kermadec Ocean Sanctuary.
The timing of both new reserves couldn’t be better. In early September, WWF’s bi-annual Living Blue Planet Report outlined the dire state of the world’s oceans, reporting that marine vertebrate populations have declined 52 percent since 1970. The report notes that “the fish that constitute up to 60 per cent of protein intake in coastal countries, supporting millions of small-scale fishers as well as a global multibillion-dollar industry,” are declining precipitously.
The new Nazca-Desventuradas Marine Park may be a step toward reversing this trend.
According to a joint press release from Oceana and National Geographic announcing the reserve, the area is home to “undulating kelp forests; abundant fish populations, including enormous amberjacks, yellowtail jacks and deep sea sharks; and fragile deep corals.”
Enric Sala, National Geographic explorer in residence and head of the Society’s Pristine Seas project, went on to say, “The new Nazca-Desventuradas Marine Park is a gift from Chile to the world. It contains pristine underwater environments like nothing else in the ocean, including deep underwater mountains with species new to science, abundant giant lobster and a relict population of the once-thought-extinct Juan Fernández fur seal.” — October 5, 2015
Note to anyone who thinks planting trees is the bees’ knees: Grasslands are important, too.
With forests disappearing at record rates and the carbon sequestration and other benefits of vegetation getting increasing visibility, tree-planting has become almost an iconic “environmentally friendly” activity. But in some cases it could do more harm than good, according to Iowa State University ecologist Joseph Veldman and colleagues.
In a paper published in the October 2015 issue of the journal BioScience, the researchers point out that the current emphasis on trees as tools for sequestering atmospheric carbon has led to a number of initiatives that encourage planting trees on grasslands or allowing forests to expand into grasslands. The U.N. Framework Convention on Climate Change’s Clean Development Mechanism, for example, offers carbon credits for foresting grasslands, the Food and Agriculture Organization of the United Nations does not distinguish for policy purposes between closed-canopy forests and grassy biomes with trees, and carbon valuation systems such as CDM and REDD+ favor forests over other types of land cover.
“Despite overwhelming evidence of their antiquity and richness, the misperception persists that grassy biomes are degraded ecosystems formed as a result of human-caused deforestation,” the researchers write. The result, they note, are “alarming rates” of loss of grasslands. That in turn leads to loss of unique flora and fauna that depend on grassland biomes, the authors note, and compromises grasslands’ capacity for sequestering carbon below ground in root systems and providing other ecosystem services.
To illustrate what they call “the tyranny of trees,” Veldman and colleagues took locations mapped as opportunities for forestation in the Atlas of Forest Landscape Restoration Opportunities, published by the World Resources Institute and the International Union for Conservation of Nature, and cross-mapped them with maps of grassy biomes around the world. They found that the atlas identified 9 million square kilometers (3 million square miles) of “ancient grassy biomes” as having potential for foresting.
Veldman and colleagues concluded by encouraging policy makers to acknowledge the value of these ecosystems in their own right and to integrate efforts to conserve forests and grasslands around the world in a way that takes into account the many services grasslands provide. Specifically, they recommended creating vegetation maps that include grasslands and savannas, assigning appropriate value to diverse vegetation types, and revising carbon valuation systems as well as widely accepted definitions of forest to formally recognize the importance of other types of land cover. — October 2, 2015
Recently journalist Elizabeth Grossman wrote about a troubling trend of pharmaceuticals being found pretty much everywhere scientists look in our environment, especially in our water. Potential impacts are concerning, Grossman writes, with among other things, “fish and birds responding with altered behavior and reproductive systems to antidepressants, diabetes medication, and other psychoactive or hormonally active drugs at concentrations found in the environment.”
While about 90 percent of these pharmaceuticals turn up in the environment after being excreted, at least some show up when medication is improperly discarded. In an effort “to provide a safe, convenient, and responsible means of disposing of prescription drugs,” the U.S. Department of Justice and Drug Enforcement Administration are holding their annual National Prescription Drug Take-Back Day Saturday, September 26, 2015.
In addition to the environmental benefits of properly disposing of unused medications, Environmental News Bits points out that misuse of prescription drugs is a big public health crisis, noting, “In 2010, 22,134 people died from overdoses involving prescription drugs. Additionally, in 2012, 69 percent of people who abused prescription pain relievers obtained the prescription drugs through friends or relatives, or from raiding the family medicine cabinet.”
Photo by Ajay Suresh (Flickr | Creative Commons) — September 18, 2015
The combination of legal and illegal logging in southwest Ghana’s tropical forests is having a devastating impact on bird populations in the region, according to new research published recently in the journal Biological Conservation.
Between 1995 and 2010, logging in the Upper Guinea rain forest — one of the world’s biodiversity hot spots — increased by over 600 percent. Researchers sampling bird communities in the same forest region discovered that the number of understory birds counted in 2008–2010 was more than 50 percent less than in 1993–1995 .
Globally, 50-90 percent of the timber harvested in tropical countries can be attributed to illegal logging. In Ghana, where the study took place, illegal logging accounts for an estimated 80 percent of timber extraction. Overall, the researchers noted, the rate of deforestation in Ghana is six times the maximum sustainable rate.
The Guinean Forest (green) stretches across Africa from Guinea to Cameroon. Map courtesy of Global Forest Watch.
“Our most disturbing finding was that more than half of all understory birds had vanished in only 15 years,” said lead author Nicole Arcilla, a researcher with Drexel University and the Zoological Society of London, in a press release accompanying the study. “If things continue as they are, in a few decades, these incredibly beautiful forests and their unique wildlife will be largely depleted, which would be a huge loss to Ghana, Africa and the world.”
Arcilla spent two years in Ghana studying understory birds as part of her research. At first her focus was on legal logging activities, but she soon found it impossible to ignore illegal activities.
“No one has looked at this issue in the past, to my knowledge, because illegal logging is an underworld issue, so it’s hard to quantify,” Arcilla said. In Ghana, she said, “it’s like the Wild West.”
The solution lies in beefed up forest management by the Ghanaian government — a daunting challenge in the face of such widespread illegal activity. The researchers point to a suite of urgently needed measures such as “increasing forest ranger patrols, increasing forest law enforcement and increasing implementation of measures to prevent illegal logging.” — September 10, 2015
With cities poised to add 2.5 billion more people by 2050, now is a perfect time to get strategic about urban design. One of the big questions planners face is the best way to accommodate both built environments and natural settings, which provide important services such as making oxygen, cleansing water, absorbing noise, moderating summer heat, adding beauty and providing habitat for animals and plants. Can we get the biggest overall benefit by interweaving buildings and roadways with small, dispersed patches of nature — “land sharing” — or by setting aside large expanses of green space surrounded by intensive development — “land sparing”?
To find an answer, researchers from the United Kingdom and Japan looked at how well cities with a range of development strategies were able to provide each of nine urban ecosystem services — carbon storage, water infiltration, human well-being, agricultural production, pollination, pest control, noise reduction, air purification and temperature regulation. They then categorized these ecosystem services as “winners” or “losers” under various points along the compact-development-to-suburban-sprawl continuum.
Their results, reported in the latest issue of Frontiers in Ecology and the Environment, indicate that large, contiguous stretches of natural land are critical for maintaining most of the ecosystem services. However, they also showed that interspersing nature with homes, businesses and roadways has some value, particularly for maintaining the sense of personal well-being associated with exposure to nature. The researchers observed in addition that the built environment can be modified in ways that make it function more like open space in its ability to provide ecosystem services — by incorporating green roofs and permeable pavement, for instance.
“Land sparing is clearly necessary if ecosystem services are to be adequately represented in urban landscapes,” they wrote. “That said, when considering the flow of ecosystem services, arguably some land sharing is required if people are to see the health and well-being benefits provided by urban green space.”
The researchers noted that more work is needed to identify optimal size for green spaces. They also recommended top-down, policy-led planning as the best way to maximize the benefits of green spaces as cities grow. — September 8, 2015
Too often we humans consider things individually, as though anything in life happens in a vacuum, all alone, with no influence from its surroundings or something else happening at the same time. Take chemicals, for example: Some chemical or another is deemed to be safe because it was studied in isolation, ignoring the potential effects of that chemical when combined with the many others on the market and in various products we encounter each day. Only recently are we beginning to fully explore and understand these combinations. In many such cases, the horse has already left the stable.
A recent study by researchers at the University of California, Irvine, reported in the Proceedings of the National Academy of Sciences, attempts to leave the silos behind when it comes to the consequences of a changing climate, starting with the premise that, “Climatic extremes cause significant damage to the environment and society, and can cause even more damage when multiple extremes occur simultaneously.” Focusing on data from 1960 to 2010, the authors looked at droughts and heat waves and found that concurrence of the two “shows statistically significant increases across the United States.”
“Heat waves can kill people and crops while worsening air quality, and droughts exacerbate those serious impacts,” said senior author Amir AghaKouchak, assistant professor of civil and environmental engineering, in a press release related to the study. “With these two extremes happening at the same time, the threat is far more significant.” Such systems thinking is critical, as the authors state, in order to prepare for and assuage, as much as possible, the negative consequences of increasing climate extremes.
If you have a houseplant in your kitchen, a rock on your desk or a fountain in your workplace, you know the basic principle of biophilic design: inviting nature into our everyday lives as a way to boost health, productivity and overall sense of well-being.
The hour-long course introduces the concept of biophilia and the 14 principles of biophilic design, then offers concrete suggestions on how designers can apply them to enrich interior environments as well as several case studies as examples. Participants may earn continuing education credits.
“ASID has a commitment to keeping designers up-to-date on new research that supports the impact of design on human health and wellness,” said Karol Kaiser, ASID vice president for education and engagement. “We are enthusiastic for ASID members and industry professionals to take advantage of this extraordinary course.” — August 31, 2015
Food waste carries massive environmental, social and economic costs — so much that you’d think one of the first lessons we’d want to pass along to our kids would be the old saw, “Take what you want, but eat what you take.”
A U.S. law intended to boost consumption of healthful foods by requiring school lunches eligible for federal funding to include a fruit or vegetable serving appears to be doing the exact opposite.
Observing elementary students’ lunchroom behavior in two elementary schools before and after the Healthy, Hunger-Free Kids Act of 2010 took effect, researchers from the University of Vermont and the University of California, Davis, reported that not only was fruit and vegetable waste more than 50 percent higher after the requirement kicked in, but fruit and vegetable consumption was down.
The researchers noted that the law, which governs a program that feeds 31 million children each school day, is coming up for reauthorization this fall — potentially opening opportunities to modify it in a way that reduces these adverse unintended consequences. Meanwhile, they encourage lunch providers to boost the appeal of the fruits and vegetables they’re required to serve by cutting large items such as apples and oranges into pieces, enhancing lunch offerings with fresh produce from school gardens or other local sources, and, of course, using that age-old trick of disguising good-for-you food as something else. — August 28, 2015
Excess phosphorus runoff and emissions from urban areas and croplands, animal feedlots, sewage treatment plants, and combustion of fossil fuels has been blamed for the dead zone in the Gulf of Mexico, toxic algal blooms in Lake Erie and problems in numerous other lakes and rivers around the world.
For years unwanted nutrients were also choking the Florida Everglades, but in a surprising reversal, phosphorus levels have been reduced 79 percent this year — more than three times the state requirement — compared to the annual average runoff in the 1980s, according to an announcement from University of Florida’s Institute for Food and Agricultural Sciences. The average decrease in recent years has been around 50 percent annually.
The reductions have been achieved through a combination of best management practices targeting agriculture, including soil testing before fertilizer is applied, regulation of when and how much water can be pumped off of farms and into local waterways, and clearing sediment from canals that lead to the Everglades. All told, farmers have spent nearly $US200 million on improvements, according to the South Florida Water Management District.
Some think the reductions haven’t gone far enough, though. As the South Florida newspaper the Sun Sentinelreported earlier this month, representatives of Audubon Florida and the Sierra Club say phosphorus runoff needs to be reduced even further to comply with federal water quality standards.
Still, phosphorus levels are heading in the right direction. This year they are down to 94 parts per billion, compared with 500 parts per billion in 1986. — August 27, 2015
Plants are far better than humans at turning sunlight into food. But they’re not nearly as good as they could be: Thanks to quirks in the systems that have evolved to capture solar energy and use it to build sugars from carbon dioxide and water, the conversion efficiency of photosynthesis is but a few percent at best.
With the need to produce more crops growing even faster than human population, it’s no surprise that scientists have been brainstorming ways to help plants do a better job of using sunlight.
In a recent issue of the scientific journal Proceedings of the National Academy of Sciences, more than two dozen researchers from the U.S., France, Germany, the Netherlands, the U.K., Australia and China shared the results of a workshop in which they put their heads together and came up with a spectrum of suggestions for how genetic engineers might modify crop plants to boost their photosynthetic prowess. Among them: alter the apparatus that captures sunlight so it doesn’t take in more than it can use, borrow photosynthetic machinery from a purple microbe to expand the range of wavelengths plants can use, improve the ability of leaves to suck CO2 from the air, and create “smart canopies” in which plants fine-tune their photosynthetic capabilities to different lighting conditions at different distances from the ground.
Emphasizing that the convergence of advances in computer modeling, computer power and our ability to reproduce genetic material are opening the door to new abilities to refine processes within living organisms, the researchers encouraged colleagues to take a closer look at the opportunities.
“If we can double or triple the efficiency of photosynthesis — and I think that’s feasible,” study coauthor and Washington University biologist Robert Blankenship noted in a related news release, “the impact on agricultural productivity could be huge.”
States hoping to increase their share of renewable energy to achieve the emissions reduction goals set forth in the President’s recently-announced Clean Power Plan may have just received an unexpected boost from wind energy.
According to the U.S. Department of Energy’s “2014 Wind Technologies Market Report” released earlier this week, the prices offered by wind projects to utility purchasers in the U.S. dropped below 2.5 cents per kilowatt-hour for the first time in history.
Costs are being driven down by technological advancements, including the production and installation of turbines with greater generating capacity, an increase in turbine hub height up to 272 feet (83 meters) and an increase in average rotor diameter. Taken together, these changes are not only driving down the cost of wind energy in blustery locations, but also making it more economical in low-wind areas.
Overall, US$8.3 billion was invested in 4.9 gigawatts of new wind power capacity in 2014. At present, wind power meets nearly 5 percent of the total U.S. electricity demand. Employment in the wind energy sector also saw a bump up in 2014 to 73,000 workers, compared to 50,500 in 2013.
Further bolstering wind energy’s prospects, an announcement from researchers at the Lawrence Berkeley National Laboratory — the research center that prepared the report — noted that 2014 wind energy contracts “compare very favorably to a range of projections of the fuel costs of gas-fired generation extending out through 2040.”
Photo by Steve p2008 (Flickr | Creative Commons) — August 12, 2015
In July 2014 we published a photo essay that highlighted the ecological and economic importance of the world’s deltas, pointing out that they are home to nearly half a billion people and provide critical habitat for innumerable plant and animal species. That photo essay also pointed out the danger deltas face: Sea-level rise and human alterations to river systems are threatening to turn the world’s deltas over to the sea.
One threat we didn’t explicitly address, though, was land subsidence. In a new study in the journal Science, researchers explore how sinking land — caused by less sediment reaching deltas along with human activities such as development and resource extraction — combined with sea-level rise will affect coastal delta communities. This combination of sinking land and rising sea, known as relative sea-level rise or RSLR, makes the challenges facing these communities even more daunting because increased RSLR means an increase risk of flooding.
The researchers found that some deltas, such as the Krishna and Brahmani in India and the Ganges-Brahmaputra in Bangladesh, are particularly vulnerable due to a combination of high RSLR, high chance of a hazardous event taking place and low capacity to respond to such an event. But even deltas in places that are less vulnerable, such as the Mississippi in the U.S. and the Rhine in the Netherlands, won’t have an easy path forward. The study points out that strategies to lower risk in these countries often come by way of expensive infrastructure, meaning that the people who call them home will have to make the ongoing decision of whether to spend more money keeping the deltas relatively safe from disaster or freeze spending, making the deltas less secure.
Writing in The Conversation, Zachary Tessler, one of the study’s authors, says that while stemming sea-level rise will require global action to combat climate change, land subsidence can be addressed in part through local and regional efforts. Tessler points to technologies that reduce buildup of sediment at dams, dike breaches, and coastal wetlands restoration as ways to deliver more sediment to deltas and increase retention. The authors also warn in the study, “The current emphasis on short-term solutions for the world’s deltas will greatly constrain options for designing sustainable solutions in the long term.”
The ability to anticipate floods and mobilize a timely response — increasingly a life-and-death matter as extreme weather events become more common — depends to a sobering extent on 10 satellites that measure precipitation and beam data to Earth, where governments and emergency response teams use them to guide efforts to protect people and property.
Even more sobering: Four of the 10 satellites have already exceeded their anticipated functional lifespan, with no plans in place to deploy replacements, according to a research article published earlier this year in the journal Environmental Research Letters.
The article reports on research by Cornell University engineering professor Patrick Reed and colleagues from Princeton University and the Aerospace Corporation. The researchers used information on the distribution and movement of water around the planet along with modeling to calculate the extent to which the current satellites are able to provide data needed for flood forecasting. They also calculated the coverage that would remain should all four aging satellites stop functioning, and explored what arrangement of satellites would be needed to optimally do the job. They found that even the current set of 10 satellites fails to provide sufficient coverage — with the biggest gaps, ironically, in South America, central and eastern Africa, and Asia, where populations are particularly vulnerable to floods. Not surprisingly, loss of the four antiquated satellites would make things even worse, with far less of the globe covered adequately over space and time.
The good news? Addition of just two new satellites with sufficient international collaboration would boost flood forecasting capability even beyond that of the current fleet.
Noting that their results demonstrate that improvements in this rain-monitoring system could have huge benefits for humanity, the authors call for an international conversation aimed at resolving the current and anticipated satellite rainfall data deficit “to ensure that the global portfolio of space-based Earth rainfall observations are sufficient to manage the potential increased flood risks posed by climate change.”
More and more people are starting to get the message that the massive amount of food we waste in the U.S. is a social issue, not just an environmental one. Even HBO comedian John Oliver took nearly 20 minutes recently on his satire news show, “Last Week Tonight,” to tackle the issue.
It seems an easy fix: Instead of throwing food out, set up a new supply chain to move it from businesses that aren’t going to use it to people who need it. But that process involves additional costs businesses have little financial incentive to take on as well as fear of lawsuits over someone getting sick from spoiled food (something that the 1996 Bill Emerson Good Samaritan Food Donation Act actually protects against).
This week, ClimateWire has a story about how Massachusetts, Vermont and Connecticut are trying to deal with these problems: by regulating the amount of food businesses can toss in the trash and encouraging them to donate it instead. The article examines progress to date (the regulations have been in place for some time) as well as the marketplace that is emerging as a result, such as the emergence of companies working to make sure more of the food goes to people in need and less to the compost heap. Composting is an easier option for compliance because not as much infrastructure such as cold storage is needed, but, as the article mentions, businesses such as Food for Free collect unwanted food and deliver it to more than 100 programs in and around Boston working to get food to hungry people.
There’s an interesting and important lesson here, as U.S. Environmental Protection Agency administrator Gina McCarthy points out in the piece. Reducing food waste by using it to feed people who really need it is “a great reminder that solutions to environmental challenges can double as solutions to social challenges.”
It’s well known that organic practices offer climate benefits in terms of management practices that help sequester carbon in soils and reductions in greenhouse gas emissions associated with fertilizer production. What’s less clear is the extent to which other variables, such as lower per-acre productivity and more intensive use of machinery in fields, counteract these savings.
Analyzing preexisting data on agriculture greenhouse gas emissions and amount of certified organic farmland across 49 states and eight years, sociology doctoral student Julius McGee from the University of Oregon discovered that the amount of farmland in organic production and greenhouse gas emissions from agriculture were positively correlated, even when adjusted for population, GDP and total amount of agricultural land.
“What these findings ultimately suggest is that organic farming is not working as a counterforce to greenhouse gas emissions stimulated by agricultural production, and is currently positively correlated with the problem,” he concluded.
Noting recent trends toward “conventionalization” of organic agriculture, McGee called for the organic food system to be more deliberate in its efforts to adopt sustainable practices and for additional oversight to make sure that production of certified organic foods is as environmentally friendly as consumers hope — and often assume — it is.
In case you somehow missed it, Shark Week has once again descended upon us. The annual summer tradition created by the Discovery Channel in 1988 to raise awareness of sharks through its programming has in recent years seen its hashtag, #SharkWeek, explode across social media. Although some of the fictional content on Discovery has led to backlash from the scientific community, it’s hard to point to another publicity scheme that has garnered so much attention for one species and for conservation, with organizations like Oceana, WildAid and Conservation International deploying the hashtag on Twitter to shine a light on their causes and other media outlets coopting it to promote their own content.
Amid all the noise it’s important to remember why everyone’s looking at sharks anyway. Besides being magnificent physical specimens around for hundreds of millions of years, sharks are often the apex predators in oceans, and top predators maintain balance in their ecosystems. Despite their importance, sharks have moved from predator to prey as humans catch them for their fins and their numbers have decreased due to other factors, such as degrading habitat and falling victim to bycatch. They have also been vilified through media and, as a result, their attacks on humans send ripples of fear through us, especially those living in coastal areas, despite the relatively few attacks that actually take place.
That’s the point of a recent study from researchers at Stanford University and the Monterey Bay Aquarium, which shows that although the overall number of shark bites has increased in the last 60 years, individual risk of being bitten or attacked by a shark along the California coast has actually decreased significantly due to the fact that the population of coastal cities in California and the numbers of people partaking in ocean activities in the state has exploded over the same time. The authors of the study, which will be published later this month in the Ecological Society of America’s journal Frontiers in Ecology and the Environment, conclude that conservation efforts for sharks and safety for humans don’t have to be at odds. By gaining more information about shark activity and behavior, the two can coexist. “Doing this kind of analyses can inform us on hot spots and cold spots for shark activity in time and space that we can use to make informed decisions and give people a way to stay safe while they are enjoying the ocean,” said Francesco Ferretti of Stanford, first author of the study, in a press release. Knowing that sharks migrate to Hawaii in the spring, for example, means that California waters will have fewer sharks and therefore may be more appealing to people for recreational activities. In fact, surfers who choose to surf the waters off Mendocino County in March are 24 times safer than surfing in October and November.
Although culling sharks is not a practice in California, it has taken place in other parts of the world and the study’s authors say that increased knowledge and data of both shark and human activity is a better way to minimize shark attacks on humans. “Our results indicate that the seemingly conflicting goals of protecting large predators and people can be reconciled,” study co-author Fiorenza Micheli told Stanford News.
Photo by Elias Levy(Flickr | Creative Commons) — July 10, 2015
You’ve probably noticed how being around trees, grass and flowers can make you feel better. Now, evidence is mounting that the presence of green and growing things is associated with being healthier, too.
Reporting at Journalist’s Resource, Justin Feldman summarized the findings of a number of recent scientific studies exploring the relationship between urban green space and human health. One study of schoolchildren in Barcelona, Spain, showed a positive association between cognitive development and “greenness index” (amount of vegetation the students encountered at home and school and over the course of their commute). A survey of adult twins in a U.S. data base, adjusted for income, activity and neighborhood traits, revealed that individuals with more access to green space tended to exhibit less stress, depression and anxiety than other study participants. And an assessment of birth outcomes in Vancouver, Canada, found that the more green cover pregnant women had in the vicinity of their homes, the higher the birth weight of full-term babies and the lower the likelihood of preterm births, even after taking other environmental factors such as air pollution, noise and proximity to parks into account.
Read more results and check out links to the original reports here.
Think environmental protection hurts business? You just might want to think again. In a study published last week at PLOS ONE, University of North Carolina land use and environmental planning faculty member Todd BenDor and colleagues calculated the dollars-and-cents impact of environmental restoration projects initiated to comply with laws such as the U.S. Clean Water Act or to meet other public, corporate or nonprofit goals. Their take: What’s good for Mother Nature can also be good for the economy.
The researchers surveyed 250 businesses that provide services aimed at improving ecosystem health and functioning such as project planning, engineering, construction, landscaping and legal counsel. Using a computer model to extrapolate from responses related to employment and sales, they concluded that the U.S. environmental restoration industry can take credit for an estimated 126,111 jobs representing $6.27 billion in wages and benefits and US$9.47 billion in sales annually. If you count indirect effects such as business-to-business economic activity and increased household spending, you can add another 95,287 jobs and US$15.37 billion in economic impact to that tally.
Employment due to ecological restoration is comparable to that of other major U.S. industries. Credit: PLOS ONE doi:10.1371/journal.pone.0128339.g002
Although they acknowledged that the study does not provide a full cost-benefit analysis of restoration legislation, the researchers noted that ecological restoration has a far more positive impact on the nation’s economy than most people realize.
“A growing number of studies have identified ‘green’ growth and job creation in renewable energy production, energy efficient construction, and green goods and services industries,” they wrote. “This study represents a first step towards quantifying the restoration industry as a piece of the broader green economy.”
This morning the U.S. Fish and Wildlife Service held an “ivory crush” event in Times Square in New York City, highlighting the huge problem that is elephant poaching and wildlife crime more generally. In addition to raising awareness of the problem, the point of pulverizing 1 ton of seized elephant tusks is to diminish demand and disrupt the marketplace for the product. Researchers from the University of Washington believe that slowing demand doesn’t move the needle on this crisis fast enough, though, so they want to use DNA from dead elephants to predict where other elephants may be hunted in the future — and beat the poachers to the spot.
Writing in The Conversation, Samuel Wasser, a research professor of biology at the Center for Conservation Biology at the University of Washington, describes a method his lab has developed to map where past hotspots of elephant poaching have occurred in order to guess where future crime might take place. In a somewhat encouraging conclusion, the researchers found that most of the poaching has come from just two areas in Africa. What this means, Wasser concludes, is that law enforcement efforts focused on just those two hotspots could go a long way in stopping the killing. And if poachers and traffickers just decide to switch their hunting area, Wasser believes his methods, which involve comparing DNA samples from elephant dung in various areas to that of seized ivory to create a reference map, can be applied quickly enough to track the new activity. “Unlike transit countries, source hotspots can’t change very quickly,” he writes. “They require large numbers of elephants and considerable infrastructure to move the ivory out of the country without detection. This infrastructure must also be developed in the next source country before it can become a major hotspot on the scale we have identified. Thus, our methods should be able to detect this.”
Protecting elephants from slaughter would obviously be good news for the species, but, as Wasser explains in the video above, the benefits go beyond, having an impact on everything from the elephants’ ecosystems to international terrorism.