Universities seem like they should be great seedbeds for advancing sustainability — after all, they’re all about discovering new things and sharing them with the rest of the world. In reality, however, their ability to do so is constrained by their culture, which often rewards working within disciplinary lines and focusing on research per se rather than on applying the results of that research to creating solutions to real-world problems
In a paper published recently in the journal Elementa, leaders from a half-dozen U.S. academic institutions with dedicated sustainability programs shared five recommendations, distilled from their own experiences, for universities seeking to improve their ability to both create tools to advance sustainability and empower individuals and institutions to use them.
1. Focus on solutions. Universities tend to focus on problems rather than solutions. To move the needle on sustainability, the authors call for a new, solutions-focused “social contract” for academia. That may seem like adding more work to already stressed workloads, but in practice, the authors say, the emphasis on solutions has yielded bountiful rewards. Faculty and students alike, they write, “speak of the satisfaction of being part of something larger than themselves, and of using their knowledge to make a difference in the wider world.” And the shift also found support from policy-makers, business leaders and citizen groups.
2. Embrace interdisciplinary collaborations. More than perhaps any other endeavor, advancing sustainability demands collaboration among a wide range of disciplines. That makes universities an ideal setting for this pursuit, since they are by definition multidisciplinary. The challenge, though, is to get experts from different disciplines talking to each other. Strategies for overcoming the institutional roadblocks to interdisciplinary collaboration include engaging faculty who want to make a difference, earmarking funding for work focused on solving problems, being patient with the process and recognizing the critical importance of participation from disciplines outside natural science and engineering.
3. Build partnerships. It’s not easy to build, maintain and use bridges between academia and government, industry and citizens — but to advance sustainability, it’s exceedingly important. External stakeholders have important knowledge and perspectives to share. Partnerships also provide an opportunity to build relationships, mutual understanding and bonhomie that are key to getting things done. The authors point to the Cooperative Extension Service as a valuable model for engaging with stakeholders outside universities. An important part of the equation: Make sure the relationships are two-way.
4. Innovate and persevere. “[R]isk-taking, creative thinking and tenacity are key ingredients in reshaping universities to meet sustainability demands,” the authors write. Particularly important are ownership at the faculty, not just administrative level; strategic use of space and location to cultivate cohesion among disparate parties; integral involvement of students; creation of novel roles (e.g., “professors of practice”) for faculty; and administrative support for boundary crossing.
5. Gather and share lessons learned. Universities are premier research institutions, so what better place to study the dynamics of institutional change? As higher education embraces innovative approaches to addressing sustainability challenges, it would do well to conduct research on what works and doesn’t and share the knowledge gained with others interested in engaging in collaborative problem-solving.
The authors note that the common themes are applicable not only to sustainability initiatives, but to other efforts that are part of the growing trend in higher education to not just study and teach, but also become involved in solving societal challenges.
“The timing is right for solutions-oriented sustainability programs that are responsive to environmental and societal needs, to student and faculty interests, and to opportunities in emerging career paths,” they conclude. “We are confident that universities will become an increasingly vital and valued partner in the quest to create a sustainable world.”
Editor’s note: One of the authors on the Elementa paper is affiliated with the University of Minnesota Institute on the Environment, which provides support to Ensia.
Worsening wildfires endanger communities. Invasive insects imperil forests. In the American West, many worry about these threats — but fewer fret about climate change, a major force behind both the burning and the bugs.
Why? Apparently, because lots of people don’t see the local connection. Polling residents of eastern Oregon, a new study published by University of New Hampshire sociologist Lawrence C. Hamilton and colleagues in the journal Regional Environmental Change found that although regional temperatures there have climbed twice as fast as the global average, only 40 percent of respondents recognized that fact. Echoing previous studies on global warming, local Republicans were more likely to say that temperatures haven’t increased, while Democrats were more likely to acknowledge that they have.
In the seven northeastern Oregon counties polled, average summer temperatures have risen over the past century, with heightened warming since the 1970s linked to more frequent wildfires. Compared with an average person, Republicans surveyed were 30 percent less likely to say that summers in their county were growing hotter. Among supporters of the conservative Tea Party movement, this number was even higher. For Democrats, the opposite relationship held.
Groups the researchers thought might be more attuned to the rise in temperature — long-term residents, year-round residents and forest landowners — are no more or less likely to know that summers have become warmer.
The researchers found that education matters, too, not because it makes people uniformly more informed, but because it intensifies preexisting partisan convictions. Among Democrats and independents in the study, college graduates were more likely than non-graduates to acknowledge local warming.
But among Republicans, especially Tea Party supporters, this effect flipped: higher levels of education went hand in hand with a higher probability of saying that Oregon summers haven’t become warmer.
Previous work has found this same educational gradient on global warming at a larger scale, and sure enough, when the researchers asked participants about human-induced climate change, the responses fell into the same pattern. Democrats and independents with a college education were more likely than Republican with higher education to acknowledge that humans are changing the climate.
The study was based on phone interviews with approximately 1,700 randomly selected residents of northeastern Oregon in 2014. The authors note that although eastern Oregon’s warming trend is statistically significant, the change is small relative to, say, the difference between a warm and cool summer day. That said, survey participants did have the option to say they weren’t sure whether summers were warming or not. Only 10 percent did so, leaving a clear partisan divide in perceptions of local warming.
This study presents a new twist on an old tale. Global climate change is, by definition, a worldwide phenomenon bigger than any one place. In contrast, local climate threads through people’s everyday experiences. If we can expect an informed, honest appraisal of climate anywhere, it’s in our own backyards. But if this study rings true on a larger scale, we can’t.
That underscores a core challenge of communicating climate change: Facts don’t seem to matter. And for local and global perspectives alike, the culprit appears to be the powerful pull of politics and social identity.
— February 26, 2016
Back in 2011, Jennifer Doudna, a biochemist and molecular biologist at the University of California, Berkeley, and Emmanuelle Charpentier, now at the Max Planck Institute for Infection Biology in Germany, grew intrigued by the way bacteria use a molecular system known as CRISPR-Cas9 to respond to viral attacks. For years, bacteria were assumed to be primitive creatures with rudimentary immune systems. But CRISPR-Cas9 revealed a startlingly sophisticated memory-response scheme. The bacteria store DNA samples from invading viruses by tucking them into a DNA library called CRISPR that is part of the bacteria’s natural genome. If the same virus should attack again, the Cas9 enzyme is primed by the CRISPR library to cut (and thus disable) viral DNA with the same sequence.
In the native bacterial system (a), a structure that’s formed by crRNA and tracrRNA and includes a “guide” segment (gold) guides the Cas9 protein (light blue blob) to a spot in the viral DNA that corresponds to the guide segment. The cRNA is critical in targeting while tracRNA stabilizes the structure and activates Cas9 to cleave the DNA. To turn this natural system into a useful tool for genetic manipulation (b), researchers created an artificial single guide RNA molecule (sgRNA, in green) by fusing the crRNA and tracrRNA. Image courtesy of Elsevier
After months of trying to tease apart how the system works, Doudna’s team determined that two RNA molecules play central roles: CRISPR RNA (crRNA), which leads Cas9 to a particular location on the viral gene, and a trans-activating RNA (tracrRNA), which helps activate Cas9. Together, these two RNA molecules empower Cas9 to make its cuts.
Still, it was not clear that CRISPR would be all that exciting or useful outside of bacteria. Microbes have very different cell structures than animals and plants, and it was quite possible that the system would only work in bacteria. The real breakthrough occurred in 2012 when Doudna, Charpentier and then-postdoctoral fellow Martin Jinek realized it would be possible to combine the crRNA and the tracrRNA into a single, artificial guide RNA (sgRNA). By adding to the sgRNA a customized “guide segment” matching a particular DNA sequence in an organism of interest, they could aim Cas9 to cut any organism’s genome in any spot they wished.
Often likened to a word processor, CRISPR can be used to target whole gene “words” or a few nucleotide “letters” with precision and speed that far outpaces conventional genetic engineering. It’s a superb tool for deleting chunks of DNA and for facilitating precise substitutions when researchers want to swap a few key nucleotide sequences.
Less often emphasized is that CRISPR can also be used to add new genes or parts thereof. The key here is understanding what happens after Cas9 makes its cuts.
A Cas9-caused break in DNA can be repaired in four different ways, two of which open the door to inserting a new gene of choice. Image courtesy of Elsevier
The cell’s DNA repair machinery typically takes over in one of two different modes. In the first mode (called “non-homologous end joining,” or NHEJ), it usually glues the two pieces back together, but imperfectly, deactivating the gene (see “a” above). Such “gene knockouts” don’t involve any foreign DNA but can eliminate traits that affect food quality, confer susceptibility to diseases or divert energy away from valuable end products such as grain or fruit. Occasionally, say researchers, this pathway may leave a DNA cut with “sticky ends,” enabling foreign genes of interest to be directly spliced in (b) — a double-stranded DNA insertion somewhat akin to “old-fashioned” genetic engineering.
A second kind of repair (called “homology-directed repair” or “homologous recombination” — HR) is much less common but far more accurate. In HR, the cut ends aren’t just jammed back together; the cell machinery copies a nearby piece of DNA to fix the damaged sequence. By providing a DNA snippet of their choice, scientists can induce the cell to fill in any desired sequence, from a small mutation (c) to a whole new gene (d). This HR pathway, says Fuguo Jiang, a postdoctoral fellow in Doudna’s lab, is not yet fully understood. But, as this illustration shows, it involves a meticulous process of one strand of donor DNA being stitched into the host gene, providing the template for cellular repair.
Needed to fertilize crops, the bulk of phosphorus comes from nonrenewable phosphate rock. While China mines the most — producing almost half the world’s phosphorus — Morocco alone controls three-fourths of global phosphate rock reserves. Year by year those global reserves dwindle, leaving phosphate rock that’s ever lower in quality and ever harder to extract. At the same time, as agriculture expands over the coming decades, experts project increasing demand for mineral phosphorus.
Pair declining supply and rising demand with the lopsided geographic distribution of phosphate, and it’s not hard to see why experts think prices are likely to go up, which could threaten global food security.
At the same time, excess phosphorus, including that running off feedlots and released from wastewater treatment plants, threatens water quality and ecosystem health as it fertilizes lakes, rivers and ocean waters around the globe.
A recent study in Science of the Total Environmentexamined one solution to these twin problems: recycling. Looking at what recycled phosphorus could do for corn in the United States, the country’s number one crop, the study’s authors found that we’d need just 37 percent of available recyclable domestic phosphorus to fertilize all of the corn in the country.
The researchers did the math for animal manure, human waste and wasted food — all of which contain phosphorus that traces back to farmers’ fertilizer — and found that these three sources could furnish more than 1.9 billion kilograms (4.2 billion pounds) of phosphorus each year, with almost 90 percent from manure alone. That’s more than enough to sate American corn’s 724 million kilogram (1.6 billion pound) annual appetite for the element.
Spatial analysis showed that we could meet three-fourths of demand with phosphorus recycled in the county of origin. Additional phosphorus would have to travel an average of 302 kilometers (188 miles) to reach farms that need it, meaning that most corn-producing states could fill demand within their borders. Currently only 5 percent of U.S. cropland sees any manure fertilizer at all.
While the study’s conclusions are surprising, the authors caution that they don’t intend to propose public policy. Instead, their paper is meant as a “proof of concept” looking at the potential of recycling phosphorus for U.S. corn. Regulatory, logistic and economic concerns remain topics for future work to tackle.
For example, the fact that 25 percent of phosphorus for corn would have to come from counties outside that in which the corn is grown underlines the spatial disconnect between phosphorus sources and phosphorus demand in the U.S. According to the researchers, that reality — stemming from the separation of farm animals and crops — has intensified since the 1970s. Potential fixes, the researchers suggest, might involve innovative technologies for better phosphorus recycling, which would let farmers get more phosphorus from other sources nearby, making it less necessary to look beyond county border. A more direct, though perhaps more ambitious, solution might be sociopolitical efforts to bring livestock and cropland closer together again.
Editor’s note: One of the authors of this paper is employed by Ensia’s publisher, the Institute on the Environment at the University of Minnesota. — January 15, 2016
Artificial intelligence, testosterone and ship tracking technology probably aren’t on many conservation organizations’ “top things to think about” lists right now. But they should be, suggests a new report in the scientific journal Trends in Ecology & Evolution.
“A Horizon Scan of Global Conservation Issues for 2016,” authored by University of Cambridge conservation biologist William Sutherland and 23 other researchers, practitioners, professional horizon scanners and journalists, offers a list of 15 emerging trends and developments that are not well known but could have big implications — positive, negative or both — for biodiversity on a global scale.
To come up with the list, team members racked their own brains, monitored social media and consulted more than 400 other individuals to come up with a list of 89 topics that have implications for global conservation and that they rated high in terms of probability of occurring, potential impact and/or likelihood of rapidly emerging.
The top issues identified were:
Increasingly powerful computers offer both new opportunities and new threats for biodiversity conservation. Photo by Randy Montoya/Sandia Labs
Artificial intelligence can contribute to conservation goals by making it possible to do things like better identify butterflies and boost energy efficiency in buildings. But if machines become smarter than humans, all bets are off as to whether our desired future for ecosystems — or anything else, for that matter — prevails.
Energy Storage and Consumption
Declining battery prices and changes in regulation open the door to rising renewable energy production and consumption. One positive conservation implication is reducing the threat of climate change. On the down side, though, they could contribute to potentially worrisome habitat disruptions due to increased deployment of wind and solar power installations.
China is better known for its environmental challenges — air pollution, desertification, high carbon emissions and more — than for its environmental solutions. In recent years, however, its government has adopted a formal policy of “ecological civilization” — development that respects and protects nature. Specific actions — which could boost the health of China’s environment and provide a model for other countries — include establishing model environmentally friendly villages, conserving areas that provide ecosystem services, factoring ecosystem services into economic reports, and investing in reforestation.
Electric Pulse Trawling
A jolt of electricity delivered to the ocean floor can make harvesting seafood species such as shrimp and flatfish easier and more efficient. Use of the practice is on the rise, suggesting it’s high time to take a look at the unintended consequences it carries for nontarget species and the marine ecosystem as a whole.
This now-shuttered power plant in Norway made electricity using the difference in salt concentration between river water and seawater. Photo by Damian Heinisch/Statkraft
Several years ago Norway experimented with producing electricity using the energy released when salt water mixes with freshwater at the intersection of river and sea. The facility turned out to be too expensive to run, however, so it shut down. With growing interest in renewables and technological advances, however, eyes are turning to this electricity source once again. Implications of implementation for biodiversity include disruption of shoreland habitat for power plant construction and operation as well as potential harm from facility wastewater to species unable to tolerate salt.
One strategy for keeping farm crops healthy involves treating them with biological control agents such as bacteria, viruses or fungi that incapacitate pests or diseases. Some producers have begun using bees to help deliver control agents to crops by having them walk through powder containing the agents before heading out on their flower-foraging flights. The extent to which the approach could sicken or kill couriers or nontarget insects and plants is currently largely unknown.
Climate change is opening the door to the central Arctic Ocean for commercial fish species such as Atlantic cod and yellowfin sole. Currently the five countries bordering the Arctic have agreed to prohibit commercial fishing until policies are in place to protect the species from overfishing. But failure to achieve international agreement for the long haul through the U.N. Convention on the Law of the Sea (which is far from out of the question given existing disagreements among the convention’s parties) could result in a Wild West of fishing in this changing ocean.
Before and after satellite images taken in August 2014 and January 2015 show the emergence of a large island at Fiery Cross Reef in the South China Sea’s Spratly Islands. Images courtesy of CNES 2014/Distribution Airbus DS/IHS
Recent years have seen growing interest in construction of artificial islands for purposes such as expanding residential areas or providing sites for military facilities by building up coral reefs with sand and concrete. The process creates major ecosystem disruption in the immediate vicinity and could cause more distant problems by disrupting the ability of corals to thrive, reproduce and spread. Artificial islands also could make currently hard-to-reach natural resources more accessible and therefore more susceptible to depletion.
As guys man up, will fish pay the price? Use of testosterone supplements to boost physical appearance and sexual function is growing rapidly in some affluent parts of the world: In the U.S. alone, the testosterone market went from $18 million to $1.6 billion between 1988 and 2011. We know that many pharmaceuticals are excreted in urine, survive the wastewater treatment process and in some cases disrupt the ability of fish and other living things to function normally. Will testosterone do the same?
Engineered Nanoparticles on Land
Scientists have started assessing possible adverse impacts on aquatic life of nanoparticles such as titanium dioxide, which can end up in wastewater after being used in personal care products or clothing. But what impact might they have elsewhere? Evidence is growing that nanoparticles found in sewage sludge could harm microbes in the soil when the sludge is spread on land for disposal and soil enrichment purposes — with adverse implications that could reverberate throughout the ecosystem.
Ocean Ship Tracking
Advanced technologies and corresponding regulations make it easier than ever to identify, characterize and track the movement of ocean-going vessels. While originally established for safety purposes, the technology holds promise for helping reduce illegal fishing, boost ship compliance with emissions regulations and more accurately calculate environmental footprints of the products we purchase, 90 percent of which spends time on a ship at some point.
Passive Acoustic Monitoring
Innovations in digital recording and transmission now make it possible to monitor sounds from afar. This emerging capability could be applied to the benefit of conservation by making it easier to monitor the presence of species of concern in remote habitats as well as track environmental conditions such as noise pollution. If we can resolve challenges related to managing and analyzing the massive amounts of data produced, it also could be applied to scanning vast areas for audible signs of illegal logging, hunting or fishing.
Synthetic Body Parts
A major threat to many endangered animals is illegal trade in body parts such as horns and bones. Recent advances in 3-D printing and chemical synthesis have combined to make it possible to make synthetic versions of the desired material. Synthetic rhino horn is already a reality, and other permutations such as synthetic tiger bone and synthetic elephant ivory could be just a Kickstarter campaign away. But would flooding the market with synthetic alternatives reduce demand for the real McCoy, or make genuine parts more sought-after than ever? No one knows — but the difference between extinction and survival could hang on the answer.
Artificial glaciers divert water to be used later for agriculture, which may alter nearby ecosystems. Photo courtesy of Ice Stupa Project
In some part of the Himalayas, residents are diverting water into low spots in the mountains to create artificial glaciers that can later produce meltwater needed for irrigating crops. The strategy could make agricultural lands more productive, reducing the need to disrupt habitat for additional cultivation. But it also alters the movement of water across land and into aquifers, raising the potential for altering nearby desert and other ecosystems and the unique species they support.
Tammar wallabies introduced to New Zealand harbor genetic diversity that has been lost in the animal’s native Australia. Photo by Josh More (Flickr/Creative Commons)
Invasive Species as Gene Reservoirs
In an ironic twist, some species are now more genetically diverse (and therefore more resilient to environmental change) in regions where they’re not native than in places where they are. Should the introduced species be used as a reservoir of genes to diversify decimated populations back home? And what are the implications of seeing them as such for efforts to eradicate or control invasive species that could potentially serve such a function in the future?
“We hope that heightened awareness of these threats and opportunities will encourage researchers, policy makers, and practitioners to consider them, potentially improving the alignment of environmental research and science with policy and practice,” the authors of the report concluded.
— December 28, 2015
As global population grows, urban population is growing even faster — with 2.5 billion more city dwellers expected by 2050, according to the United Nations. And while urbanization can bring benefits in terms of economies of scale, social cohesion, technological innovation, transportation efficiency and more, cities can also be breeding grounds for poverty, pollution and malaise.
To help inspire cities to make the most of the opportunities and minimize downsides of growth, the World Economic Forum recently published “Top Ten Urban Innovations,” showcasing new ideas for boosting urban sustainability. The examples fall into four broad categories: better using underused capacity, evening out demand over time, encouraging small-scale infrastructure, and people-centered design. Each includes a “why” and “what” as well as an assessment of the potential for improving global well-being. Most also include a list of links for more information. Here’s a quick summary:
In growing cities, the need for infrastructure can quickly outpace our ability to build it. Vancouver, Glasgow, New York and others are tackling this head-on by repurposing and densifying use of existing urban land rather than building out, and by designing buildings in a way that allows them to switch functions — for example, from a theater to a night club — as needs change.
An Internet of Pipes
Clean, readily available water supplies are a growing concern for growing cities. Efforts to meet future needs include a variety of Internet-based innovations aimed at managing water challenges such as flood control, rainwater management, supply distribution, pipe leakage reduction and sanitation management.
Twitter for Trees
Urban trees help reduce temperature extremes, moderate stormwater surges, sequester carbon and capture nutrients from runoff. Melbourne is boosting interest in and appreciation for urban forests by inviting its residents to adopt and name individual trees and share updates, including carbon offset and other information, via social media.
People-powered transit not only helps make cities cleaner and less congested, it also can boost human health and well-being. But bicycling can sometimes seem too demanding for a workday commute. To make it more appealing and accessible, innovators are developing products such as the Copenhagen Wheel, a bike that runs partly on a battery recharged by braking and downhill travel.
Co-heating, Co-cooling, CO2 Capture
Co-generation facilities boost energy efficiency by taking waste heat from electricity generation and using it to heat or cool buildings. For even more benefit, the carbon dioxide generated in the process can be captured and used for horticulture, manufacturing or other applications.
Sharing Spare Capacity
City dwellers around the world are reducing the environmental footprint of consumption through sharing networks. Starting with increasingly common practices such as carpooling, lodging rental and shared ownership, the practice is expanding to include things like co-locating enterprises to allow them to share facilities such as gyms or classrooms.
Mobility on Demand
Computer- and smartphone-assisted traffic management and vehicle routing can reduce time and fuel wasted trying to travel through congested areas. Similarly, self-driving vehicles and car sharing can boost efficiency by maximizing use of vehicles and reducing need for space to park idle ones.
Infrastructure for Social Integration
The Colombian city of Medilin, once considered one of the world’s most dangerous cities, has been transformed by a focus on architecture and design. Shared spaces and improved public transit blur economic boundaries and boost a sense of connection and culture.
Smart Street Poles
As cities switch from polluting conventional streetlights to LED-based updates, they have the opportunity to connect light poles to form a web of information sensors that can do everything from gather air quality data to monitor traffic and reduce the risk of crime.
Cities can help cut food waste by growing perishable produce right in town, boosting individuals’ connections to food and reducing spoilage-promoting lengthy transit distance and time. With water-based gardening and LED-lighting, walls, roofs and other structures that serve one function can be taught to multitask as a food-producing garden, too.
— December 16, 2015
These two words might make you sleepy: stormwater management. But they should make you scared.
Or, if a new study can help shake things up, hopeful.
Runoff carrying oil, salt, fertilizer, sediment and other pollutants is a top threat to lakes across the U.S., and several factors — including rising population, growing cities and changing climate — are loading the future’s dice in favor of even more trouble. With this challenge in mind, a new study in the Journal of Geophysical Research paves the way to fight flooding and enhance water quality with plans geared toward individual watersheds.
In the first multi-factor analysis of land cover and climate for watersheds across the entire continental U.S., researchers from University of Massachusetts Amherst concluded that professionals — think land use experts, city planners and water quality managers — need to remember that myriad factors affect stormwater runoff. Not only that, the factors vary based on space and place: Different watersheds have different temperatures, rainfall, land use, evaporation, plant characteristics, and more. The new paper estimates statistical relationships among these variables for watersheds in all lower 48 states and quantifies relationships in a way that could help decision-makers on the ground make smart choices that allow them to minimize adverse impacts of runoff.
Managers can use the study’s statistical conclusions to target variables that most greatly affect runoff in their watershed. For example, the researchers write, “[i]n watersheds where runoff is strongly influenced by evaporation, one could develop policy toward green infrastructure by incorporating infiltration enhancing components into urban structures like roads, buildings, and parking lots.”
— November 27, 2015
With the Paris climate negotiations just two weeks away, we’ll soon see if the world can agree on a plan for slashing greenhouse gas emissions to slow climate change. To get insight into the upcoming conference, a research team from Chalmers University of Technology in Gothenburg, Sweden, built a simplified (read: not completely real-world) mathematical model that reached two conclusions about climate talks: First, the more countries that are involved, the harder it is to settle on a deal. Second — counterintuitively — when negotiators think more strategically, they impede their chance of reaching consensus.
To improve on past studies exploring the theory behind negotiation outcomes, the researchers, whose work was reported recently in the journal Nature Climate Change, modeled negotiators with talent for thinking strategically. Under this scenario — which aligns with surveys showing that high-level policy-makers use a degree of strategic reasoning that’s higher than that of the average person — negotiators each want to get the best deal possible for their country: They don’t want to offer any more emission cuts than they have to. Negotiators with high strategic reasoning are marked by a skill for predicting other negotiators’ behavior.
Compared with the real thing, this model of climate talks was super simple. That’s because, instead of actual nations and real back-and-forth haggling, the model abstracted to equal-size countries that started with an initial offer of how much they’d cut their greenhouse gas emissions, then gave them a chance to negotiate over a number of turns, either raising or lowering their offers depending on what other countries did. A series of equations shaped how negotiators would react.
The team set up several scenarios, some with more strategic negotiators than others. They also varied the number of negotiating countries in each scenario.
After running each scenario 100,000 times, the researchers found that the more countries involved, the lower the chance of agreeing to emissions terms that could avoid catastrophic climate change. Likewise, situations with more negotiators who strategically thought through things succeeded less often than scenarios in which fewer participants had high levels of strategic reasoning. “[S]trategic reasoning,” the authors write, “typically increases the risk of climate catastrophe in our model.”
With more than 190 nations and some of the world’s best negotiators on hand for the Paris talks later this month, the paper’s conclusions are stark. But there’s room for hope: Because the model is much more simple than the real world — and because humans don’t always act with model rationality — actual outcomes could still bode well for our planet.
— November 20, 2015
Groundwater has long been an important resource for irrigation, drinking water, energy production and more, but getting a clear picture of how much is available and being recharged over time has been a challenge.
Now an international group of hydrologists led by Tom Gleeson, engineering professor at the University of Victoria, with fellow researchers from the University of Texas at Austin, the University of Calgary and the University of Göttingen, say they’ve produced the first data-driven estimate of Earth’s total supply of groundwater — with a focus on “modern” groundwater.
Writing in the journal Nature Geoscience, the researchers estimate the total volume of groundwater in the upper 2 kilometers (1.2 mile) of Earth’s crust to be 22.6 million cubic kilometers (5.4 million cubic miles) — an amount that if extracted and spread evenly across Earth’s land surface would be 180 meters (590 feet) deep.
Perhaps most interesting was the study’s focus on “modern” groundwater, or groundwater that is less than 50 years old.
Using geochemical, geologic, hydrologic and geospatial data sets, the researchers estimated this newer groundwater accounts for at most 6 percent of all groundwater in Earth’s uppermost layers. This number is important because it represents roughly the amount of groundwater that can recharged in little less than a human lifetime.
Furthermore, the study provides an important differentiation between older and more modern groundwater. Older, deeper groundwater is what we typically use for agriculture and industry and may be brackish, saline or low quality, whereas newer, higher quality groundwater closer to the surface is critical to overall aquifer health and recharge, yet may be more susceptible to contamination and climate change.
According to Gleeson, this new research differs from GRACE and other technologies previously used to assess global groundwater resources.
“GRACE measures the change in the mass of the earth, which is often interpreted to be due to the depletion of groundwater, but it cannot estimate the total groundwater or the fraction of young groundwater as we have,” says Gleeson. “Additionally, our results are more based on measured groundwater data such as tritium concentrations and water levels, and recent global maps of permeability and porosity.”
This new research builds on Gleeson’s earlier work pinpointing global hot spots where groundwater pumping — primarily for agriculture — is exceeding recharge rates.
— November 16, 2015
Whether people have the water they need depends on how much water there is and how much demand there is for it, right? Wrong.
According to a recent study published in Environmental Research Letters, governance, not pure supply and demand, is the most prevalent factor determining “water vulnerability” — the degree to which access to water is susceptible to disruption — in low-income nations.
Focusing on 119 countries and territories with per capita gross domestic product less than $10,725, researchers from Washington State University and Stanford University identified four basic categories of characteristics important to water supply vulnerability: demand, endowment (a mashup of quantity, quality and variability), infrastructure and institutions (the presence of government regulations and the likelihood they’ll be effective). They then used publicly available data to assess19 traits representative of those categories, such as access to improved sanitation, dependence on groundwater, virtual water imported and exported, and government corruption.
All of the countries were found to be vulnerable in at least one of the four categories, and 23 of them were vulnerable in all four. The category characterizing the highest number of countries — 44 — was that of institutional vulnerability — suggesting that social and political factors are important areas of focus for initiatives aiming to boost water security.
Noting that access to water is a fast-growing challenge, with extractive use tripling over the past half century while population doubled, lead author Julie Padowski, a member of the water resources faculty at the University of Washington, suggested in a news release summarizing the paper’s findings that the results could be valuable in helping identify and target strategies for boosting water security.
“The common occurrence of institutional vulnerability,” she said, “adds weight to the argument that it’s not just how much water we have, it’s how we manage it that’s really important.”
— November 6, 2015
According to a recent report from the U.S. Energy Information Administration, the number of new and reactivated coal mines in the U.S. hit a 10-year low in 2013 — down nearly 67 percent from 2008.
Total coal production in the U.S. was also down during the same time period. The drop was driven by a number of factors, including regulatory pressure, reduced investment, shifting demand toward natural gas and renewables and weak exports.
Data and chart by Energy Information Administration
Overall, the total number of coal mines in the U.S. also declined, dropping by nearly 400 sites from 2008 to 2013. The report doesn’t distinguish between high- and low-output mines, which is an important distinction. According to the EIA, “in 2013, 877 Appalachian mines produced 270 million short tons [245 million metric tons] of coal compared with 52 Western region mines that produced 530 million short tons [480 million metric tons].”
Data and chart by Energy Information Administration
Why does this matter? At the end of 2014, the U.S. had 262 billion tons (237 billion metric tons) worth of coal reserves, or about 27 percent of the global total. Following the U.S., the next five countries for coal reserves are Russia, China, Australia, India and Germany. Together these countries account for 77 percent of known global reserves.
The decline in domestic coal mining and use may help reduce greenhouse gas emissions as outlined in the EPA’s Clean Power Plan.
Despite decreased U.S. production and consumption, global demand for coal is projected to continue increasing. The report “Global Coal Mining to 2020” prepared by Timetric outlines India’s plans to ramp up domestic coal production and consumption while Russia has set a target of increasing in-country use for power generation from 25 percent in 2014 to 27 percent by 2020. Overall, global demand is projected to continue increasing by 2.1 percent annually through 2019, according to IEA.
— October 23, 2015
Human population is expected to soar to between 9 and 10 billion by 2050, with a growing number of people moving into the middle class and requiring more energy, natural resources and animal products. This combination of population growth and resource demand will put increased pressure on the planet’s resources, a fact that has interested parties from academia, nonprofits, government, industry and more scrambling to find ways to improve living conditions for billions of people while staying within the planet’s ecological boundaries.
But that’s just the starting point of a recent study led by The Nature Conservancy in partnership with the Department of Geography at McGill University and the University of Minnesota’s Institute on the Environment (which, full disclosure, provides support for Ensia). The researchers analyzed global development drivers, including urbanization, agriculture, energy and mining, to see how future development is likely to affect what’s left of the planet’s remaining natural lands. Their conclusion: 20 percent of all natural lands are at risk from future development.
Although the three regions currently with the most developed land — Central America, Europe and South Asia — are projected to remain the most developed, the researchers note that Africa and South America, currently relatively undeveloped, have the greatest amount of land at risk. The researchers estimate that the amount of land developed on those continents in coming decades could be about twice as much as it is today in South America and three times as much in Africa. Meanwhile, only about 5 percent of the lands that face development risk are currently under “strict protection” (many “protected areas” allow some forms of development).
The goal of the research wasn’t just to point out where nature is going to be converted to farms, cities and other signs of modern life; it’s meant to offer insights that can be used to balance conservation strategies alongside ongoing development.
As Joseph Kiesecker, a lead scientist for The Nature Conservancy’s Conservation Lands Team, says in this video, “The study’s not intended to be a doom-and-gloom story. It’s really intended to be hopefully inspiring. Armed with the knowledge of where future development risk may occur, we can get ahead of the curve. We can get into those landscapes, we can proactively plan for development and find solutions that will strike that balance [between development and conservation].”
— October 14, 2015
Earlier today Chilean president Michelle Bachelet announced the formation of Nazca-Desventuradas Marine Park — the largest marine protected area in the Americas. Located more than 850 kilometers (530 miles) northwest of Santiago in the Pacific Ocean, the reserve encompasses a surface area of 297,518 square kilometers (114,872 square miles). All told, Chile has now protected 12 percent of its entire marine surface area.
The news comes on the heels of an announcement last week that New Zealand was establishing an even larger 620,000-square-kilometer (385,000-square-mile) preserve called the Kermadec Ocean Sanctuary.
The timing of both new reserves couldn’t be better. In early September, WWF’s bi-annual Living Blue Planet Report outlined the dire state of the world’s oceans, reporting that marine vertebrate populations have declined 52 percent since 1970. The report notes that “the fish that constitute up to 60 per cent of protein intake in coastal countries, supporting millions of small-scale fishers as well as a global multibillion-dollar industry,” are declining precipitously.
The new Nazca-Desventuradas Marine Park may be a step toward reversing this trend.
According to a joint press release from Oceana and National Geographic announcing the reserve, the area is home to “undulating kelp forests; abundant fish populations, including enormous amberjacks, yellowtail jacks and deep sea sharks; and fragile deep corals.”
Enric Sala, National Geographic explorer in residence and head of the Society’s Pristine Seas project, went on to say, “The new Nazca-Desventuradas Marine Park is a gift from Chile to the world. It contains pristine underwater environments like nothing else in the ocean, including deep underwater mountains with species new to science, abundant giant lobster and a relict population of the once-thought-extinct Juan Fernández fur seal.”
— October 5, 2015
Note to anyone who thinks planting trees is the bees’ knees: Grasslands are important, too.
With forests disappearing at record rates and the carbon sequestration and other benefits of vegetation getting increasing visibility, tree-planting has become almost an iconic “environmentally friendly” activity. But in some cases it could do more harm than good, according to Iowa State University ecologist Joseph Veldman and colleagues.
In a paper published in the October 2015 issue of the journal BioScience, the researchers point out that the current emphasis on trees as tools for sequestering atmospheric carbon has led to a number of initiatives that encourage planting trees on grasslands or allowing forests to expand into grasslands. The U.N. Framework Convention on Climate Change’s Clean Development Mechanism, for example, offers carbon credits for foresting grasslands, the Food and Agriculture Organization of the United Nations does not distinguish for policy purposes between closed-canopy forests and grassy biomes with trees, and carbon valuation systems such as CDM and REDD+ favor forests over other types of land cover.
“Despite overwhelming evidence of their antiquity and richness, the misperception persists that grassy biomes are degraded ecosystems formed as a result of human-caused deforestation,” the researchers write. The result, they note, are “alarming rates” of loss of grasslands. That in turn leads to loss of unique flora and fauna that depend on grassland biomes, the authors note, and compromises grasslands’ capacity for sequestering carbon below ground in root systems and providing other ecosystem services.
To illustrate what they call “the tyranny of trees,” Veldman and colleagues took locations mapped as opportunities for forestation in the Atlas of Forest Landscape Restoration Opportunities, published by the World Resources Institute and the International Union for Conservation of Nature, and cross-mapped them with maps of grassy biomes around the world. They found that the atlas identified 9 million square kilometers (3 million square miles) of “ancient grassy biomes” as having potential for foresting.
Veldman and colleagues concluded by encouraging policy makers to acknowledge the value of these ecosystems in their own right and to integrate efforts to conserve forests and grasslands around the world in a way that takes into account the many services grasslands provide. Specifically, they recommended creating vegetation maps that include grasslands and savannas, assigning appropriate value to diverse vegetation types, and revising carbon valuation systems as well as widely accepted definitions of forest to formally recognize the importance of other types of land cover.
— October 2, 2015
Recently journalist Elizabeth Grossman wrote about a troubling trend of pharmaceuticals being found pretty much everywhere scientists look in our environment, especially in our water. Potential impacts are concerning, Grossman writes, with among other things, “fish and birds responding with altered behavior and reproductive systems to antidepressants, diabetes medication, and other psychoactive or hormonally active drugs at concentrations found in the environment.”
While about 90 percent of these pharmaceuticals turn up in the environment after being excreted, at least some show up when medication is improperly discarded. In an effort “to provide a safe, convenient, and responsible means of disposing of prescription drugs,” the U.S. Department of Justice and Drug Enforcement Administration are holding their annual National Prescription Drug Take-Back Day Saturday, September 26, 2015.
In addition to the environmental benefits of properly disposing of unused medications, Environmental News Bits points out that misuse of prescription drugs is a big public health crisis, noting, “In 2010, 22,134 people died from overdoses involving prescription drugs. Additionally, in 2012, 69 percent of people who abused prescription pain relievers obtained the prescription drugs through friends or relatives, or from raiding the family medicine cabinet.”
Photo by Ajay Suresh (Flickr | Creative Commons) — September 18, 2015
The combination of legal and illegal logging in southwest Ghana’s tropical forests is having a devastating impact on bird populations in the region, according to new research published recently in the journal Biological Conservation.
Between 1995 and 2010, logging in the Upper Guinea rain forest — one of the world’s biodiversity hot spots — increased by over 600 percent. Researchers sampling bird communities in the same forest region discovered that the number of understory birds counted in 2008–2010 was more than 50 percent less than in 1993–1995 .
Globally, 50-90 percent of the timber harvested in tropical countries can be attributed to illegal logging. In Ghana, where the study took place, illegal logging accounts for an estimated 80 percent of timber extraction. Overall, the researchers noted, the rate of deforestation in Ghana is six times the maximum sustainable rate.
The Guinean Forest (green) stretches across Africa from Guinea to Cameroon. Map courtesy of Global Forest Watch.
“Our most disturbing finding was that more than half of all understory birds had vanished in only 15 years,” said lead author Nicole Arcilla, a researcher with Drexel University and the Zoological Society of London, in a press release accompanying the study. “If things continue as they are, in a few decades, these incredibly beautiful forests and their unique wildlife will be largely depleted, which would be a huge loss to Ghana, Africa and the world.”
Arcilla spent two years in Ghana studying understory birds as part of her research. At first her focus was on legal logging activities, but she soon found it impossible to ignore illegal activities.
“No one has looked at this issue in the past, to my knowledge, because illegal logging is an underworld issue, so it’s hard to quantify,” Arcilla said. In Ghana, she said, “it’s like the Wild West.”
The solution lies in beefed up forest management by the Ghanaian government — a daunting challenge in the face of such widespread illegal activity. The researchers point to a suite of urgently needed measures such as “increasing forest ranger patrols, increasing forest law enforcement and increasing implementation of measures to prevent illegal logging.”
— September 10, 2015
With cities poised to add 2.5 billion more people by 2050, now is a perfect time to get strategic about urban design. One of the big questions planners face is the best way to accommodate both built environments and natural settings, which provide important services such as making oxygen, cleansing water, absorbing noise, moderating summer heat, adding beauty and providing habitat for animals and plants. Can we get the biggest overall benefit by interweaving buildings and roadways with small, dispersed patches of nature — “land sharing” — or by setting aside large expanses of green space surrounded by intensive development — “land sparing”?
To find an answer, researchers from the United Kingdom and Japan looked at how well cities with a range of development strategies were able to provide each of nine urban ecosystem services — carbon storage, water infiltration, human well-being, agricultural production, pollination, pest control, noise reduction, air purification and temperature regulation. They then categorized these ecosystem services as “winners” or “losers” under various points along the compact-development-to-suburban-sprawl continuum.
Their results, reported in the latest issue of Frontiers in Ecology and the Environment, indicate that large, contiguous stretches of natural land are critical for maintaining most of the ecosystem services. However, they also showed that interspersing nature with homes, businesses and roadways has some value, particularly for maintaining the sense of personal well-being associated with exposure to nature. The researchers observed in addition that the built environment can be modified in ways that make it function more like open space in its ability to provide ecosystem services — by incorporating green roofs and permeable pavement, for instance.
“Land sparing is clearly necessary if ecosystem services are to be adequately represented in urban landscapes,” they wrote. “That said, when considering the flow of ecosystem services, arguably some land sharing is required if people are to see the health and well-being benefits provided by urban green space.”
The researchers noted that more work is needed to identify optimal size for green spaces. They also recommended top-down, policy-led planning as the best way to maximize the benefits of green spaces as cities grow.
— September 8, 2015
Too often we humans consider things individually, as though anything in life happens in a vacuum, all alone, with no influence from its surroundings or something else happening at the same time. Take chemicals, for example: Some chemical or another is deemed to be safe because it was studied in isolation, ignoring the potential effects of that chemical when combined with the many others on the market and in various products we encounter each day. Only recently are we beginning to fully explore and understand these combinations. In many such cases, the horse has already left the stable.
A recent study by researchers at the University of California, Irvine, reported in the Proceedings of the National Academy of Sciences, attempts to leave the silos behind when it comes to the consequences of a changing climate, starting with the premise that, “Climatic extremes cause significant damage to the environment and society, and can cause even more damage when multiple extremes occur simultaneously.” Focusing on data from 1960 to 2010, the authors looked at droughts and heat waves and found that concurrence of the two “shows statistically significant increases across the United States.”
“Heat waves can kill people and crops while worsening air quality, and droughts exacerbate those serious impacts,” said senior author Amir AghaKouchak, assistant professor of civil and environmental engineering, in a press release related to the study. “With these two extremes happening at the same time, the threat is far more significant.” Such systems thinking is critical, as the authors state, in order to prepare for and assuage, as much as possible, the negative consequences of increasing climate extremes.
If you have a houseplant in your kitchen, a rock on your desk or a fountain in your workplace, you know the basic principle of biophilic design: inviting nature into our everyday lives as a way to boost health, productivity and overall sense of well-being.
The hour-long course introduces the concept of biophilia and the 14 principles of biophilic design, then offers concrete suggestions on how designers can apply them to enrich interior environments as well as several case studies as examples. Participants may earn continuing education credits.
“ASID has a commitment to keeping designers up-to-date on new research that supports the impact of design on human health and wellness,” said Karol Kaiser, ASID vice president for education and engagement. “We are enthusiastic for ASID members and industry professionals to take advantage of this extraordinary course.”
— August 31, 2015
Food waste carries massive environmental, social and economic costs — so much that you’d think one of the first lessons we’d want to pass along to our kids would be the old saw, “Take what you want, but eat what you take.”
A U.S. law intended to boost consumption of healthful foods by requiring school lunches eligible for federal funding to include a fruit or vegetable serving appears to be doing the exact opposite.
Observing elementary students’ lunchroom behavior in two elementary schools before and after the Healthy, Hunger-Free Kids Act of 2010 took effect, researchers from the University of Vermont and the University of California, Davis, reported that not only was fruit and vegetable waste more than 50 percent higher after the requirement kicked in, but fruit and vegetable consumption was down.
The researchers noted that the law, which governs a program that feeds 31 million children each school day, is coming up for reauthorization this fall — potentially opening opportunities to modify it in a way that reduces these adverse unintended consequences. Meanwhile, they encourage lunch providers to boost the appeal of the fruits and vegetables they’re required to serve by cutting large items such as apples and oranges into pieces, enhancing lunch offerings with fresh produce from school gardens or other local sources, and, of course, using that age-old trick of disguising good-for-you food as something else.
— August 28, 2015
Excess phosphorus runoff and emissions from urban areas and croplands, animal feedlots, sewage treatment plants, and combustion of fossil fuels has been blamed for the dead zone in the Gulf of Mexico, toxic algal blooms in Lake Erie and problems in numerous other lakes and rivers around the world.
For years unwanted nutrients were also choking the Florida Everglades, but in a surprising reversal, phosphorus levels have been reduced 79 percent this year — more than three times the state requirement — compared to the annual average runoff in the 1980s, according to an announcement from University of Florida’s Institute for Food and Agricultural Sciences. The average decrease in recent years has been around 50 percent annually.
The reductions have been achieved through a combination of best management practices targeting agriculture, including soil testing before fertilizer is applied, regulation of when and how much water can be pumped off of farms and into local waterways, and clearing sediment from canals that lead to the Everglades. All told, farmers have spent nearly $US200 million on improvements, according to the South Florida Water Management District.
Some think the reductions haven’t gone far enough, though. As the South Florida newspaper the Sun Sentinelreported earlier this month, representatives of Audubon Florida and the Sierra Club say phosphorus runoff needs to be reduced even further to comply with federal water quality standards.
Still, phosphorus levels are heading in the right direction. This year they are down to 94 parts per billion, compared with 500 parts per billion in 1986.
— August 27, 2015
Plants are far better than humans at turning sunlight into food. But they’re not nearly as good as they could be: Thanks to quirks in the systems that have evolved to capture solar energy and use it to build sugars from carbon dioxide and water, the conversion efficiency of photosynthesis is but a few percent at best.
With the need to produce more crops growing even faster than human population, it’s no surprise that scientists have been brainstorming ways to help plants do a better job of using sunlight.
In a recent issue of the scientific journal Proceedings of the National Academy of Sciences, more than two dozen researchers from the U.S., France, Germany, the Netherlands, the U.K., Australia and China shared the results of a workshop in which they put their heads together and came up with a spectrum of suggestions for how genetic engineers might modify crop plants to boost their photosynthetic prowess. Among them: alter the apparatus that captures sunlight so it doesn’t take in more than it can use, borrow photosynthetic machinery from a purple microbe to expand the range of wavelengths plants can use, improve the ability of leaves to suck CO2 from the air, and create “smart canopies” in which plants fine-tune their photosynthetic capabilities to different lighting conditions at different distances from the ground.
Emphasizing that the convergence of advances in computer modeling, computer power and our ability to reproduce genetic material are opening the door to new abilities to refine processes within living organisms, the researchers encouraged colleagues to take a closer look at the opportunities.
“If we can double or triple the efficiency of photosynthesis — and I think that’s feasible,” study coauthor and Washington University biologist Robert Blankenship noted in a related news release, “the impact on agricultural productivity could be huge.”
States hoping to increase their share of renewable energy to achieve the emissions reduction goals set forth in the President’s recently-announced Clean Power Plan may have just received an unexpected boost from wind energy.
According to the U.S. Department of Energy’s “2014 Wind Technologies Market Report” released earlier this week, the prices offered by wind projects to utility purchasers in the U.S. dropped below 2.5 cents per kilowatt-hour for the first time in history.
Costs are being driven down by technological advancements, including the production and installation of turbines with greater generating capacity, an increase in turbine hub height up to 272 feet (83 meters) and an increase in average rotor diameter. Taken together, these changes are not only driving down the cost of wind energy in blustery locations, but also making it more economical in low-wind areas.
Overall, US$8.3 billion was invested in 4.9 gigawatts of new wind power capacity in 2014. At present, wind power meets nearly 5 percent of the total U.S. electricity demand. Employment in the wind energy sector also saw a bump up in 2014 to 73,000 workers, compared to 50,500 in 2013.
Further bolstering wind energy’s prospects, an announcement from researchers at the Lawrence Berkeley National Laboratory — the research center that prepared the report — noted that 2014 wind energy contracts “compare very favorably to a range of projections of the fuel costs of gas-fired generation extending out through 2040.”
Photo by Steve p2008 (Flickr | Creative Commons) — August 12, 2015
In July 2014 we published a photo essay that highlighted the ecological and economic importance of the world’s deltas, pointing out that they are home to nearly half a billion people and provide critical habitat for innumerable plant and animal species. That photo essay also pointed out the danger deltas face: Sea-level rise and human alterations to river systems are threatening to turn the world’s deltas over to the sea.
One threat we didn’t explicitly address, though, was land subsidence. In a new study in the journal Science, researchers explore how sinking land — caused by less sediment reaching deltas along with human activities such as development and resource extraction — combined with sea-level rise will affect coastal delta communities. This combination of sinking land and rising sea, known as relative sea-level rise or RSLR, makes the challenges facing these communities even more daunting because increased RSLR means an increase risk of flooding.
The researchers found that some deltas, such as the Krishna and Brahmani in India and the Ganges-Brahmaputra in Bangladesh, are particularly vulnerable due to a combination of high RSLR, high chance of a hazardous event taking place and low capacity to respond to such an event. But even deltas in places that are less vulnerable, such as the Mississippi in the U.S. and the Rhine in the Netherlands, won’t have an easy path forward. The study points out that strategies to lower risk in these countries often come by way of expensive infrastructure, meaning that the people who call them home will have to make the ongoing decision of whether to spend more money keeping the deltas relatively safe from disaster or freeze spending, making the deltas less secure.
Writing in The Conversation, Zachary Tessler, one of the study’s authors, says that while stemming sea-level rise will require global action to combat climate change, land subsidence can be addressed in part through local and regional efforts. Tessler points to technologies that reduce buildup of sediment at dams, dike breaches, and coastal wetlands restoration as ways to deliver more sediment to deltas and increase retention. The authors also warn in the study, “The current emphasis on short-term solutions for the world’s deltas will greatly constrain options for designing sustainable solutions in the long term.”
The ability to anticipate floods and mobilize a timely response — increasingly a life-and-death matter as extreme weather events become more common — depends to a sobering extent on 10 satellites that measure precipitation and beam data to Earth, where governments and emergency response teams use them to guide efforts to protect people and property.
Even more sobering: Four of the 10 satellites have already exceeded their anticipated functional lifespan, with no plans in place to deploy replacements, according to a research article published earlier this year in the journal Environmental Research Letters.
The article reports on research by Cornell University engineering professor Patrick Reed and colleagues from Princeton University and the Aerospace Corporation. The researchers used information on the distribution and movement of water around the planet along with modeling to calculate the extent to which the current satellites are able to provide data needed for flood forecasting. They also calculated the coverage that would remain should all four aging satellites stop functioning, and explored what arrangement of satellites would be needed to optimally do the job. They found that even the current set of 10 satellites fails to provide sufficient coverage — with the biggest gaps, ironically, in South America, central and eastern Africa, and Asia, where populations are particularly vulnerable to floods. Not surprisingly, loss of the four antiquated satellites would make things even worse, with far less of the globe covered adequately over space and time.
The good news? Addition of just two new satellites with sufficient international collaboration would boost flood forecasting capability even beyond that of the current fleet.
Noting that their results demonstrate that improvements in this rain-monitoring system could have huge benefits for humanity, the authors call for an international conversation aimed at resolving the current and anticipated satellite rainfall data deficit “to ensure that the global portfolio of space-based Earth rainfall observations are sufficient to manage the potential increased flood risks posed by climate change.”