New tool predicts geological movement and the flow of groundwater in old coalfields

New tool predicts geological movement and the flow of groundwater in old  coalfields

A remote monitoring tool to help authorities manage public safety and environmental issues in recently abandoned coal mines has been developed by the University of Nottingham.

The tool uses satellite radar imagery to capture millimetre-scale measurements of changes in terrain height. Such measurements can be used to monitor and forecast groundwater levels and changes in geological conditions deep below the earth’s surface in former mining areas.

With a long history of coal mining, the project was tested in the UK at a regional scale, but has global implications given the worldwide decline in the demand for coal in favour of more sustainable energy sources.

The method was implemented over the Nottinghamshire coalfields, which were abandoned as recently as 2015, when the last deep mine, Thoresby Colliery, shut its doors for good.

When deep mines are closed, the groundwater that was previously pumped to the surface to make mining safe, is allowed to rise again until it is restored to its natural level in a process called rebound.

The rebound of groundwater through former mine workings needs careful monitoring; often containing contaminants it can pollute waterways and drinking water supplies; lead to localised flooding; renew mining subsidence, land uplift and reactivate geological faults if it rises too fast. Such issues can cause costly and hazardous problems that need to be addressed prior to the land being repurposed.

The Coal Authority therefore needs detailed information on the rebound rate across the vast mine systems it manages so it knows exactly where to relax or increase pumping to control groundwater levels.

Measuring the rate and location of mine water rebound is therefore vital to effectively manage the environmental and safety risks in former coalfields, but difficult to achieve. Groundwater can flow in unanticipated directions via cavities within and between neighbouring collieries and discharge at the surface in areas not thought to be at risk.

In the past, predicting where mine water will flow was heavily-reliant on mine plans; inaccurate or incomplete documents that are sometimes more than a century old; and borehole data. Costing approximately £20,000 to £350K each, boreholes are expensive to drill and are often sparsely situated across vast coalfields, leaving measurement gaps.

More recently uplift, subsidence and other geological motion has been monitored by applying Interferometric Synthetic Aperture Radar (InSAR) to images acquired from radar satellites. However, this interferometry technique has historically worked only in urban areas (as opposed to rural ones), where the radar can pick up stable objects, such as buildings or rail tracks, on the ground to reflect back regularly to the satellite.

This study uses an advanced InSAR technique, called Intermittent Small Baseline Subset (ISBAS), developed by the University of Nottingham and its spin-out company Terra Motion Ltd. InSAR uses stacks of satellite images of the same location taken every few days or weeks which makes it possible to pick up even the slightest topographical changes over time. Uniquely, ISBAS InSAR can compute land deformation measurements over both urban and rural terrain. This is beneficial when mapping former mining areas, which are often located in rural areas. Over the Nottinghamshire coalfields, for example, the land cover is predominantly rural, with nearly 80 per cent comprising agricultural land, pastures and semi-natural areas.

Such a density of measurements meant study lead, University of Nottingham PhD student David Gee could develop a cost-effective and simple method to model groundwater rebound from the surface movement changes.

The study found a definitive link between ground motion measurements and rising mine water levels. Often land subsidence or uplift occurs as a result of changes in groundwater, where the strata acts a little like a sponge, expanding when filling with fluid and contracting when drained.

With near-complete spatial coverage of the InSAR data, he could fill in the measurement gaps between boreholes to map the change in mine water levels across the whole coalfield. The model takes into account both geology and depth of groundwater to determine the true rate of rebound and help identify where problems associated with rebound may occur.

The findings have been published in a paper ‘Modelling groundwater rebound in recently abandoned coalfields using DInSAR’ in the journal Remote Sensing of Environment.

David Gee, who is based in the Nottingham Geospatial Institute at the University, said, “There are several coalfields currently undergoing mine water rebound in the UK, where surface uplift has been measured using InSAR. In the Nottinghamshire coalfields, the quantitative comparison between the deformation measured by the model and InSAR confirms that the heave is caused by the recovery of mine water.”

At first a forward model was generated to estimate surface uplift in response to measured changes in groundwater levels from monitoring boreholes. David calibrated and validated the model using ISBAS InSAR on ENVISAT and Sentinel-1 radar data. He then inverted the InSAR measurements to provide an estimate of the change in groundwater levels. Subsequently, the inverted rates were used to estimate the time it will take for groundwater to rebound and identify areas of the coalfield most at risk of surface discharges.

“InSAR measurements, when combined with modelling, can assist with the characterisation of the hydrogeological processes occurring at former mining sites. The technique has the potential to make a significant contribution to the progressive abandonment strategy of recently closed coalfields,” David said.

The InSAR findings offer a supplementary source of data on groundwater changes that augment the borehole measurements. It means monitoring can be done remotely so is less labour-intensive for national bodies such as the Environment Agency (which manages hazards such as flooding, pollution and contaminated land) and the Coal Authority (which has a mandate to manage the legacy of underground coal mining in terms of public safety and subsidence).

The model has already flagged that some parts of the coal fields that are not behaving as previously predicted, which could influence existing remediation plans.

David explains, “The deepest part of the North Nottinghamshire coalfield, for example, is not rebounding as expected which suggests that the mine plans here might not be completely accurate. The stability is confirmed by the InSAR and the model — future monitoring of this area will help to identify if or when rebound does eventually occur.

“Next steps for the project are to integrate our results into an existing screening tool developed by the Environment Agency and Coal Authority to help local planning authorities, developers and consultants design sustainable drainage systems in coalfield areas. The initial results, generated at a regional scale, have the potential to be scaled to all coalfields in the UK, with the aid of national InSAR maps,” adds David.

Luke Bateson, Senior Remote Sensing Geologist from the British Geological Survey, said, “InSAR data offers a fantastic opportunity to reveal how the ground is moving, however we need studies such as David’s in order to understand what these ground motions relate to and what they mean. David’s study, not only provides this understanding but also provides a tool which can convert InSAR ground motions into information on mine water levels that can be used to make informed decisions.”

Dr Andrew Sowter, Chief Technical Officer at Terra Motion Ltd, explains, “Studies like this demonstrate the value to us, as a small commercial company, in investing in collaborative work with the University. We now have a remarkable, validated, result that is based upon our ISBAS InSAR method and demonstrably supported by a range of important stakeholders. This will enable us to further penetrate the market in a huge range of critical applications hitherto labelled as difficult for more conventional InSAR techniques, particularly those markets relating to underground fluid extraction and injection in more temperate, vegetated zones.”make a difference: sponsored opportunity


Story Source:

Materials provided by University of NottinghamNote: Content may be edited for style and length.


Journal Reference:

  1. David Gee, Luke Bateson, Stephen Grebby, Alessandro Novellino, Andrew Sowter, Lee Wyatt, Stuart Marsh, Roy Morgenstern, Ahmed Athab. Modelling groundwater rebound in recently abandoned coalfields using DInSARRemote Sensing of Environment, 2020; 249: 112021 DOI: 10.1016/j.rse.2020.112021

FOR MORE INFORMATION: University of Nottingham. “New tool predicts geological movement and the flow of groundwater in old coalfields.” ScienceDaily. ScienceDaily, 16 November 2020. <www.sciencedaily.com/releases/2020/11/201116112934.htm>.

In a warming world, Cape Town’s ‘Day Zero’ drought won’t be an anomaly

Why Cape Town Is Running Out of Water, and the Cities That Are Next

Today, the lakes around Cape Town are brimming with water, but it was only a few years ago that South Africa’s second-most populous city made global headlines as a multi-year drought depleted its reservoirs, impacting millions of people. That kind of extreme event may become the norm, researchers now warn.

Using new high-resolution simulations, researchers from Stanford University and the National Oceanic and Atmospheric Administration (NOAA) concluded that human-caused climate change made the “Day Zero” drought in southwestern South Africa — named after the day, barely averted, when Cape Town’s municipal water supply would need to be shut off — five to six times more likely. Furthermore, such extreme events could go from being rare to common events by the end of the century, according to the study, published November 9 in the journal Proceedings of the National Academy of Sciences.

“In a way, the ‘Day Zero’ drought might have been a sort of taste of what the future may be,” said lead author Salvatore Pascale, a research scientist at Stanford’s School of Earth, Energy & Environmental Sciences (Stanford Earth). “In the worst-case scenario, events like the ‘Day Zero’ drought may become about 100 times more likely than what they were in the early 20th-century world.”

Factoring multiple climate scenarios

Using a climate modeling system known as the Seamless System for Prediction and EArth System Research (SPEAR), the researchers simulated the response of atmospheric circulation patterns to increasing levels of carbon dioxide.

The model found that in a high greenhouse gas-emissions scenario, a devastating drought like the one that crippled Cape Town could impact the region two or three times in a decade. Even in an intermediate-emissions scenario, the risk of multi-year droughts that are more extreme and last longer than the “Day Zero” drought will increase by the end of the century.

The new research uses higher resolution models than were previously available and supports the conclusions of past studies that projected an increase in drought risk. The findings underscore the area’s sensitivity to further emissions and need for aggressive water management.

“The information we can provide now with these new tools is much more precise,” Pascale said. “We can say with a higher degree of confidence that the role anthropogenic climate change has had so far has been quite large.”

Preparing for the future

Other parts of the world with similar climates to South Africa — including California, southern Australia, southern Europe and parts of South America — could experience their own Zero Day droughts in the future, according to the researchers. “Analysis like this should be conducted for thorough water risk management,” said co-author Sarah Kapnick, a research physical scientist and deputy division leader at NOAA’s Geophysical Fluid Dynamics Laboratory.

“Given the dramatic shift in multi-year drought risk, this work also serves as an example for other regions to explore their changing drought risks,” Kapnick said. “Emerging drought risks may not be on the radar of managers in other regions in the world who have not experienced a recent rare drought event.”

Meteorological droughts, or rainfall deficits, like the one that affected Cape Town have high societal and economic impacts. According to estimates, lower crop yields from the “Day Zero” drought caused an economic loss of about $400 million, in addition to tens of thousands of jobs.

“This study shows these events will be more likely in the future depending on how energetic we are in addressing the climate problem,” Pascale said. “It can be either catastrophic or just a little bit better, but still worse than what it is now — this is trying to give some indication about what the future might look like.”

Three consecutive years of dry winters from 2015-17 in southwestern South Africa led to the severe water shortage from 2017-18. Cape Town never actually reach “Day Zero,” in part because authorities implemented water restrictions throughout the period, banning outdoor and non-essential water use, encouraging toilet flushing with grey water and eventually limiting consumption to about 13 gallons per person in February 2018. That level of conservation was foreign to many residents of the coastal tourist destination and would likely be jarring to many in the U.S., where the average person goes through 80 to 100 gallons per day, according to the United States Geological Survey (USGS).

“I’m sure that many Cape Town residents have forgotten what happened now that lakes and water reservoirs are back to normal,” Pascale said. “But this is the moment to rethink the old way of managing water for a future when there will be less water available.”

Thomas Delworth and William Cooke from NOAA are co-authors on the study.

The research was supported by NOAA and Stanford University.make a difference: sponsored opportunity


Story Source:

Materials provided by Stanford University. Original written by Danielle Torrent Tucker. Note: Content may be edited for style and length.


Journal Reference:

  1. Salvatore Pascale, Sarah B. Kapnick, Thomas L. Delworth, William F. Cooke. Increasing risk of another Cape Town “Day Zero” drought in the 21st centuryProceedings of the National Academy of Sciences, 2020; 202009144 DOI: 10.1073/pnas.2009144117

FOR MORE INFORMATION: Stanford University. “In a warming world, Cape Town’s ‘Day Zero’ drought won’t be an anomaly.” ScienceDaily. ScienceDaily, 9 November 2020. <www.sciencedaily.com/releases/2020/11/201109152241.htm>.

Scientists have discovered an ancient lake bed deep beneath the Greenland ice

Drone images show Greenland ice sheet becoming more unstable as it fractures

Scientists have detected what they say are the sediments of a huge ancient lake bed sealed more than a mile under the ice of northwest Greenland — the first-ever discovery of such a sub-glacial feature anywhere in the world. Apparently formed at a time when the area was ice-free but now completely frozen in, the lake bed may be hundreds of thousands or millions of years old, and contain unique fossil and chemical traces of past climates and life. Scientists consider such data vital to understanding what the Greenland ice sheet may do in coming years as climate warms, and thus the site makes a tantalizing target for drilling. A paper describing the discovery is in press at the journal Earth and Planetary Science Letters.

“This could be an important repository of information, in a landscape that right now is totally concealed and inaccessible,” said Guy Paxman, a postdoctoral researcher at Columbia University’s Lamont-Doherty Earth Observatory and lead author of the report. “We’re working to try and understand how the Greenland ice sheet has behaved in the past. It’s important if we want to understand how it will behave in future decades.” The ice sheet, which has been melting at an accelerating pace in recent years, contains enough water to raise global sea levels by about 24 feet.

The researchers mapped out the lake bed by analyzing data from airborne geophysical instruments that can read signals that penetrate the ice and provide images of the geologic structures below. Most of the data came from aircraft flying at low altitude over the ice sheet as part of NASA’s Operation IceBridge.

The team says the basin once hosted a lake covering about 7,100 square kilometers (2,700 square miles), about the size of the U.S. states of Delaware and Rhode Island combined. Sediments in the basin, shaped vaguely like a meat cleaver, appear to range as much as 1.2 kilometers (three quarters of a mile) thick. The geophysical images show a network of at least 18 apparent onetime stream beds carved into the adjoining bedrock in a sloping escarpment to the north that must have fed the lake. The image also show at least one apparent outlet stream to the south. The researchers calculate that the water depth in the onetime lake ranged from about 50 meters to 250 meters (a maximum of about 800 feet).

In recent years, scientists have found existing subglacial lakes in both Greenland and Antarctica, containing liquid water sandwiched in the ice, or between bedrock and ice. This is the first time anyone has spotted a fossil lake bed, apparently formed when there was no ice, and then later covered over and frozen in place. There is no evidence that the Greenland basin contains liquid water today.

Paxman says there is no way to tell how old the lake bed is. Researchers say it is likely that ice has periodically advanced and retreated over much of Greenland for the last 10 million years, and maybe going back as far as 30 million years. A 2016 study led by Lamont-Doherty geochemist Joerg Schaefer has suggested that most of the Greenland ice may have melted for one or more extended periods some time in the last million years or so, but the details of that are sketchy. This particular area could have been repeatedly covered and uncovered, Paxman said, leaving a wide range of possibilities for the lake’s history. In any case, Paxman says, the substantial depth of the sediments in the basin suggest that they must have built up during ice-free times over hundreds of thousands or millions of years.

“If we could get at those sediments, they could tell us when the ice was present or absent,” he said.

The researchers assembled a detailed picture of the lake basin and its surroundings by analyzing radar, gravity and magnetic data gathered by NASA. Ice-penetrating radar provided a basic topographic map of the earth’ s surface underlying the ice. This revealed the outlines of the smooth, low-lying basin, nestled among higher-elevation rocks. Gravity measurements showed that the material in the basin is less dense than the surrounding hard, metamorphic rocks — evidence that it is composed of sediments washed in from the sides. Measurements of magnetism (sediments are less magnetic than solid rock) helped the team map the depths of the sediments.

The researchers say the basin may have formed along a now long-dormant fault line, when the bedrock stretched out and formed a low spot. Alternatively, but less likely, previous glaciations may have carved out the depression, leaving it to fill with water when the ice receded.

What the sediments might contain is a mystery. Material washed out from the edges of the ice sheet have been found to contain the remains of pollen and other materials, suggesting that Greenland may have undergone warm periods during the last million years, allowing plants and maybe even forests to take hold. But the evidence is not conclusive, in part because it is hard to date such loose materials. The newly discovered lake bed, in contrast, could provide an intact archive of fossils and chemical signals dating to a so-far unknown distant past.

The basin “may therefore be an important site for future sub-ice drilling and the recovery of sediment records that may yield valuable insights into the glacial, climatological and environmental history” of the region, the researchers write. With the top of the sediments lying 1.8 kilometers below the current ice surface (1.1 miles), such drilling would be daunting, but not impossible. In the 1990s, researchers penetrated almost 2 miles into the summit of the Greenland ice sheet and recovered several feet of bedrock — at the time, the deepest ice core ever drilled. The feat, which took five years, has not since been repeated in Greenland, but a new project aimed at reaching shallower bedrock in another part of northwest Greenland is being planned for the next few years.

The study was coauthored Jacqueline Austermann and Kirsty Tinto, both also based at Lamont-Doherty Earth Observatory. The research was supported by the U.S. National Science Foundation.make a difference: sponsored opportunity


Story Source:

Materials provided by Earth Institute at Columbia University. Original written by Kevin Krajick. Note: Content may be edited for style and length.


Journal Reference:

  1. Guy J.G. Paxman, Jacqueline Austermann, Kirsty J. Tinto. A fault-bounded palaeo-lake basin preserved beneath the Greenland Ice SheetEarth and Planetary Science Letters, 2020; 116647 DOI: 10.1016/j.epsl.2020.116647

FOR MORE INFORMATION: Earth Institute at Columbia University. “Scientists have discovered an ancient lake bed deep beneath the Greenland ice: Inaccessible for now, unique site may hold secrets of past.” ScienceDaily. ScienceDaily, 10 November 2020. <www.sciencedaily.com/releases/2020/11/201110133145.htm>.

The dangers of collecting drinking water

Eight ways entrepreneurs and investors can bring safer drinking water to  millions

Collecting drinking water in low and middle income countries can cause serious injury, particularly for women, according to new research from the University of East Anglia.

A new international study published in BMJ Global Health reveals dangers including falls, traffic accidents, animal attacks, and fights, which can result in broken bones, spinal injuries, lacerations, and other physical injuries.

And women are most likely to sustain such injuries — highlighting the social the social and gender inequities of a hidden global health challenge.

Dr Jo-Anne Geere, from UEA’s School of Health Sciences, said: “Millions of people don’t have the luxury of clean drinking water at their home, and they face many dangers before the water even touches their lips.

“Global research on water has largely focused on scarcity and health issues related to what is in the water, but the burden and risks of how water is retrieved and carried has been overlooked until now.

“We wanted to better understand the true burden of water insecurity.”

The new study was led by Northwestern University in the US, in collaboration with UEA, the University of Miamii and the Household Water Insecurity Experiences Research Coordination Network (HWISE RCN).

The research team used a large global dataset to understand what factors might predict water-fetching injuries. The work draws on a survey of 6,291 randomly selected households across 24 sites in 21 low- and middle-income countries in Asia, Africa, Latin America, and the Caribbean.

They found that 13 per cent of the respondents reported some sort of injury while collecting water, and that women were twice as likely to be hurt as men.

Dr Sera Young, from Northwestern University, said: “Thirteen percent is a big number, but it is probably an underestimate. It’s highly likely that more people would have reported injuries if the survey had more detailed questions.

Prof Paul Hunter, from UEA’s Norwich Medical School, said: “This reinforces how the burden of water scarcity disproportionately falls on women, on rural populations, and on those who do not have water sources close to home.

“It highlights the importance of safe interventions that prioritise personal physical safety alongside traditional global indicators of water, sanitation, and hygiene.”

The researchers say that keeping track of such safety measures — in addition to the usual measures of water quality and access — could help better assess progress towards the United Nations’ Sustainable Development Goal 6.1, which sets out “to achieve universal and equitable access to safe and affordable drinking water for all” by 2030.

Dr Vidya Venkataramanan, also from Northwestern University, said: “It seems likely that water-fetching can contribute considerably to the global Water, Sanitation and Hygiene (WaSH) burden, but it usually goes unmeasured because we typically think about access and water quality. It is, therefore, a greatly underappreciated, nearly invisible public health challenge.

“It’s really important that data on water-fetching injuries are systematically collected so that we can know the true burden of water insecurity. Currently, all of the broken bones, spinal injuries, lacerations and other physical injuries are not accounted for in calculations about the burden of water insecurity.”make a difference: sponsored opportunity


Story Source:

Materials provided by University of East AngliaNote: Content may be edited for style and length.


Journal Reference:

  1. Vidya Venkataramanan, Jo-Anne L Geere, Benjamin Thomae, Justin Stoler, Paul R Hunter, Sera L Young. In pursuit of ‘safe’ water: the burden of personal injury from water fetching in 21 low-income and middle-income countriesBMJ Global Health, 2020; 5 (10): e003328 DOI: 10.1136/bmjgh-2020-003328

FOR MORE INFORMATION: University of East Anglia. “The dangers of collecting drinking water.” ScienceDaily. ScienceDaily, 4 November 2020. <www.sciencedaily.com/releases/2020/11/201104102213.htm>.

Desalination: Industrial-strength brine, meet your kryptonite

Desalination Is Booming. But What About All That Toxic Brine? | WIRED

A thin coating of the 2D nanomaterial hexagonal boron nitride is the key ingredient in a cost-effective technology developed by Rice University engineers for desalinating industrial-strength brine.

More than 1.8 billion people live in countries where fresh water is scarce. In many arid regions, seawater or salty groundwater is plentiful but costly to desalinate. In addition, many industries pay high disposal costs for wastewater with high salt concentrations that cannot be treated using conventional technologies. Reverse osmosis, the most common desalination technology, requires greater and greater pressure as the salt content of water increases and cannot be used to treat water that is extremely salty, or hypersaline.

Hypersaline water, which can contain 10 times more salt than seawater, is an increasingly important challenge for many industries. Some oil and gas wells produce it in large volumes, for example, and it is a byproduct of many desalination technologies that produce both freshwater and concentrated brine. Increasing water consciousness across all industries is also a driver, said Rice’s Qilin Li, co-corresponding author of a study about Rice’s desalination technology published in Nature Nanotechnology.

“It’s not just the oil industry,” said Li, co-director of the Rice-based Nanotechnology Enabled Water Treatment Center (NEWT). “Industrial processes, in general, produce salty wastewater because the trend is to reuse water. Many industries are trying to have ‘closed loop’ water systems. Each time you recover freshwater, the salt in it becomes more concentrated. Eventually the wastewater becomes hypersaline and you either have to desalinate it or pay to dispose of it.”

Conventional technology to desalinate hypersaline water has high capital costs and requires extensive infrastructure. NEWT, a National Science Foundation (NSF) Engineering Research Center (ERC) headquartered at Rice’s Brown School of Engineering, is using the latest advances in nanotechnology and materials science to create decentralized, fit-for-purpose technologies for treating drinking water and industrial wastewater more efficiently.

One of NEWT’s technologies is an off-grid desalination system that uses solar energy and a process called membrane distillation. When the brine is flowed across one side of a porous membrane, it is heated up at the membrane surface by a photothermal coating that absorbs sunlight and generates heat. When cold freshwater is flowed across the other side of the membrane, the difference in temperature creates a pressure gradient that drives water vapor through the membrane from the hot to the cold side, leaving salts and other nonvolatile contaminants behind.

A large difference in temperature on each side of the membrane is the key to membrane desalination efficiency. In NEWT’s solar-powered version of the technology, light-activated nanoparticles attached to the membrane capture all the necessary energy from the sun, resulting in high energy efficiency. Li is working with a NEWT industrial partner to develop a version of the technology that can be deployed for humanitarian purposes. But unconcentrated solar power alone isn’t sufficient for high-rate desalination of hypersaline brine, she said.

“The energy intensity is limited with ambient solar energy,” said Li, a professor of civil and environmental engineering. “The energy input is only one kilowatt per meter square, and the production rate of water is slow for large-scale systems.”

Adding heat to the membrane surface can produce exponential improvements in the volume of freshwater that each square foot of membrane can produce each minute, a measure known as flux. But saltwater is highly corrosive, and it becomes more corrosive when heated. Traditional metallic heating elements get destroyed quickly, and many nonmetallic alternatives fare little better or have insufficient conductivity.

“We were really looking for a material that would be highly electrically conductive and also support large current density without being corroded in this highly salty water,” Li said.

The solution came from study co-authors Jun Lou and Pulickel Ajayan in Rice’s Department of Materials Science and NanoEngineering (MSNE). Lou, Ajayan and NEWT postdoctoral researchers and study co-lead authors Kuichang Zuo and Weipeng Wang, and study co-author and graduate student Shuai Jia developed a process for coating a fine stainless steel mesh with a thin film of hexagonal boron nitride (hBN).

Boron nitride’s combination of chemical resistance and thermal conductivity has made its ceramic form a prized asset in high-temperature equipment, but hBN, the atom-thick 2D form of the material, is typically grown on flat surfaces.

“This is the first time this beautiful hBN coating has been grown on an irregular, porous surface,” Li said. “It’s a challenge, because anywhere you have a defect in the hBN coating, you will start to have corrosion.”

Jia and Wang used a modified chemical vapor deposition (CVD) technique to grow dozens of layers of hBN on a nontreated, commercially available stainless steel mesh. The technique extended previous Rice research into the growth of 2D materials on curved surfaces, which was supported by the Center for Atomically Thin Multifunctional Coatings, or ATOMIC. The ATOMIC Center is also hosted by Rice and supported by the NSF’s Industry/University Cooperative Research Program.

The researchers showed that the wire mesh coating, which was only about one 10-millionth of a meter thick, was sufficient to encase the interwoven wires and protect them from the corrosive forces of hypersaline water. The coated wire mesh heating element was attached to a commercially available polyvinylidene difluoride membrane that was rolled into a spiral-wound module, a space-saving form used in many commercial filters.

In tests, researchers powered the heating element with voltage at a household frequency of 50 hertz and power densities as high as 50 kilowatts per square meter. At maximum power, the system produced a flux of more than 42 kilograms of water per square meter of membrane per hour — more than 10 times greater than ambient solar membrane distillation technologies — at an energy efficiency much higher than existing membrane distillation technologies.

Li said the team is looking for an industry partner to scale up the CVD coating process and produce a larger prototype for small-scale field tests.

“We’re ready to pursue some commercial applications,” she said. “Scaling up from the lab-scale process to a large 2D CVD sheet will require external support.”

NEWT is a multidisciplinary engineering research center launched in 2015 by Rice, Yale University, Arizona State University and the University of Texas at El Paso that was recently awarded a five-year renewal grant for $16.5 million by the National Science Foundation. NEWT works with industry and government partners to produce transformational technology and train engineers who are ready to lead the global economy.

Ajayan is Rice’s Benjamin M. and Mary Greenwood Anderson Professor in Engineering, MSNE department chair and a professor of chemistry. Lou is a professor and associate department chair in MSNE and a professor of chemistry.make a difference: sponsored opportunity


Story Source:

Materials provided by Rice UniversityNote: Content may be edited for style and length.


Journal Reference:

  1. Kuichang Zuo, Weipeng Wang, Akshay Deshmukh, Shuai Jia, Hua Guo, Ruikun Xin, Menachem Elimelech, Pulickel M. Ajayan, Jun Lou, Qilin Li. Multifunctional nanocoated membranes for high-rate electrothermal desalination of hypersaline watersNature Nanotechnology, 2020; DOI: 10.1038/s41565-020-00777-0

FOR MORE INFORMATION: Rice University. “Desalination: Industrial-strength brine, meet your kryptonite: Boron nitride coating is key ingredient in hypersaline desalination technology.” ScienceDaily. ScienceDaily, 3 November 2020. <www.sciencedaily.com/releases/2020/11/201103172609.htm>.

From nitrate crisis to phosphate crisis?

Water pollution facts and information

The aim of the EU Nitrates Directive is to reduce nitrates leaking into the environment in order to prevent pollution of water supplies. The widely accepted view is that this will also help protect threatened plant species which can be damaged by high levels of nutrients like nitrates in the soil and water. However, an international team of researchers including the Universities of Göttingen, Utrecht and Zurich, has discovered that many threatened plant species will actually suffer because of this policy. The results were published in Nature Ecology and Evolution.

Nitrogen, in the form of nitrates, is an important nutrient for plant species. However, an overabundance can harm plant biodiversity: plant species that thrive on high levels of nitrates can displace other species adapted to low levels. “Despite this, it is not enough simply to reduce the level of nitrates,” says co-author Julian Schrader, researcher in the Biodiversity, Macroecology and Biogeography Group at the University of Göttingen. “Such a policy can even backfire and work against the protection of threatened plant species if other nutrients are not taken into account.”

In addition to nitrogen, plants also need phosphorus and potassium to grow. The researchers discovered that the ratio of these nutrients in the soil is important. They showed that when the concentration of nitrogen in the soil is reduced, without simultaneously reducing the concentration of phosphates, plant species that are already threatened, could disappear.

“Many threatened plant species in Europe are found in places where phosphate concentrations are low,” Schrader explained. If nitrogen concentrations decrease, as a result of effective environmental policies, then the relative concentration of phosphorous increases. This means that threatened species come under even more pressure. Threatened species are particularly sensitive to changes in nutrient concentrations and should, according to the researchers, be better protected.

The results of this research have significant consequences for the current EU Nitrate Directive. The authors advocate the introduction of an EU Phosphate Directive in addition to the existing EU Nitrate Directive.make a difference: sponsored opportunity


Story Source:

Materials provided by University of GöttingenNote: Content may be edited for style and length.


Journal Reference:

  1. Martin Joseph Wassen, Julian Schrader, Jerry van Dijk, Maarten Boudewijn Eppinga. Phosphorus fertilization is eradicating the niche of northern Eurasia’s threatened plant speciesNature Ecology & Evolution, 2020; DOI: 10.1038/s41559-020-01323-w

FOR MORE INFORMATION: University of Göttingen. “From nitrate crisis to phosphate crisis?.” ScienceDaily. ScienceDaily, 3 November 2020. <www.sciencedaily.com/releases/2020/11/201103104727.htm>.

Waste not, want not: Recycled water proves fruitful for greenhouse tomatoes

The Diversity Dynamics of Greenhouse Tomato Varieties | EuropeanSeed

In the driest state in the driest continent in the world, South Australian farmers are acutely aware of the impact of water shortages and drought. So, when it comes to irrigation, knowing which method works best is vital for sustainable crop development.

Now, new research from the University of South Australia shows that water quality and deficit irrigation schemes each have significant effects on crop development, yield and water productivity — with recycled wastewater achieving the best overall results.

Testing different water sources on greenhouse-grown tomatoes, recycled wastewater outperformed both groundwater, and a water mix of 50 per cent groundwater and 50 per cent recycled wastewater.

Researchers also confirmed that growers using deficit irrigation strategies (irrigation that limits watering in a controlled way) performs best at 80 per cent capacity, ensuring maximum water efficiency while maintaining excellent crop growth and yield levels.

Lead researcher and UniSA PhD candidate, Jeet Chand, says that the findings will provide farmers with valuable insights for productive, profitable and sustainable agricultural management.

“Water is an extremely valuable commodity in dry and arid farming regions, making efficient irrigation strategies and alternative water sources essential for agriculture production,” Chand says.

“Deficit irrigation is a strategy commonly used by farmers to minimise water use while maximising crop productivity but finding the most effective balance for greenhouse-grown produce can be tricky.

“In our research we tested optimum water deficit levels for greenhouse-grown tomatoes, showing that water at 80 per cent of field capacity is the superior choice for optimal tomato growth in the Northern Adelaide Plains.

“These results were enhanced by the use of recycled wastewater, which not only fares well for plants (by delivering additional nutrients) and for farmers (by reducing the need for fertilizer) but is also great for the environment.”

The Northern Adelaide Plains represents 90 per cent of tomato production in South Australia and contains the largest area of greenhouse coverage in the whole of Australia.

This study simulated tomato growing conditions in this region across the most popular growing season and over two years. It tested groundwater, recycled wastewater and a 50:50 mix of both, across four irrigation scenarios with soil moisture levels at 60, 70, 80 and 100 per cent of field capacity.

The highest growth levels were unsurprisingly achieved through 100 per cent field capacity, but mild water stress (80 per cent water capacity) delivered positive water efficiency without significant yield reduction.

While the results are positive for the tomato industry, Chand says there’s also good news for the home-gardening tomato aficionado.

“If you’re one of the lucky areas to have access to a verified source of recycled water, then your garden can also benefit from its additional nutrients,” Chand says.

“Remember, there is a significant difference between grey water — that is, water from the bath or dishes — and recycled water, so be sure to check your water source with your supplier.

“But if you have access to recycled water, great! Your tomatoes will grow like crazy, and you’ll be the envy of all your neighbours.”make a difference: sponsored opportunity


Story Source:

Materials provided by University of South AustraliaNote: Content may be edited for style and length.


Journal Reference:

  1. Jeet Chand, Guna Hewa, Ali Hassanli, Baden Myers. Evaluation of Deficit Irrigation and Water Quality on Production and Water Productivity of Tomato in GreenhouseAgriculture, 2020; 10 (7): 297 DOI: 10.3390/agriculture10070297

University of South Australia. “Waste not, want not: Recycled water proves fruitful for greenhouse tomatoes.” ScienceDaily. ScienceDaily, 30 October 2020. <www.sciencedaily.com/releases/2020/10/201030111835.htm>.

Researchers devise new method to get the lead out

How to Protect Yourself from Lead-Contaminated Water | NRDC

Commercially sold water filters do a good job of making sure any lead from residential water pipes does not make its way into water used for drinking or cooking.

Filters do not do a good job, however, of letting the user know how much lead was captured.

Until now, when a researcher, public works department or an individual wanted to know how much lead was in tap water, there wasn’t a great way to find out. Usually, a scientist would look at a one-liter sample taken from a faucet.

Researchers in the McKelvey School of Engineering at Washington University in St. Louis have devised a new method that allows them to extract the lead from these “point-of-use” filters, providing a clearer picture of what’s coming out of the faucet.

And they can do it in less than an hour.

Their research was published this past summer in the journal Environmental Science: Water Research & Technology.

The problem with just collecting a one-liter sample is that “We don’t know how long it was in contact with that lead pipe or if it just flowed through quickly. Everyone’s water use patterns are different,” said Daniel Giammar, the Walter E. Browne Professor of Environmental Engineering in the Department of Energy, Environmental & Chemical Engineering.

“Collecting a single liter is not a good way of assessing how much lead a resident would be exposed to if not using a filter,” he said. “To do that, you’d need to see all that the person was drinking or using for cooking.”

A better method would be to collect the lead from a filter that had been in use long enough to provide an accurate picture of household water use. Most of the commercial filters for sale at any major retail store will last for about 100 gallons — 40 times the amount of the typical water sample.

The idea to use filters in this way isn’t new, but it hasn’t been done very efficiently, precisely because the filters do such a good job at holding onto the lead.

Giammar said he had probably heard about this method previously, but a light went off after a conversation about indoor air quality. He had been talking to a professor at another institution who was monitoring indoor air quality using a box that sucked in air, collecting contaminants in a tube. The user can then remove the collection tube and send it to a lab to be analyzed.

“I said, ‘Let’s do what you’re doing with air,'” pull water through a filter, collect contaminants and then analyze them. “Then we realized, they already make and sell these filters.”

Liberating the lead

Point-of-use filters are typically made of a block of activated carbon that appears solid, almost like a lump of coal. The water filters through tiny pores in the carbon; the carbon binds to the lead, trapping it before the water flows out of the faucet.

“If you want to take the lead out of the water, you need something that is really good at strongly holding on to it,” which carbon is, Giammar said.

“So we had to hit it with something even stronger to pull that lead off.”

The solution? Acid.

Working with senior Elizabeth Johnson, graduate student Weiyi Pan tried different methods, but ultimately discovered that slowly passing an acidic solution through the filter would liberate 100% of the lead.

The entire process took about two liters of acid and about a half hour.

In the near future, Giammar sees the filters being put to use for research, as opposed to being put in the trash.

“The customer has a filter because they want to remove lead from the water. The water utility or researcher wants to know how much lead is in the home’s water over some average period of time,” Giammar said. Even if the customer doesn’t care, they’ve got this piece of data that usually they’d just throw away.

“We’d rather them send it to their utility service, or to us, and we can use it to get information.”make a difference: sponsored opportunity


Story Source:

Materials provided by Washington University in St. Louis. Original written by Brandie Jefferson. Note: Content may be edited for style and length.


Journal Reference:

  1. Weiyi Pan, Elizabeth R. Johnson, Daniel E. Giammar. Accumulation on and extraction of lead from point-of-use filters for evaluating lead exposure from drinking waterEnvironmental Science: Water Research & Technology, 2020; 6 (10): 2734 DOI: 10.1039/d0ew00496k

FOR MORE INFORMATION: Washington University in St. Louis. “Researchers devise new method to get the lead out: Now they can better measure lead in your tap water.” ScienceDaily. ScienceDaily, 30 October 2020. <www.sciencedaily.com/releases/2020/10/201030111744.htm>.

Expect more mega-droughts

Drought: Worst western megadrought in 1,200 years is here, new study says

Mega-droughts — droughts that last two decades or longer — are tipped to increase thanks to climate change, according to University of Queensland-led research.

UQ’s Professor Hamish McGowan said the findings suggested climate change would lead to increased water scarcity, reduced winter snow cover, more frequent bushfires and wind erosion.

The revelation came after an analysis of geological records from the Eemian Period — 129,000 to 116,000 years ago — which offered a proxy of what we could expect in a hotter, drier world.

“We found that, in the past, a similar amount of warming has been associated with mega-drought conditions all over south eastern Australia,” Professor McGowan said.

“These drier conditions prevailed for centuries, sometimes for more than 1000 years, with El Niño events most likely increasing their severity.”

The team engaged in paleoclimatology — the study of past climates — to see what the world will look like as a result of global warming over the next 20 to 50 years.

“The Eemian Period is the most recent in Earth’s history when global temperatures were similar, or possibly slightly warmer than present,” Professor McGowan said.

“The ‘warmth’ of that period was in response to orbital forcing, the effect on climate of slow changes in the tilt of the Earth’s axis and shape of the Earth’s orbit around the sun.

Professor Hamish McGowan wearing a headlamp crawling through a small cave to gain access to stalagmites around 120 metres below the surface in the Grotto Cave, NSW.”In modern times, heating is being caused by high concentrations of greenhouse gases, though this period is still a good analogue for our current-to-near-future climate predictions.”

Researchers worked with the New South Wales Parks and Wildlife service to identify stalagmites in the Yarrangobilly Caves in the northern section of Kosciuszko National Park.

Small samples of the calcium carbonate powder contained within the stalagmites were collected, then analysed and dated at UQ.

That analysis allowed the team to identify periods of significantly reduced precipitation during the Eemian Period.

“They’re alarming findings, in a long list of alarming findings that climate scientists have released over the last few decades,” Professor McGowan said.

“We hope that this new research allows for new insights to our future climate and the risks it may bring, such as drought and associated bushfires.

“But, importantly, if humans continue to warm the planet, this is the future we may all be looking at.”make a difference: sponsored opportunity


Story Source:

Materials provided by University of QueenslandNote: Content may be edited for style and length.


Journal Reference:

  1. Hamish McGowan, Micheline Campbell, John Nikolaus Callow, Andrew Lowry, Henri Wong. Evidence of wet-dry cycles and mega-droughts in the Eemian climate of southeast AustraliaScientific Reports, 2020; 10 (1) DOI: 10.1038/s41598-020-75071-z

FOR MORE INFORMATION: University of Queensland. “Expect more mega-droughts.” ScienceDaily. ScienceDaily, 30 October 2020. <www.sciencedaily.com/releases/2020/10/201030111839.htm>.

PFAS: These ‘forever chemicals’ are highly toxic, under-studied, and largely unregulated

Scientists Fight Back Against Toxic 'Forever' Chemicals | WIRED

Per-/poly-fluroalkyl substances, or PFAS, are everywhere. They are used in firefighting foam, car wax, and even fast-food wrappers. They’re one of the most toxic substances ever identified — harmful at concentrations in the parts per trillion — yet very little is known about them. PFAS, which is a class of over 3000 compounds, are only regulated at the state level, so while some states are working to aggressively tackle the problem, other states have chosen to ignore PFAS completely, leaving concentrations unknown and health risks unexplored.

This topic will be discussed at the Geological Society of America’s 2020 Annual Meeting, in a technical session which will help bring PFAS to national attention. Presentations will discuss how PFAS are released into the environment, transported through groundwater, river, and soils, and partially remediated. PFAS have been produced in the U.S. for decades, primarily for industrial use. Matt Reeves, a professor at Western Michigan University and lead author of one of the presentations, says PFAS have been labelled “forever chemicals” because they have bonds that are “among the strongest in all of chemistry.”

“It’s almost like armor…we don’t have any evidence of degradation of these compounds,” he says.

The health risks from PFAS bioaccumulation are heightened because of their toxicity at extremely low concentrations. At the federal advisory level, which is non-enforceable and was set in 2016, the EPA has deemed just 70 parts per trillion (ppt) safe; that’s like a few grains of sand in an Olympic-size swimming pool. Compare that to arsenic, a toxic element whose safe limit is 10 parts per billion — much higher than the PFAS limit. Due to bioaccumulation, fish in southeastern Michigan were found with PFAS concentrations in the parts per billion — far exceeding safe limits and prompting “do not eat the fish” signs to be posted along rivers and lakes. Health effects from PFAS are still being studied, but they potentially include increased rates of some types of cancer, hormonal disruption, and immune responses.

Michigan is receiving special attention because in July of this year, the state government enacted strict regulations for seven compounds in the PFAS family. For one compound, the highest safe limit is just 6 ppt — far lower than the EPA’s guidelines. “Michigan is the most proactive state of the nation in characterizing and studying PFAS, and with their legislation,” Reeves says. His talk highlights the PFAS cycle on land and complications with site remediation.

“Notice we don’t call it a ‘life cycle,'” he says. “It’s a perpetual cycle. We cannot break down these compounds, so there’s no ‘death.'”

Even once a PFAS source is identified, remediation is difficult. North Carolina, like Michigan, has legacy PFAS contamination from industries past. Marie-Amélie Pétré, a postdoc at NCSU, is studying how quickly PFAS are flushed from groundwater to streams. This flushing is a critical part of the water cycle that determines when residents can expect their drinking water to be safe. “Quantifying the timescale for PFAS flushing from groundwater can help predict downriver concentrations in the future,” Pétré says. “We’re the first to quantify PFAS transport… between groundwater and streams using field data. It’s such a rapidly evolving field. This ongoing discharge isn’t included in remediation plans.”

At the University of Arizona, Mark Brusseau and Bo Guo are studying PFAS in soils, which serve as a PFAS repository between groundwater and surface waters. “Concentrations of PFAS in the soil can be orders of magnitude higher than they are in the groundwater at the same location,” Brusseau says. Despite differences in state regulation, one thread is clear: PFAS are everywhere. His talk examines over 30,000 soil samples from around the world. “PFAS were found to be present at almost every site that was sampled, whether it was a metropolitan area, near an industrial source, or out in a rural area,” he says. “[They are] even in some very remote mountain areas.”

“PFAS don’t discriminate,” Steve Sliver, a co-author on Reeves’ talk and lead of Michigan’s PFAS response team, says. “The sources are pretty much everywhere.”make a difference: sponsored opportunity


Story Source:

Materials provided by Geological Society of AmericaNote: Content may be edited for style and length.

FOR MORE INFORMATION: Geological Society of America. “PFAS: These ‘forever chemicals’ are highly toxic, under-studied, and largely unregulated.” ScienceDaily. ScienceDaily, 29 October 2020. <www.sciencedaily.com/releases/2020/10/201029122943.htm>.