Raincoat no longer waterproof? A textile scientist explains why – and how to fix it

You pull on your rain jacket, step out into the storm, and within half an hour your undershirt is soaked. The jacket you purchased as “waterproof” seems to have stopped working, and all the marketing claims feel a bit suspect.

In reality, the jacket probably hasn’t failed overnight: a mix of how it’s built, the exact level of water protection it offers, and years of sweat, skin oil and dirt have all played a part.

But there are a few simple ways you can care for your rain jacket to ensure you stay dry, even when it’s pouring.

The science behind rain jackets

Most proper rain jackets are built around a waterproof “membrane” sandwiched inside the fabric. Gore-Tex is the most popular technology used which includes a very thin layer of chemicals known as PTFE (polytetrafluoroethylene) or expanded PTFE (ePTFE) which are full of microscopic pores.

Those pores are much smaller than liquid water droplets. But they’re big enough for individual water vapour molecules, so rain on the outside can’t push through, but sweat vapour from your body can escape outwards.

Other fabrics use solid, non-porous membranes made from polyurethane or polyester that move water vapour by absorbing it and passing it through the material molecule by molecule rather than via tiny holes. This can make them a bit more tolerant of dirt.

The outer fabric is sometimes treated with a very thin chemical finish that makes water roll off the surface instead of soaking into the fibres – a bit like wax on a car. This finish is known as “Durable Water Repellent” and helps to reduce saturation of water in the exterior of the jacket.

In the past, many of these chemical finishes used “forever chemicals” (PFAS) that repelled both water and oil, but persist in the environment and build up in wildlife and people.

Because of this, brands and regulators have started using alternatives based on silicones or hydrocarbons. These still repel water but are generally less hazardous.

It’s also useful to understand the words you see on labels.

A waterproof jacket is built to stop rain coming through, even in heavy or prolonged downpours, and usually has a membrane, a chemical finish plus fully taped seams.

“Water resistant” means the fabric slows water down and copes with light showers but will eventually let water through. It often relies on a tight weave and a chemical finish but no true membrane.

“Water repellent” just describes that beading effect from the chemical finish. It can apply to both waterproof and non-waterproof fabrics.

Some brands also say rainproof or weatherproof as a friendlier way of saying “pretty much waterproof”, but there’s rarely a separate test behind that word.

 
The outer fabric of a rain jacket is sometimes treated with a very thin chemical finish that makes water roll off the surface instead of soaking into the fibres. Claudio Schwarz/Unsplash

Why do rain jackets degrade over time?

When you realise your jacket isn’t waterproof anymore, the first thing that has usually gone wrong isn’t the membrane. It’s the chemical finish on the outside.

That ultra thin surface layer gets scuffed by backpack straps and seat belts, baked by sun, and contaminated by mud, smoke and city grime.

These coatings can gradually lose their water repellent properties through abrasion and washing if harsh detergents and washing cycles are used, and bits of them are shed into the environment over time.

Body oils, sunscreen and insect repellent also play a role, as they build up in the fabric over time. Outdoor gear care guides and lab work on waterproof fabrics both point out that these oily contaminants can damage the chemical finish and clog the pores of the membrane, making it harder both for rain to be repelled and for sweat vapour to escape.

Over many years, slow physical ageing also takes a toll. Constant flexing can cause a membrane to thin or develop tiny cracks and the finish to deteriorate. Seam tapes can also start to peel away, especially on shoulders where backpack straps press.

How to keep a jacket waterproof

The single best thing you can do for both your comfort and the planet is to keep a good jacket working for as long as possible, because making new technical fabrics has a significant environmental footprint.

Gentle washing will help extend the life of your rain jacket, as it removes the build up of contamination such as dirt and body oils. Brands and care guides recommend closing zips and Velcro, then washing on a gentle cycle with a cleaner designed for waterproof fabrics or a very mild soap, avoiding normal detergents and softeners that leave residues.

Depending on the type of chemical finish, this coat can be re-applied through spray-on or wash-in products found commercially. Some finishes can be re-activated by exposure to low heat (low dryer heat or low ironing heat). Heat makes the water-repelling molecules stand back up after they have been “flattened” by use and contamination.

Although the above will help you to keep your jacket waterproof, it is best to follow the care instructions given by the manufacturer as they change according to the type of composition of the fabric.

In any case, it is important to avoid leaving the jacket wet and scrunched up for weeks, and be mindful of heavy sunscreens and repellents.The Conversation

Carolina Quintero Rodriguez, Senior Lecturer and Program Manager, Bachelor of Fashion (Enterprise) program, RMIT University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Read More........

Wildlife Poachers to Be Targeted Using State of the Art AI Listening Technology

A photo of a male forest elephant captured near the site where some of the gunshot recordings were taken – credit, Anahita Verahrami / SWNS

Wildlife poachers can now be located and arrested across the central African forests thanks to state-of-the-art AI listening technology.

A network of microphones has been deployed across the rainforests to detect gunshots from illegal poaching of elephants and other animals, and American scientists are using AI to ensure the network can distinguish gunshots over the din of the jungle environment.

The web of acoustic sensors was deployed in Gabon, Congo, and Cameroon, creating the possibility of real-time alerts to the sounds of gun-based poaching.

But the belly of the rainforest is loud, and scientists say sorting through a constant influx of sound data is computationally demanding. Detectors can distinguish a loud bang from the whistles, chirps, and rasps of birds and bugs, but they often confuse the sounds of branches cracking or trees falling with gunshot noises, resulting in a high percentage of false positives.

Project leader Naveen Dhar at Center for Conservation Bioacoustics at Cornell University aimed to develop a lightweight gunshot detection neural network that can accompany sensors and process signals in real-time to minimize false positives.

He worked alongside colleagues at the Elephant Listening Project to create a model that will work through autonomous recording units (ARUs), which are power-efficient microphones that capture continuous, long-term soundscapes.

“The proposed system utilizes a web of ARUs deployed across the forest, each performing real-time detection, with a central hub that handles more complex processing.”

An initial scan filters all audio for “gunshot likely” signals and sends them to the ARU’s microprocessor, where the lightweight gunshot detection model lives.

If confirmed as a gunshot by the microprocessor, the ARU passes the information to the central hub, initiating data collection from other devices in the web.


By determining if other sensors also hear a “gunshot likely” noise, the central hub then decides whether the event was a true gunshot or a potential false positive.

If it determines a true positive, the central hub collates audio files from each sensor, allowing it to pinpoint the location of the gunshot and alert rangers on the ground with coordinates for immediate poaching intervention.

“Down the road, the device can be used as a tool for rangers and conservation managers, providing accurate and verifiable alerts for on-the-ground intervention along with low-latency data on the spatiotemporal trends of poachers,” Dhar said.

He plans to expand the model to detect the type of gun that fires each gunshot and other human activities, such as chainsaws or trucks, before field-testing the system, which is currently under development.

“I hope the device can coalesce with Internet of Things infrastructure innovations and cost reduction of materials to produce a low-cost, open-source framework for real-time detection usable in any part of the globe.”He is due to present his findings at a joint meeting of the Acoustical Society of America and Acoustical Society of Japan, in Honolulu, Hawaii. Wildlife Poachers to Be Targeted Using State of the Art AI Listening Technology:
Read More........

Planting Billions of Trees Turned Barren Desert into a Carbon Sink That Lowers CO2

A mixed-species section of the Green Great Wall – Credit: 中国新闻网 CC 3.0. BY

China’s multi-decade long, successful effort to plant a ring of trees around one of the world’s most hostile deserts has sprouted an unexpected benefit to humanity.

Along with protecting the nation’s grasslands and agriculture from the spreading sands of the dismal Taklamakan Desert, the giant ring of trees has turned previous unproductive land into a carbon sink that draws CO2 out of the atmosphere.

It’s thought, and some isolated research has indeed demonstrated, that humans can prevent the worst effects of a rise in average global temperatures by planting trees to absorb more CO2 from the atmosphere.

This strategy has limits, however, when viewed on a global scale. Atmospheric CO2 levels continue to rise, while there is a limit in the amount of land that can be turned over to forests.

One-third of our planet is covered in deserts, where vegetation is sparse or absent, and rainfall is scarce, yet despite their vast acreage they collectively hold less than one-tenth of the world’s carbon stock, or the amount of carbon that is held underground.

A study conducted by NASA and California Technical Institute (Caltech) has used satellite data to demonstrate that the “sea of death” as the Taklamakan Desert was called in antiquity, could be utilized to store carbon and reduce the greenhouse effect.

The Taklamakan Desert. Credit: NASA World Wind 1.4.

Starting in 1978, China’s Three-North Shelter Belt program aimed to plant trees along the borders of the great Taklamakan to stop sandstorms from ruining adjacent pasture and agriculture land. As the world’s single farthest point from any ocean, the Taklamakan is one of the driest and most hostile landscapes on our planet.

The massive Himalayas rise to the south and east, the Pamirs to the southwest, and a pair of mountains known as the Tian Shan and the Altai to the west, leaving landscape completely isolated from moisture.

66 billion trees have been planted by estimates since the start of the Shelter Belt program, which finished in 2024. Monikered the “Green Great Wall,” this incredible increase in greenery has raised average rainfall by several millimeters, resulting in a natural growth of foliage during the wet season that boosts photosynthesis along the tree line, leading to greater degrees of sequestration.

“We found, for the first time, that human-led intervention can effectively enhance carbon sequestration in even the most extreme arid landscapes, demonstrating the potential to transform a desert into a carbon sink and halt desertification,” study co-author Yuk Yung, a professor of planetary science at Caltech and a senior research scientist in NASA’s Jet Propulsion Laboratory, told Live Science in an email.

By precise numbers, it has reduced the average carbon content in the desert air from 416 parts per million to 413 ppm. Parts per million is used as a measurement for the greenhouse effect. Worldwide, the number is 429.3. It was 350 in before the advent of industrialization.If more shelter belt-style tree planting efforts could be used to reclaim desert landscapes, it could open vast areas to absorbing carbon. With little to no vegetation, deserts in their natural state have precious little ability to do so. Planting Billions of Trees Turned Barren Desert into a Carbon Sink That Lowers CO2
Read More........

New Spray-on Powder Instantly Seals Life-Threatening Wounds in Battle or During Disasters

South Korean scientists win award for wound powder – SWNS

A spray-on powder that instantly seals life-threatening wounds could save thousands of lives, say scientists.

The new substance can help prevent excessive bleeding which is the leading cause of death due to injuries in war, according to a study.

The fast-acting powder that stops bleeding in just one second was developed by South Korean scientists who say it can also be applied in emergency hospital procedures.

The research team at the Korea Advanced Institute of Science and Technology (KAIST) in Daejeon, which included an Army Major, created the powder that rapidly forms a strong hydrogel barrier when sprayed directly onto a bullet wound.

The team designed the technology with real combat conditions in mind, and the direct involvement of an Army Major helped ensure its practical readiness.

Major Kyusoon Park, who is also a PhD candidate and served as a study co-author, said the substance not only allows “instant hardening” under extreme conditions like combat or disasters but also delivers high usability and storage stability.

“Until now, patch-type hemostatic agents widely used in medical fields have had limitations due to their flat structure and sensitivity to temperature and humidity.”

They cannot withstand pressure applied to the wound. Also, current powders that stop blood flow have limited functionality by physically absorbing blood to form a barrier, according to the study published in the journal Advanced Functional Materials.
Medical first aide equipment for combat care at Marine Corps Base Camp Lejeune – Credit: Navy Medicine via Unsplash

“The new AGCL powder reacts with cations, such as calcium in the blood, to turn into a gel state in one second, instantly sealing the wound,” said study co-leader Professor Steve Park.

“Furthermore, by forming a three-dimensional structure inside the powder, it can absorb blood amounting to more than seven times its own weight.”

“It shows superior sealing performance compared to commercial hemostatic agents—with a high adhesive strength and a level of pressure that can withstand being pressed strongly by hand.”

AGCL powder is composed entirely of naturally derived materials with an antibacterial effect of 99.9%.

It has a structure that combines biocompatible natural materials such as alginate and gellan gum—that react with calcium for fast gelation and physical sealing—and chitosan, which bonds with blood components to enhance chemical and biological hemostasis.

“In animal experiments, excellent tissue-regeneration effects, such as rapid wound recovery and promotion of blood vessel and collagen regeneration, were confirmed,” explained Prof. Park.

“In surgical liver injury experiments, the amount of bleeding and hemostasis time were significantly reduced compared to commercial methods.”

“It also maintains its performance for two years, even in room temperature and high humidity environments, possessing the advantage of being ready for immediate use in harsh environments.

“Although this is an advanced new material technology developed with national defense purposes in mind,” said Major Park, “it has great potential for emergency medicine, disaster sites, developing countries, and medically underserved areas.”“I started the research with a sense of mission to save even one more soldier—but I also hope this technology will be used as a life-saving technology in private medical fields.” New Spray-on Powder Instantly Seals Life-Threatening Wounds in Battle or During Disasters
Read More........

EV Charging Answer: Quantum Technology Will Cut Time it Takes to Charge Electric Cars to Just 9 Seconds

Institute for Basic Science

Scientists in South Korea have proven that a new technology will cut the time it takes to charge electric cars to just nine seconds, allowing EV owners to ‘fill up’ faster than their gasoline counterparts.

And even those plugging-in at home will have the time slashed from 10 hours to three minutes.

The new device uses the laws of quantum physics to power all of a battery’s cells at once—instead of one at a time—so recharging takes no longer than filling up at the pump.

Electric cars were rarely seen on the roads 10 years ago, but millions are now being sold every year and it has become one of the fastest growing industries, but even the fastest superchargers need around 20 to 40 minutes to power their car.

Scientists at the Institute for Basic Science (IBS) in South Korea have come up with a solution. Co-author Dr. Dario Rosa said the consequences could be far-reaching.

“Quantum charging could go well beyond electric cars and consumer electronics. For example, it may find key uses in future fusion power plants, which require large amounts of energy to be charged and discharged in an instant.”

The concept of a “quantum battery” was first proposed in a seminal paper published by Alicki and Fannes in 2012. It was theorized that quantum resources, such as entanglement, can be used to vastly speed up battery charging.

The researchers used quantum mechanics to model their super fast charging station with calculations of the charging speed showing that a typical electric vehicle with a battery containing around 200 cells would recharge 200 times faster.

Current collective charging is not possible in classical batteries, where the cells are charged in parallel, independently of one another.

“This is particularly exciting as modern large-capacity batteries can contain numerous cells.”

The group went further to provide an explicit way of designing such batteries.

This means charging times could be cut from 10 hours to three minutes at home and from around 30 minutes to just a few seconds at stations.

Co-author Dr Dominik Šafránek said, “Of course, quantum technologies are still in their infancy and there is a long way to go before these methods can be implemented in practice.”

“Research findings such as these, however, create a promising direction and can incentivize the funding agencies and businesses to further invest in these technologies.

“If employed, it is believed that quantum batteries would completely revolutionize the way we use energy and take us a step closer to our sustainable future.”

The findings were published in the February 8 edition of the journal Physical Review Letters. [GNN updated the earlier broken link.] EV Charging Answer: Quantum Technology Will Cut Time it Takes to Charge Electric Cars to Just 9 Seconds
Read More........

Scientists Develop Biodegradable Smart Textile–A Big Leap Forward for Eco-Friendly Wearable Technology

Flexible inkjet printed E-textile – Credit: Marzia Dulal

Wearable electronic textiles can be both sustainable and biodegradable, shows a new study.

A research team led by the University of Southampton and UWE Bristol in the UK tested a new sustainable approach for fully inkjet-printed, eco-friendly e-textiles.

Named SWEET—for Smart, Wearable, and Eco-friendly Electronic Textiles—the new ‘fabric’ was described in findings published in the journal Energy and Environmental Materials.


E-textiles are those with embedded electrical components, such as sensors, batteries or lights. They might be used in fashion, for performance sportswear, or for medical purposes as garments that monitor people’s vital signs.

Such textiles need to be durable, safe to wear and comfortable, but also, in an industry which is increasingly concerned with clothing waste, they need to be kind to the environment when no longer required.

“Integrating electrical components into conventional textiles complicates the recycling of the material because it often contains metals, such as silver, that don’t easily biodegrade,” explained Professor Nazmul Karim at the University of Southampton.


“Our eco-friendly approach for selecting sustainable materials and manufacturing overcomes this, enabling the fabric to decompose when it is disposed of.”

The team’s design has three layers, a sensing layer, a layer to interface with the sensors and a base fabric. It uses a textile called Tencel for the base, which is made from renewable wood and is biodegradable.

The active electronics in the design are made from graphene, along with a polymer called PEDOT: PSS. These conductive materials are precision inkjet-printed onto the fabric.

The research team, which included members from the universities of Exeter, Cambridge, Leeds, and Bath, tested samples of the material for continuous monitoring of heart rates. Five volunteers were connected to monitoring equipment, attached to gloves worn by the participants. Results confirmed the material can effectively and reliably measure both heart rate and temperature at the industry standard level.

Gloves with e-textile sensors monitoring heart rate – Credit: Marzia Dulal

“Achieving reliable, industry-standard monitoring with eco-friendly materials is a significant milestone,” said Dr. Shaila Afroj, an Associate Professor of Sustainable Materials from the University of Exeter and a co-author of the study. “It demonstrates that sustainability doesn’t have to come at the cost of functionality, especially in critical applications like healthcare.”

The project team then buried the e-textiles in soil to measure its biodegradable properties.

After four months, the fabric had lost 48 percent of its weight and 98 percent of its strength, suggesting relatively rapid and also effective decomposition.

Furthermore, a life cycle assessment revealed the graphene-based electrodes had up to 40 times less impact on the environment than standard electrodes.

Four strips in a variety of decomposed states, during four months of decomposition – Credit: Marzia Dulal

Marzia Dulal from UWE Bristol, the first author of the study, highlighted the environmental impact: “Our life cycle analysis shows that graphene-based e-textiles have a fraction of the environmental footprint compared to traditional electronics. This makes them a more responsible choice for industries looking to reduce their ecological impact.”

The ink-jet printing process is also a more sustainable approach for e-textile fabrications, depositing exact numbers of functional materials on textiles as needed, with almost no material waste and less use of water and energy than conventional screen printing.“These materials will become increasingly more important in our lives,” concluded Prof. Karim, who hopes to move forward with the team to design wearable garments made from SWEET, particularly in the area of early detection and prevention of heart diseases. Scientists Develop Biodegradable Smart Textile–A Big Leap Forward for Eco-Friendly Wearable Technology
Read More........

Heat with no end: climate model sets out an unbearable future for parts of Africa


Oluwafemi E. Adeyeri, Australian National University

People often think of a heatwave as a temporary event, a brutal week of sun that eventually breaks with a cool breeze. But as the climate changes globally, in parts of Africa, that level of heat is becoming a permanent part of the weather.

Research shows Africa’s exposure to dangerous heat is rising rapidly. Until now, estimating how severe this heat would become was challenging. This was because many widely used global climate models struggled to capture the local factors that shape heat in Africa’s diverse climate zones and habitats (humid tropics, dry savannas and rapidly changing agricultural areas).

It is very important to analyse how these different local factors cause dangerous heat because they all play a role in causing it. For example, rapid changes to the way land is used, such as deforestation, alter soil moisture and humidity. Turning forests into crop land therefore becomes a driver of extreme heat.

We are a team of hydroclimate and land-atmosphere scientists who study heat extremes, water resources, the way land use changes, and hydroclimate risk. We set out to produce reliable, locally relevant projections of future heatwaves. Our team realised that to understand the true heatwave risk in Africa, we had to look down as well as up. It is not only the warming atmosphere from above, it is also the way people are transforming the land below.

To better understand how heat is likely to affect African countries, and to avoid relying on any single climate model, we developed a framework built on four pillars:

  • To get the most accurate data, we studied 10 global climate models rather than betting on one model.

  • The global climate model outputs were adjusted so they matched observed heatwave patterns (the frequency, duration, magnitude, amplitude, number and timing of heatwaves) and showed the links between temperature, wind, radiation and humidity.

  • Artificial intelligence (AI) was used to quantify how much the different drivers of heat (such as temperature, humidity, soil moisture, wind, radiation, land use) contributed to heatwave changes. We also used AI to highlight how these drivers made heat worse when they interacted.

  • We compared what would happen in a high-pollution future as opposed to one where governments and industry managed to reduce carbon emissions.

Our research found that by the late 21st century, most regions in Africa will stop having occasional heatwaves and will suffer from extreme heat lasting most of the year. The study shows that by 2065-2100, many parts of Africa (apart from Madagascar) could experience heatwaves on 250-300 days per year.

Some areas, such as the western side of southern Africa, will experience heatwaves that are 12 times as long and frequent as they are now, even if global emissions are reduced. Many heatwaves will last longer than 40 days at a time.

This is not just a slight warming; it is a fundamental change in how people will have to survive on the continent. Once regions in Africa enter a state of almost continuous heatwaves, the human body will have no window of time to recover.

Africa’s heat risk comes from global emissions and local land choices. This means that cutting greenhouse gases matters, and so does protecting and restoring the land’s natural ways of cooling the planet down.

How heat will build dramatically across Africa

In places with intact forests that cool the air, heat and humidity usually remain below a deadly limit. Forests act like natural air-conditioners, preventing fatal heat.

But when forests are cut down and replaced with cropland, the local climate changes. Crops release large amounts of moisture into the air, raising humidity. Heat and moisture build, and the surface heats up faster during the day and stays warmer at night. The land becomes a heat trap. A hot spell that would have been tolerable under forest cover becomes a prolonged, hazardous heatwave.

Rising background heat can affect entire regions. Rural communities, including smallholder farmers, are also highly exposed because they work outdoors and often have limited access to cooling, healthcare or heat-resilient infrastructure.

Heatwaves will affect shack or informal settlement areas more because they generally lack trees and vegetation, and homes built from metal are harder to cool. Without shade, heat will build and linger.

A ‘deadly threshold’ will be reached

Our modelling shows that there is a specific combination of heat and humidity where conditions can intensify heatwaves very quickly, especially in landscapes dominated by cropland.

This is a different kind of heat risk. It is not the familiar “dry heat” driven by parched soils. It is a crop‑driven humidity effect that pushes the atmosphere into a danger zone. For example, in west Africa, extreme heat will peak at about 26.5°C-26.8°C with 74%-75% humidity, producing heatwaves that last 30-35 days.

In southern east Africa, heatwaves will happen even at lower temperatures (23.6°C-23.8°C) and humidity (70%-72%). The danger there is that even small increases in heat or moisture, including those caused by cutting down forests, will make heatwaves more common and longer.

Across all nine African climate regions, our research found that heatwaves will stop being rare events and start becoming a regular part of the year.

The good news is that local land choices will offer immediate protection. Keeping forests, restoring vegetation and using climate-smart farming (where animals and crops are farmed with trees) are not just environmental actions. They are public health defences that weaken the intensity and duration of heatwaves.

What needs to happen next

This research highlights something simple but powerful: a forest is a shield.

This study also shows how planning in cities and in rural areas can keep “nature’s air‑conditioner” working.

Protecting the continent means acting on two fronts. Globally, we need to keep reducing fossil fuel emissions, because even moderate cuts lower the chance of long, near-permanent heatwaves.

Locally, every land-clearing decision matters. Removing natural vegetation adds heat to communities, but keeping forests and cover on the land helps hold temperatures down.

The message is straightforward. Countries cannot control global warming on their own, but they can control how the land responds to it.The Conversation

Oluwafemi E. Adeyeri, Research Fellow in Climate Science, Australian National University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Read More........

Southern right whales are having babies less often, but why?

For decades, southern right whales have been celebrated as one of conservation’s success stories.

Once driven to the brink of extinction by commercial whaling, southern right whales slowly returned to Australian coastlines through the late 20th century. Their recovery reflected the power of international protection, marine sanctuaries and long-term science working together.

But our new research shows this success story is changing. We drew on more than 30 years of continuous shore-based monitoring of southern right whales in the Great Australian Bight, from within the Yalata Indigenous Protected Area in South Australia. We found clear evidence whales are having calves less often, with the average calving interval increasing for 3 to 4 years. This means the number of calves being born has slowed over the past decade.

This decline appears closely linked to climate-driven changes in the Southern Ocean — similar patterns are now being observed across the southern hemisphere.

More than 3 decades of photos

Our study analysed photo-identification data collected by researchers between 1991 and 2024 from a major calving area in the Great Australian Bight. Each whale is identified using its unique pattern of callosities — the hard patches of skin on its head that remain throughout its life.

This allows individual whales to be tracked across decades, providing rare insight into long-term population dynamics and how these change over time. Photo-identification is a globally accepted method used for whale population assessments. By tracking known individuals over time, researchers can directly measure their reproductive histories.

Long-term datasets like this are rare — and that is precisely what makes them so powerful. The Australian Right Whale Research Program at Flinders University is one of the longest continuous photo-identification studies of any whale species in the world. It has used the same methods over decades. In the context of climate change, where impacts often emerge slowly and unevenly, this long-term evidence is essential.

What we found

Since around 2015, female southern right whales have not given birth as often. These extended calving intervals mean fewer calves are being born overall, and this reduces population growth over time.

For a long-lived species that reproduces slowly, this matters. Small changes in reproductive rates impacts population growth. The slowdown in reproduction signals a shift away from the recovery seen in previous decades.

A signal from the south

The cause of this change is not immediately visible from Australia’s coastline. Southern right whales spend much of their lives feeding thousands of kilometres away in the Southern Ocean, where they rely on the cold, nutrient-rich waters created by Antarctic sea ice. These waters support krill and prey that are crucial for whales to build up the energy reserves they need for pregnancy and lactation.

Over the past decade, the ocean has warmed, the ice is melting and there have been dramatic shifts in food availability weather patterns. Our analysis shows longer calving intervals coincide with these environmental changes, suggesting the impacts of climate change on conditions in the Southern Ocean are linked to whales having fewer calves.

A global pattern emerges

Importantly, this is not just an Australian story.

Similar trends are being reported in southern right whale populations off South America and South Africa, where researchers have documented reduced calving rates, whales in poor condition and environmental changes.

Southern right whales are a sentinel species: animals whose health reflects broader changes in their environment. Our findings signal deeper disruption in ocean systems that also support fisheries, affect how the climate is regulated and influence marine plants, animals and other species.

Southern right whales are long-lived, reproduce slowly, and rely on energy-rich feeding grounds. This makes them particularly vulnerable to climate-driven changes in prey.

What needs to change?

Protecting the Southern Ocean and its increasingly vulnerable natural ecosystems demands urgent collective climate action. This must bridge disciplines, industries, governments and interconnected regions.

This action should include the expansion of sanctuaries across the migratory ranges of threatened species. It should also limit threats, such as whales being struck by ships, getting entangled in ropes and being exposed to noise pollution.

The future of southern right whales is likely to be closely tied to the management of krill harvesting and addressing climate change.

We need to listen — and act — while there is still time.

The author would like to acknowledge the contribution of research collaborators and all of the people involved in the long-term research program that make this work possible.The Conversation

Claire Charlton, Leader of Australian Right Whale Research Program, College of Science and Engineering, Flinders University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Read More........

Why your brain has to work harder in an open-plan office than private offices: study

Since the pandemic, offices around the world have quietly shrunk. Many organisations don’t need as much floor space or as many desks, given many staff now do a mix of hybrid work from home and the office.

But on days when more staff are required to be in, office spaces can feel noticeably busier and noisier. Despite so much focus on getting workers back into offices, there has been far less focus on the impacts of returning to open-plan workspaces.

Now, more research confirms what many suspected: our brains have to work harder in open-plan spaces than in private offices.

What the latest study tested

In a recently published study, researchers at a Spanish university fitted 26 people, aged in their mid-20s to mid-60s, with wireless electroencephalogram (EEG) headsets. EEG testing can measure how hard the brain is working by tracking electrical activity through sensors on the scalp.

Participants completed simulated office tasks, such as monitoring notifications, reading and responding to emails, and memorising and recalling lists of words.

Each participant was monitored while completing the tasks in two different settings: an open-plan workspace with colleagues nearby, and a small enclosed work “pod” with clear glazed panels on one side.

The researchers focused on the frontal regions of the brain, responsible for attention, concentration, and filtering out distractions. They measured different types of brain waves.

As neuroscientist Susan Hillier explains in more detail, different brain waves reveal distinct mental states:

  • “gamma” is linked with states or tasks that require more focused concentration
  • “beta” is linked with higher anxiety and more active states, with attention often directed externally
  • “alpha” is linked with being very relaxed, and passive attention (such as listening quietly but not engaging)
  • “theta” is linked with deep relaxation and inward focus
  • and “delta” is linked with deep sleep.

The Spanish study found that the same tasks done inside the enclosed pod vs the open-plan workspace produced completely opposite patterns.

It takes effort to filter out distractions

In the work pod, the study found beta waves – associated with active mental processing – dropped significantly over the experiment, as did alpha waves linked to passive attention and overall activity in the frontal brain regions.

This meant people’s brains needed progressively less effort to sustain the same work.

The open-plan office testing showed the reverse.

Gamma waves, linked to complex mental processing, climbed steadily. Theta waves, which track both working memory and mental fatigue, increased. Two key measures also rose significantly: arousal (how alert and activated the brain is) and engagement (how much mental effort is being applied).

In other words, in the open-plan office participants’ brains had to work harder to maintain performance.

Even when we try to ignore distractions, our brain has to expend mental effort to filter them out.

In contrast, the pod eliminated most background noise and visual disruptions, allowing participant’s brains to work more efficiently.

Researchers also found much wider variability in the open office. Some people’s brain activity increased dramatically, while others showed modest changes. This suggests individual differences in how distracting we find open-plan spaces.

With only 26 participants, this was a relatively small study. But its findings echo a significant body of research from the past decade.

What past research has shown

In our 2021 study, my colleagues and I found a significant causal relationship between open-plan office noise and physiological stress. Studying 43 participants in controlled conditions – using heart rate, skin conductivity and AI facial emotion recognition – we found negative mood in open plan offices increased by 25% and physiological stress by 34%.

Another study showed background conversations and noisy environments can degrade cognitive task performance and increase distraction for workers.

And a 2013 analysis of more than 42,000 office workers in the United States, Finland, Canada and Australia found those in open-plan offices were less satisfied with their work environment than those in private offices. This was largely due to increased, uncontrollable noise and lack of privacy.

Just as we now recognise poorly designed chairs cause physical strain, years of research has shown how workspace design can result in cognitive strain.

What to do about it

The ability to focus and concentrate without interruption and distraction is a fundamental requirement for modern knowledge work.

Yet the value of uninterrupted work continues to be undervalued in workplace design.

Creating zones where workers can match their workplace environment to the task is essential.

Responding to having more staff doing hybrid work post-pandemic, LinkedIn redesigned its flagship San Francisco office. LinkedIn halved the number of workstations in open plan areas, instead experimenting with 75 types of work settings, including work areas for quiet focus.

For organisations looking to look after their workers’ brains, there are practical measures to consider. These include setting up different work zones, acoustic treatments and sound-masking technologies, and thoughtfully placed partitions to reduce visual and auditory distractions.

While adding those extra features in may cost more upfront than an open plan office, they can be worth it. Research has shown the significant hidden toll of poor office design on productivity, health and employee retention.

Providing workers with more choice in how much they’re exposed to noise and other interruptions is not a luxury. To get more done, with less strain on our brains, better design at work should be seen as a necessity.The Conversation

Libby (Elizabeth) Sander, MBA Director & Associate Professor of Organisational Behaviour, Bond Business School, Bond University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Read More........

Samsung's 600-Mile-Range Batteries That Charge in 9 Minutes Ready for Production/Sale Next Year

A mock-up design of Samsung SDI’s solid-state battery – credit, Samsung, released

In late October, Samsung announced that it was preparing to take its long-anticipated solid-state batteries to market with a trilateral agreement between itself, BMW, and American battery expert Solid Power.

It was January of last year that industry outlets began to get some of the promises that all-solid-state batteries (ASSBs) developed by Samsung SDI would bring. With an energy density of 500 watt-hours per kilogram, they’re twice as dense as conventional lithium-ion batteries.

Samsung claimed they were smaller, lighter, and safer, capable of driving 600 miles, and charging with
in 9 minutes. Typically, a lithium-ion battery pack in a modern EV charges from 10% to 80% in around 45 minutes, and has a limit of around 300 miles of range.

“Samsung SDI’s preparations for mass-producing next-generation products of various form factors such as an all-solid-state battery are well underway as we are set to lead the global battery market with our unrivaled ‘super-gap’ technology,” said Samsung SDI CEO Yoon-ho Choi.

ASSB cells use solid electrolyte instead of liquid electrolyte found in a lithium-ion battery. They offer superior safety, as they aren’t flammable, and last for 20 years, or 2,000 charge-discharges, equating to 1.2 million miles.

Under the trilateral agreement, Samsung will supply ASSB cells featuring the solid electrolyte developed by Solid Power to the German automotive group BMW, which will then develop modules and packs for ASSB cells to fit into their next-generation evaluation vehicles, expected in late 2026.

Metal Tech News reported in January that ASSBs will also debut in some smaller Samsung devices during 2026, including the Galaxy Ring fitness tracker, as a way of testing the new power supplies in the real world before incorporating them into smartphones, laptops, and other devices.Samsung’s ASSBs use a silver-carbon layer as the anode and a nickel-manganese-cobalt material for the cathode. Silver is not only the most electrically conductive metal available, it’s also substantially more plentiful in the Earth’s crust than lithium. Samsung's 600-Mile-Range Batteries That Charge in 9 Minutes Ready for Production/Sale Next Year
Read More........

AI-powered digital stethoscopes show promise in bridging screening gaps

(Photo: Eko Health, US) IANS

New Delhi, As tuberculosis (TB) continues as the deadliest infectious cause of deaths globally, a new study has shown that artificial intelligence (AI)-enabled digital stethoscopes can help fill critical screening gaps, especially in hard-to-reach areas.

In a commentary published in the journal Med (Cell Press), global experts contended that stethoscopes combined with digital technology and AI can be a better option against the challenges faced in screening programmes, such as under-detection, high cost, and inequitable access.

“AI-enabled digital stethoscopes have demonstrated promising accuracy and feasibility for detecting lung and cardiovascular abnormalities, with promising results in early TB studies. Training and validation in diverse, high-burden settings are essential to explore the potential of this tool further,” said corresponding author Madhukar Pai from McGill University, Canada, along with researchers from the UAE, Germany, and Switzerland.

Despite advancements in screening and diagnostic tools, an estimated 2.7 million people with TB were missed by current screening programmes, as per data from the World Health Organization (WHO). Routine symptom screening is also likely to miss people with asymptomatic or subclinical TB.

While the WHO recently recommended several AI-powered computer-aided detection (CAD) software, as well as ultra-portable radiography hardware, higher operating costs and upfront hardware act as a deterrent.

This particularly appeared difficult in primary care settings and or among pregnant women due to radiation concerns.

At the same time, AI showed significant potential for screening, including applications beyond CAD of TB from radiographs, said the researchers.

“One application of AI for disease screening is to interpret acoustic (sound) biomarkers of disease, with potential to identify sounds that appear nonspecific or are inaudible to the human ear,” they added, while highlighting the potential of AI in detecting and interpreting cough biomarkers and lung auscultation to analyse breath sounds.

Studies from high-TB burden countries, including India, Peru, South Africa, Uganda, and Vietnam, highlighted that AI-enabled auscultation could hold promise as a TB screening and triage tool.

"AI digital stethoscopes may become useful alternatives to imaging-based approaches for TB screening, with the potential to democratise access to care for populations underserved by radiography," the researchers said."Importantly, AI digital stethoscopes offer a scalable, low-cost, and person-centered tool that could bring us closer to reaching TB case finding goals," they added. AI-powered digital stethoscopes show promise in bridging screening gaps | MorungExpress | morungexpress.com
Read More........

Deep-sea fish larvae rewrite the rules of how eyes can be built

Fabio Cortesi, The University of Queensland and Lily Fogg, University of Helsinki

The deep sea is cold, dark and under immense pressure. Yet life has found a way to prevail there, in the form of some of Earth’s strangest creatures.

Since deep-sea critters have adapted to near darkness, their eyes are particularly unique – pitch-black and fearsome in dragonfish, enormous in giant squid, barrel-shaped in telescope fish. This helps them catch the remaining rays of sunlight penetrating to depth and see the faint glow of bioluminescence.

Deep-sea fishes, however, typically start life in shallower waters in the twilight zone of the ocean (roughly 50–200 metres deep). This is a safe refuge to feed on plankton and grow while avoiding becoming a snack for larger predators.

Our new study, published in Science Advances, shows deep-sea fish larvae have evolved a unique way to maximise their vision in this dusky environment – a finding that challenges scientific understanding of vertebrate vision.

The nightmare of seeing in the twilight zone

The vertebrate retina, located at the back of the eye, has two main types of light-sensitive photoreceptor cells: rod-shaped for dim light and cone-shaped for bright light.

The rods and cones slowly change position inside the retina when moving between dim and bright conditions, which is why you temporarily go blind when you flick on the light switch on your way to the bathroom at night.

While vertebrates that are active during the daytime and predominantly inhabit bright light environments favour cone-dominated vision, animals that live in dim conditions, such as the deep sea or caves, have lost or reduced their cone cells in favour of more rods.

However, vision in twilight is a bit of a nightmare – neither rods nor cones are working at their best. This raises the question of how some animals, such as larval deep-sea fishes, can overcome the limitations of the cone-and-rod retina not only to survive but even to thrive in twilight conditions.

Starting where the fish start

To understand how newly born deep-sea fishes see, we had to start where they do: in the twilight zone of the ocean.

We caught larval fish from the Red Sea using fine-meshed nets towed from near the surface to a depth of around 200m. This way we got hold of three different species – the lightfish (Vinciguerria mabahiss) and the hatchetfish (Maurolicus mucronatus), both members of the dragonfishes, and a member of the lanternfishes, the skinnycheek lanternfish (Benthosema pterotum). Next, we studied what their photoreceptor cells looked like on the outside and how they were wired on the inside.

First, we used high-resolution microscopy to examine the cells’ shape in great detail. Then we investigated retinal gene expression to identify which vision genes were activated as the fish grew. Finally, we got some experts in computational modelling of visual proteins on board to simulate which wavelengths of light these tiny fishes may perceive.

By combining all the approaches, we were able to piece together a picture of how these animals see their world. This sounds relatively simple, but working with deep-sea fishes is anything but easy.

While these animals are generally thought of as monsters of the deep, in reality, most reach only about the size of a thumb – even when fully grown. They are also very fragile and difficult to get.

Working with larval specimens that are only a few millimetres long is even more difficult. However, by leveraging support from the deep-sea research community, we were fortunate enough to combine specimens from multiple research expeditions to piece together an unusually complete picture of visual development in these elusive animals.

So, what did we discover?

For decades, scientists have thought that, as vertebrates grow, the development of their retina follows a predictable pattern: cones form first, then rods. But the deep-sea fish we studied do not follow this rule.

We found that, as larvae, they mostly use a mix-and-match type of hybrid photoreceptor. The cells they are using early on look like rods but use the molecular machinery of cones, making them rod-like cones.

In some of the species we studied, these hybrid cells were a temporary solution, replaced by “normal” rods as the fish grew and migrated into deeper, darker waters.

However, in the hatchetfish, which spends its whole life in twilight, the adults keep their rod-like cone cells throughout life, essentially building their entire visual system around this extra type of cell.

Our research shows this is not a minor tweak to the system. Instead, it represents a fundamentally different developmental pathway for vertebrate vision.

Biology doesn’t fit into neat boxes

So why bother with these hybrid cells?

It seems that to overcome the visual limitations of the twilight zone, rod-like cones offer the best of both worlds: the light-capturing ability of rods combined with the faster, less bright-light sensitive properties of cones. For a tiny fish trying to survive in the murky midwater, this could mean the difference between spotting dinner or becoming it.

For more than a century, biology textbooks have taught that vertebrate vision is built from two clearly defined cell types. Our findings show these tidy categories are much more blurred.

Deep-sea fish larvae combine features of both rods and cones into a single, highly specialised cell optimised for life in between light and darkness. In the murky depths of the ocean, deep-sea fish larvae have quietly rewritten the rules of how eyes can be built, and in doing so, remind us that biology rarely fits into neat boxes.The Conversation

Fabio Cortesi, ARC Future Fellow, Faculty of Science, The University of Queensland and Lily Fogg, Postdoctoral Researcher, Helsinki Institute of Life Science, University of Helsinki

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Read More........

Highly Fatal Virus May Finally Be Treatable with First Vaccine–Clinical Trials Starting

The Nipah virus pictured in red – credit, US NIH

In January, India recorded a mini-outbreak of the Nipah virus, an often lethal disease spread by contact between humans and animals.

There was little that could be done for the victims, as no specialized treatment for Nipah virus exists other than normal supportive care procedures such as the treatment of the resulting symptoms, rest, and hydration.

Some well-studied antiviral medications like ribavirin, remdesivir, acyclovir, favipiravir, have seen use on a speculative basis during certain outbreaks, but real efficacy is unclear.

Now though, the University of Tokyo’s Research Center for Advanced Science and Technology has developed a potential Nipah virus vaccine by inserting some of the virus’ genetic material into the modified measles vaccine. Early trials in hamsters have shown it to be safe and effective.

Nipah virus fatality rates are 40% to 75%. It’s typically spread by contact between humans and bats, often through people consuming tree fruit contaminated with bat saliva. Once thusly contracted, it can spread quickly through humans via any form of fluid exchange.

The virus is present in the tropics and often in rural areas where access to medical care may be limited.

Tokyo University’s vaccine candidate is now on its way to Belgium for a Phase 1 testing in humans, where with the help of a nonprofit called the European Vaccine Initiative, it will be examined for safety across 60 test candidates.The trials are set to begin in April. Highly Fatal Virus May Finally Be Treatable with First Vaccine–Clinical Trials Starting
Read More........

AI tool can simulate complex fusion plasma in seconds

(Image: UKAEA)

A team of scientists from the UK Atomic Energy Authority, the Johannes Kepler University Linz, and Emmi AI, have developed an artificial intelligence tool - named GyroSwin - which can create simulations up to 1,000 times faster than traditional computational methods.

Magnetic nuclear fusion is considered a promising technology for sustainable and emission-free energy supply. However, to achieve fusion, machines need to confine plasma at extreme temperatures using powerful magnets. Managing turbulence within the plasma is a key fusion challenge so it needs to be accurately modelled.

Plasma scientists rely on state-of-the-art numerical simulations, using five-dimensional (5D) gyrokinetics, which includes three spatial dimensions plus two additional dimensions which account for parallel and perpendicular velocity of particles within the plasma. This 5D approach requires immense supercomputing power. Traditional simulations are extremely slow and computationally expensive, significantly lengthening design and development cycles. Previously, computation methods simulated a plasma by actively calculating the complex plasma dynamics.

GyroSwin uses the latest AI methods to learn the 5D simulation dynamics and the resulting surrogate models can run in seconds, in contrast to the hours or even days for conventional simulations. It was trained on six terabytes of data. This speed allows for much faster, more agile prediction of plasma turbulence, crucial for optimising fusion machine designs.

"Designing, developing, and operating a fusion power plant will involve millions of plasma simulations," said Rob Akers, Director of Computing Programmes at UKAEA. "Reducing runtimes from hours or days to minutes or seconds - whilst preserving sufficient accuracy - will be essential for making this challenge manageable. Pioneering AI-based tools like GyroSwin therefore show great promise for being genuinely transformative around time-to-solution and cost."

Processing 5D data has never previously been tackled by an AI surrogate model, and GyroSwin outperforms other AI methods it's been compared against, UKAEA noted. This increased performance is made possible because GyroSwin preserves key physical information from a fusion plasma, including the length scale of fluctuations, and the sheared flows that can reduce turbulence - all crucial to the physical interpretability of plasma simulations.

"We love scientific challenges, and building AI models that accelerate 5D gyrokinetic simulations is definitely one of the toughest challenges out there," said Johannes Brandstetter, Professor at JKU, co-founder and Chief Scientist at Emmi. "We are very proud of how far we got in this great collaboration, but we know that we have just scratched the surface."

UKAEA will now research how GyroSwin's advanced capability can be applied to next generation power plants such as the UK's Spherical Tokamak for Energy Production (STEP), where millions of simulations will potentially be required to optimise plasma scenario designs with uncertainty quantification. As more complex physics is included for power plant conditions, simulations become even more lengthy, making faster plasma modelling essential.This GyroSwin project was part-funded by the International Computing element of the UK Government's Fusion Futures Programme. AI tool can simulate complex fusion plasma in seconds
Read More........

Apes Show Ability to Imagine in ‘Tea Party’ Experiments, and Scientists are Very Excited

43-year-old bonobo named Kanzi – Courtesy of Ape Initiative / Johns Hopkins / SWNS

Apes share the human ability to imagine and pretend, suggests new research that included a series of tea party experiments.

Scientists at Johns Hopkins University in Baltimore, Maryland, called it the first study to show the capacity for pretending is not unique to mankind.

They learned that apes can use their imagination and play pretend. One bonobo engaged with cups of imaginary juice and bowls of pretend grapes “consistently and robustly” across three experiments, challenging long-held assumptions about the abilities of animals.

The findings, published this week in the journal Science, suggest that the capacity to understand pretend objects is within the cognitive potential of, at least, an “enculturated ape”, and likely dates back six to nine million years, to our common evolutionary ancestors.

“It really is game-changing that their mental lives go beyond the here and now,” said study co-author Dr. Christopher Krupenye.

“Imagination has long been seen as a critical element of what it is to be human, but the idea that it may not be exclusive to our species is really transformative.

“Jane Goodall discovered that chimps make tools and that led to a change in the definition of what it means to be human—and this, too, really invites us to reconsider what makes us special and what mental life is out there among other creatures.”

He said that, by the age of two, human children can engage in pretend scenarios, like tea parties. Even at 15-months-old, infants show measures of surprise when they see a person “drinking” from a cup after pretending to empty it.

Credit: Getty Images For Unsplash+

There had been no previous studies of pretend behavior in non-human animals, despite several reports of animals seemingly engaging in pretending behavior from both the wild and in zoos or captivity.

For instance, in the wild, young female chimps have been observed carrying and playing with sticks, holding them like mothers would hold their infants. And a chimp in captivity seemed to drag imaginary blocks along the floor after playing with real wooden blocks.

Dr. Krupenye and co-author Amalia Bastos, a former Johns Hopkins postdoctoral fellow who is now a lecturer at the University of St. Andrews in Scotland, wondered if they could test the capacity to pretend in a controlled environment.

They created experiments similar to a child’s tea party to test Kanzi, a 43-year-old bonobo living at Ape Initiative in Iowa, is the world’s only research center and sanctuary dedicated exclusively to the study and conservation of bonobos, our closest primate relative.

Kanzi had been anecdotally reported to engage in pretense, and could respond to verbal prompts by pointing.

In each test, a researcher and Kanzi faced one another, tea party-style, across a table. In the first task there were two transparent cups on the table, both empty, alongside an empty transparent pitcher.

Kanzi – Courtesy of Ape Initiative / Johns Hopkins / SWNS

The researcher tipped the pitcher to “pour” a little pretend juice into each cup, then pretended to dump the juice out of one cup, shaking it a bit to really get it out.

The researcher then asked Kanzi: “Where’s the juice?”

The bonobo pointed to the correct cup that still contained pretend juice, even when the researcher changed the position of the cup filled with pretend juice.


In case Kanzi thought there was real juice in the cup, even if he couldn’t see it, the team ran a second experiment, during which a cup of real juice was placed alongside the cup of pretend juice.

When Kanzi was asked what he wanted, he pointed toward the real juice almost every time.

A third experiment repeated the same concept, except with grapes. A researcher pretended to sample a grape from an empty container, then placed it inside one of the two jars.

After pretending to empty one of the containers, he asked Kanzi: “Where’s the grape?”

Kanzi again indicated the location of the pretend object. The researchers said Kanzi wasn’t perfect, but he was consistently correct.

“It’s extremely striking and very exciting that the data seem to suggest that apes, in their minds, can conceive of things that are not there,” said Dr. Bastos.

“Kanzi is able to generate an idea of this pretend object and, at the same time, know it’s not real.”

The researchers now want to test whether other apes and animals can engage in pretend play or track pretend objects. They also hope to explore other facets of imagination in apes, perhaps their ability to think about the future or to think about what’s going on in the minds of others.

“Imagination is one of those things that in humans gives us a rich mental life,” said Dr. Krupenye.“And if some roots of imagination are shared with apes, that should make people question their assumption that other animals are just living robotic lifestyles constrained to the present. We should be compelled by these findings to care for these creatures with rich and beautiful minds and ensure they continue to exist.” Apes Show Ability to Imagine in ‘Tea Party’ Experiments, and Scientists are Very Excited
Read More........

New Ultrasonic Imaging System Can Detect Deadly Defects in All Types of Concrete

– credit Fujikawa et al. with background / SWNS

If a physician needs to see what’s gone wrong inside a human body, it’s easy enough to order an ultrasound scan. But if the structural engineer wants to do the same in a block of concrete, his options are of limited effectiveness.

The range of materials that concrete contains, such as stone, clay, chalk, slate, iron ore, and sand, scatters normal sound waves, making clear images difficult to obtain.

Now, Japanese and American scientists have teamed up to develop a system that can identify interior defects in concrete buildings and bridges without destroying their structure.

Team members explain in a news release that their method sends sound waves into the material and captures the waves that echo back to create images of what’s inside, just like an ultrasound.

“In our approach, the ultrasonic wave is broadband, using a wide range of ultrasonic frequencies rather than operating around a single, fixed frequency,” said Professor Yoshikazu Ohara from Tohoku University in Japan.

“The receiver is capable of accepting an even broader range of frequencies. By automatically adapting the frequency to the material, our system improves the contrast between defects and background material in concrete.”

Tohoku and his colleagues joined the Los Alamos National Laboratory in New Mexico, and Texas A&M University to create the system.

A chief challenge is that it’s hard to know which frequencies of sound waves will survive traveling through concrete, as different material therein may interfere with different wavelengths.

To accommodate the uncertainty, the team used two devices: one to generate a wide range of frequencies to send into the material and another, called a vibrometer, to capture the outcoming waves.

The system, described in the journal Applied Physics Letters, can handle a wide range of frequencies, which means that even if ultrasonic waves are scattered by materials in the concrete, those that do make it through are still detected, regardless of what frequency they are.

“As the concrete filters out certain frequencies, the laser Doppler vibrometer simply captures whatever frequencies remain,” said Professor Ohara. “Unlike conventional systems, we don’t have to swap transducers or adjust the frequency beforehand. The system adapts automatically.”

The result is a high-resolution 3D image of the defect and its location in the concrete.For a repair planner or field technician, this provides ‘concrete’ information: how deep the defect is from the surface, how large it is, and how it extends in three dimensions, making it possible to plan repairs more efficiently. New Ultrasonic Imaging System Can Detect Deadly Defects in All Types of Concrete
Read More........

Winter Olympians often compete in freezing temperatures – physiology and advances in materials science help keep them warm

Cara Ocobock, University of Notre Dame and Gabriel R. Burks, University of Notre Dame

The Winter Olympics and Paralympics are upon us once again. This year the games come to Milan and Cortina d’Ampezzo, Italy, where weather forecasts are predicting temperatures in the upper 30s to mid-40s Fahrenheit (1 to 10 degrees Celsius).

These temperatures are a good deal warmer than one might expect for winter, particularly in a mountainous area. They’re warm enough that athletes will need to adjust how they are preparing their equipment for competition, yet still cold enough to affect the physiology of athletes and spectators alike.

As a biological anthropologist and a materials scientist, we’re interested in how the human body responds to different conditions and how materials can help people improve performance and address health challenges. Both of these components will play a key role for Olympic athletes hoping to perform at their peak in Italy.

Athletes in the cold

The athletes taking part in outdoor events are no strangers to cold and unpredictable weather conditions. It is an inherent part of their sports. Though it is highly unlikely the athletes this year will be exposed to extreme cold, the outdoor conditions will still affect their performance.

One concern is dehydration, which can be less noticeable, as sweating is typically less frequent and intense in cold conditions. However, cold temperatures also mean lower relative humidity. This dry air means the body needs to use more of its own water to moisten the air before it reaches the delicate lungs. Athletes breathing heavily during competition are losing more body water that way than they would in more temperate conditions.

When cold, the body also tends to narrow its blood vessels to better maintain core body temperature. Narrower blood vessels lose less heat to the cooler air, but this results in the body pushing more fluid out of the circulatory system and toward the kidneys, which then increases urine output.

Though the athletes may not be sweating to the same degree as they would in warmer temperatures, they are still sweating. Athletes dress to improve their performance and protect themselves from cold. The layers of clothing and material used in conjunction with the heat produced from physical activity can lead to sweating and create a hot, wet space between the athlete’s body and what they are wearing.

This space is not only another site of water loss, but also a potential problem for athletes who need to take part in different rounds or runs for their competition – for example, the initial heats for skiing or snowboarding.

These athletes are physically active and working up a sweat, and then they wait around for their next heat. During this waiting period, that damp layer of sweat will make them more vulnerable to body heat loss and cold injury such as frostbite or hypothermia. Athletes must stay warm between rounds of competition.

Science of winter apparel

Staying warm is all about materials selection and construction.

Many apparel companies adopt a three-layer system approach to keep wearers warm, dry and comfortable. Specifically, there is a bottom layer – in direct contact with the skin – that is typically composed of a moisture-wicking synthetic fabric such as nylon or a natural fabric such as wool.

The second layer in winter apparel is an insulating one that is generally porous to trap warm air generated by the body and to slow heat loss. Great options for this are down and fleece.

The final layer is the external protection layer, which keeps you dry and protected from the elements. This layer needs to be waterproof and breathable to keep the inner insulating layers dry but to simultaneously let out sweat. Polyester and acrylic are good options here, as they are lightweight, durable and resist moisture.

The gear athletes wear can be customized to their needs. For example, the synthetic fabrics used on the innermost layer are versatile, and engineers can introduce new properties and functionalities for users. Adding a specific coating to a fabric like nylon can give it new properties – such as wind and water resistance.

Frequently, both the synthetic fibers and the coatings materials scientists add to them are made up of polymers, which are long chains of molecules. They can be human-made and petroleum-based, like polyethylene trash bags, polyester and Teflon. But polymers can also be natural and derived from nature. Your DNA and the proteins in your body are examples of polymers.

In addition to polymer technology, conventional battery-powered heating jackets are also an option.

Smart materials

As an added bonus, there is also a class of smart materials called phase change materials that are made of polymers and composite materials. They automatically absorb excess body heat when too much is created and release it again to the body when needed to passively regulate your body temperature. These materials release or take in heat as they transition between solid and liquid states and respond to the body’s natural cues.

Phase change materials are less about warming you up. Instead, they work by keeping your temperature balanced.

While these aren’t commonly used in the gear athletes wear, NASA has been experimenting with them for a long time, and many commercially available products leverage this technology. Cooling fabrics, such as bedding and towels, are often made of phase change textiles because they do not overheat you.

Risks to the rest of us

Athletes are not the only ones at risk for cold injury.

While most of us will be watching the Games with the comfort of indoor heating, thousands of people and support staff will be watching or working those outdoor events in person. Unlike the athletes, these individuals will not have the added benefit of their bodies producing extra heat from exercise. The nonathletes in attendance will be at greater risk in the cold.

If you’re planning to spectate or work at an event this winter, drink more water than usual and time your bathroom breaks accordingly. Plan to wear several layers of clothing you can add and remove as needed, and pay special attention to the more vulnerable parts of the body, such as the hands, feet and nose.

Colder temperatures elicit a variety of metabolic responses in the body. One example is shivering, caused by tiny muscle contractions that produce heat. Your body’s brown adipose tissue – a type of fat – also becomes active and produces heat rather than energy.

Both of these processes burn extra calories, so expect to be more hungry if you’re out in the cold for a while. Trips to the bathroom or to get food are a welcome opportunity to warm up – especially those hands and feet.

It is easy to think of Olympians as exceptional athletes at the mercy of Mother Nature’s cold wrath. However, both the human body’s natural physiology and the impressive advances scientists have made in winter apparel technology will keep these athletes warm and performing at their best.The Conversation

Cara Ocobock, Assistant Professor of Anthropology, University of Notre Dame and Gabriel R. Burks, Assistant Professor of Chemical and Biomolecular Engineering, University of Notre Dame

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Read More........