Science
Separating out signals recorded at the seafloor
Roger Bryant and David Fike’s research reveals that pyrite sulfur isotopes mainly reflect local conditions, shifting fundamental understanding of oceanic environmental studies.
![Signals](https://i0.wp.com/q5i.09c.myftpupload.com/wp-content/uploads/2023/11/image-47.jpeg?resize=740%2C555&ssl=1)
Roger Bryant studied ocean floor core samples at the Secondary Ion Mass Spectrometry (SIMS) facility at Washington University in St. Louis during his PhD studies. Bryant and David Fike have used this data to prove a discovery that will fundamentally change how scientists use pyrite sulfur isotopes to study oceanic conditions.
« Separating out signals recorded at the seafloor
Newswise — Blame it on plate tectonics. The deep ocean is never preserved, but instead is lost to time as the seafloor is subducted. Geologists are mostly left with shallower rocks from closer to the shoreline to inform their studies of Earth history.
Signals from the Sea
“We have only a good record of the deep ocean for the last ~180 million years,” said David Fike, the Glassberg/Greensfelder Distinguished University Professor of Earth, Environmental, and Planetary Sciences in Arts & Sciences at Washington University in St. Louis. “Everything else is just shallow-water deposits. So it’s really important to understand the bias that might be present when we look at shallow-water deposits.”
One of the ways that scientists like Fike use deposits from the seafloor is to reconstruct timelines of past ecological and environmental change. Researchers are keenly interested in how and when oxygen began to build up in the oceans and atmosphere, making Earth more hospitable to life as we know it.
For decades they have relied on pyrite, the iron-sulfide mineral known as “fool’s gold,” as a sensitive recorder of conditions in the marine environment where it is formed. By measuring the bulk isotopic composition of sulfur in pyrite samples — the relative abundance of sulfur atoms with slightly different mass — scientists have tried to better understand ancient microbial activity and interpret global chemical cycles.
But the outlook for pyrite is not so shiny anymore. In a pair of companion papers published Nov. 24 in the journal Science, Fike and his collaborators show that variations in pyrite sulfur isotopes may not represent the global processes that have made them such popular targets of analysis.
Instead, Fike’s research demonstrates that pyritte responds predominantly to local processes that should not be taken as representative of the whole ocean. A new microanalysis approach developed at Washington University helped the researchers to separate out signals in pyrite that reveal the relative influence of microbes and that of local climate.
For the first study, Fike worked with Roger Bryant, who completed his graduate studies at Washington University, to examine the grain-level distribution of pyrite sulfur isotope compositions in a sample of recent glacial-interglacial sediments. They developed and used a cutting-edge analytical technique with the secondary-ion mass spectrometer (SIMS) in Fike’s laboratory.
“We analyzed every individual pyrite crystal that we could find and got isotopic values for each one,” Fike said. By considering the distribution of results from individual grains, rather than the average (or bulk) results, the scientists showed that it is possible to tease apart the role of the physical properties of the depositional environment, like the sedimentation rate and the porosity of the sediments, from the microbial activity in the seabed.
“We found that even when bulk pyrite sulfur isotopes changed a lot between glacials and interglacials, the minima of our single grain pyrite distributions remained broadly constant,” Bryant said. “This told us that microbial activity did not drive the changes in bulk pyrite sulfur isotopes and refuted one of our major hypotheses.”
“Using this framework, we’re able to go in and look at the separate roles of microbes and sediments in driving the signals,” Fike said. “That to me represents a huge step forward in being able to interpret what is recorded in these signals.”
In the second paper, led by Itay Halevy of the Weizmann Institute of Science and co-authored by Fike and Bryant, the scientists developed and explored a computer model of marine sediments, complete with mathematical representations of the microorganisms that degrade organic matter and turn sulfate into sulfide and the processes that trap that sulfide in pyrite.
“We found that variations in the isotopic composition of pyrite are mostly a function of the depositional environment in which the pyrite formed,” Halevy said. The new model shows that a range of parameters of the sedimentary environment affect the balance between sulfate and sulfide consumption and resupply, and that this balance is the major determinant of the sulfur isotope composition of pyrite.
“The rate of sediment deposition on the seafloor, the proportion of organic matter in that sediment, the proportion of reactive iron particles, the density of packing of the sediment as it settles to the seafloor — all of these properties affect the isotopic composition of pyrite in ways that we can now understand,” he said.
Importantly, none of these properties of the sedimentary environment are strongly linked to the global sulfur cycle, to the oxidation state of the global ocean, or essentially any other property that researchers have traditionally used pyrite sulfur isotopes to reconstruct, the scientists said.
“The really exciting aspect of this new work is that it gives us a predictive model for how we think other pyrite records should behave,” Fike said. “For example, if we can interpret other records — and better understand that they are driven by things like local changes in sedimentation, rather than global parameters about ocean oxygen state or microbial activity — then we can try to use this data to refine our understanding of sea level change in the past.”
Source: Washington University in St. Louis
The science section of our news blog STM Daily News provides readers with captivating and up-to-date information on the latest scientific discoveries, breakthroughs, and innovations across various fields. We offer engaging and accessible content, ensuring that readers with different levels of scientific knowledge can stay informed. Whether it’s exploring advancements in medicine, astronomy, technology, or environmental sciences, our science section strives to shed light on the intriguing world of scientific exploration and its profound impact on our daily lives. From thought-provoking articles to informative interviews with experts in the field, STM Daily News Science offers a harmonious blend of factual reporting, analysis, and exploration, making it a go-to source for science enthusiasts and curious minds alike. https://stmdailynews.com/category/science/
Discover more from Daily News
Subscribe to get the latest posts sent to your email.
Lifestyle
Biden helped bring science out of the lab and into the community − emphasizing research focused on solutions
![Biden](https://i0.wp.com/stmdailynews.com/wp-content/uploads/2025/01/file-20250116-15-23uxzk-jpg.webp?resize=740%2C492&ssl=1)
Arthur Daemmrich, Arizona State University
President Joe Biden was inaugurated in January 2021 amid a devastating pandemic, with over 24 million COVID-19 cases and more than 400,000 deaths in the U.S. recorded at that point.
Operation Warp Speed, initiated by the Trump administration in May 2020, meant an effective vaccine was becoming available. Biden quickly announced a plan to immunize 100 million Americans over the next three months. By the end of April 2021, 145 million Americans – nearly half the population – had received one vaccine dose, and 103 million were considered fully vaccinated. Science and technology policymakers celebrated this coordination across science, industry and government to address a real-world crisis as a 21st-century Manhattan Project.
From my perspective as a scholar of science and technology policy, Biden’s legacy includes structural, institutional and practical changes to how science is conducted. Building on approaches developed over the course of many years, the administration elevated the status of science in the government and fostered community participation in research.
Raising science’s profile in government
The U.S. has no single ministry of science and technology. Instead, agencies and offices across the executive branch carry out scientific research at several national labs and fund research by other institutions. By elevating the White House Office of Science and Technology Policy to a Cabinet-level organization for the first time in its history, Biden gave the agency greater influence in federal decision-making and coordination.
Formally established in 1976, the agency provides the president and senior staff with scientific and technical advice, bringing science to bear on executive policies. Biden’s inclusion of the agency’s director in his Cabinet was a strong signal about the elevated role science and technology would play in the administration’s solutions to major societal challenges.
Under Biden, the Office of Science and Technology Policy established guidelines that agencies across the government would follow as they implemented major legislation. This included developing technologies that remove carbon dioxide from the atmosphere to address climate change, rebuilding America’s chip industry, and managing the rollout of AI technologies.
Instead of treating the ethical and societal dimensions of scientific and technological change as separate from research and development, the agency advocated for a more integrated approach. This was reflected in the appointment of social scientist Alondra Nelson as the agency’s first deputy director for science and society, and science policy expert Kei Koizumi as principal deputy director for policy. Ethical and societal considerations were added as evaluation criteria for grants. And initiatives such as the AI bill of rights and frameworks for research integrity and open science further encouraged all federal agencies to consider the social effects of their research.
The Office of Science and Technology Policy also introduced new ways for agencies to consult with communities, including Native Nations, rural Americans and people of color, in order to avoid known biases in science and technology research. For example, the agency issued government-wide guidance to recognize and include Indigenous knowledge in federal programs. Agencies such as the Department of Energy have incorporated public perspectives while rolling out atmospheric carbon dioxide removal technologies and building new hydrogen hubs.
Use-inspired research
A long-standing criticism of U.S. science funding is that it often fails to answer questions of societal importance. Members of Congress and policy analysts have argued that funded projects instead overly emphasize basic research in areas that advance the careers of researchers.
In response, the Biden administration established the technology, innovation and partnerships directorate at the National Science Foundation in March 2022.
The directorate uses social science approaches to help focus scientific research and technology on their potential uses and effects on society. For example, engineers developing future energy technologies could start by consulting with the community about local needs and opportunities, rather than pitching their preferred solution after years of laboratory work. Genetic researchers could share both knowledge and financial benefits with the communities that provided the researchers with data.
Fundamentally, “use-inspired” research aims to reconnect scientists and engineers with the people and communities their work ultimately affects, going beyond publication in a journal accessible only to academics.
The technology, innovation and partnerships directorate established initiatives to support regional projects and multidisciplinary partnerships bringing together researchers, entrepreneurs and community organizations. These programs, such as the regional innovation engines and convergence accelerator, seek to balance the traditional process of grant proposals written and evaluated by academics with broader societal demand for affordable health and environmental solutions. This work is particularly key to parts of the country that have not yet seen visible gains from decades of federally sponsored research, such as regions encompassing western North Carolina, northern South Carolina, eastern Tennessee and southwest Virginia.
Community-based scientific research
The Biden administration also worked to involve communities in science not just as research consultants but also as active participants.
Scientific research and technology-based innovation are often considered the exclusive domain of experts from elite universities or national labs. Yet, many communities are eager to conduct research, and they have insights to contribute. There is a decades-long history of citizen science initiatives, such as birdwatchers contributing data to national environmental surveys and community groups collecting industrial emissions data that officials can use to make regulations more cost effective.
Going further, the Biden administration carried out experiments to create research projects in a way that involved community members, local colleges and federal agencies as more equal partners.
For example, the Justice40 initiative asked people from across the country, including rural and small-town Americans, to identify local environmental justice issues and potential solutions.
The National Institutes of Health’s ComPASS program funded community organizations to test and scale successful health interventions, such as identifying pregnant women with complex medical needs and connecting them to specialized care.
And the National Science Foundation’s Civic Innovation Challenge required academic researchers to work with local organizations to address local concerns, improving the community’s technical skills and knowledge.
Frontiers of science and technology policy
Researchers often cite the 1945 report Science: The Endless Frontier, written by former Office of Scientific Research and Development head Vannevar Bush, to describe the core rationales for using American taxpayer money to fund basic science. Under this model, funding science would lead to three key outcomes: a secure national defense, improved health, and economic prosperity. The report, however, says little about how to go from basic science to desired societal outcomes. It also makes no mention of scientists sharing responsibility for the direction and impact of their work.
The 80th anniversary of Bush’s report in 2025 offers an opportunity to move science out into society. At present, major government initiatives are following a technology push model that focuses efforts on only one or a few products and involves little consideration of consumer and market demand. Research has repeatedly demonstrated that consumer or societal pull, which attracts development of products that enhance quality of life, is key to successful uptake of new technologies and their longevity.
Future administrations can further advance science and address major societal challenges by considering how ready society is to take up new technologies and increasing collaboration between government and civil society.
Arthur Daemmrich, Professor of Practice in the School for the Future of Innovation in Society, Arizona State University
This article is republished from The Conversation under a Creative Commons license. Read the original article.
The science section of our news blog STM Daily News provides readers with captivating and up-to-date information on the latest scientific discoveries, breakthroughs, and innovations across various fields. We offer engaging and accessible content, ensuring that readers with different levels of scientific knowledge can stay informed. Whether it’s exploring advancements in medicine, astronomy, technology, or environmental sciences, our science section strives to shed light on the intriguing world of scientific exploration and its profound impact on our daily lives. From thought-provoking articles to informative interviews with experts in the field, STM Daily News Science offers a harmonious blend of factual reporting, analysis, and exploration, making it a go-to source for science enthusiasts and curious minds alike. https://stmdailynews.com/category/science/
Discover more from Daily News
Subscribe to get the latest posts sent to your email.
The Earth
The US natural gas industry is leaking way more methane than previously thought. Here’s why that matters
Research reveals that methane emissions from U.S. natural gas operations are significantly underestimated, with a leak rate of 2.3 percent, which poses serious climate concerns and challenges in accurate measurement.
![natural gas](https://i0.wp.com/images.theconversation.com/files/225549/original/file-20180629-117374-gyqk1p.jpg?w=740&ssl=1)
Anthony J. Marchese, Colorado State University and Dan Zimmerle, Colorado State University
Natural gas is displacing coal, which could help fight climate change because burning it produces fewer carbon emissions. But producing and transporting natural gas releases methane, a greenhouse gas that also contributes to climate change. How big is the methane problem?
For the past five years, our research teams at Colorado State University have made thousands of methane emissions measurements at more than 700 separate facilities in the production, gathering, processing, transmission and storage segments of the natural gas supply chain.
This experience has given us a unique perspective regarding the major sources of methane emissions from natural gas and the challenges the industry faces in terms of detecting and reducing, if not eliminating, them.
Our work, along with numerous other research projects, was recently folded into a new study published in the journal Science. This comprehensive snapshot suggests that methane emissions from oil and gas operations are much higher than current EPA estimates.
What’s wrong with methane
One way to quantify the magnitude of the methane leakage is to divide the amount of methane emitted each year by the total amount of methane pumped out of the ground each year from natural gas and oil wells. The EPA currently estimates this methane leak rate to be 1.4 percent. That is, for every cubic foot of natural gas drawn from underground reservoirs, 1.4 percent of it is lost into the atmosphere.
This study synthesized the results from a five-year series of 16 studies coordinated by environmental advocacy group Environmental Defense Fund (EDF), which involved more than 140 researchers from over 40 institutions and 50 natural gas companies.
The effort brought together scholars based at universities, think tanks and the industry itself to make the most accurate estimate possible of the total amount of methane emitted from all U.S. oil and gas operations. It integrated data from a multitude of recent studies with measurements made on the ground and from the air.
All told, based on the results of the new study, the U.S. oil and gas industry is leaking 13 million metric tons of methane each year, which means the methane leak rate is 2.3 percent. This 60 percent difference between our new estimate and the EPA’s current one can have profound climate consequences.
Methane is a highly potent greenhouse gas, with more than 80 times the climate warming impact of carbon dioxide over the first 20 years after it is released.
An earlier EDF study showed that a methane leak rate of greater than 3 percent would result in no immediate climate benefits from retiring coal-fired power plants in favor of natural gas power plants.
That means even with a 2.3 percent leakage rate, the growing share of U.S. electricity powered by natural gas is doing something to slow the pace of climate change. However, these climate benefits could be far greater.
Also, at a methane leakage rate of 2.3 percent, many other uses of natural gas besides generating electricity are conclusively detrimental for the climate. For example, EDF found that replacing the diesel used in most trucks or the gasoline consumed by most cars with natural gas would require a leakage rate of less than 1.4 percent before there would be any immediate climate benefit.
What’s more, some scientists believe that the leakage rate could be even higher than this new estimate.
What causes these leaks
Perhaps you’ve never contemplated the long journey that natural gas travels before you can ignite the burners on the gas stove in your kitchen.
But on top of the 500,000 natural gas wells operating in the U.S. today, there are 2 million miles of pipes and millions of valves, fittings, tanks, compressors and other components operating 24 hours per day, seven days a week to deliver natural gas to your home.
That natural gas that you burn when you whip up a batch of pancakes may have traveled 1,000 miles or more as it wended through this complicated network. Along the way, there were ample opportunities for some of it to leak out into the atmosphere.
Natural gas leaks can be accidental, caused by malfunctioning equipment, but a lot of natural gas is also released intentionally to perform process operations such as opening and closing valves. In addition, the tens of thousands of compressors that increase the pressure and pump the gas along through the network are powered by engines that burn natural gas and their exhaust contains some unburned natural gas.
Since the natural gas delivered to your home is 85 to 95 percent methane, natural gas leaks are predominantly methane. While methane poses the greatest threat to the climate because of its greenhouse gas potency, natural gas contains other hydrocarbons that can degrade regional air quality and are bad for human health.
Inventory tallies vs. aircraft surveillance
The EPA Greenhouse Gas Inventory is done in a way experts like us call a “bottom-up” approach. It entails tallying up all of the nation’s natural gas equipment – from household gas meters to wellpads – and estimating an annualized average emission rate for every category and adding it all up.
There are two challenges to this approach. First, there are no accurate equipment records for many of these categories. Second, when components operate improperly or fail, emissions balloon, making it hard to develop an accurate and meaningful annualized emission rate for each source.
“Top-down” approaches, typically requiring aircraft, are the alternative. They measure methane concentrations upwind and downwind of large geographic areas. But this approach has its own shortcomings.
First, it captures all methane emissions, rather than just the emissions tied to natural gas operations – including the methane from landfills, cows and even the leaves rotting in your backyard. Second, these one-time snapshots may get distorted depending on what’s going on while planes fly around capturing methane data.
Historically, top-down approaches estimate emissions that are about twice bottom-up estimates. Some regional top-down methane leak rate estimates have been as high as 8 percent while some bottom-up estimates have been as low as 1 percent.
More recent work, including the Science study, have performed coordinated campaigns in which the on-the-ground and aircraft measurements are made concurrently, while carefully modeling emission events.
Helpful gadgets and sound policy
On a sunny morning in October 2013, our research team pulled up to a natural gas gathering compressor station in Texas. Using an US$80,000 infrared camera, we immediately located an extraordinarily large leak of colorless, odorless methane that was invisible to the operator who quickly isolated and fixed the problem.
We then witnessed the methane emissions decline tenfold – the facility leak rate fell from 9.8 percent to 0.7 percent before our eyes.
It is not economically feasible, of course, to equip all natural gas workers with $80,000 cameras, or to hire the drivers required to monitor every wellpad on a daily basis when there are 40,000 oil and gas wells in Weld County, Colorado, alone.
But new technologies can make a difference. Our team at Colorado State University is working with the Department of Energy to evaluate gadgetry that will rapidly detect methane emissions. Some of these devices can be deployed today, including inexpensive sensors that can be monitored remotely.
Technology alone won’t solve the problem, however. We believe that slashing the nation’s methane leak rate will require a collaborative effort between industry and government. And based on our experience in Colorado, which has developed some of the nation’s strictest methane emissions regulations, we find that best practices become standard practices with strong regulations.
We believe that the Trump administration’s efforts to roll back regulations, without regard to whether they are working or not, will not only have profound climate impacts. They will also jeopardize the health and safety of all Americans while undercutting efforts by the natural gas industry to cut back on the pollution it produces.
Anthony J. Marchese, Associate Dean for Academic and Student Affairs, Walter Scott, Jr. College of Engineering; Director, Engines and Energy Conversion Laboratory; Professor, Department of Mechanical Engineering, Colorado State University and Dan Zimmerle, Senior Research Associate and Director of METEC, Colorado State University
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Discover more from Daily News
Subscribe to get the latest posts sent to your email.
Science
That Arctic blast can feel brutally cold, but how much colder than ‘normal’ is it really?
![file 20250106 15 5zzsh1](https://i0.wp.com/stmdailynews.com/wp-content/uploads/2025/01/file-20250106-15-5zzsh1-jpg.webp?resize=740%2C438&ssl=1)
Richard B. (Ricky) Rood, University of Michigan
An Arctic blast hitting the central and eastern U.S. in early January 2025 has been creating fiercely cold conditions in many places. Parts of North Dakota dipped to more than 20 degrees below zero, and people as far south as Texas woke up to temperatures in the teens. A snow and ice storm across the middle of the country added to the winter chill.
Forecasters warned that temperatures could be “10 to more than 30 degrees below normal” across much of the eastern two-thirds of the country during the first full week of the year.
But what does “normal” actually mean?
While temperature forecasts are important to help people stay safe, the comparison to “normal” can be quite misleading. That’s because what qualifies as normal in forecasts has been changing rapidly over the years as the planet warms.
Defining normal
One of the most used standards for defining a science-based “normal” is a 30-year average of temperature and precipitation. Every 10 years, the National Center for Environmental Information updates these “normals,” most recently in 2021. The current span considered “normal” is 1991-2020. Five years ago, it was 1981-2010.
But temperatures have been rising over the past century, and the trend has accelerated since about 1980. This warming is fueled by the mining and burning of fossil fuels that increase carbon dioxide and methane in the atmosphere. These greenhouse gases trap heat close to the planet’s surface, leading to increasing temperature.
Because global temperatures are warming, what’s considered normal is warming, too.
So, when a 2025 cold snap is reported as the difference between the actual temperature and “normal,” it will appear to be colder and more extreme than if it were compared to an earlier 30-year average.
Thirty years is a significant portion of a human life. For people under age 40 or so, the use of the most recent averaging span might fit with what they have experienced.
But it doesn’t speak to how much the Earth has warmed.
How cold snaps today compare to the past
To see how today’s cold snaps – or today’s warming – compare to a time before global warming began to accelerate, NASA scientists use 1951-1980 as a baseline.
The reason becomes evident when you compare maps.
For example, January 1994 was brutally cold east of the Rocky Mountains. If we compare those 1994 temperatures to today’s “normal” – the 1991-2020 period – the U.S. looks a lot like maps of early January 2025’s temperatures: Large parts of the Midwest and eastern U.S. were more than 7 degrees Fahrenheit (4 degrees Celsius) below “normal,” and some areas were much colder.
But if we compare January 1994 to the 1951-1980 baseline instead, that cold spot in the eastern U.S. isn’t quite as large or extreme.
Where the temperatures in some parts of the country in January 1994 approached 14.2 F (7.9 C) colder than normal when compared to the 1991-2020 average, they only approached 12.4 F (6.9 C) colder than the 1951-1980 average.
As a measure of a changing climate, updating the average 30-year baseline every decade makes warming appear smaller than it is, and it makes cold snaps seem more extreme.
Conditions for heavy lake-effect snow
The U.S. will continue to see cold air outbreaks in winter, but as the Arctic and the rest of the planet warm, the most frigid temperatures of the past will become less common.
That warming trend helps set up a remarkable situation in the Great Lakes that we’re seeing in January 2025: heavy lake-effect snow across a large area.
As cold Arctic air encroached from the north in January, it encountered a Great Lakes basin where the water temperature was still above 40 F (4.4 C) in many places. Ice covered less than 2% of the lakes’ surface on Jan. 4.
That cold dry air over warmer open water causes evaporation, providing moisture for lake-effect snow. Parts of New York and Ohio along the lakes saw well over a foot of snow in the span of a few days.
The accumulation of heat in the Great Lakes, observed year after year, is leading to fundamental changes in winter weather and the winter economy in the states bordering the lakes.
It’s also a reminder of the persistent and growing presence of global warming, even in the midst of a cold air outbreak.
Richard B. (Ricky) Rood, Professor Emeritus of Climate and Space Sciences and Engineering, University of Michigan
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Discover more from Daily News
Subscribe to get the latest posts sent to your email.
-
Urbanism1 year ago
Signal Hill, California: A Historic Enclave Surrounded by Long Beach
-
News2 years ago
Diana Gregory Talks to us about Diana Gregory’s Outreach Services
-
Senior Pickleball Report2 years ago
The Absolute Most Comfortable Pickleball Shoe I’ve Ever Worn!
-
STM Blog2 years ago
World Naked Gardening Day: Celebrating Body Acceptance and Nature
-
Senior Pickleball Report2 years ago
ACE PICKLEBALL CLUB TO DEBUT THEIR HIGHLY ANTICIPATED INDOOR PICKLEBALL FRANCHISES IN THE US, IN EARLY 2023
-
Travel2 years ago
Unique Experiences at the CitizenM
-
Automotive2 years ago
2023 Nissan Sentra pricing starts at $19,950
-
Senior Pickleball Report2 years ago
“THE PEOPLE’S CHOICE AWARDS OF PICKLEBALL” – VOTING OPEN