Connect with us

Science

How a Record-Breaking Copper Catalyst Converts CO2 Into Liquid Fuels

Published

on

Researchers at Berkeley Lab have made real-time movies of copper nanoparticles as they evolve to convert carbon dioxide and water into renewable fuels and chemicals. Their new insights could help advance the next generation of solar fuels
Credit: Yao Yang/Berkeley Lab. Courtesy of Nature.
Video of a 4D-STEM experiment: Berkeley Lab researchers used a new electrochemical liquid cell to observe copper nanoparticles (ranging in size from 7 nanometers to 18 nanometers) evolve into active nanograins during CO2 electrolysis – a process that uses electricity to drive a reaction on the surface of an electrocatalyst. The new electrochemical liquid cell allows researchers to resolve images of objects smaller than 10 nanometers.
Previous

Newswise — Since the 1970s, scientists have known that copper has a special ability to transform carbon dioxide into valuable chemicals and fuels. But for many years, scientists have struggled to understand how this common metal works as an electrocatalyst, a mechanism that uses energy from electrons to chemically transform molecules into different products. 

Now, a research team led by Lawrence Berkeley National Laboratory (Berkeley Lab) has gained new insight by capturing real-time movies of copper nanoparticles (copper particles engineered at the scale of a billionth of a meter) as they convert CO2 and water into renewable fuels and chemicals: ethylene, ethanol, and propanol, among others. The work was reported in the journal Nature last week. 

“This is very exciting. After decades of work, we’re finally able to show – with undeniable proof – how copper electrocatalysts excel in CO2 reduction,” said Peidong Yang, a senior faculty scientist in Berkeley Lab’s Materials Sciences and Chemical Sciences Divisions who led the study. Yang is also a professor of chemistry and materials science and engineering at UC Berkeley. “Knowing how copper is such an excellent electrocatalyst brings us steps closer to turning CO2 into new, renewable solar fuels through artificial photosynthesis.”

The work was made possible by combining a new imaging technique called operando 4D electrochemical liquid-cell STEM (scanning transmission electron microscopy) with a soft X-ray probe to investigate the same sample environment: copper nanoparticles in liquid. First author Yao Yang, a UC Berkeley Miller postdoctoral fellow, conceived the groundbreaking approach under the guidance of Peidong Yang while working toward his Ph.D. in chemistry at Cornell University.

 

Scientists who study artificial photosynthesis materials and reactions have wanted to combine the power of an electron probe with X-rays, but the two techniques typically can’t be performed by the same instrument. 

Electron microscopes (such as STEM or TEM) use beams of electrons and excel at characterizing the atomic structure in parts of a material. In recent years, 4D STEM (or “2D raster of 2D diffraction patterns using scanning transmission electron microscopy”) instruments, such as those at Berkeley Lab’s Molecular Foundry, have pushed the boundaries of electron microscopy even further, enabling scientists to map out atomic or molecular regions in a variety of materials, from hard metallic glass to soft, flexible films. 

On the other hand, soft (or lower-energy) X-rays are useful for identifying and tracking chemical reactions in real time in an operando, or real-world, environment. 

Advertisement
Big Dill Pickleball Co. Serving Up Fun!

But now, scientists can have the best of both worlds. At the heart of the new technique is an electrochemical “liquid cell” sample holder with remarkable versatility. A thousand times thinner than a human hair, the device is compatible with both STEM and X-ray instruments. 

The electrochemical liquid cell’s ultrathin design allows reliable imaging of delicate samples while protecting them from electron beam damage. A special electrode custom-designed by co-author Cheng Wang, a staff scientist at Berkeley Lab’s Advanced Light Source, enabled the team to conduct X-ray experiments with the electrochemical liquid cell. Combining the two allows researchers to comprehensively characterize electrochemical reactions in real time and at the nanoscale. 

Getting granular

During 4D-STEM experiments, Yao Yang and team used the new electrochemical liquid cell to observe copper nanoparticles (ranging in size from 7 nanometers to 18 nanometers) evolve into active nanograins during CO2 electrolysis – a process that uses electricity to drive a reaction on the surface of an electrocatalyst. 

The experiments revealed a surprise: copper nanoparticles combined into larger metallic copper “nanograins” within seconds of the electrochemical reaction. 

To learn more, the team turned to Wang, who pioneered a technique known as “resonant soft X-ray scattering (RSoXS) for soft materials,” at the Advanced Light Source more than 10 years ago. 

With help from Wang, the research team used the same electrochemical liquid cell, but this time during RSoXS experiments, to determine whether copper nanograins facilitate COreduction. Soft X-rays are ideal for studying how copper electrocatalysts evolve during CO2 reduction, Wang explained. By using RSoXS, researchers can monitor multiple reactions between thousands of nanoparticles in real time, and accurately identify chemical reactants and products. 

Advertisement
Big Dill Pickleball Co. Serving Up Fun!

The RSoXS experiments at the Advanced Light Source – along with additional evidence gathered at Cornell High Energy Synchrotron Source (CHESS) – proved that metallic copper nanograins serve as active sites for CO2 reduction. (Metallic copper, also known as copper(0), is a form of the element copper.) 

During CO2 electrolysis, the copper nanoparticles change their structure during a process called “electrochemical scrambling.” The copper nanoparticles’ surface layer of oxide degrades, creating open sites on the copper surface for CO2 molecules to attach, explained Peidong Yang. And as CO2 “docks” or binds to the copper nanograin surface, electrons are then transferred to CO2, causing a reaction that simultaneously produces ethylene, ethanol, and propanol along with other multicarbon products. 

“The copper nanograins essentially turn into little chemical manufacturing factories,” Yao Yang said.

Further experiments at the Molecular Foundry, the Advanced Light Source, and CHESS revealed that size matters. All of the 7-nanometer copper nanoparticles participated in CO2 reduction, whereas the larger nanoparticles did not. In addition, the team learned that only metallic copper can efficiently reduce COinto multicarbon products. The findings have implications for “rationally designing efficient CO2 electrocatalysts,” Peidong Yang said.

The new study also validated Peidong Yang’s findings from 2017: That the 7-nanometer-sized copper nanoparticles require low inputs of energy to start CO2 reduction. As an electrocatalyst, the 7-nanometer copper nanoparticles required a record-low driving force that is about 300 millivolts less than typical bulk copper electrocatalysts. The best-performing catalysts that produce multicarbon products from CO2 typically operate at high driving force of 1 volt.

The copper nanograins could potentially boost the energy efficiency and productivity of some catalysts designed for artificial photosynthesis, a field of research that aims to produce solar fuels from sunlight, water, and CO2. Currently, researchers within the Department of Energy-funded Liquid Sunlight Alliance (LiSA) plan to use the copper nanograin catalysts in the design of future solar fuel devices. 

“The technique’s ability to record real-time movies of a chemical process opens up exciting opportunities to study many other electrochemical energy conversion processes. It’s a huge breakthrough, and it would not have been possible without Yao and his pioneering work,” Peidong Yang said. 

Advertisement
Big Dill Pickleball Co. Serving Up Fun!

Researchers from Berkeley Lab, UC Berkeley, and Cornell University contributed to the work. Other authors on the paper include co-first authors Sheena Louisa and Sunmoon Yu, former UC Berkeley Ph.D. students in Peidong Yang’s group, along with Jianbo Jin, Inwhan Roh, Chubai Chen, Maria V. Fonseca Guzman, Julian Feijóo, Peng-Cheng Chen, Hongsen Wang, Christopher Pollock, Xin Huang, Yu-Tsuan Shao, Cheng Wang, David A. Muller, and Héctor D. Abruña.

Parts of the experiments were performed by Yao Yang at Cornell under the supervision of Héctor Abruña, professor of chemistry and chemical biology, and David A. Muller, professor of engineering. 

This work was supported by the DOE Office of Science. 

The Molecular Foundry and Advanced Light Source are user facilities at Berkeley Lab. 

Source:  Lawrence Berkeley National Laboratory

Author


Discover more from Daily News

Subscribe to get the latest posts sent to your email.

Continue Reading
Advertisement

Lifestyle

Biden helped bring science out of the lab and into the community − emphasizing research focused on solutions

Published

on

Biden
Biden began his presidency in the throes of the COVID-19 pandemic. Evan Vucci/AP Photo

Arthur Daemmrich, Arizona State University

President Joe Biden was inaugurated in January 2021 amid a devastating pandemic, with over 24 million COVID-19 cases and more than 400,000 deaths in the U.S. recorded at that point.

Operation Warp Speed, initiated by the Trump administration in May 2020, meant an effective vaccine was becoming available. Biden quickly announced a plan to immunize 100 million Americans over the next three months. By the end of April 2021, 145 million Americans – nearly half the population – had received one vaccine dose, and 103 million were considered fully vaccinated. Science and technology policymakers celebrated this coordination across science, industry and government to address a real-world crisis as a 21st-century Manhattan Project.

From my perspective as a scholar of science and technology policy, Biden’s legacy includes structural, institutional and practical changes to how science is conducted. Building on approaches developed over the course of many years, the administration elevated the status of science in the government and fostered community participation in research.

Raising science’s profile in government

The U.S. has no single ministry of science and technology. Instead, agencies and offices across the executive branch carry out scientific research at several national labs and fund research by other institutions. By elevating the White House Office of Science and Technology Policy to a Cabinet-level organization for the first time in its history, Biden gave the agency greater influence in federal decision-making and coordination.

Formally established in 1976, the agency provides the president and senior staff with scientific and technical advice, bringing science to bear on executive policies. Biden’s inclusion of the agency’s director in his Cabinet was a strong signal about the elevated role science and technology would play in the administration’s solutions to major societal challenges.

Under Biden, the Office of Science and Technology Policy established guidelines that agencies across the government would follow as they implemented major legislation. This included developing technologies that remove carbon dioxide from the atmosphere to address climate change, rebuilding America’s chip industry, and managing the rollout of AI technologies.

Close-up of gloved hand holding square semiconductor chip
The CHIPS and Science Act of 2022 boosted research and manufacture of semiconductor chips in the U.S. Narumon Bowonkitwanchai/Moment via Getty Images

Instead of treating the ethical and societal dimensions of scientific and technological change as separate from research and development, the agency advocated for a more integrated approach. This was reflected in the appointment of social scientist Alondra Nelson as the agency’s first deputy director for science and society, and science policy expert Kei Koizumi as principal deputy director for policy. Ethical and societal considerations were added as evaluation criteria for grants. And initiatives such as the AI bill of rights and frameworks for research integrity and open science further encouraged all federal agencies to consider the social effects of their research.

The Office of Science and Technology Policy also introduced new ways for agencies to consult with communities, including Native Nations, rural Americans and people of color, in order to avoid known biases in science and technology research. For example, the agency issued government-wide guidance to recognize and include Indigenous knowledge in federal programs. Agencies such as the Department of Energy have incorporated public perspectives while rolling out atmospheric carbon dioxide removal technologies and building new hydrogen hubs.

Advertisement
Big Dill Pickleball Co. Serving Up Fun!

Use-inspired research

A long-standing criticism of U.S. science funding is that it often fails to answer questions of societal importance. Members of Congress and policy analysts have argued that funded projects instead overly emphasize basic research in areas that advance the careers of researchers.

In response, the Biden administration established the technology, innovation and partnerships directorate at the National Science Foundation in March 2022.

The directorate uses social science approaches to help focus scientific research and technology on their potential uses and effects on society. For example, engineers developing future energy technologies could start by consulting with the community about local needs and opportunities, rather than pitching their preferred solution after years of laboratory work. Genetic researchers could share both knowledge and financial benefits with the communities that provided the researchers with data.

Fundamentally, “use-inspired” research aims to reconnect scientists and engineers with the people and communities their work ultimately affects, going beyond publication in a journal accessible only to academics.

The technology, innovation and partnerships directorate established initiatives to support regional projects and multidisciplinary partnerships bringing together researchers, entrepreneurs and community organizations. These programs, such as the regional innovation engines and convergence accelerator, seek to balance the traditional process of grant proposals written and evaluated by academics with broader societal demand for affordable health and environmental solutions. This work is particularly key to parts of the country that have not yet seen visible gains from decades of federally sponsored research, such as regions encompassing western North Carolina, northern South Carolina, eastern Tennessee and southwest Virginia.

Community-based scientific research

The Biden administration also worked to involve communities in science not just as research consultants but also as active participants.

Scientific research and technology-based innovation are often considered the exclusive domain of experts from elite universities or national labs. Yet, many communities are eager to conduct research, and they have insights to contribute. There is a decades-long history of citizen science initiatives, such as birdwatchers contributing data to national environmental surveys and community groups collecting industrial emissions data that officials can use to make regulations more cost effective.

Advertisement
Big Dill Pickleball Co. Serving Up Fun!

Going further, the Biden administration carried out experiments to create research projects in a way that involved community members, local colleges and federal agencies as more equal partners.

Hand-drawn signs displayed on a fence against a green field, with messages about climate change around a sign that reads 'It's our future'
Collaboration between the community, academia, industry and government can lead to more effective solutions. Deb Cohn-Orbach/UCG/Universal Images Group via Getty Images

For example, the Justice40 initiative asked people from across the country, including rural and small-town Americans, to identify local environmental justice issues and potential solutions.

The National Institutes of Health’s ComPASS program funded community organizations to test and scale successful health interventions, such as identifying pregnant women with complex medical needs and connecting them to specialized care.

And the National Science Foundation’s Civic Innovation Challenge required academic researchers to work with local organizations to address local concerns, improving the community’s technical skills and knowledge.

Frontiers of science and technology policy

Researchers often cite the 1945 report Science: The Endless Frontier, written by former Office of Scientific Research and Development head Vannevar Bush, to describe the core rationales for using American taxpayer money to fund basic science. Under this model, funding science would lead to three key outcomes: a secure national defense, improved health, and economic prosperity. The report, however, says little about how to go from basic science to desired societal outcomes. It also makes no mention of scientists sharing responsibility for the direction and impact of their work.

The 80th anniversary of Bush’s report in 2025 offers an opportunity to move science out into society. At present, major government initiatives are following a technology push model that focuses efforts on only one or a few products and involves little consideration of consumer and market demand. Research has repeatedly demonstrated that consumer or societal pull, which attracts development of products that enhance quality of life, is key to successful uptake of new technologies and their longevity.

Future administrations can further advance science and address major societal challenges by considering how ready society is to take up new technologies and increasing collaboration between government and civil society.

Arthur Daemmrich, Professor of Practice in the School for the Future of Innovation in Society, Arizona State University

Advertisement
Big Dill Pickleball Co. Serving Up Fun!

This article is republished from The Conversation under a Creative Commons license. Read the original article.

The science section of our news blog STM Daily News provides readers with captivating and up-to-date information on the latest scientific discoveries, breakthroughs, and innovations across various fields. We offer engaging and accessible content, ensuring that readers with different levels of scientific knowledge can stay informed. Whether it’s exploring advancements in medicine, astronomy, technology, or environmental sciences, our science section strives to shed light on the intriguing world of scientific exploration and its profound impact on our daily lives. From thought-provoking articles to informative interviews with experts in the field, STM Daily News Science offers a harmonious blend of factual reporting, analysis, and exploration, making it a go-to source for science enthusiasts and curious minds alike. https://stmdailynews.com/category/science/


Discover more from Daily News

Subscribe to get the latest posts sent to your email.

Continue Reading

The Earth

The US natural gas industry is leaking way more methane than previously thought. Here’s why that matters

Research reveals that methane emissions from U.S. natural gas operations are significantly underestimated, with a leak rate of 2.3 percent, which poses serious climate concerns and challenges in accurate measurement.

Published

on

natural gas
The authors conferring at a natural gas facility in Colorado. Colorado State University, CC BY-SA

Anthony J. Marchese, Colorado State University and Dan Zimmerle, Colorado State University

Natural gas is displacing coal, which could help fight climate change because burning it produces fewer carbon emissions. But producing and transporting natural gas releases methane, a greenhouse gas that also contributes to climate change. How big is the methane problem?

For the past five years, our research teams at Colorado State University have made thousands of methane emissions measurements at more than 700 separate facilities in the production, gathering, processing, transmission and storage segments of the natural gas supply chain.

This experience has given us a unique perspective regarding the major sources of methane emissions from natural gas and the challenges the industry faces in terms of detecting and reducing, if not eliminating, them.

Our work, along with numerous other research projects, was recently folded into a new study published in the journal Science. This comprehensive snapshot suggests that methane emissions from oil and gas operations are much higher than current EPA estimates.

What’s wrong with methane

One way to quantify the magnitude of the methane leakage is to divide the amount of methane emitted each year by the total amount of methane pumped out of the ground each year from natural gas and oil wells. The EPA currently estimates this methane leak rate to be 1.4 percent. That is, for every cubic foot of natural gas drawn from underground reservoirs, 1.4 percent of it is lost into the atmosphere.

This study synthesized the results from a five-year series of 16 studies coordinated by environmental advocacy group Environmental Defense Fund (EDF), which involved more than 140 researchers from over 40 institutions and 50 natural gas companies.

The effort brought together scholars based at universities, think tanks and the industry itself to make the most accurate estimate possible of the total amount of methane emitted from all U.S. oil and gas operations. It integrated data from a multitude of recent studies with measurements made on the ground and from the air.

Advertisement
Big Dill Pickleball Co. Serving Up Fun!

All told, based on the results of the new study, the U.S. oil and gas industry is leaking 13 million metric tons of methane each year, which means the methane leak rate is 2.3 percent. This 60 percent difference between our new estimate and the EPA’s current one can have profound climate consequences.

Methane is a highly potent greenhouse gas, with more than 80 times the climate warming impact of carbon dioxide over the first 20 years after it is released.

An earlier EDF study showed that a methane leak rate of greater than 3 percent would result in no immediate climate benefits from retiring coal-fired power plants in favor of natural gas power plants.

That means even with a 2.3 percent leakage rate, the growing share of U.S. electricity powered by natural gas is doing something to slow the pace of climate change. However, these climate benefits could be far greater.

Also, at a methane leakage rate of 2.3 percent, many other uses of natural gas besides generating electricity are conclusively detrimental for the climate. For example, EDF found that replacing the diesel used in most trucks or the gasoline consumed by most cars with natural gas would require a leakage rate of less than 1.4 percent before there would be any immediate climate benefit.

What’s more, some scientists believe that the leakage rate could be even higher than this new estimate.

What causes these leaks

Perhaps you’ve never contemplated the long journey that natural gas travels before you can ignite the burners on the gas stove in your kitchen.

Advertisement
Big Dill Pickleball Co. Serving Up Fun!

But on top of the 500,000 natural gas wells operating in the U.S. today, there are 2 million miles of pipes and millions of valves, fittings, tanks, compressors and other components operating 24 hours per day, seven days a week to deliver natural gas to your home.

That natural gas that you burn when you whip up a batch of pancakes may have traveled 1,000 miles or more as it wended through this complicated network. Along the way, there were ample opportunities for some of it to leak out into the atmosphere.

Natural gas leaks can be accidental, caused by malfunctioning equipment, but a lot of natural gas is also released intentionally to perform process operations such as opening and closing valves. In addition, the tens of thousands of compressors that increase the pressure and pump the gas along through the network are powered by engines that burn natural gas and their exhaust contains some unburned natural gas.

Since the natural gas delivered to your home is 85 to 95 percent methane, natural gas leaks are predominantly methane. While methane poses the greatest threat to the climate because of its greenhouse gas potency, natural gas contains other hydrocarbons that can degrade regional air quality and are bad for human health.

Inventory tallies vs. aircraft surveillance

The EPA Greenhouse Gas Inventory is done in a way experts like us call a “bottom-up” approach. It entails tallying up all of the nation’s natural gas equipment – from household gas meters to wellpads – and estimating an annualized average emission rate for every category and adding it all up.

There are two challenges to this approach. First, there are no accurate equipment records for many of these categories. Second, when components operate improperly or fail, emissions balloon, making it hard to develop an accurate and meaningful annualized emission rate for each source.

“Top-down” approaches, typically requiring aircraft, are the alternative. They measure methane concentrations upwind and downwind of large geographic areas. But this approach has its own shortcomings.

Advertisement
Big Dill Pickleball Co. Serving Up Fun!

First, it captures all methane emissions, rather than just the emissions tied to natural gas operations – including the methane from landfills, cows and even the leaves rotting in your backyard. Second, these one-time snapshots may get distorted depending on what’s going on while planes fly around capturing methane data.

Historically, top-down approaches estimate emissions that are about twice bottom-up estimates. Some regional top-down methane leak rate estimates have been as high as 8 percent while some bottom-up estimates have been as low as 1 percent.

More recent work, including the Science study, have performed coordinated campaigns in which the on-the-ground and aircraft measurements are made concurrently, while carefully modeling emission events.

Helpful gadgets and sound policy

On a sunny morning in October 2013, our research team pulled up to a natural gas gathering compressor station in Texas. Using an US$80,000 infrared camera, we immediately located an extraordinarily large leak of colorless, odorless methane that was invisible to the operator who quickly isolated and fixed the problem.

We then witnessed the methane emissions decline tenfold – the facility leak rate fell from 9.8 percent to 0.7 percent before our eyes.

It is not economically feasible, of course, to equip all natural gas workers with $80,000 cameras, or to hire the drivers required to monitor every wellpad on a daily basis when there are 40,000 oil and gas wells in Weld County, Colorado, alone.

But new technologies can make a difference. Our team at Colorado State University is working with the Department of Energy to evaluate gadgetry that will rapidly detect methane emissions. Some of these devices can be deployed today, including inexpensive sensors that can be monitored remotely.

Advertisement
Big Dill Pickleball Co. Serving Up Fun!

Technology alone won’t solve the problem, however. We believe that slashing the nation’s methane leak rate will require a collaborative effort between industry and government. And based on our experience in Colorado, which has developed some of the nation’s strictest methane emissions regulations, we find that best practices become standard practices with strong regulations.

We believe that the Trump administration’s efforts to roll back regulations, without regard to whether they are working or not, will not only have profound climate impacts. They will also jeopardize the health and safety of all Americans while undercutting efforts by the natural gas industry to cut back on the pollution it produces.

Anthony J. Marchese, Associate Dean for Academic and Student Affairs, Walter Scott, Jr. College of Engineering; Director, Engines and Energy Conversion Laboratory; Professor, Department of Mechanical Engineering, Colorado State University and Dan Zimmerle, Senior Research Associate and Director of METEC, Colorado State University

This article is republished from The Conversation under a Creative Commons license. Read the original article.


Discover more from Daily News

Subscribe to get the latest posts sent to your email.

Continue Reading

Science

That Arctic blast can feel brutally cold, but how much colder than ‘normal’ is it really?

Published

on

file 20250106 15 5zzsh1
Philadelphia Eagles fans braved temperatures in the 20s to watch their team play the New York Giants on Jan. 5, 2025. AP Photo/Chris Szagola

Richard B. (Ricky) Rood, University of Michigan

An Arctic blast hitting the central and eastern U.S. in early January 2025 has been creating fiercely cold conditions in many places. Parts of North Dakota dipped to more than 20 degrees below zero, and people as far south as Texas woke up to temperatures in the teens. A snow and ice storm across the middle of the country added to the winter chill.

Forecasters warned that temperatures could be “10 to more than 30 degrees below normal” across much of the eastern two-thirds of the country during the first full week of the year.

But what does “normal” actually mean?

While temperature forecasts are important to help people stay safe, the comparison to “normal” can be quite misleading. That’s because what qualifies as normal in forecasts has been changing rapidly over the years as the planet warms.

Defining normal

One of the most used standards for defining a science-based “normal” is a 30-year average of temperature and precipitation. Every 10 years, the National Center for Environmental Information updates these “normals,” most recently in 2021. The current span considered “normal” is 1991-2020. Five years ago, it was 1981-2010.

But temperatures have been rising over the past century, and the trend has accelerated since about 1980. This warming is fueled by the mining and burning of fossil fuels that increase carbon dioxide and methane in the atmosphere. These greenhouse gases trap heat close to the planet’s surface, leading to increasing temperature.

Ten maps show conditions warming, particularly since the 1980s.
How U.S. temperatures considered ‘normal’ have changed over the decades. Each 30-year period is compared to the 20th-century average. NOAA Climate.gov

Because global temperatures are warming, what’s considered normal is warming, too.

So, when a 2025 cold snap is reported as the difference between the actual temperature and “normal,” it will appear to be colder and more extreme than if it were compared to an earlier 30-year average.

Advertisement
Big Dill Pickleball Co. Serving Up Fun!

Thirty years is a significant portion of a human life. For people under age 40 or so, the use of the most recent averaging span might fit with what they have experienced.

But it doesn’t speak to how much the Earth has warmed.

How cold snaps today compare to the past

To see how today’s cold snaps – or today’s warming – compare to a time before global warming began to accelerate, NASA scientists use 1951-1980 as a baseline.

The reason becomes evident when you compare maps.

For example, January 1994 was brutally cold east of the Rocky Mountains. If we compare those 1994 temperatures to today’s “normal” – the 1991-2020 period – the U.S. looks a lot like maps of early January 2025’s temperatures: Large parts of the Midwest and eastern U.S. were more than 7 degrees Fahrenheit (4 degrees Celsius) below “normal,” and some areas were much colder.

A map shows a large cold blob over the eastern and central U.S. and Canada.
How temperatures in January 1994 compare to the 1991-2020 average, the current 30-year period used to define ‘normal,’ NASA Goddard Institute for Space Studies

But if we compare January 1994 to the 1951-1980 baseline instead, that cold spot in the eastern U.S. isn’t quite as large or extreme.

Where the temperatures in some parts of the country in January 1994 approached 14.2 F (7.9 C) colder than normal when compared to the 1991-2020 average, they only approached 12.4 F (6.9 C) colder than the 1951-1980 average.

A map shows a cold blob over the eastern and central U.S. and Canada and much-warmer-than-normal spots over Europe and the U.S. West Coast.
How temperatures in January 1994 compared to the 1951-1980 average. NASA Goddard Institute for Space Studies

As a measure of a changing climate, updating the average 30-year baseline every decade makes warming appear smaller than it is, and it makes cold snaps seem more extreme.

Charts show temperatures shifting about 4 degrees Fahrenheit when comparing the 1951-1980 average to the 1991-2020 average, considered the current 'normal.'
Charts show how temperatures have shifted in southwest Minnesota. Each histogram on the left shows 30 years of average January temperatures. Blue is the most recent 30-year period, 1991-2020; yellow is the earlier 1951-1980 period. The bell curves of the frequency of those temperatures show about a 4 F (2.2 C) shift. Omar Gates/GLISA, University of Michigan

Conditions for heavy lake-effect snow

The U.S. will continue to see cold air outbreaks in winter, but as the Arctic and the rest of the planet warm, the most frigid temperatures of the past will become less common.

That warming trend helps set up a remarkable situation in the Great Lakes that we’re seeing in January 2025: heavy lake-effect snow across a large area.

Advertisement
Big Dill Pickleball Co. Serving Up Fun!

As cold Arctic air encroached from the north in January, it encountered a Great Lakes basin where the water temperature was still above 40 F (4.4 C) in many places. Ice covered less than 2% of the lakes’ surface on Jan. 4.

That cold dry air over warmer open water causes evaporation, providing moisture for lake-effect snow. Parts of New York and Ohio along the lakes saw well over a foot of snow in the span of a few days.

Maps show warm water in much of the lakes, particularly on their eastern sides on Jan. 3, 2025.
Surface temperatures in much of the Great Lakes were still warm as the cold Arctic air arrived in early January. Great Lake Environmental Research Laboratory

The accumulation of heat in the Great Lakes, observed year after year, is leading to fundamental changes in winter weather and the winter economy in the states bordering the lakes.

It’s also a reminder of the persistent and growing presence of global warming, even in the midst of a cold air outbreak.

Richard B. (Ricky) Rood, Professor Emeritus of Climate and Space Sciences and Engineering, University of Michigan

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Author

Advertisement
Big Dill Pickleball Co. Serving Up Fun!

Discover more from Daily News

Subscribe to get the latest posts sent to your email.

Continue Reading

Trending