Connect with us

Science

How a Record-Breaking Copper Catalyst Converts CO2 Into Liquid Fuels

Published

on

Researchers at Berkeley Lab have made real-time movies of copper nanoparticles as they evolve to convert carbon dioxide and water into renewable fuels and chemicals. Their new insights could help advance the next generation of solar fuels
Credit: Yao Yang/Berkeley Lab. Courtesy of Nature.
Video of a 4D-STEM experiment: Berkeley Lab researchers used a new electrochemical liquid cell to observe copper nanoparticles (ranging in size from 7 nanometers to 18 nanometers) evolve into active nanograins during CO2 electrolysis – a process that uses electricity to drive a reaction on the surface of an electrocatalyst. The new electrochemical liquid cell allows researchers to resolve images of objects smaller than 10 nanometers.
Previous

Newswise — Since the 1970s, scientists have known that copper has a special ability to transform carbon dioxide into valuable chemicals and fuels. But for many years, scientists have struggled to understand how this common metal works as an electrocatalyst, a mechanism that uses energy from electrons to chemically transform molecules into different products. 

Now, a research team led by Lawrence Berkeley National Laboratory (Berkeley Lab) has gained new insight by capturing real-time movies of copper nanoparticles (copper particles engineered at the scale of a billionth of a meter) as they convert CO2 and water into renewable fuels and chemicals: ethylene, ethanol, and propanol, among others. The work was reported in the journal Nature last week. 

“This is very exciting. After decades of work, we’re finally able to show – with undeniable proof – how copper electrocatalysts excel in CO2 reduction,” said Peidong Yang, a senior faculty scientist in Berkeley Lab’s Materials Sciences and Chemical Sciences Divisions who led the study. Yang is also a professor of chemistry and materials science and engineering at UC Berkeley. “Knowing how copper is such an excellent electrocatalyst brings us steps closer to turning CO2 into new, renewable solar fuels through artificial photosynthesis.”

The work was made possible by combining a new imaging technique called operando 4D electrochemical liquid-cell STEM (scanning transmission electron microscopy) with a soft X-ray probe to investigate the same sample environment: copper nanoparticles in liquid. First author Yao Yang, a UC Berkeley Miller postdoctoral fellow, conceived the groundbreaking approach under the guidance of Peidong Yang while working toward his Ph.D. in chemistry at Cornell University.

 

Scientists who study artificial photosynthesis materials and reactions have wanted to combine the power of an electron probe with X-rays, but the two techniques typically can’t be performed by the same instrument. 

Electron microscopes (such as STEM or TEM) use beams of electrons and excel at characterizing the atomic structure in parts of a material. In recent years, 4D STEM (or “2D raster of 2D diffraction patterns using scanning transmission electron microscopy”) instruments, such as those at Berkeley Lab’s Molecular Foundry, have pushed the boundaries of electron microscopy even further, enabling scientists to map out atomic or molecular regions in a variety of materials, from hard metallic glass to soft, flexible films. 

On the other hand, soft (or lower-energy) X-rays are useful for identifying and tracking chemical reactions in real time in an operando, or real-world, environment. 

Advertisement
image 101376000 12222003

But now, scientists can have the best of both worlds. At the heart of the new technique is an electrochemical “liquid cell” sample holder with remarkable versatility. A thousand times thinner than a human hair, the device is compatible with both STEM and X-ray instruments. 

The electrochemical liquid cell’s ultrathin design allows reliable imaging of delicate samples while protecting them from electron beam damage. A special electrode custom-designed by co-author Cheng Wang, a staff scientist at Berkeley Lab’s Advanced Light Source, enabled the team to conduct X-ray experiments with the electrochemical liquid cell. Combining the two allows researchers to comprehensively characterize electrochemical reactions in real time and at the nanoscale. 

Getting granular

During 4D-STEM experiments, Yao Yang and team used the new electrochemical liquid cell to observe copper nanoparticles (ranging in size from 7 nanometers to 18 nanometers) evolve into active nanograins during CO2 electrolysis – a process that uses electricity to drive a reaction on the surface of an electrocatalyst. 

The experiments revealed a surprise: copper nanoparticles combined into larger metallic copper “nanograins” within seconds of the electrochemical reaction. 

To learn more, the team turned to Wang, who pioneered a technique known as “resonant soft X-ray scattering (RSoXS) for soft materials,” at the Advanced Light Source more than 10 years ago. 

With help from Wang, the research team used the same electrochemical liquid cell, but this time during RSoXS experiments, to determine whether copper nanograins facilitate COreduction. Soft X-rays are ideal for studying how copper electrocatalysts evolve during CO2 reduction, Wang explained. By using RSoXS, researchers can monitor multiple reactions between thousands of nanoparticles in real time, and accurately identify chemical reactants and products. 

Advertisement
image 101376000 12222003

The RSoXS experiments at the Advanced Light Source – along with additional evidence gathered at Cornell High Energy Synchrotron Source (CHESS) – proved that metallic copper nanograins serve as active sites for CO2 reduction. (Metallic copper, also known as copper(0), is a form of the element copper.) 

During CO2 electrolysis, the copper nanoparticles change their structure during a process called “electrochemical scrambling.” The copper nanoparticles’ surface layer of oxide degrades, creating open sites on the copper surface for CO2 molecules to attach, explained Peidong Yang. And as CO2 “docks” or binds to the copper nanograin surface, electrons are then transferred to CO2, causing a reaction that simultaneously produces ethylene, ethanol, and propanol along with other multicarbon products. 

“The copper nanograins essentially turn into little chemical manufacturing factories,” Yao Yang said.

Further experiments at the Molecular Foundry, the Advanced Light Source, and CHESS revealed that size matters. All of the 7-nanometer copper nanoparticles participated in CO2 reduction, whereas the larger nanoparticles did not. In addition, the team learned that only metallic copper can efficiently reduce COinto multicarbon products. The findings have implications for “rationally designing efficient CO2 electrocatalysts,” Peidong Yang said.

The new study also validated Peidong Yang’s findings from 2017: That the 7-nanometer-sized copper nanoparticles require low inputs of energy to start CO2 reduction. As an electrocatalyst, the 7-nanometer copper nanoparticles required a record-low driving force that is about 300 millivolts less than typical bulk copper electrocatalysts. The best-performing catalysts that produce multicarbon products from CO2 typically operate at high driving force of 1 volt.

The copper nanograins could potentially boost the energy efficiency and productivity of some catalysts designed for artificial photosynthesis, a field of research that aims to produce solar fuels from sunlight, water, and CO2. Currently, researchers within the Department of Energy-funded Liquid Sunlight Alliance (LiSA) plan to use the copper nanograin catalysts in the design of future solar fuel devices. 

“The technique’s ability to record real-time movies of a chemical process opens up exciting opportunities to study many other electrochemical energy conversion processes. It’s a huge breakthrough, and it would not have been possible without Yao and his pioneering work,” Peidong Yang said. 

Advertisement
image 101376000 12222003

Researchers from Berkeley Lab, UC Berkeley, and Cornell University contributed to the work. Other authors on the paper include co-first authors Sheena Louisa and Sunmoon Yu, former UC Berkeley Ph.D. students in Peidong Yang’s group, along with Jianbo Jin, Inwhan Roh, Chubai Chen, Maria V. Fonseca Guzman, Julian Feijóo, Peng-Cheng Chen, Hongsen Wang, Christopher Pollock, Xin Huang, Yu-Tsuan Shao, Cheng Wang, David A. Muller, and Héctor D. Abruña.

Parts of the experiments were performed by Yao Yang at Cornell under the supervision of Héctor Abruña, professor of chemistry and chemical biology, and David A. Muller, professor of engineering. 

This work was supported by the DOE Office of Science. 

The Molecular Foundry and Advanced Light Source are user facilities at Berkeley Lab. 

Source:  Lawrence Berkeley National Laboratory

Author


Discover more from Daily News

Subscribe to get the latest posts sent to your email.

Continue Reading
Advertisement SodaStream USA, inc

Science

There’s growing evidence of possible life on other planets – here’s why you should still be sceptical

Published

on

Life
Artist’s impression of K2-18 b. NASA, ESA, CSA, Joseph Olmsted (STScI)
Manoj Joshi, University of East Anglia; Andrew Rushby, Birkbeck, University of London, and Maria Di Paolo, University of East Anglia A team of researchers has recently claimed they have discovered a gas called dimethyl sulphide (DMS) in the atmosphere of K2-18b, a planet orbiting a distant star. The University of Cambridge team’s claims are potentially very exciting because, on Earth at least, the compound is produced by marine bacteria. The presence of this gas may be a sign of life on K2-18b too – but we can’t rush to conclusions just yet. K2-18b has a radius 2.6 times that of Earth, a mass nearly nine times greater and orbits a star that is 124 light years away. We can’t directly tell what kinds of large scale characteristics it has, although one possibility is a world with a global liquid water ocean under a hydrogen-rich atmosphere. Such a world might well be hospitable to life, but different ideas exist about the properties of this planet – and what that might mean for a DMS signature.
Get your news from actual experts, straight to your inbox. Sign up to our daily newsletter to receive all The Conversation UK’s latest coverage of news and research, from politics and business to the arts and sciences.
Claims for the detection of life on other planets go back decades. In the 1970s, one of the scientists working on the Viking mission to Mars claimed that his experiment had indicated there could be microorganisms in the Martian soil. However, these conclusions were widely refuted by other researchers. In 1996, a team said that microscopic features resembling bacteria had been found in the Martian meteorite ALH84001. However, subsequent studies cast significant doubt on the discovery. Since the early 2000s there have also been repeated claims for the detection of methane gas in the atmosphere of Mars, both by remote sensing by satellites and by in-situ observations by rovers. Methane can be produced by several mechanisms. One of these potential sources involves production by microorganisms. Such sources are described by scientists as being “biotic”. Other sources of methane, such as volcanoes and hydrothermal vents, don’t require life and are said to be “abiotic”.
Venus, Mariner probe
The claimed detection of phosphine gas in Venus’ atmosphere has been proposed as a biosignature. Nasa
Not all of the previous claims for evidence of extraterrestrial life involve the red planet. In 2020, Earth-based observations of Venus’s atmosphere implied the presence of low levels of phosphine gas. Because phosphine gas can be produced by microbes, there was speculation that life might exist in Venus’s clouds. However, the detection of phosphine was later disputed by other scientists. Proposed signs of life on other worlds are known as “biosignatures”. This is defined as “an object, substance, and/or pattern whose origin specifically requires a biological agent”. In other words, any detection requires all possible abiotic production pathways to be considered. In addition to this, scientists face many challenges in the collection, interpretation, and planetary environmental context of possible biosignature gases. Understanding the composition of a planetary atmosphere from limited data, collected from light years away, is very difficult. We also have to understand that these are often exotic environments, with conditions we do not experience on Earth. As such, exotic chemical processes may occur here too. In order to characterise the atmospheres of exoplanets, we obtain what are called spectra. These are the fingerprints of molecules in the atmosphere that absorb light at specific wavelengths. Once the data has been collected, it needs to be interpreted. Astronomers assess which chemicals, or combinations thereof, best fit the observations. It is an involved process and one that requires lots of computer based work. The process is especially challenging when dealing with exoplanets, where available data is at a premium. Once these stages have been carried out, astronomers can then assign a confidence to the likelihood of a particular chemical signature being “real”. In the case of the recent discovery from K2-18b, the authors claim the detection of a feature that can only be explained by DMS with a likelihood of greater than 99.9%. In other words, there’s about a 1 in 1,500 chance that this feature is not actually there. While the team behind the recent result favours a model of K2-18b as an ocean world, another team suggests it could actually have a magma (molten rock) ocean instead. It could also be a Neptune-like “gas dwarf” planet, with a small core shrouded in a thick layer of gas and ices. Both of these options would be much less favourable to the development of life – raising questions as to whether there are abiotic ways that DMS can form.

A higher bar?

But is the bar higher for claims of extraterrestrial life than for other areas of science? In a study claiming the detection of a biosignature, the usual level of scientific rigour expected for all research should apply to the collection and processing of the data, along with the interpretation of the results. However, even when these standards have been met, claims that indicate the presence of life have in the past still been meet with high levels of scepticism. The reasons for this are probably best summed up by the phrase “extraordinary claims require extraordinary evidence”. This is attributed to the American planetary scientist, author and science communicator Carl Sagan. While on Earth there are no known means of producing DMS without life, the chemical has been detected on a comet called 67/P, which was studied up close by the European Space Agency’s Rosetta spacecraft. DMS has even been detected in the interstellar medium, the space between stars, suggesting that it can be produced by non-biological, or abiotic, mechanisms. Given the uncertainties about the nature of K2-18b, we cannot be sure if the presence of this gas might simply be a sign of non-biological processes we don’t yet understand. The claimed discovery of DMS on K2-18b is interesting, exciting, and reflects huge advances in astronomy, planetary science and astrobiology. However, its possible implications mean that we have to consider the results very cautiously. We must also entertain alternative explanations before supporting such a profound conclusion as the presence of extraterrestrial life.The Conversation Manoj Joshi, Professor of Climate Dynamics, University of East Anglia; Andrew Rushby, Lecturer, School of Natural Sciences, Birkbeck, University of London, and Maria Di Paolo, PhD Candidate, School of Engineering, Mathematics and Physics, University of East Anglia This article is republished from The Conversation under a Creative Commons license. Read the original article.

Discover more from Daily News

Subscribe to get the latest posts sent to your email.

Continue Reading

Blog

America’s clean air rules boost health and economy − charts show what EPA’s deregulation plans ignore

Published

on

clean air
Regulations have cleaned up cars, power plants and factories, leaving cleaner air while economies have grown. Cavan Images/Josh Campbell via Getty Images
Richard E. Peltier, UMass Amherst The Trump administration is “reconsidering” more than 30 air pollution regulations, and it offered industries a brief window to apply for exemptions that would allow them to stop following many air quality regulations immediately if approved. All of the exemptions involve rules finalized in 2024 and include regulations for hazardous air pollutants that cause asthma, heart disease and cancer. The results – if regulations are ultimately rolled back and if those rollbacks and any exemptions stand up to court challenges – could impact air quality across the United States. “Reconsideration” is a term used to review or modify a government regulation. While Environmental Protection Agency Administrator Lee Zeldin provided few details, the breadth of the regulations being reconsidered affects all Americans. They include rules that set limits for pollutants that can harm human health, such as ozone, particulate matter and volatile organic carbon. Zeldin wrote on March 12, 2025, that his deregulation moves would “roll back trillions in regulatory costs and hidden “taxes” on U.S. families.“ What Zeldin didn’t say is that the economic and health benefits from decades of federal clean air regulations have far outweighed their costs. Some estimates suggest every $1 spent meeting clean air rules has returned $10 in health and economic benefits.

How far America has come, because of regulations

In the early 1970s, thick smog blanketed American cities and acid rain stripped forests bare from the Northeast to the Midwest. Air pollution wasn’t just a nuisance – it was a public health emergency. But in the decades since, the United States has engineered one of the most successful environmental turnarounds in history. Thanks to stronger air quality regulations, pollution levels have plummeted, preventing hundreds of thousands of deaths annually. And despite early predictions that these regulations would cripple the economy, the opposite has proven true: The U.S. economy more than doubled in size while pollution fell, showing that clean air and economic growth can – and do – go hand in hand. The numbers are eye-popping. An Environmental Protection Agency analysis of the first 20 years of the Clean Air Act, from 1970 to 1990, found the economic benefits of the regulations were about 42 times greater than the costs. The EPA later estimated that the cost of air quality regulations in the U.S. would be about US$65 billion in 2020, and the benefits, primarily in improved health and increased worker productivity, would be around $2 trillion. Other studies have found similar benefits. That’s a return of more than 30 to 1, making clean air one of the best investments the country has ever made.

Science-based regulations even the playing field

The turning point came with the passage of the Clean Air Act of 1970, which put in place strict rules on pollutants from industry, vehicles and power plants. These rules targeted key culprits: lead, ozone, sulfur dioxide, nitrogen oxides and particulate matter – substances that contribute to asthma, heart disease and premature deaths. An example was the removal of lead, which can harm the brain and other organs, from gasoline. That single change resulted in far lower levels of lead in people’s blood, including a 70% drop in U.S. children’s blood-lead levels.
A line graph that shows declining lead used in gasoline with declining blood lead levels from 1976-1980.
Air Quality regulations lowered the amount of lead being used in gasoline, which also resulted in rapidly declining lead concentrations in the average American between 1976-1980. This shows us how effective regulations can be at reducing public health risks to people. USEPA/Environmental Criteria and Assessment Office (1986)
The results have been extraordinary. Since 1980, emissions of six major air pollutants have dropped by 78%, even as the U.S. economy has more than doubled in size. Cities that were once notorious for their thick, choking smog – such as Los Angeles, Houston and Pittsburgh – now see far cleaner air, while lakes and forests devastated by acid rain in the Northeast have rebounded.
Chart shows economy growing 321% while emissions of common pollutants fell.
Comparison of growth areas and declining emissions, 1970-2023. EPA
And most importantly, lives have been saved. The Clean Air Act requires the EPA to periodically estimate the costs and benefits of air quality regulations. In the most recent estimate, released in 2011, the EPA projected that air quality improvements would prevent over 230,000 premature deaths in 2020. That means fewer heart attacks, fewer emergency room visits for asthma, and more years of healthy life for millions of Americans.

The economic payoff

Critics of air quality regulations have long argued that the regulations are too expensive for businesses and consumers. But the data tells a very different story. EPA studies have confirmed that clean air regulations improve air quality over time. Other studies have shown that the health benefits greatly outweigh the costs. That pays off for the economy. Fewer illnesses mean lower health care costs, and healthier workers mean higher productivity and fewer missed workdays. The EPA estimated that for every $1 spent on meeting air quality regulations, the United States received $9 in benefits. A separate study by the non-partisan National Bureau of Economic Research in 2024 estimated that each $1 spent on air pollution regulation brought the U.S. economy at least $10 in benefits. And when considering the long-term impact on human health and climate stability, the return is even greater.
On a smoggy day, downtown is barely visible.
Hollywood and downtown Los Angeles in 1984: Smog was a common problem in the 1970s and 1980s. Ian Dryden/Los Angeles Times/UCLA Archive/Wikimedia Commons, CC BY

The next chapter in clean air

The air Americans breathe today is cleaner, much healthier and safer than it was just a few decades ago. Yet, despite this remarkable progress, air pollution remains a challenge in some parts of the country. Some urban neighborhoods remain stubbornly polluted because of vehicle emissions and industrial pollution. While urban pollution has declined, wildfire smoke has become a larger influence on poor air quality across the nation. That means the EPA still has work to do. If the agency works with environmental scientists, public health experts and industry, and fosters honest scientific consensus, it can continue to protect public health while supporting economic growth. At the same time, it can ensure that future generations enjoy the same clean air and prosperity that regulations have made possible. By instead considering retracting clean air rules, the EPA is calling into question the expertise of countless scientists who have provided their objective advice over decades to set standards designed to protect human lives. In many cases, industries won’t want to go back to past polluting ways, but lifting clean air rules means future investment might not be as protective. And it increases future regulatory uncertainty for industries. The past offers a clear lesson: Investing in clean air is not just good for public health – it’s good for the economy. With a track record of saving lives and delivering trillion-dollar benefits, air quality regulations remain one of the greatest policy success stories in American history. This article, originally published March 12, 2025, has been updated with the administration’s offer of exemptions for industries. Richard E. Peltier, Professor of Environmental Health Sciences, UMass Amherst This article is republished from The Conversation under a Creative Commons license. Read the original article.

Discover more from Daily News

Subscribe to get the latest posts sent to your email.

Continue Reading

Blog

Algebra is more than alphabet soup – it’s the language of algorithms and relationships

Published

on

Algebra
Algebra often involves manipulating numbers or other objects using operations like addition and multiplication. Flavio Coelho/Moment via Getty Images
Courtney Gibbons, Hamilton College You scrambled up a Rubik’s cube, and now you want to put it back in order. What sequence of moves should you make? Surprise: You can answer this question with modern algebra. Most folks who have been through high school mathematics courses will have taken a class called algebra – maybe even a sequence of classes called algebra I and algebra II that asked you to solve for x. The word “algebra” may evoke memories of complicated-looking polynomial equations like ax² + bx + c = 0 or plots of polynomial functions like y = ax² + bx + c. You might remember learning about the quadratic formula to figure out the solutions to these equations and find where the plot crosses the x-axis, too.
file 20250514 62 ogszr5.png?ixlib=rb 4.1
Graph of a quadratic equation and its roots via the quadratic formula. Jacob Rus, CC BY-SA
Equations and plots like these are part of algebra, but they’re not the whole story. What unifies algebra is the practice of studying things – like the moves you can make on a Rubik’s cube or the numbers on a clock face you use to tell time – and the way they behave when you put them together in different ways. What happens when you string together the Rubik’s cube moves or add up numbers on a clock? In my work as a mathematician, I’ve learned that many algebra questions come down to classifying objects by their similarities.

Sets and groups

How did equations like ax² + bx + c = 0 and their solutions lead to abstract algebra? The short version of the story is that mathematicians found formulas that looked a lot like the quadratic formula for polynomial equations where the highest power of x was three or four. But they couldn’t do it for five. It took mathematician Évariste Galois and techniques he developed – now called group theory – to make a convincing argument that no such formula could exist for polynomials with a highest power of five or more. So what is a group, anyway? It starts with a set, which is a collection of things. The fruit bowl in my kitchen is a set, and the collection of things in it are pieces of fruit. The numbers 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11 and 12 also form a set. Sets on their own don’t have too many properties – that is, characteristics – but if we start doing things to the numbers 1 through 12, or the fruit in the fruit bowl, it gets more interesting.
Diagram of clock with the hands set to 3:15, with an arrow indicating that you'll arrive at the same place 12 hours later
In clock addition, 3 + 12 = 3. OpenStax, CC BY-SA
Let’s call this set of numbers 1 through 12 “clock numbers.” Then, we can define an addition function for the clock numbers using the way we tell time. That is, to say “3 + 11 = 2” is the way we would add 3 and 11. It feels weird, but if you think about it, 11 hours past 3 o’clock is 2 o’clock. Clock addition has some nice properties. It satisfies:
  • closure, where adding things in the set gives you something else in the set,
  • identity, where there’s an element that doesn’t change the value of other elements in the set when added – adding 12 to any number will equal that same number,
  • associativity, where you can add wherever you want in the set,
  • inverses, where you can undo whatever an element does, and
  • commutativity, where you can change the order of which clock numbers you add up without changing the outcome: a + b = b + a.
By satisfying all these properties, mathematicians can consider clock numbers with clock addition a group. In short, a group is a set with some way of combining the elements layered on top. The set of fruit in my fruit bowl probably can’t be made into a group easily – what’s a banana plus an apple? But we can make a set of clock numbers into a group by showing that clock addition is a way of taking two clock numbers and getting to a new one that satisfies the rules outlined above.

Rings and fields

Along with groups, the two other fundamental types of algebraic objects you would study in an introduction to modern algebra are rings and fields. We could introduce a second operation for the clock numbers: clock multiplication, where 2 times 7 is 2, because 14 o’clock is the same as 2 o’clock. With clock addition and clock multiplication, the clock numbers meet the criteria for what mathematicians call a ring. This is primarily because clock multiplication and clock addition together satisfy a key component that defines a ring: the distributive property, where a(b + c) = ab + ac. Lastly, fields are rings that satisfy even more conditions. At the turn of the 20th century, mathematicians David Hilbert and Emmy Noether – who were interested in understanding how the principles in Einstein’s relativity worked mathematically – unified algebra and showed the utility of studying groups, rings and fields.

It’s all fun and games until you do the math

Groups, rings and fields are abstract, but they have many useful applications. For example, the symmetries of molecular structures are categorized by different point groups. A point group describes ways to move a molecule in space so that even if you move the individual atoms, the end result is indistinguishable from the molecule you started with.
Two water molecules with labeled hydrogen atoms H_1 and H_2 exchanging places
The water molecule H₂O can be flipped horizontally and the end result is indistinguishable from the original position. Courtney Gibbons, CC BY-SA
But let’s take a different example that uses rings instead of groups. You can set up a pretty complicated set of equations to describe a Sudoku puzzle: You need 81 variables to represent each place you can put a number in the grid, polynomial expressions to encode the rules of the game, and polynomial expressions that take into account the clues already on the board. To get the spaces on the game board and the 81 variables to correspond nicely, you can use two subscripts to associate the variable with a specific place on the board, like using x₃₅ to represent the cell in the third row and fifth column. The first entry must be one of the numbers 1 through 9, and we represent that relationship with (x₁₁ – 1)(x₁₁ – 2)(x₁₁ – 3) ⋅⋅⋅ (x₁₁ – 9). This expression is equal to zero if and only if you followed the rules of the game. Since every space on the board follows this rule, that’s already 81 equations just to say, “Don’t plug in anything other than 1 through 9.” The rule “1 through 9 each appear exactly once in the top row” can be captured with some sneaky pieces of algebraic thinking. The sum of the top row is going to add up to 45, which is to say x₁₁ + x₁₂ + ⋅⋅⋅ + x₁₉ – 45 will be zero, and the product of the top row is going to be the product of 1 through 9, which is to say x₁₁ x₁₂ ⋅⋅⋅ x₁₉ – 9⋅8⋅7⋅6⋅5⋅4⋅3⋅2⋅1 will be zero. If you’re thinking that it takes more time to set up all these rules than it does to solve the puzzle, you’re not wrong.
sudoku grid with variables x_11 through x_99 (x_ij is in the i-th row, j-th column)
Turning Sudoku into algebra takes a fair bit of work. Courtney Gibbons
What do we get by doing this complicated translation into algebra? Well, we get to use late-20th century algorithms to figure out what numbers you can plug into the board that satisfy all the rules and all the clues. These algorithms are based on describing the structure of the special ring – called an ideal – these game board clues make within the larger ring. The algorithms will tell you if there’s no solution to the puzzle. If there are multiple solutions, the algorithms will find them all. This is a small example where setting up the algebra is harder than just doing the puzzle. But the techniques generalize widely. You can use algebra to tackle problems in artificial intelligence, robotics, cryptography, quantum computing and so much more – all with the same bag of tricks you’d use to solve the Sudoku puzzle or Rubik’s cube.The Conversation Courtney Gibbons, Associate Professor of Mathematics, Hamilton College This article is republished from The Conversation under a Creative Commons license. Read the original article.
STM Daily News is a vibrant news blog dedicated to sharing the brighter side of human experiences. Emphasizing positive, uplifting stories, the site focuses on delivering inspiring, informative, and well-researched content. With a commitment to accurate, fair, and responsible journalism, STM Daily News aims to foster a community of readers passionate about positive change and engaged in meaningful conversations. Join the movement and explore stories that celebrate the positive impacts shaping our world. https://stmdailynews.com/  

Discover more from Daily News

Subscribe to get the latest posts sent to your email.

Continue Reading

Trending