Regulations have cleaned up cars, power plants and factories, leaving cleaner air while economies have grown.
Cavan Images/Josh Campbell via Getty ImagesRichard E. Peltier, UMass Amherst
The Trump administration is “reconsidering” more than 30 air pollution regulations, and it offered industries a brief window to apply for exemptions that would allow them to stop following many air quality regulations immediately if approved. All of the exemptions involve rules finalized in 2024 and include regulations for hazardous air pollutants that cause asthma, heart disease and cancer.
The results – if regulations are ultimately rolled back and if those rollbacks and any exemptions stand up to court challenges – could impact air quality across the United States.
“Reconsideration” is a term used to review or modify a government regulation. While Environmental Protection Agency Administrator Lee Zeldin provided few details, the breadth of the regulations being reconsidered affects all Americans. They include rules that set limits for pollutants that can harm human health, such as ozone, particulate matter and volatile organic carbon.
Zeldin wrote on March 12, 2025, that his deregulation moves would “roll back trillions in regulatory costs and hidden “taxes” on U.S. families.“
What Zeldin didn’t say is that the economic and health benefits from decades of federal clean air regulations have far outweighed their costs. Some estimates suggest every $1 spent meeting clean air rules has returned $10 in health and economic benefits.
How far America has come, because of regulations
In the early 1970s, thick smog blanketed American cities and acid rain stripped forests bare from the Northeast to the Midwest.
Air pollution wasn’t just a nuisance – it was a public health emergency. But in the decades since, the United States has engineered one of the most successful environmental turnarounds in history.
Thanks to stronger air quality regulations, pollution levels have plummeted, preventing hundreds of thousands of deaths annually. And despite early predictions that these regulations would cripple the economy, the opposite has proven true: The U.S. economy more than doubled in size while pollution fell, showing that clean air and economic growth can – and do – go hand in hand.
The numbers are eye-popping.
An Environmental Protection Agency analysis of the first 20 years of the Clean Air Act, from 1970 to 1990, found the economic benefits of the regulations were about 42 times greater than the costs.
The EPA later estimated that the cost of air quality regulations in the U.S. would be about US$65 billion in 2020, and the benefits, primarily in improved health and increased worker productivity, would be around $2 trillion. Other studies have found similar benefits.
That’s a return of more than 30 to 1, making clean air one of the best investments the country has ever made.
Science-based regulations even the playing field
The turning point came with the passage of the Clean Air Act of 1970, which put in place strict rules on pollutants from industry, vehicles and power plants.
These rules targeted key culprits: lead, ozone, sulfur dioxide, nitrogen oxides and particulate matter – substances that contribute to asthma, heart disease and premature deaths. An example was the removal of lead, which can harm the brain and other organs, from gasoline. That single change resulted in far lower levels of lead in people’s blood, including a 70% drop in U.S. children’s blood-lead levels.
Air Quality regulations lowered the amount of lead being used in gasoline, which also resulted in rapidly declining lead concentrations in the average American between 1976-1980. This shows us how effective regulations can be at reducing public health risks to people.USEPA/Environmental Criteria and Assessment Office (1986)
The results have been extraordinary. Since 1980, emissions of six major air pollutants have dropped by 78%, even as the U.S. economy has more than doubled in size. Cities that were once notorious for their thick, choking smog – such as Los Angeles, Houston and Pittsburgh – now see far cleaner air, while lakes and forests devastated by acid rain in the Northeast have rebounded.
Comparison of growth areas and declining emissions, 1970-2023.EPA
And most importantly, lives have been saved. The Clean Air Act requires the EPA to periodically estimate the costs and benefits of air quality regulations. In the most recent estimate, released in 2011, the EPA projected that air quality improvements would prevent over 230,000 premature deaths in 2020. That means fewer heart attacks, fewer emergency room visits for asthma, and more years of healthy life for millions of Americans.
The economic payoff
Critics of air quality regulations have long argued that the regulations are too expensive for businesses and consumers. But the data tells a very different story.
EPA studies have confirmed that clean air regulations improve air quality over time. Other studies have shown that the health benefits greatly outweigh the costs. That pays off for the economy. Fewer illnesses mean lower health care costs, and healthier workers mean higher productivity and fewer missed workdays.
The EPA estimated that for every $1 spent on meeting air quality regulations, the United States received $9 in benefits. A separate study by the non-partisan National Bureau of Economic Research in 2024 estimated that each $1 spent on air pollution regulation brought the U.S. economy at least $10 in benefits. And when considering the long-term impact on human health and climate stability, the return is even greater.
Hollywood and downtown Los Angeles in 1984: Smog was a common problem in the 1970s and 1980s.Ian Dryden/Los Angeles Times/UCLA Archive/Wikimedia Commons, CC BY
The next chapter in clean air
The air Americans breathe today is cleaner, much healthier and safer than it was just a few decades ago.
Yet, despite this remarkable progress, air pollution remains a challenge in some parts of the country. Some urban neighborhoods remain stubbornly polluted because of vehicle emissions and industrial pollution. While urban pollution has declined, wildfire smoke has become a larger influence on poor air quality across the nation.
That means the EPA still has work to do.
If the agency works with environmental scientists, public health experts and industry, and fosters honest scientific consensus, it can continue to protect public health while supporting economic growth. At the same time, it can ensure that future generations enjoy the same clean air and prosperity that regulations have made possible.
By instead considering retracting clean air rules, the EPA is calling into question the expertise of countless scientists who have provided their objective advice over decades to set standards designed to protect human lives. In many cases, industries won’t want to go back to past polluting ways, but lifting clean air rules means future investment might not be as protective. And it increases future regulatory uncertainty for industries.
The past offers a clear lesson: Investing in clean air is not just good for public health – it’s good for the economy. With a track record of saving lives and delivering trillion-dollar benefits, air quality regulations remain one of the greatest policy success stories in American history.
This article, originally published March 12, 2025, has been updated with the administration’s offer of exemptions for industries.Richard E. Peltier, Professor of Environmental Health Sciences, UMass Amherst
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Immigration enforcement is a key justification for repurposing government data.
Photo by U.S. Immigration and Customs Enforcement via Getty ImagesNicole M. Bennett, Indiana University
A whistleblower at the National Labor Relations Board reported an unusual spike in potentially sensitive data flowing out of the agency’s network in early March 2025 when staffers from the Department of Government Efficiency, which goes by DOGE, were granted access to the agency’s databases. On April 7, the Department of Homeland Security gained access to Internal Revenue Service tax data.
These seemingly unrelated events are examples of recent developments in the transformation of the structure and purpose of federal government data repositories. I am a researcher who studies the intersection of migration, data governance and digital technologies. I’m tracking how data that people provide to U.S. government agencies for public services such as tax filing, health care enrollment, unemployment assistance and education support is increasingly being redirected toward surveillance and law enforcement.
Originally collected to facilitate health care, eligibility for services and the administration of public services, this information is now shared across government agencies and with private companies, reshaping the infrastructure of public services into a mechanism of control. Once confined to separate bureaucracies, data now flows freely through a network of interagency agreements, outsourcing contracts and commercial partnerships built up in recent decades.
These data-sharing arrangements often take place outside public scrutiny, driven by national security justifications, fraud prevention initiatives and digital modernization efforts. The result is that the structure of government is quietly transforming into an integrated surveillance apparatus, capable of monitoring, predicting and flagging behavior at an unprecedented scale.
Executive orders signed by President Donald Trump aim to remove remaining institutional and legal barriers to completing this massive surveillance system.
DOGE and the private sector
Central to this transformation is DOGE, which is tasked via an executive order to “promote inter-operability between agency networks and systems, ensure data integrity, and facilitate responsible data collection and synchronization.” An additional executive order calls for the federal government to eliminate its information silos.
By building interoperable systems, DOGE can enable real-time, cross-agency access to sensitive information and create a centralized database on people within the U.S. These developments are framed as administrative streamlining but lay the groundwork for mass surveillance.
Key to this data repurposing are public-private partnerships. The DHS and other agencies have turned to third-party contractors and data brokers to bypass direct restrictions. These intermediaries also consolidate data from social media, utility companies, supermarkets and many other sources, enabling enforcement agencies to construct detailed digital profiles of people without explicit consent or judicial oversight.
Palantir, a private data firm and prominent federal contractor, supplies investigative platforms to agencies such as Immigration and Customs Enforcement, the Department of Defense, the Centers for Disease Control and Prevention and the Internal Revenue Service. These platforms aggregate data from various sources – driver’s license photos, social services, financial information, educational data – and present it in centralized dashboards designed for predictive policing and algorithmic profiling. These tools extend government reach in ways that challenge existing norms of privacy and consent.
The role of AI
Artificial intelligence has further accelerated this shift.
Predictive algorithms now scan vast amounts of data to generate risk scores, detect anomalies and flag potential threats.
These systems ingest data from school enrollment records, housing applications, utility usage and even social media, all made available through contracts with data brokers and tech companies. Because these systems rely on machine learning, their inner workings are often proprietary, unexplainable and beyond meaningful public accountability.
Data privacy researcher Justin Sherman explains the astonishing amount of information data brokers have about you.
Sometimes the results are inaccurate, generated by AI hallucinations – responses AI systems produce that sound convincing but are incorrect, made up or irrelevant. Minor data discrepancies can lead to major consequences: job loss, denial of benefits and wrongful targeting in law enforcement operations. Once flagged, individuals rarely have a clear pathway to contest the system’s conclusions.
Digital profiling
Participation in civic life, applying for a loan, seeking disaster relief and requesting student aid now contribute to a person’s digital footprint. Government entities could later interpret that data in ways that allow them to deny access to assistance. Data collected under the banner of care could be mined for evidence to justify placing someone under surveillance. And with growing dependence on private contractors, the boundaries between public governance and corporate surveillance continue to erode.
Artificial intelligence, facial recognition systems and predictive profiling systems lack oversight. They also disproportionately affect low-income individuals, immigrants and people of color, who are more frequently flagged as risks.
Initially built for benefits verification or crisis response, these data systems now feed into broader surveillance networks. The implications are profound. What began as a system targeting noncitizens and fraud suspects could easily be generalized to everyone in the country.
Eyes on everyone
This is not merely a question of data privacy. It is a broader transformation in the logic of governance. Systems once designed for administration have become tools for tracking and predicting people’s behavior. In this new paradigm, oversight is sparse and accountability is minimal.
AI allows for the interpretation of behavioral patterns at scale without direct interrogation or verification. Inferences replace facts. Correlations replace testimony.
The risk extends to everyone. While these technologies are often first deployed at the margins of society – against migrants, welfare recipients or those deemed “high risk” – there’s little to limit their scope. As the infrastructure expands, so does its reach into the lives of all citizens.
With every form submitted, interaction logged and device used, a digital profile deepens, often out of sight. The infrastructure for pervasive surveillance is in place. What remains uncertain is how far it will be allowed to go.
Nicole M. Bennett, Ph.D. Candidate in Geography and Assistant Director at the Center for Refugee Studies, Indiana University
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Red skies in August, longer fire seasons and checking air quality before taking my toddler to the park. This has become the new norm in the western United States as wildfires become more frequent, larger and more catastrophic.
As an ecologist at the University of Colorado Boulder, I know that fires are part of the natural processes that forests need to stay healthy. But the combined effects of a warmer and drier climate, more people living in fire-prone areas and vegetation and debris built up over years of fire suppression are leading to more severe fires that spread faster. And that’s putting humans, ecosystems and economies at risk.
To help prevent catastrophic fires, the U.S. Forest Service issued a 10-year strategy in 2022 that includes scaling up the use of controlled burns and other techniques to remove excess plant growth and dry, dead materials that fuel wildfires.
However, the Forest Service’s wildfire management activities have been thrown into turmoil in 2025 with funding cuts and disruptions and uncertainty from the federal government.
The planet just saw its hottest year on record. If spring and summer 2025 are also dry and hot, conditions could be prime for severe fires again.
More severe fires harm forest recovery and people
Today’s severe wildfires have been pushing societies, emergency response systems and forests beyond what they have evolved to handle.
Extreme fires have burned into cities, including destroying thousands of homes in the Los Angeles area in 2025 and near Boulder, Colorado, in 2021. They threaten downstream public drinking water by increasing sediments and contaminants in water supplies, as well as infrastructure, air quality and rural economies. They also increase the risk of flooding and mudslides from soil erosion. And they undermine efforts to mitigate climate change by releasing carbon stored in these ecosystems.
In some cases, fires burned so hot and deep into the soil that the forests are not growing back.
While many species are adapted to survive low-level fires, severe blazes can damage the seeds and cones needed for forests to regrow. My team has seen this trend outside of Fort Collins, Colorado, where four years after the Cameron Peak fire, forests have still not come back the way ecologists would expect them to under past, less severe fires. Returning to a strategy of fire suppression − or trying to “go toe-to-toe with every fire” − will make these cases more common.
Parts of Cameron Peak, burned in a severe fire in 2020, still showed little evidence of recovery in 2024. Efforts have been underway to try to replant parts of the burned areas by hand.Bella Oleksy/University of Colorado
Proactive wildfire management can help reduce the risk to forests and property.
Measures such as prescribed burns have proven to be effective for maintaining healthy forests and reducing the severity of subsequent wildfires. A recent review found that selective thinning followed by prescribed fire reduced subsequent fire severity by 72% on average, and prescribed fire on its own reduced severity by 62%.
Prescribed burns and forest thinning tend to reduce the risk of extremely destructive wildfires.Kimberley T. Davis, et al., Forest Ecology and Management, 2024, CC BY
But managing forests well requires knowing how forests are changing, where trees are dying and where undergrowth has built up and increased fire hazards. And, for federal lands, these are some of the jobs that are being targeted by the Trump administration.
Some of the Forest Service staff who were fired or put in limbo by the Trump administration are those who do research or collect and communicate critical data about forests and fire risk. Other fired staff provided support so crews could clear flammable debris and carry out fuel treatments such as prescribed burns, thinning forests and building fire breaks.
Losing people in these roles is like firing all primary care doctors and leaving only EMTs. Both are clearly needed. As many people know from emergency room bills, preventing emergencies is less costly than dealing with the damage later.
Logging is not a long-term fire solution
The Trump administration cited “wildfire risk reduction” when it issued an emergency order to increase logging in national forests by 25%.
But private − unregulated − forest management looks a lot different than managing forests to prevent destructive fires.
Logging, depending on the practice, can involve clear-cutting trees and other techniques that compromise soils. Exposing a forest’s soils and dead vegetation to more sunlight also dries them out, which can increase fire risk in the near term.
Forest-thinning operations involve carefully removing young trees and brush that could easily burn, with a goal of creating conditions less likely to send fire into the crowns of trees.AP Photo/Godofredo A. Vásquez
In general, logging that focuses on extracting the highest-value trees leaves thinner trees that are more vulnerable to fires. A study in the Pacific Northwest found that replanting logged land with the same age and size of trees can lead to more severe fires in the future.
Research and data are essential
For many people in the western U.S., these risks hit close to home.
I’ve seen neighborhoods burn and friends and family displaced, and I have contended with regular air quality warnings and red flag days signaling a high fire risk. I’ve also seen beloved landscapes, such as those on Cameron Peak, transform when conifers that once made up the forest have not regrown.
Recovery has been slow on Cameron Peak after a severe fire in 2020. This photo was taken in 2024.Bella Oleksy/University of Colorado
My scientific research group and collaborations with other scientists have been helping to identify cost-effective solutions. That includes which fuel-treatment methods are most effective, which types of forests and conditions they work best in and how often they are needed. We’re also planning research projects to better understand which forests are at greatest risk of not recovering after fires.
This sort of research is what robust, cost-effective land management is based on.
When careful, evidence-based forest management is replaced with a heavy emphasis on suppressing every fire or clear-cutting forests, I worry that human lives, property and economies, as well as the natural legacy of public lands left to every American, are at risk.Laura Dee, Associate Professor of Ecology, University of Colorado Boulder
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Not all children learn to read in the same way, but schools tend to adopt a single approach to literacy.
luckyvector/iStock via Getty Images PlusK. Dara Hill, University of Michigan-Dearborn
Five years after the pandemic forced children into remote instruction, two-thirds of U.S. fourth graders still cannot read at grade level. Reading scores lag 2 percentage points below 2022 levels and 4 percentage points below 2019 levels.
This data from the 2024 report of National Assessment of Educational Progress, a state-based ranking sometimes called “America’s report card,” has concerned educators scrambling to boost reading skills.
Many school districts have adopted an evidence-based literacy curriculum called the “science of reading” that features phonics as a critical component.
Phonics strategies begin by teaching children to recognize letters and make their corresponding sounds. Then they advance to manipulating and blending first-letter sounds to read and write simple, consonant-vowel-consonant words – such as combining “b” or “c” with “-at” to make “bat” and “cat.” Eventually, students learn to merge more complex word families and to read them in short stories to improve fluency and comprehension.
Proponents of the curriculum celebrate its grounding in brain science, and the science of reading has been credited with helping Louisiana students outperform their pre-pandemic reading scores last year.
In practice, Louisiana used a variety of science of reading approaches beyond phonics. That’s because different students have different learning needs, for a variety of reasons.
Yet as a scholar of reading and language who has studied literacy in diverse student populations, I see many schools across the U.S. placing a heavy emphasis on the phonics components of the science of reading.
If schools want across-the-board gains in reading achievement, using one reading curriculum to teach every child isn’t the best way. Teachers need the flexibility and autonomy to use various, developmentally appropriate literacy strategies as needed.
Phonics fails some students
Phonics programs often require memorizing word families in word lists. This works well for some children: Research shows that “decoding” strategies such as phonics can support low-achieving readers and learners with dyslexia.
However, some students may struggle with explicit phonics instruction, particularly the growing population of neurodivergent learners with autism spectrum disorder or attention deficit hyperactivity disorder. These students learn and interact differently than their mainstream peers in school and in society. And they tend to have different strengths and challenges when it comes to word recognition, reading fluency and comprehension.
This was the case with my own child. He had been a proficient reader from an early age, but struggles emerged when his school adopted a phonics program to balance out its regular curriculum, a flexible literature-based curriculum called Daily 5 that prioritizes reading fluency and comprehension.
I worked with his first grade teacher to mitigate these challenges. But I realized that his real reading proficiency would likely not have been detected if the school had taught almost exclusively phonics-based reading lessons.
Another weakness of phonics, in my experience, is that it teaches reading in a way that is disconnected from authentic reading experiences. Phonics often directs children to identify short vowel sounds in word lists, rather than encounter them in colorful stories. Evidence shows that exposing children to fun, interesting literature promotes deep comprehension.
Balanced literacy
To support different learning styles, educators can teach reading in multiple ways. This is called balanced literacy, and for decades it was a mainstay in teacher preparation and in classrooms.
Balanced literacy prompts children to learn words encountered in authentic literature during guided, teacher-led read-alouds – versus learning how to decode words in word lists. Teachers use multiple strategies to promote reading acquisition, such as blending the letter sounds in words to support “decoding” while reading.
Another balanced literacy strategy that teachers can apply in phonics-based strategies while reading aloud is called “rhyming word recognition.” The rhyming word strategy is especially effective with stories whose rhymes contribute to the deeper meaning of the story, such as Marc Brown’s “Arthur in a Pickle.”
The rhyming structure of ‘Arthur in a Pickle’ helps children learn to read entire words, versus word parts.
After reading, teachers may have learners arrange letter cards to form words, then tap the letter cards while saying and blending each sound to form the word. Similar phonics strategies include tracing and writing letters to form words that were encountered during reading.
There is no one right way to teach literacy in a developmentally appropriate, balanced literacy framework. There are as many ways as there are students.
STM Daily News is a vibrant news blog dedicated to sharing the brighter side of human experiences. Emphasizing positive, uplifting stories, the site focuses on delivering inspiring, informative, and well-researched content. With a commitment to accurate, fair, and responsible journalism, STM Daily News aims to foster a community of readers passionate about positive change and engaged in meaningful conversations. Join the movement and explore stories that celebrate the positive impacts shaping our world.
https://stmdailynews.com/
Privacy & Cookies: This site uses cookies. By continuing to use this website, you agree to their use.
To find out more, including how to control cookies, see here:
Cookie Policy