News
Can you trust companies that say their plastic products are recyclable? US regulators may crack down on deceptive claims
Keurig was fined for falsely claiming its K-Cup pods were recyclable. Many plastic items labeled recyclable end up as waste. The FTC is revising guidelines to clarify recyclability claims.
Patrick Parenteau, Vermont Law & Graduate School
Plastic is a fast-growing segment of U.S. municipal solid waste, and most of it ends up in the environment. Just 9% of plastic collected in municipal solid waste was recycled as of 2018, the most recent year for which national data is available. The rest was burned in waste-to-energy plants or buried in landfills.
Manufacturers assert that better recycling is the optimal way to reduce plastic pollution. But critics argue that the industry often exaggerates how readily items can actually be recycled. In September 2024, beverage company Keurig Dr Pepper was fined US$1.5 million for inaccurately claiming that its K-Cup coffee pods were recyclable after two large recycling companies said they could not process the cups. California is suing ExxonMobil, accusing the company of falsely promoting plastic products as recyclable.
Environmental law scholar Patrick Parenteau explains why claims about recyclability have confused consumers, and how forthcoming guidelines from the U.S. Federal Trade Commission may address this problem.
Why do manufacturers need guidance on what ‘recyclable’ means?
Stating that a product is recyclable means that it can be collected, separated or otherwise recovered from the waste stream for reuse or in the manufacture of other products. But defining exactly what that means is difficult for several reasons:
- Different U.S. states have different recycling regulations and guidelines, which can affect what is considered recyclable in a given location.
- The availability and quality of recycling infrastructure also varies from place to place. Even if a product technically is recyclable, a local recycling facility may not be able to accept it because its equipment can’t process it.
- If no market demand for the recycled material exists, recycling companies may be unlikely to accept it.
https://www.youtube.com/embed/RwppgbZwrpg?wmode=transparent&start=0 Most plastic goods that consumers put in their recycle bins aren’t recycled, despite the “chasing arrow” label. Critics say manufacturers have deceived the public to avert plastic bans.
What is the Federal Trade Commission’s role?
Public concern about plastic pollution has skyrocketed in recent years. A 2020 survey found that globally, 91% of consumers were concerned about plastic waste.
Once plastic enters the environment, it can take 1,000 years or more to decompose, depending on environmental conditions. Exposure through ingestion, inhalation or in drinking water poses potential risks to human health and wildlife.
The Federal Trade Commission’s role is to protect the public from deceptive or unfair business practices and unfair methods of competition. Every year, it brings hundreds of cases against individuals and companies for violating consumer protection and competition laws. These cases can involve fraud, scams, identity theft, false advertising, privacy violations, anticompetitive behavior and more.
The FTC publishes references called the Green Guides, which are designed to help marketers avoid making environmental claims that mislead consumers. The guides were first issued in 1992 and were revised in 1996, 1998 and 2012. While the guides themselves are not enforceable, the commission can use them to prove that a claim is deceptive, in violation of federal law.
The guidance they provide includes:
- General principles that apply to all environmental marketing claims
- How consumers are likely to interpret claims, and how marketers can substantiate these claims
- How marketers can qualify their claims to avoid deceiving consumers
The agency monitors environmentally themed marketing for potentially deceptive claims and evaluates compliance with the FTC Act of 1914 by reference to the Green Guides. Marketing inconsistent with the Green Guides may be considered unfair or deceptive under Section 5 of the FTC Act.
Courts also refer to the Green Guides when they evaluate claims for false advertising in private litigation.
Currently, the Green Guides state that marketers should qualify claims that products are recyclable when recycling facilities are not available to at least 60% of consumers or communities where a product is sold.
How is the agency addressing recyclability claims?
The FTC is reviewing the Green Guides and issued a request for public comment on the guides in late 2022. In May 2023, the agency convened a workshop called Talking Trash at the FTC: Recycling Claims and the Green Guides.
This meeting focused on the 60% processing threshold for recyclability claims. It also addressed potential confusion created by the “chasing arrows” recycling symbol, which often identifies the type of plastic resin used in a product, using the numbers 1 through 7.
Many critics argue that consumers may see the symbol and assume that a product is recyclable, even though municipal recycling programs are not widely available for some types of resins. Other labels use a version of the symbol for products such as single-use grocery bags that aren’t accepted in most curbside recycling programs but can be dropped off at designated stores for recycling.
The FTC has sought public comments on specific characteristics that make products recyclable. It also has asked whether unqualified recyclability claims should be made when recycling facilities are available to a “substantial majority” of consumers or communities where the item is sold – even if the item is not ultimately recycled due to market demand, budgetary constraints or other factors.
What are companies and environmental advocates saying?
Organizations representing environmental interests, recycling businesses and the waste and packaging industries have offered numerous suggestions for updating the Green Guides. For example:
- The U.S. Environmental Protection Agency urged the FTC to increase its threshold for recyclability claims beyond the current 60% rate. The EPA said that products and packaging “should not be considered recyclable without strong end markets in which they can reliably be sold for a price higher than the cost of disposal.” It also recommended requiring companies’ recyclability claims to be reviewed and certified by outside experts.
- The Consumer Brands Association, which represents the U.S. Chamber of Commerce, the Plastics Industry Association and other commercial interests, called for more research into public understanding of environmental marketing claims. To help companies avoid making deceptive advertising claims, it urged the FTC to provide more detailed explanations, with examples of acceptable marketing.
- The Association of Plastic Recyclers encouraged the FTC to increase enforcement against deceptive unqualified claims of both recyclability and recycled content. It recommended providing stronger, more prescriptive guidance; publicizing specific examples from the marketplace of deceptive representations; and sending warning letters when companies appear to be making unsubstantiated claims. It also asked the FTC to maintain its current recyclability claim threshold at 60% and to update the Green Guides again within five years instead of 10.
- A coalition of environmental groups, including Greenpeace USA and the Center for Biological Diversity, urged the commission to codify the Green Guides into binding rules. They also argued that for goods that require in-store drop-off, companies should have to prove that processors can capture and recycle at least 75% of the material.
The FTC has not set a date for publishing a final version of the revised Green Guides. All eyes will be on the agency to see how far it is willing to go to police recycling claims by manufacturers in this $90 billion U.S. industry.
Patrick Parenteau, Professor of Law Emeritus, Vermont Law & Graduate School
This article is republished from The Conversation under a Creative Commons license. Read the original article.
The science section of our news blog STM Daily News provides readers with captivating and up-to-date information on the latest scientific discoveries, breakthroughs, and innovations across various fields. We offer engaging and accessible content, ensuring that readers with different levels of scientific knowledge can stay informed. Whether it’s exploring advancements in medicine, astronomy, technology, or environmental sciences, our science section strives to shed light on the intriguing world of scientific exploration and its profound impact on our daily lives. From thought-provoking articles to informative interviews with experts in the field, STM Daily News Science offers a harmonious blend of factual reporting, analysis, and exploration, making it a go-to source for science enthusiasts and curious minds alike. https://stmdailynews.com/category/science/
Discover more from Daily News
Subscribe to get the latest posts sent to your email.
The Earth
The US natural gas industry is leaking way more methane than previously thought. Here’s why that matters
Research reveals that methane emissions from U.S. natural gas operations are significantly underestimated, with a leak rate of 2.3 percent, which poses serious climate concerns and challenges in accurate measurement.
Anthony J. Marchese, Colorado State University and Dan Zimmerle, Colorado State University
Natural gas is displacing coal, which could help fight climate change because burning it produces fewer carbon emissions. But producing and transporting natural gas releases methane, a greenhouse gas that also contributes to climate change. How big is the methane problem?
For the past five years, our research teams at Colorado State University have made thousands of methane emissions measurements at more than 700 separate facilities in the production, gathering, processing, transmission and storage segments of the natural gas supply chain.
This experience has given us a unique perspective regarding the major sources of methane emissions from natural gas and the challenges the industry faces in terms of detecting and reducing, if not eliminating, them.
Our work, along with numerous other research projects, was recently folded into a new study published in the journal Science. This comprehensive snapshot suggests that methane emissions from oil and gas operations are much higher than current EPA estimates.
What’s wrong with methane
One way to quantify the magnitude of the methane leakage is to divide the amount of methane emitted each year by the total amount of methane pumped out of the ground each year from natural gas and oil wells. The EPA currently estimates this methane leak rate to be 1.4 percent. That is, for every cubic foot of natural gas drawn from underground reservoirs, 1.4 percent of it is lost into the atmosphere.
This study synthesized the results from a five-year series of 16 studies coordinated by environmental advocacy group Environmental Defense Fund (EDF), which involved more than 140 researchers from over 40 institutions and 50 natural gas companies.
The effort brought together scholars based at universities, think tanks and the industry itself to make the most accurate estimate possible of the total amount of methane emitted from all U.S. oil and gas operations. It integrated data from a multitude of recent studies with measurements made on the ground and from the air.
All told, based on the results of the new study, the U.S. oil and gas industry is leaking 13 million metric tons of methane each year, which means the methane leak rate is 2.3 percent. This 60 percent difference between our new estimate and the EPA’s current one can have profound climate consequences.
Methane is a highly potent greenhouse gas, with more than 80 times the climate warming impact of carbon dioxide over the first 20 years after it is released.
An earlier EDF study showed that a methane leak rate of greater than 3 percent would result in no immediate climate benefits from retiring coal-fired power plants in favor of natural gas power plants.
That means even with a 2.3 percent leakage rate, the growing share of U.S. electricity powered by natural gas is doing something to slow the pace of climate change. However, these climate benefits could be far greater.
Also, at a methane leakage rate of 2.3 percent, many other uses of natural gas besides generating electricity are conclusively detrimental for the climate. For example, EDF found that replacing the diesel used in most trucks or the gasoline consumed by most cars with natural gas would require a leakage rate of less than 1.4 percent before there would be any immediate climate benefit.
What’s more, some scientists believe that the leakage rate could be even higher than this new estimate.
What causes these leaks
Perhaps you’ve never contemplated the long journey that natural gas travels before you can ignite the burners on the gas stove in your kitchen.
But on top of the 500,000 natural gas wells operating in the U.S. today, there are 2 million miles of pipes and millions of valves, fittings, tanks, compressors and other components operating 24 hours per day, seven days a week to deliver natural gas to your home.
That natural gas that you burn when you whip up a batch of pancakes may have traveled 1,000 miles or more as it wended through this complicated network. Along the way, there were ample opportunities for some of it to leak out into the atmosphere.
Natural gas leaks can be accidental, caused by malfunctioning equipment, but a lot of natural gas is also released intentionally to perform process operations such as opening and closing valves. In addition, the tens of thousands of compressors that increase the pressure and pump the gas along through the network are powered by engines that burn natural gas and their exhaust contains some unburned natural gas.
Since the natural gas delivered to your home is 85 to 95 percent methane, natural gas leaks are predominantly methane. While methane poses the greatest threat to the climate because of its greenhouse gas potency, natural gas contains other hydrocarbons that can degrade regional air quality and are bad for human health.
Inventory tallies vs. aircraft surveillance
The EPA Greenhouse Gas Inventory is done in a way experts like us call a “bottom-up” approach. It entails tallying up all of the nation’s natural gas equipment – from household gas meters to wellpads – and estimating an annualized average emission rate for every category and adding it all up.
There are two challenges to this approach. First, there are no accurate equipment records for many of these categories. Second, when components operate improperly or fail, emissions balloon, making it hard to develop an accurate and meaningful annualized emission rate for each source.
“Top-down” approaches, typically requiring aircraft, are the alternative. They measure methane concentrations upwind and downwind of large geographic areas. But this approach has its own shortcomings.
First, it captures all methane emissions, rather than just the emissions tied to natural gas operations – including the methane from landfills, cows and even the leaves rotting in your backyard. Second, these one-time snapshots may get distorted depending on what’s going on while planes fly around capturing methane data.
Historically, top-down approaches estimate emissions that are about twice bottom-up estimates. Some regional top-down methane leak rate estimates have been as high as 8 percent while some bottom-up estimates have been as low as 1 percent.
More recent work, including the Science study, have performed coordinated campaigns in which the on-the-ground and aircraft measurements are made concurrently, while carefully modeling emission events.
Helpful gadgets and sound policy
On a sunny morning in October 2013, our research team pulled up to a natural gas gathering compressor station in Texas. Using an US$80,000 infrared camera, we immediately located an extraordinarily large leak of colorless, odorless methane that was invisible to the operator who quickly isolated and fixed the problem.
We then witnessed the methane emissions decline tenfold – the facility leak rate fell from 9.8 percent to 0.7 percent before our eyes.
It is not economically feasible, of course, to equip all natural gas workers with $80,000 cameras, or to hire the drivers required to monitor every wellpad on a daily basis when there are 40,000 oil and gas wells in Weld County, Colorado, alone.
But new technologies can make a difference. Our team at Colorado State University is working with the Department of Energy to evaluate gadgetry that will rapidly detect methane emissions. Some of these devices can be deployed today, including inexpensive sensors that can be monitored remotely.
Technology alone won’t solve the problem, however. We believe that slashing the nation’s methane leak rate will require a collaborative effort between industry and government. And based on our experience in Colorado, which has developed some of the nation’s strictest methane emissions regulations, we find that best practices become standard practices with strong regulations.
We believe that the Trump administration’s efforts to roll back regulations, without regard to whether they are working or not, will not only have profound climate impacts. They will also jeopardize the health and safety of all Americans while undercutting efforts by the natural gas industry to cut back on the pollution it produces.
Anthony J. Marchese, Associate Dean for Academic and Student Affairs, Walter Scott, Jr. College of Engineering; Director, Engines and Energy Conversion Laboratory; Professor, Department of Mechanical Engineering, Colorado State University and Dan Zimmerle, Senior Research Associate and Director of METEC, Colorado State University
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Discover more from Daily News
Subscribe to get the latest posts sent to your email.
Science
That Arctic blast can feel brutally cold, but how much colder than ‘normal’ is it really?
Richard B. (Ricky) Rood, University of Michigan
An Arctic blast hitting the central and eastern U.S. in early January 2025 has been creating fiercely cold conditions in many places. Parts of North Dakota dipped to more than 20 degrees below zero, and people as far south as Texas woke up to temperatures in the teens. A snow and ice storm across the middle of the country added to the winter chill.
Forecasters warned that temperatures could be “10 to more than 30 degrees below normal” across much of the eastern two-thirds of the country during the first full week of the year.
But what does “normal” actually mean?
While temperature forecasts are important to help people stay safe, the comparison to “normal” can be quite misleading. That’s because what qualifies as normal in forecasts has been changing rapidly over the years as the planet warms.
Defining normal
One of the most used standards for defining a science-based “normal” is a 30-year average of temperature and precipitation. Every 10 years, the National Center for Environmental Information updates these “normals,” most recently in 2021. The current span considered “normal” is 1991-2020. Five years ago, it was 1981-2010.
But temperatures have been rising over the past century, and the trend has accelerated since about 1980. This warming is fueled by the mining and burning of fossil fuels that increase carbon dioxide and methane in the atmosphere. These greenhouse gases trap heat close to the planet’s surface, leading to increasing temperature.
Because global temperatures are warming, what’s considered normal is warming, too.
So, when a 2025 cold snap is reported as the difference between the actual temperature and “normal,” it will appear to be colder and more extreme than if it were compared to an earlier 30-year average.
Thirty years is a significant portion of a human life. For people under age 40 or so, the use of the most recent averaging span might fit with what they have experienced.
But it doesn’t speak to how much the Earth has warmed.
How cold snaps today compare to the past
To see how today’s cold snaps – or today’s warming – compare to a time before global warming began to accelerate, NASA scientists use 1951-1980 as a baseline.
The reason becomes evident when you compare maps.
For example, January 1994 was brutally cold east of the Rocky Mountains. If we compare those 1994 temperatures to today’s “normal” – the 1991-2020 period – the U.S. looks a lot like maps of early January 2025’s temperatures: Large parts of the Midwest and eastern U.S. were more than 7 degrees Fahrenheit (4 degrees Celsius) below “normal,” and some areas were much colder.
But if we compare January 1994 to the 1951-1980 baseline instead, that cold spot in the eastern U.S. isn’t quite as large or extreme.
Where the temperatures in some parts of the country in January 1994 approached 14.2 F (7.9 C) colder than normal when compared to the 1991-2020 average, they only approached 12.4 F (6.9 C) colder than the 1951-1980 average.
As a measure of a changing climate, updating the average 30-year baseline every decade makes warming appear smaller than it is, and it makes cold snaps seem more extreme.
Conditions for heavy lake-effect snow
The U.S. will continue to see cold air outbreaks in winter, but as the Arctic and the rest of the planet warm, the most frigid temperatures of the past will become less common.
That warming trend helps set up a remarkable situation in the Great Lakes that we’re seeing in January 2025: heavy lake-effect snow across a large area.
As cold Arctic air encroached from the north in January, it encountered a Great Lakes basin where the water temperature was still above 40 F (4.4 C) in many places. Ice covered less than 2% of the lakes’ surface on Jan. 4.
That cold dry air over warmer open water causes evaporation, providing moisture for lake-effect snow. Parts of New York and Ohio along the lakes saw well over a foot of snow in the span of a few days.
The accumulation of heat in the Great Lakes, observed year after year, is leading to fundamental changes in winter weather and the winter economy in the states bordering the lakes.
It’s also a reminder of the persistent and growing presence of global warming, even in the midst of a cold air outbreak.
Richard B. (Ricky) Rood, Professor Emeritus of Climate and Space Sciences and Engineering, University of Michigan
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Discover more from Daily News
Subscribe to get the latest posts sent to your email.
Community
News coverage boosts giving after disasters – Australian research team’s findings may offer lessons for Los Angeles fires
Media coverage significantly influences charitable donations during disasters by highlighting urgency, personal stories, and the scale of the crisis, shaping public generosity and nonprofit support choices.
Cassandra Chapman, The University of Queensland
In late 2019 and early 2020, a series of devastating wildfires, known as the “black summer” bushfire disaster, left Australia reeling: More than 20% of the country’s forests burned.
As a scholar of the psychology of charitable giving, I have long been interested in the unique emotional response that disasters evoke – often generating an urgent and visceral wish to help.
I wanted to understand how and why people respond to a crisis of this magnitude. For the project, I teamed up with three Australian environmental psychology and collective action experts: Matthew Hornsey, Kelly Fielding and Robyn Gulliver.
We found that international media coverage of disasters can help increase donations. Our findings, which were published in the peer-reviewed academic journal Disasters in 2022, are relevant to the situation in Los Angeles, where severe fires destroyed thousands of homes and businesses in January 2025, devastating many communities.
That recovery could take years.
5 key factors affect generosity
All told, Australian donors gave more than US$397 million, or $640 million in Australian dollars, to support the recovery from the black summer bushfire disaster. The international community also rallied: U.S. and U.K. donors contributed an additional US$2.6 million. These donations were used to fund evacuation centers, support groups for victims, and cash grants for repairs and rebuilding, among other things.
When we surveyed 949 Australians about what influenced their donations and analyzed news articles about the disaster, we found that coverage of disasters significantly increased generosity and influenced which charities drew donations. This may be because news articles communicated directly the need for charitable support.
Using this survey data, we identified key factors that influenced how much money, if any, people donated in response to the bushfire disaster appeals. These five were linked with the amounts Australians donated:
• Scale: The sheer scale of the fires.
• Personal impact: Having been personally affected, knowing people who have been affected, or being worried that they will be affected in the future.
• Climate change beliefs: Believing that climate change is impacting the environment.
• News footage: The dramatic footage of the fires they have seen.
• Stories: The stories of those who have been affected.
Three of these factors – scale, news footage and stories – relate to information people were exposed to in media coverage of the disaster. Further, when we asked people how they chose which charities to support, they said that media coverage was more influential than either their friends and family or direct communication from those same charities.
These findings collectively show how media coverage can powerfully influence both how much people give to disaster relief and which nonprofits they choose to support.
Setting the agenda
In the next phase of our research, we tried to learn how media coverage affects the public’s generosity.
We downloaded every news article we could find about the disaster over the three-month period that fires raged and analyzed the text of 30,239 news articles using Linguistic Inquiry and Word Count software.
We looked at which kinds of language and concepts were being used in media coverage, and how frequently they were used compared with their use in everyday written language.
In addition to concepts we expected to see, like emergency, heroes and human loss, we found that the concepts of support and money frequently showed up in coverage. Words like “donations,” “help” and “support” occurred in 74% of news articles. Words having to do with money were even more common: They appeared almost twice as often as they do in ordinary written language.
Our findings suggest that news coverage may have helped to set the agenda for the huge charitable response to Australia’s wildfire disaster because the media told people what they should be thinking about in terms of that disaster. In Australia’s case, it was how they could help.
A consideration for the media
We also believe that it’s likely that news coverage of disasters like this one can serve an agenda-setting function by teaching the public how to think about the crisis.
To the extent that news coverage highlights concepts like support, possibly communicating that donating is a normal response to a crisis, it’s reasonable to expect people to donate more money.
Given that news coverage can influence how much someone donates, as well as which charities they choose to support, nonprofits responding to the Los Angeles fires may wish to encourage media outlets to mention their work in news coverage.
It is likely that being featured in news coverage – especially when calls to action or opportunities to donate are incorporated in an article – would result in more funds being raised for the charity’s response to the disaster.
Cassandra Chapman, Associate Professor, The University of Queensland
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Discover more from Daily News
Subscribe to get the latest posts sent to your email.
-
Urbanism1 year ago
Signal Hill, California: A Historic Enclave Surrounded by Long Beach
-
News2 years ago
Diana Gregory Talks to us about Diana Gregory’s Outreach Services
-
Senior Pickleball Report2 years ago
The Absolute Most Comfortable Pickleball Shoe I’ve Ever Worn!
-
STM Blog2 years ago
World Naked Gardening Day: Celebrating Body Acceptance and Nature
-
Senior Pickleball Report2 years ago
ACE PICKLEBALL CLUB TO DEBUT THEIR HIGHLY ANTICIPATED INDOOR PICKLEBALL FRANCHISES IN THE US, IN EARLY 2023
-
Travel2 years ago
Unique Experiences at the CitizenM
-
Automotive2 years ago
2023 Nissan Sentra pricing starts at $19,950
-
Senior Pickleball Report2 years ago
“THE PEOPLE’S CHOICE AWARDS OF PICKLEBALL” – VOTING OPEN