Some school librarians in Florida have found themselves in the midst of controversy over complaints of “obscene” titles in their libraries. Trish233/iStock via Getty Images
Federal judge overturns part of Florida’s book ban law, drawing on nearly 100 years of precedent protecting First Amendment access to ideas
James B. Blasingame, Arizona State University When a junior at an Orange County public high school in Florida visited the school library to check out a copy of “On the Road” by Jack Kerouac, it wasn’t in its Dewey decimal system-assigned location. It turns out the title had been removed from the library’s shelves because of a complaint, and in compliance with Florida House Bill 1069, it had been removed from the library indefinitely. Kerouac’s quintessential chronicle of the Beat Generation in the 1950s, along with hundreds of other titles, was not available for students to read. Gov. Ron DeSantis signed the bill into law in July 2023. Under this law, if a parent or community member objected to a book on the grounds that it was obscene or pornographic, the school had to remove that title from the curriculum within five days and hold a public hearing with a special magistrate appointed by the state. On Aug. 13, 2025, Judge Carlos Mendoza of the U.S. Middle District of Florida ruled in Penguin Random House v. Gibson that parts of Florida HB 1069 are unconstitutional and violate students’ First Amendment right of free access to ideas. The plaintiffs who filed the suit included the five largest trade book publishing houses, a group of award-winning authors, the Authors Guild, which is a labor union for published professional authors with over 15,000 members, and the parents of a group of Florida students. Though the state filed an appeal on Sept. 11, 2025, this is an important ruling on censorship in a time when many states are passing or debating similar laws. I’ve spent the past 26 years training English language arts teachers at Arizona State University, and 24 years before that teaching high school English. I understand the importance of Mendoza’s ruling for keeping books in classrooms and school libraries. In my experience, every few years the books teachers have chosen to teach come under attack. I’ve tried to learn as much as I can about the history of censorship in this country and pass it to my students, in order to prepare them for what may lie ahead in their careers as English teachers.
Legal precedent
The August 2025 ruling is in keeping with legal precedent around censorship. Over the years, U.S. courts have established that obscenity can be a legitimate cause for removing a book from the public sphere, but only under limited circumstances. In the 1933 case of United States v. One Book Called Ulysses, Judge John Munro Woolsey declared that James Joyce’s classic novel was not obscene, contradicting a lower court ruling. Woolsey emphasized that works must be considered as a whole, rather than judged by “selected excerpts,” and that reviewers should apply contemporary national standards and think about the effect on the average person. In 1957, the Supreme Court further clarified First Amendment protections in Roth v. United States by rejecting the argument that obscenity lacks redeeming social importance. In this case, the court defined obscenity as material that, taken as a whole, appeals to a prurient – that is, lascivious – interest in sex in average readers. The Supreme Court’s 1973 Miller v. California decision created the eponymous Miller test for jurors in obscenity cases. This test incorporates language from the Ulysses and Roth rulings, asking jurors to consider whether the average person, looking at the work as a whole and applying the contemporary standards in their community, would find it lascivious. It also adds the consideration of whether the material in question is of “serious literary, artistic, political, or scientific value” when deciding whether it is obscene. Another decision that is particularly relevant for teachers and school librarians is 1982’s Island Trees School District v. Pico, a case brought by students against their school board. The Supreme Court ruled that removing books from a school library or curriculum is a violation of the First Amendment if it is an attempt to suppress ideas. Free access to ideas in books, the court wrote, is sacrosanct: “If there is any fixed star in our constitutional constellation, it is that no official, high or petty, can prescribe what shall be orthodox in politics, nationalism, religion or other matters of opinion.”These 23 books were removed from Florida school libraries under Florida HB 1069. In his ruling in Penguin Random House v. Gibson, Judge Carlos Mendoza named them and stated, ‘None of these books are obscene.’Illustration by The Conversation
What this ruling clarifies
In his ruling in August 2025, Mendoza pointed out that many of the removed books are classics with no sexual content at all. This was made possible in part by the formulation of HB 1069. The law allows anyone from the community to challenge a book simply by filling out a form, at which point the school is mandated to remove that book within five days. In order to put a book back in circulation, however, the law requires a hearing to be held by the state’s appointed magistrate, and there is no specified deadline by which this hearing must take place. Mendoza did not strike down the parts of HB 1069 that require school districts to follow a state policy for challenging books. In line with precedent, he also left in place challenges for obscenity using the Miller test and with reference to age-appropriateness for mature content. The Florida Department of Education argued that HB 1069 is protected by Florida’s First Amendment right of government speech, a legal theory that the government has the right to prevent any opposing views to its own in schools or any government platform. Mendoza questioned this argument, suggesting that “slapping the label of government speech on book removals only serves to stifle the disfavored viewpoints.”
What this means for schools, in Florida and across the US
The 8,000-Year History of Pecans: How America’s Only Native Nut Became a Holiday Staple
Discover how pecans went from ignored trees to holiday staples over 8,000 years. Learn about Native American pecan use, the enslaved man who revolutionized pecan grafting, George Washington’s pecan obsession, and why the US produces 80% of the world’s pecans.
How pecans went from ignored trees to a holiday staple – the 8,000-year history of America’s only native major nut crop
Shelley Mitchell, Oklahoma State University Pecans have a storied history in the United States. Today, American trees produce hundreds of million of pounds of pecans – 80% of the world’s pecan crop. Most of that crop stays here. Pecans are used to produce pecan milk, butter and oil, but many of the nuts end up in pecan pies. Throughout history, pecans have been overlooked, poached, cultivated and improved. As they have spread throughout the United States, they have been eaten raw and in recipes. Pecans have grown more popular over the decades, and you will probably encounter them in some form this holiday season. I’m an extension specialist in Oklahoma, a state consistently ranked fifth in pecan production, behind Georgia, New Mexico, Arizona and Texas. I’ll admit that I am not a fan of the taste of pecans, which leaves more for the squirrels, crows and enthusiastic pecan lovers.
The spread of pecans
The pecan is a nut related to the hickory. Actually, though we call them nuts, pecans are actually a type of fruit called a drupe. Drupes have pits, like the peach and cherry.Three pecan fruits, which ripen and split open to release pecan nuts, clustered on a pecan tree.IAISI/Moment via Getty Images The pecan nuts that look like little brown footballs are actually the seed that starts inside the pecan fruit – until the fruit ripens and splits open to release the pecan. They are usually the size of your thumb, and you may need a nutcracker to open them. You can eat them raw or as part of a cooked dish. The pecan derives its name from the Algonquin “pakani,” which means “a nut too hard to crack by hand.” Rich in fat and easy to transport, pecans traveled with Native Americans throughout what is now the southern United States. They were used for food, medicine and trade as early as 8,000 years ago.Pecans are native to the southern United States.Elbert L. Little Jr. of the U.S. Department of Agriculture, Forest Service Pecans are native to the southern United States, and while they had previously spread along travel and trade routes, the first documented purposeful planting of a pecan tree was in New York in 1722. Three years later, George Washington’s estate, Mount Vernon, had some planted pecans. Washington loved pecans, and Revolutionary War soldiers said he was constantly eating them. Meanwhile, no one needed to plant pecans in the South, since they naturally grew along riverbanks and in groves. Pecan trees are alternate bearing: They will have a very large crop one year, followed by one or two very small crops. But because they naturally produced a harvest with no input from farmers, people did not need to actively cultivate them. Locals would harvest nuts for themselves but otherwise ignored the self-sufficient trees. It wasn’t until the late 1800s that people in the pecan’s native range realized the pecan’s potential worth for income and trade. Harvesting pecans became competitive, and young boys would climb onto precarious tree branches. One girl was lifted by a hot air balloon so she could beat on the upper branches of trees and let them fall to collectors below. Pecan poaching was a problem in natural groves on private property.
Pecan cultivation begins
Even with so obvious a demand, cultivated orchards in the South were still rare into the 1900s. Pecan trees don’t produce nuts for several years after planting, so their future quality is unknown.An orchard of pecan trees.Jon Frederick/iStock via Getty Images To guarantee quality nuts, farmers began using a technique called grafting; they’d join branches from quality trees to another pecan tree’s trunk. The first attempt at grafting pecans was in 1822, but the attempts weren’t very successful. Grafting pecans became popular after an enslaved man named Antoine who lived on a Louisiana plantation successfully produced large pecans with tender shells by grafting, around 1846. His pecans became the first widely available improved pecan variety.Grafting is a technique that involves connecting the branch of one tree to the trunk of another.Orest Lyzhechka/iStock via Getty Images The variety was named Centennial because it was introduced to the public 30 years later at the Philadelphia Centennial Expedition in 1876, alongside the telephone, Heinz ketchup and the right arm of the Statue of Liberty. This technique also sped up the production process. To keep pecan quality up and produce consistent annual harvests, today’s pecan growers shake the trees while the nuts are still growing, until about half of the pecans fall off. This reduces the number of nuts so that the tree can put more energy into fewer pecans, which leads to better quality. Shaking also evens out the yield, so that the alternate-bearing characteristic doesn’t create a boom-bust cycle.
US pecan consumption
The French brought praline dessert with them when they immigrated to Louisiana in the early 1700s. A praline is a flat, creamy candy made with nuts, sugar, butter and cream. Their original recipe used almonds, but at the time, the only nut available in America was the pecan, so pecan pralines were born.Pralines were originally a French dessert, but Americans began making them with pecans.Jupiterimages/The Image Bank via Getty Images During the Civil War and world wars, Americans consumed pecans in large quantities because they were a protein-packed alternative when meat was expensive and scarce. One cup of pecan halves has about 9 grams of protein. After the wars, pecan demand declined, resulting in millions of excess pounds at harvest. One effort to increase demand was a national pecan recipe contest in 1924. Over 21,000 submissions came from over 5,000 cooks, with 800 of them published in a book. Pecan consumption went up with the inclusion of pecans in commercially prepared foods and the start of the mail-order industry in the 1870s, as pecans can be shipped and stored at room temperature. That characteristic also put them on some Apollo missions. Small amounts of pecans contain many vitamins and minerals. They became commonplace in cereals, which touted their health benefits. In 1938, the federal government published the pamphlet Nuts and How to Use Them, which touted pecans’ nutritional value and came with recipes. Food writers suggested using pecans as shortening because they are composed mostly of fat. The government even put a price ceiling on pecans to encourage consumption, but consumers weren’t buying them. The government ended up buying the surplus pecans and integrating them into the National School Lunch Program.Today, pecan producers use machines called tree shakers to shake pecans out of the trees.Christine_Kohler/iStock via Getty Images While you are sitting around the Thanksgiving table this year, you can discuss one of the biggest controversies in the pecan industry: Are they PEE-cans or puh-KAHNS? Editor’s note: This article was updated to include the amount of protein in a cup of pecans.Shelley Mitchell, Senior Extension Specialist in Horticulture and Landscape Architecture, Oklahoma State University This article is republished from The Conversation under a Creative Commons license. Read the original article.
Learning with AI falls short compared to old-fashioned web search
Learning with AI falls short: New research with 10,000+ participants reveals people who learn using ChatGPT develop shallower knowledge than those using Google search. Discover why AI-generated summaries reduce learning effectiveness and how to use AI tools strategically for education.
Learning with AI falls short compared to old-fashioned web search
Shiri Melumad, University of Pennsylvania Since the release of ChatGPT in late 2022, millions of people have started using large language models to access knowledge. And it’s easy to understand their appeal: Ask a question, get a polished synthesis and move on – it feels like effortless learning. However, a new paper I co-authored offers experimental evidence that this ease may come at a cost: When people rely on large language models to summarize information on a topic for them, they tend to develop shallower knowledge about it compared to learning through a standard Google search. Co-author Jin Ho Yunand I, both professors of marketing, reported this finding in a paper based on seven studies with more than 10,000 participants. Most of the studies used the same basic paradigm: Participants were asked to learn about a topic – such as how to grow a vegetable garden – and were randomly assigned to do so by using either an LLM like ChatGPT or the “old-fashioned way,” by navigating links using a standard Google search. No restrictions were put on how they used the tools; they could search on Google as long as they wanted and could continue to prompt ChatGPT if they felt they wanted more information. Once they completed their research, they were then asked to write advice to a friend on the topic based on what they learned. The data revealed a consistent pattern: People who learned about a topic through an LLM versus web search felt that they learned less, invested less effort in subsequently writing their advice, and ultimately wrote advice that was shorter, less factual and more generic. In turn, when this advice was presented to an independent sample of readers, who were unaware of which tool had been used to learn about the topic, they found the advice to be less informative, less helpful, and they were less likely to adopt it. We found these differences to be robust across a variety of contexts. For example, one possible reason LLM users wrote briefer and more generic advice is simply that the LLM results exposed users to less eclectic information than the Google results. To control for this possibility, we conducted an experiment where participants were exposed to an identical set of facts in the results of their Google and ChatGPT searches. Likewise, in another experiment we held constant the search platform – Google – and varied whether participants learned from standard Google results or Google’s AI Overview feature. The findings confirmed that, even when holding the facts and platform constant, learning from synthesized LLM responses led to shallower knowledge compared to gathering, interpreting and synthesizing information for oneself via standard web links.
Why it matters
Why did the use of LLMs appear to diminish learning? One of the most fundamental principles of skill development is that people learn best when they are actively engaged with the material they are trying to learn. When we learn about a topic through Google search, we face much more “friction”: We must navigate different web links, read informational sources, and interpret and synthesize them ourselves. While more challenging, this friction leads to the development of a deeper, more original mental representation of the topic at hand. But with LLMs, this entire process is done on the user’s behalf, transforming learning from a more active to passive process.
What’s next?
To be clear, we do not believe the solution to these issues is to avoid using LLMs, especially given the undeniable benefits they offer in many contexts. Rather, our message is that people simply need to become smarter or more strategic users of LLMs – which starts by understanding the domains wherein LLMs are beneficial versus harmful to their goals. Need a quick, factual answer to a question? Feel free to use your favorite AI co-pilot. But if your aim is to develop deep and generalizable knowledge in an area, relying on LLM syntheses alone will be less helpful. As part of my research on the psychology of new technology and new media, I am also interested in whether it’s possible to make LLM learning a more active process. In another experiment we tested this by having participants engage with a specialized GPT model that offered real-time web links alongside its synthesized responses. There, however, we found that once participants received an LLM summary, they weren’t motivated to dig deeper into the original sources. The result was that the participants still developed shallower knowledge compared to those who used standard Google. Building on this, in my future research I plan to study generative AI tools that impose healthy frictions for learning tasks – specifically, examining which types of guardrails or speed bumps most successfully motivate users to actively learn more beyond easy, synthesized answers. Such tools would seem particularly critical in secondary education, where a major challenge for educators is how best to equip students to develop foundational reading, writing and math skills while also preparing for a real world where LLMs are likely to be an integral part of their daily lives. The Research Brief is a short take on interesting academic work.Shiri Melumad, Associate Professor of Marketing, University of Pennsylvania This article is republished from The Conversation under a Creative Commons license. Read the original article.
Dive into “The Knowledge,” where curiosity meets clarity. This playlist, in collaboration with STMDailyNews.com, is designed for viewers who value historical accuracy and insightful learning. Our short videos, ranging from 30 seconds to a minute and a half, make complex subjects easy to grasp in no time. Covering everything from historical events to contemporary processes and entertainment, “The Knowledge” bridges the past with the present. In a world where information is abundant yet often misused, our series aims to guide you through the noise, preserving vital knowledge and truths that shape our lives today. Perfect for curious minds eager to discover the ‘why’ and ‘how’ of everything around us. Subscribe and join in as we explore the facts that matter. https://stmdailynews.com/the-knowledge/
Beyond the habitable zone: Exoplanet atmospheres are the next clue to finding life on planets orbiting distant stars
The habitable zone is just the start. Scientists now focus on exoplanet atmospheres to find signs of life beyond Earth. Discover how carbon cycling, greenhouse gases, and NASA’s upcoming Habitable Worlds Observatory could reveal habitable worlds orbiting distant stars.
Some exoplanets, like the one shown in this illustration, may have atmospheres that could make them potentially suitable for life. NASA/JPL-Caltech via AP
Beyond the habitable zone: Exoplanet atmospheres are the next clue to finding life on planets orbiting distant stars
Morgan Underwood, Rice University When astronomers search for planets that could host liquid water on their surface, they start by looking at a star’s habitable zone. Water is a key ingredient for life, and on a planet too close to its star, water on its surface may “boil”; too far, and it could freeze. This zone marks the region in between. But being in this sweet spot doesn’t automatically mean a planet is hospitable to life. Other factors, like whether a planet is geologically active or has processes that regulate gases in its atmosphere, play a role. The habitable zone provides a useful guide to search for signs of life on exoplanets – planets outside our solar system orbiting other stars. But what’s in these planets’ atmospheres holds the next clue about whether liquid water — and possibly life — exists beyond Earth. On Earth, the greenhouse effect, caused by gases like carbon dioxide and water vapor, keeps the planet warm enough for liquid water and life as we know it. Without an atmosphere, Earth’s surface temperature would average around zero degrees Fahrenheit (minus 18 degrees Celsius), far below the freezing point of water. The boundaries of the habitable zone are defined by how much of a “greenhouse effect” is necessary to maintain the surface temperatures that allow for liquid water to persist. It’s a balance between sunlight and atmospheric warming. Many planetary scientists, including me, are seeking to understand if the processes responsible for regulating Earth’s climate are operating on other habitable zone worlds. We use what we know about Earth’s geology and climate to predict how these processes might appear elsewhere, which is where my geoscience expertise comes in.Picturing the habitable zone of a solar system analog, with Venus- and Mars-like planets outside of the ‘just right’ temperature zone.NASA
Why the habitable zone?
The habitable zone is a simple and powerful idea, and for good reason. It provides a starting point, directing astronomers to where they might expect to find planets with liquid water, without needing to know every detail about the planet’s atmosphere or history. Its definition is partially informed by what scientists know about Earth’s rocky neighbors. Mars, which lies just outside the outer edge of the habitable zone, shows clear evidence of ancient rivers and lakes where liquid water once flowed. Similarly, Venus is currently too close to the Sun to be within the habitable zone. Yet, some geochemical evidence and modeling studies suggest Venus may have had water in its past, though how much and for how long remains uncertain. These examples show that while the habitable zone is not a perfect predictor of habitability, it provides a useful starting point.
Planetary processes can inform habitability
What the habitable zone doesn’t do is determine whether a planet can sustain habitable conditions over long periods of time. On Earth, a stable climate allowed life to emerge and persist. Liquid water could remain on the surface, giving slow chemical reactions enough time to build the molecules of life and let early ecosystems develop resilience to change, which reinforced habitability. Life emerged on Earth, but continued to reshape the environments it evolved in, making them more conducive to life. This stability likely unfolded over hundreds of millions of years, as the planet’s surface, oceans and atmosphere worked together as part of a slow but powerful system to regulate Earth’s temperature. A key part of this system is how Earth recycles inorganic carbon between the atmosphere, surface and oceans over the course of millions of years. Inorganic carbon refers to carbon bound in atmospheric gases, dissolved in seawater or locked in minerals, rather than biological material. This part of the carbon cycle acts like a natural thermostat. When volcanoes release carbon dioxide into the atmosphere, the carbon dioxide molecules trap heat and warm the planet. As temperatures rise, rain and weathering draw carbon out of the air and store it in rocks and oceans. If the planet cools, this process slows down, allowing carbon dioxide, a warming greenhouse gas, to build up in the atmosphere again. This part of the carbon cycle has helped Earth recover from past ice ages and avoid runaway warming. Even as the Sun has gradually brightened, this cycle has contributed to keeping temperatures on Earth within a range where liquid water and life can persist for long spans of time. Now, scientists are asking whether similar geological processes might operate on other planets, and if so, how they might detect them. For example, if researchers could observe enough rocky planets in their stars’ habitable zones, they could look for a pattern connecting the amount of sunlight a planet receives and how much carbon dioxide is in its atmosphere. Finding such a pattern may hint that the same kind of carbon-cycling process could be operating elsewhere. The mix of gases in a planet’s atmosphere is shaped by what’s happening on or below its surface. One study shows that measuring atmospheric carbon dioxide in a number of rocky planets could reveal whether their surfaces are broken into a number of moving plates, like Earth’s, or if their crusts are more rigid. On Earth, these shifting plates drive volcanism and rock weathering, which are key to carbon cycling.Simulation of what space telescopes, like the Habitable Worlds Observatory, will capture when looking at distant solar systems.STScI, NASA GSFC
Keeping an eye on distant atmospheres
The next step will be toward gaining a population-level perspective of planets in their stars’ habitable zones. By analyzing atmospheric data from many rocky planets, researchers can look for trends that reveal the influence of underlying planetary processes, such as the carbon cycle. Scientists could then compare these patterns with a planet’s position in the habitable zone. Doing so would allow them to test whether the zone accurately predicts where habitable conditions are possible, or whether some planets maintain conditions suitable for liquid water beyond the zone’s edges. This kind of approach is especially important given the diversity of exoplanets. Many exoplanets fall into categories that don’t exist in our solar system — such as super Earths and mini Neptunes. Others orbit stars smaller and cooler than the Sun. The datasets needed to explore and understand this diversity are just on the horizon. NASA’s upcoming Habitable Worlds Observatory will be the first space telescope designed specifically to search for signs of habitability and life on planets orbiting other stars. It will directly image Earth-sized planets around Sun-like stars to study their atmospheres in detail.NASA’s planned Habitable Worlds Observatory will look for exoplanets that could potentially host life. Instruments on the observatory will analyze starlight passing through these atmospheres to detect gases like carbon dioxide, methane, water vapor and oxygen. As starlight filters through a planet’s atmosphere, different molecules absorb specific wavelengths of light, leaving behind a chemical fingerprint that reveals which gases are present. These compounds offer insight into the processes shaping these worlds. The Habitable Worlds Observatory is under active scientific and engineering development, with a potential launch targeted for the 2040s. Combined with today’s telescopes, which are increasingly capable of observing atmospheres of Earth-sized worlds, scientists may soon be able to determine whether the same planetary processes that regulate Earth’s climate are common throughout the galaxy, or uniquely our own. Morgan Underwood, Ph.D. Candidate in Earth, Environmental and Planetary Sciences, Rice University This article is republished from The Conversation under a Creative Commons license. Read the original article.
The science section of our news blog STM Daily News provides readers with captivating and up-to-date information on the latest scientific discoveries, breakthroughs, and innovations across various fields. We offer engaging and accessible content, ensuring that readers with different levels of scientific knowledge can stay informed. Whether it’s exploring advancements in medicine, astronomy, technology, or environmental sciences, our science section strives to shed light on the intriguing world of scientific exploration and its profound impact on our daily lives. From thought-provoking articles to informative interviews with experts in the field, STM Daily News Science offers a harmonious blend of factual reporting, analysis, and exploration, making it a go-to source for science enthusiasts and curious minds alike. https://stmdailynews.com/category/science/