The aftermath of floods, hurricanes and other disasters can be hardest on older rural Americans – here’s how families and neighbors can help
Hurricanes, tornadoes and other extreme weather do not distinguish between urban and rural boundaries. But when a disaster strikes, there are big differences in how well people are able to respond and recover – and older adults in rural areas are especially vulnerable. If a disaster causes injuries, getting health care can take longer in rural areas. Many rural hospitals have closed, leaving patients traveling longer distances for care. At the same time, rural areas have higher percentages of older adults, a group that is more likely to have chronic health problems that make experiencing natural disasters especially dangerous. Medical treatments, such as dialysis, can be disrupted when power goes out or clinics are damaged, and injuries are more likely around property damaged by flooding or powerful winds. As a sociologist who studies rural issues and directs the Institute of Behavioral Science at the University of Colorado Boulder, I believe that understanding the risks is essential for ensuring healthier lives for older adults. I see many different ways rural communities are helping reduce their vulnerability in disasters.
Disasters disrupt health care, especially in isolated rural regions
According to the U.S. Census Bureau, about 20% of the country’s rural population is age 65 and over, compared with only 16% of urban residents. That’s about 10 million older adults living in rural areas. There are three primary reasons rural America has been aging faster than the rest of the country: Young people have been leaving for college and job opportunities, meaning fewer residents are starting new families. Many older rural residents are choosing to “age in place” where they have strong social ties. And some rural areas are gaining older adults who choose to retire there. An aging population means rural areas tend to have a larger percentage of residents with chronic disease, such as dementia, heart disease, respiratory illness and diabetes. According to research from the National Council on Aging, nearly 95% of adults age 60 and older have at least one chronic condition, while more than 78% have two or more. Rural areas also have higher rates of death from chronic diseases, particularly heart disease. At the same time, health care access in rural areas is rapidly declining. Nearly 200 rural hospitals have closed or stopped providing in-patient care since 2005. Over 700 more — one-third of the nation’s remaining rural hospitals — were considered to be at risk of closing even before the cuts to Medicaid that the president signed in July 2025. Hospital closures have left rural residents traveling about 20 miles farther for common in-patient health care services than they did two decades ago, and even farther for specialist care. Those miles might seem trivial, but in emergencies when roads are damaged or flooded, they can mean losing access to care and treatment. After Hurricane Katrina struck New Orleans in 2005, 44% of patients on dialysis missed at least one treatment session, and almost 17% missed three or more. When Hurricanes Matthew and Florence hit rural Robeson County, North Carolina, in 2016 and 2018, some patients who relied on insulin to manage their blood sugar levels went without insulin for weeks. The county had high rates of poverty and poor health already, and the healthy foods people needed to manage the disease were also hard to find after the storm. Insulin is important for treating diabetes – a chronic disease estimated to affect nearly one-third of adults age 65 and older. But a sufficient supply can be harder to maintain when a disaster knocks out power, because insulin should be kept cool, and medical facilities and drugstores may be harder for patients to reach. Rural residents also often live farther from community centers, schools or other facilities that can serve as cooling centers during heat waves or evacuation centers in times of crisis.
Alzheimer’s disease can make evacuation difficult
Cognitive decline also affects older adults’ ability to manage disasters. Over 11% of Americans age 65 and older – more than 7 million people – have Alzheimer’s disease or related dementia, and the prevalence is higher in rural areas’ older populations compared with urban areas. Caregivers for family members living with dementia may struggle to find time to prepare for disasters. And when disaster strikes, they face unique challenges. Disasters disrupt routines, which can cause agitation for people with Alzheimer’s, and patients may resist evacuation. Living through a disaster can also worsen brain health over the long run. Older adults who lived through the 2011 Great East Japan earthquake and tsunami were found to have greater cognitive decline over the following decade, especially those who lost their homes or jobs, or whose health care routines were disrupted.
Social safety nets are essential
One thing that many rural communities have that helps is a strong social fabric. Those social connections can help reduce older adults’ vulnerability when disasters strike. Following severe flooding in Colorado in 2013, social connections helped older adults navigate the maze of paperwork required for disaster aid, and some even provided personal loans.Community support through churches, like this one whose building was hit by a tornado in rural Argyle, Wis., in 2024, and other groups can help older adults recover from disasters.Ross Harried/NurPhoto via Getty Images Friends, family and neighbors in rural areas often check in on seniors, particularly those living alone. They can help them develop disaster response plans to ensure older residents have access to medications and medical treatment, and that they have an evacuation plan. Rural communities and local groups can also help build up older adults’ mental and physical health before and after storms by developing educational, social and exercise programs. Better health and social connections can improve resilience, including older adults’ ability to respond to alerts and recover after disasters. Ensuring that everyone in the community has that kind of support is important in rural areas and cities alike as storm and flood risks worsen, particularly for older adults. Lori Hunter, Professor of Sociology, Director of the Institute of Behavioral Science, University of Colorado Boulder This article is republished from The Conversation under a Creative Commons license. Read the original article.
The Substitute Teacher Who Wanted Blueprints of Our House
A fifth-grade assignment took a strange turn when a substitute teacher asked students to draw schematics of their homes. What followed — a wildly fictional floor plan and a priceless reaction from my mom — turned into one of my funniest childhood memories.
The Substitute Teacher Who Wanted Blueprints of Our House
Elementary school memories tend to blend together — cafeteria pizza, playground arguments, the eternal struggle of times tables — but every once in a while, something happens that sticks with you for life. For me, that moment came in the fifth grade during a week when our regular teacher was out, and we cycled through substitute teachers like we were testing models for durability. By midweek, in walked a substitute with a mysterious, slightly intense energy — the kind of vibe that suggested he either meditated at dawn or worked a graveyard shift doing something he couldn’t talk about. We settled into our seats, expecting worksheets or quiet reading time. But nope. He had other plans. “Today,” he announced, “we’re going to draw schematics of our houses.” Schematics. Not drawings. Not little houses with smoke coming out of the chimney. Actual blueprint-style schematics. He wanted the layout of our bedrooms, our parents’ rooms, and where the pets slept. Every detail. Now, to be fair, Highlights Magazine did have a feature that month teaching kids how to draw floor plans. So maybe he was just a bit overenthusiastic about cross-curricular learning. Or maybe — and this is my completely rhetorical adult theory — he worked the graveyard shift as a cat burglar gathering intel between heists. Just moonlighting between blueprints. While the rest of the class tried their best to recreate their actual homes, my imagination sprinted in a totally different direction. The house I drew had:
A massive master bedroom with an oversized bathroom for my parents
Separate bedrooms for us kids on the opposite side of the house
A kitchen placed right in the center like a command center
And the dog — the true VIP — had a luxurious two-story doghouse
I had basically created a dream home designed by a 10-year-old watching too much Fantasy Homes by the Yard. Later that day, my mom asked the usual question: “So, what did you guys do today?” “We drew schematics of our house,” I said casually. The look on her face was instant and intense. She wasn’t panicked, but there was definitely a “Why does a substitute teacher need to know the exact layout of my home?” expression happening. Parental instincts activated. But then I showed her my diagram. She stared at it. Blinked. Then sighed with massive relief. “This isn’t our house,” she said. “Nope! I made it up,” I replied proudly. Her shoulders relaxed so much she probably lost five pounds of tension in one instant. If the substitute was secretly planning a heist, my masterpiece of misinformation would have sent him to the wrong house entirely. Looking back, the whole moment feels like a sitcom setup — a mysterious substitute collecting “house schematics,” me creating a completely fictional piece of architecture, and my mom going on a full emotional journey in under 30 seconds. Maybe he was just excited about the Highlights Magazine floor-plan activity. Or maybe — just maybe — he moonlighted in cat burglary. We’ll never know. But if he was, I like to think I threw him completely off the scent.
Enjoy this story?
Check out more nostalgic and humorous stories on STM Daily News and be sure to sign up for our newsletter!
Our Lifestyle section on STM Daily News is a hub of inspiration and practical information, offering a range of articles that touch on various aspects of daily life. From tips on family finances to guides for maintaining health and wellness, we strive to empower our readers with knowledge and resources to enhance their lifestyles. Whether you’re seeking outdoor activity ideas, fashion trends, or travel recommendations, our lifestyle section has got you covered. Visit us today at https://stmdailynews.com/category/lifestyle/ and embark on a journey of discovery and self-improvement.
Latin America’s Religious Shift: More Say ‘Yes’ to God but ‘No’ to Church
New research on 220,000 Latin Americans reveals a paradox: church affiliation dropped from 93% to 82% and attendance is declining, yet personal faith remains strong. Discover why Latin America’s religious decline differs dramatically from Europe and the US.
A woman takes part in a Christ of May procession in Santiago, Chile, parading a relic from a destroyed church’s crucifix through the city. AP Photo/Esteban Felix
Latin America’s Religious Shift: More Say ‘Yes’ to God but ‘No’ to Church
Matthew Blanton, The University of Texas at Austin In a region known for its tumultuous change, one idea remained remarkably consistent for centuries: Latin America is Catholic. The region’s 500-year transformation into a Catholic stronghold seemed capped in 2013, when Jorge Mario Bergoglio of Argentina was elected as the first Latin American pope. Once a missionary outpost, Latin America is now the heart of the Catholic Church. It is home to over 575 million adherents – over 40% of all Catholics worldwide. The next-largest regions are Europe and Africa, each home to 20% of the world’s Catholics. Yet beneath this Catholic dominance, the region’s religious landscape is changing. First, Protestant and Pentecostal groups have experienced dramatic growth. In 1970, only 4% of Latin Americans identified as Protestant; by 2014, the share had climbed to almost 20%. But even as Protestant ranks swelled, another trend was quietly gaining ground: a growing share of Latin Americans abandoning institutional faith altogether. And, as my research shows, the region’s religious decline shows a surprising difference from patterns elsewhere. While fewer Latin Americans are identifying with a religion or attending services, personal faith remains strong.Women known as ‘animeras,’ who pray for the souls of the deceased, walk to a church for Day of the Dead festivities in Telembi, Ecuador.AP Photo/Carlos Noriega
Religious decline
In 2014, 8% of Latin Americans claimed no religion at all. This number is twice as high as the percentage of people who were raised without a religion, indicating that the growth is recent, coming from people who left the church as adults. However, there had been no comprehensive study of religious change in Latin America since then. My new research, published in September 2025, draws on two decades of survey data from over 220,000 respondents in 17 Latin American countries. This data comes from the AmericasBarometer, a large, region-wide survey conducted every two years by Vanderbilt University that focuses on democracy, governance and other social issues. Because it asks the same religion questions across countries and over time, it offers an unusually clear view of changing patterns. Overall, the number of Latin Americans reporting no religious affiliation surged from 7% in 2004 to over 18% in 2023. The share of people who say they are religiously unaffiliated grew in 15 of the 17 countries, and more than doubled in seven. On average, 21% of people in South America say they do not have a religious affiliation, compared with 13% in Mexico and Central America. Uruguay, Chile and Argentina are the three least religious countries in the region. Guatemala, Peru and Paraguay are the most traditionally religious, with fewer than 9% who identify as unaffiliated. Another question scholars typically use to measure religious decline is how often people go to church. From 2008 to 2023, the share of Latin Americans attending church at least once a month decreased from 67% to 60%. The percentage who never attend, meanwhile, grew from 18% to 25%. The generational pattern is stark. Among people born in the 1940s, just over half say they attend church regularly. Each subsequent generation shows a steeper decline, dropping to just 35% for those born in the 1990s. Religious affiliation shows a similar trajectory – each generation is less affiliated than the one before.
Personal religiosity
However, in my study, I also examined a lesser-used measure of religiosity – one that tells a different story. That measure is “religious importance”: how important people say that religion is in their daily lives. We might think of this as “personal” religiosity, as opposed to the “institutional” religiosity tied to formal congregations and denominations.People attend a Mass marking the International Day against Drug Abuse and Illicit Trafficking in Buenos Aires, Argentina, on June 26, 2024.AP Photo/Rodrigo Abd Like church attendance, overall religious importance is high in Latin America. In 2010, roughly 85% of Latin Americans in the 17 countries whose data I analyzed said religion was important in their daily lives. Sixty percent said “very,” and 25% said “somewhat.” By 2023, the “somewhat important” group declined to 19%, while the “very important” group grew to 64%. Personal religious importance was growing, even as affiliation and church attendance were falling. Religious importance shows the same generational pattern as affiliation and attendance: Older people tend to report higher levels than younger ones. In 2023, 68% of people born in the 1970s said religion was “very important,” compared with 60% of those born in the 1990s. Yet when you compare people at the same age, the pattern reverses. At age 30, 55% of those born in the 1970s rated religion as very important. Compare that with 59% among Latin Americans born in the 1980s, and 62% among those born in the 1990s. If this trend continues, younger generations could eventually show greater personal religious commitment than their elders.
Affiliation vs. belief
What we are seeing in Latin America, I’d argue, is a fragmented pattern of religious decline. The authority of religious institutions is waning – fewer people claim a faith; fewer attend services. But personal belief isn’t eroding. Religious importance is holding steady, even growing. This pattern is quite different from Europe and the United States, where institutional decline and personal belief tend to move together. Eighty-six percent of unaffiliated people in Latin America say they believe in God or a higher power. That compares with only 30% in Europe and 69% in the United States. Sizable proportions of unaffiliated Latin Americans also believe in angels, miracles and even that Jesus will return to Earth in their lifetime. In other words, for many Latin Americans, leaving behind a religious label or skipping church does not mean leaving faith behind.An Aymara Indigenous spiritual guide blesses a statue of baby Jesus with incense after an Epiphany Mass at a Catholic church in La Paz, Bolivia, on Jan. 6, 2025.AP Photo/Juan Karita This distinctive pattern reflects Latin America’s unique history and culture. Since the colonial period, the region has been shaped by a mix of religious traditions. People often combine elements of Indigenous beliefs, Catholic practices and newer Protestant movements, creating personal forms of faith that don’t always fit neatly into any one church or institution. Because priests were often scarce in rural areas, Catholicism developed in many communities with little direct oversight from the church. Home rituals, local saints’ festivals and lay leaders helped shape religious life in more independent ways. This reality challenges how scholars typically measure religious change. Traditional frameworks for measuring religious decline, developed from Western European data, rely heavily on religious affiliation and church attendance. But this approach overlooks vibrant religiosity outside formal structures – and can lead scholars to mistaken conclusions. In short, Latin America reminds us that faith can thrive even as institutions fade. Matthew Blanton, PhD Candidate, Sociology and Demography, The University of Texas at Austin This article is republished from The Conversation under a Creative Commons license. Read the original article.
STM Daily News is a vibrant news blog dedicated to sharing the brighter side of human experiences. Emphasizing positive, uplifting stories, the site focuses on delivering inspiring, informative, and well-researched content. With a commitment to accurate, fair, and responsible journalism, STM Daily News aims to foster a community of readers passionate about positive change and engaged in meaningful conversations. Join the movement and explore stories that celebrate the positive impacts shaping our world.
The 8,000-Year History of Pecans: How America’s Only Native Nut Became a Holiday Staple
Discover how pecans went from ignored trees to holiday staples over 8,000 years. Learn about Native American pecan use, the enslaved man who revolutionized pecan grafting, George Washington’s pecan obsession, and why the US produces 80% of the world’s pecans.
How pecans went from ignored trees to a holiday staple – the 8,000-year history of America’s only native major nut crop
Shelley Mitchell, Oklahoma State University Pecans have a storied history in the United States. Today, American trees produce hundreds of million of pounds of pecans – 80% of the world’s pecan crop. Most of that crop stays here. Pecans are used to produce pecan milk, butter and oil, but many of the nuts end up in pecan pies. Throughout history, pecans have been overlooked, poached, cultivated and improved. As they have spread throughout the United States, they have been eaten raw and in recipes. Pecans have grown more popular over the decades, and you will probably encounter them in some form this holiday season. I’m an extension specialist in Oklahoma, a state consistently ranked fifth in pecan production, behind Georgia, New Mexico, Arizona and Texas. I’ll admit that I am not a fan of the taste of pecans, which leaves more for the squirrels, crows and enthusiastic pecan lovers.
The spread of pecans
The pecan is a nut related to the hickory. Actually, though we call them nuts, pecans are actually a type of fruit called a drupe. Drupes have pits, like the peach and cherry.Three pecan fruits, which ripen and split open to release pecan nuts, clustered on a pecan tree.IAISI/Moment via Getty Images The pecan nuts that look like little brown footballs are actually the seed that starts inside the pecan fruit – until the fruit ripens and splits open to release the pecan. They are usually the size of your thumb, and you may need a nutcracker to open them. You can eat them raw or as part of a cooked dish. The pecan derives its name from the Algonquin “pakani,” which means “a nut too hard to crack by hand.” Rich in fat and easy to transport, pecans traveled with Native Americans throughout what is now the southern United States. They were used for food, medicine and trade as early as 8,000 years ago.Pecans are native to the southern United States.Elbert L. Little Jr. of the U.S. Department of Agriculture, Forest Service Pecans are native to the southern United States, and while they had previously spread along travel and trade routes, the first documented purposeful planting of a pecan tree was in New York in 1722. Three years later, George Washington’s estate, Mount Vernon, had some planted pecans. Washington loved pecans, and Revolutionary War soldiers said he was constantly eating them. Meanwhile, no one needed to plant pecans in the South, since they naturally grew along riverbanks and in groves. Pecan trees are alternate bearing: They will have a very large crop one year, followed by one or two very small crops. But because they naturally produced a harvest with no input from farmers, people did not need to actively cultivate them. Locals would harvest nuts for themselves but otherwise ignored the self-sufficient trees. It wasn’t until the late 1800s that people in the pecan’s native range realized the pecan’s potential worth for income and trade. Harvesting pecans became competitive, and young boys would climb onto precarious tree branches. One girl was lifted by a hot air balloon so she could beat on the upper branches of trees and let them fall to collectors below. Pecan poaching was a problem in natural groves on private property.
Pecan cultivation begins
Even with so obvious a demand, cultivated orchards in the South were still rare into the 1900s. Pecan trees don’t produce nuts for several years after planting, so their future quality is unknown.An orchard of pecan trees.Jon Frederick/iStock via Getty Images To guarantee quality nuts, farmers began using a technique called grafting; they’d join branches from quality trees to another pecan tree’s trunk. The first attempt at grafting pecans was in 1822, but the attempts weren’t very successful. Grafting pecans became popular after an enslaved man named Antoine who lived on a Louisiana plantation successfully produced large pecans with tender shells by grafting, around 1846. His pecans became the first widely available improved pecan variety.Grafting is a technique that involves connecting the branch of one tree to the trunk of another.Orest Lyzhechka/iStock via Getty Images The variety was named Centennial because it was introduced to the public 30 years later at the Philadelphia Centennial Expedition in 1876, alongside the telephone, Heinz ketchup and the right arm of the Statue of Liberty. This technique also sped up the production process. To keep pecan quality up and produce consistent annual harvests, today’s pecan growers shake the trees while the nuts are still growing, until about half of the pecans fall off. This reduces the number of nuts so that the tree can put more energy into fewer pecans, which leads to better quality. Shaking also evens out the yield, so that the alternate-bearing characteristic doesn’t create a boom-bust cycle.
US pecan consumption
The French brought praline dessert with them when they immigrated to Louisiana in the early 1700s. A praline is a flat, creamy candy made with nuts, sugar, butter and cream. Their original recipe used almonds, but at the time, the only nut available in America was the pecan, so pecan pralines were born.Pralines were originally a French dessert, but Americans began making them with pecans.Jupiterimages/The Image Bank via Getty Images During the Civil War and world wars, Americans consumed pecans in large quantities because they were a protein-packed alternative when meat was expensive and scarce. One cup of pecan halves has about 9 grams of protein. After the wars, pecan demand declined, resulting in millions of excess pounds at harvest. One effort to increase demand was a national pecan recipe contest in 1924. Over 21,000 submissions came from over 5,000 cooks, with 800 of them published in a book. Pecan consumption went up with the inclusion of pecans in commercially prepared foods and the start of the mail-order industry in the 1870s, as pecans can be shipped and stored at room temperature. That characteristic also put them on some Apollo missions. Small amounts of pecans contain many vitamins and minerals. They became commonplace in cereals, which touted their health benefits. In 1938, the federal government published the pamphlet Nuts and How to Use Them, which touted pecans’ nutritional value and came with recipes. Food writers suggested using pecans as shortening because they are composed mostly of fat. The government even put a price ceiling on pecans to encourage consumption, but consumers weren’t buying them. The government ended up buying the surplus pecans and integrating them into the National School Lunch Program.Today, pecan producers use machines called tree shakers to shake pecans out of the trees.Christine_Kohler/iStock via Getty Images While you are sitting around the Thanksgiving table this year, you can discuss one of the biggest controversies in the pecan industry: Are they PEE-cans or puh-KAHNS? Editor’s note: This article was updated to include the amount of protein in a cup of pecans.Shelley Mitchell, Senior Extension Specialist in Horticulture and Landscape Architecture, Oklahoma State University This article is republished from The Conversation under a Creative Commons license. Read the original article.