Connect with us

News

The Battle Over Intercity Rail: A Political Showdown in Arizona

Arizona’s political divide over intercity rail: GOP opposes, Dems advocate for sustainable transit solutions.

Published

on

passenger train at a train station at sunset. intercity rail
Photo by Maisy Vi on Pexels.com

Republican lawmakers in Arizona are taking a firm stance against the development of intercity rail, particularly a commuter rail between Phoenix and Tucson. Their recent move to impose stringent conditions on the state Department of Transportation, including barring the acceptance of federal funds for commuter rail, has sparked controversy and division along party lines.

Senator Jake Hoffman, a vocal opponent of the commuter rail project, argues that investing in what he deems as outdated technology would be a waste of money, citing low ridership numbers on existing light rail systems. He insists that the focus should be on enhancing road infrastructure like the I-10 instead.

On the other side, Governor Katie Hobbs and Democratic lawmakers are advocating for sustainable transportation solutions, including the potential revival of Amtrak service between Phoenix and Tucson. They emphasize the importance of environmental considerations, clean air for future generations, and reducing carbon emissions.

The clash between the two parties reflects a larger debate on transportation priorities and environmental concerns. While Republicans stress individual freedom and the efficiency of personal automobiles, Democrats highlight the need for greener modes of transportation and addressing climate change.

As the legislative battle continues, the fate of intercity rail in Arizona hangs in the balance. The decision on whether to proceed with the project will have far-reaching implications for the state’s transportation infrastructure and environmental policies. Stay tuned as the Senate deliberates on this contentious issue.

https://tucson.com/news/local/government-politics/tucson-phoenix-commuter-train-jake-hoffman/article_32e22568-c9f3-11ee-a111-071dc300ee63.html

The Bridge is a section of the STM Daily News Blog meant for diversity, offering real news stories about bona fide community efforts to perpetuate a greater good. The purpose of The Bridge is to connect the divides that separate us, fostering understanding and empathy among different groups. By highlighting positive initiatives and inspirational actions, The Bridge aims to create a sense of unity and shared purpose. This section brings to light stories of individuals and organizations working tirelessly to promote inclusivity, equality, and mutual respect. Through these narratives, readers are encouraged to appreciate the richness of diverse perspectives and to participate actively in building stronger, more cohesive communities.

https://stmdailynews.com/the-bridge

Author

Want more stories 👋
“Your morning jolt of Inspiring & Interesting Stories!”

Sign up to receive awesome articles directly to your inbox.

We don’t spam! Read our privacy policy for more info.

Advertisement
image 101376000 12222003
STM Coffee Newsletter 1

Discover more from Daily News

Subscribe to get the latest posts sent to your email.

News

US earthquake safety relies on federal employees’ expertise

Published

on

earthquakes
The 6.9 magnitude Loma Prieta earthquake near San Francisco in 1989 caused about $6.8 billion in damage and 63 deaths. J.K. Nakata/U.S. Geological Survey
Jonathan P. Stewart, University of California, Los Angeles and Lucy Arendt, St. Norbert College Earthquakes and the damage they cause are apolitical. Collectively, we either prepare for future earthquakes or the population eventually pays the price. The earthquakes that struck Myanmar on March 28, 2025, collapsing buildings and causing more than 3,000 deaths, were a sobering reminder of the risks and the need for preparation. In the U.S., this preparation hinges in large part on the expertise of scientists and engineers in federal agencies who develop earthquake hazard models and contribute to the creation of building codes designed to ensure homes, high-rises and other structures won’t collapse when the ground shakes. Local communities and states decide whether to adopt building code documents. But those documents and other essential resources are developed through programs supported by federal agencies working in partnership with practicing engineers and earthquake experts at universities. This essential federal role is illustrated by two programs that we work closely with as an earthquake engineer and a disaster management expert whose work focuses on seismic risk.

Improving building codes

First, seismologists and earthquake engineers at the U.S. Geological Survey, or USGS, produce the National Seismic Hazard Model. These maps, based on research into earthquake sources such as faults and how seismic waves move through the earth’s crust, are used to determine the forces that structures in each community should be designed to resist. A steering committee of earthquake experts from the private sector and universities works with USGS to ensure that the National Seismic Hazard Model implements the best available science.
Map shows the highest risk areas in Alaska, the Pacific Coast, Mountain West and Midwest. But strong earthquakes hit elsewhere, too.
In this 2023 update of the national seismic risk map, red areas have the greatest chance of a damaging earthquake occurring within 100 years. USGS
Second, the Federal Emergency Management Agency, FEMA, supports the process for periodically updating building codes. That includes supporting the work of the National Institute of Building Sciences’ Provisions Update Committee, which recommends building code revisions based on investigations of earthquake damage. More broadly, FEMA, the USGS, the National Institute of Standards and Technology and the National Science Foundation work together through the National Earthquake Hazards Reduction Program to advance earthquake science and turn knowledge of earthquake risks into safer standards, better building design and education. Some of those agencies have been threatened by potential job and funding cuts under the Trump administration, and others face uncertainty regarding continuation of federal support for their work. It is in large part because of the National Seismic Hazard Model and regularly updated building codes that U.S. buildings designed to meet modern code requirements are considered among the safest in the world, despite substantial seismic hazards in several states. This paradigm has been made possible by the technical expertise and lack of political agendas among the federal staff. Without that professionalism, we believe experts from outside the federal government would be less likely to donate their time. The impacts of these and other programs are well documented. We can point to the limited fatalities from U.S. earthquakes such as the 1989 Loma Prieta earthquake near San Francisco, the 1994 Northridge earthquake in Los Angeles and the 2001 Nisqually earthquake near Seattle. Powerful earthquakes in countries lacking seismic preparedness, often due to lack of adoption or enforcement of building codes, have produced much greater devastation and loss of life.

The US has long relied on people with expertise

These programs and the federal agencies supporting them have benefited from a high level of staff expertise because hiring and advancement processes have been divorced from politics and focused on qualifications and merit. This has not always been the case. For much of early U.S. history, federal jobs were awarded through a patronage system, where political loyalty determined employment. As described in “The Federal Civil Service System and The Problem of Bureaucracy,” this system led to widespread corruption and dysfunction, with officials focused more on managing quid pro quo patronage than governing effectively. That peaked in 1881 with President James Garfield’s assassination by Charles Guiteau, a disgruntled supporter who had been denied a government appointment. The passage of the Pendleton Act by Congress in 1883 shifted federal employment to a merit-based system. This preference for a merit-based system was reinforced in the Civil Service Reform Act of 1978. It states as national policy that “to provide the people of the United States with a competent, honest, and productive workforce … and to improve the quality of public service, Federal personnel management should be implemented consistent with merit system principles.” The shift away from a patronage system produced a more stable and efficient federal workforce, which has enabled improvements in many critical areas, including seismic safety and disaster response.

Merit-based civil service matters for safety

While the work of these federal employees often goes unnoticed, the benefits are demonstrable and widespread. That becomes most apparent when disasters strike and buildings that meet modern code requirements remain standing. A merit-based civil service is not just a democratic ideal but a proven necessity for the safety and security of the American people, one we hope will continue well into the future. This can be achieved by retaining federal scientists and engineers and supporting the essential work of federal agencies. This article, originally published March 31, 2025, has been updated with the rising death toll in Myanamar.The Conversation Jonathan P. Stewart, Professor of Engineering, University of California, Los Angeles and Lucy Arendt, Professor of Business Administration Management, St. Norbert College This article is republished from The Conversation under a Creative Commons license. Read the original article.
 

Discover more from Daily News

Subscribe to get the latest posts sent to your email.

Continue Reading

News

From help to harm: How the government is quietly repurposing everyone’s data for surveillance

Published

on

file 20250423 56 4pof6v.jpg?ixlib=rb 4.1
Immigration enforcement is a key justification for repurposing government data. Photo by U.S. Immigration and Customs Enforcement via Getty Images
Nicole M. Bennett, Indiana University A whistleblower at the National Labor Relations Board reported an unusual spike in potentially sensitive data flowing out of the agency’s network in early March 2025 when staffers from the Department of Government Efficiency, which goes by DOGE, were granted access to the agency’s databases. On April 7, the Department of Homeland Security gained access to Internal Revenue Service tax data. These seemingly unrelated events are examples of recent developments in the transformation of the structure and purpose of federal government data repositories. I am a researcher who studies the intersection of migration, data governance and digital technologies. I’m tracking how data that people provide to U.S. government agencies for public services such as tax filing, health care enrollment, unemployment assistance and education support is increasingly being redirected toward surveillance and law enforcement. Originally collected to facilitate health care, eligibility for services and the administration of public services, this information is now shared across government agencies and with private companies, reshaping the infrastructure of public services into a mechanism of control. Once confined to separate bureaucracies, data now flows freely through a network of interagency agreements, outsourcing contracts and commercial partnerships built up in recent decades. These data-sharing arrangements often take place outside public scrutiny, driven by national security justifications, fraud prevention initiatives and digital modernization efforts. The result is that the structure of government is quietly transforming into an integrated surveillance apparatus, capable of monitoring, predicting and flagging behavior at an unprecedented scale. Executive orders signed by President Donald Trump aim to remove remaining institutional and legal barriers to completing this massive surveillance system.

DOGE and the private sector

Central to this transformation is DOGE, which is tasked via an executive order to “promote inter-operability between agency networks and systems, ensure data integrity, and facilitate responsible data collection and synchronization.” An additional executive order calls for the federal government to eliminate its information silos. By building interoperable systems, DOGE can enable real-time, cross-agency access to sensitive information and create a centralized database on people within the U.S. These developments are framed as administrative streamlining but lay the groundwork for mass surveillance. Key to this data repurposing are public-private partnerships. The DHS and other agencies have turned to third-party contractors and data brokers to bypass direct restrictions. These intermediaries also consolidate data from social media, utility companies, supermarkets and many other sources, enabling enforcement agencies to construct detailed digital profiles of people without explicit consent or judicial oversight. Palantir, a private data firm and prominent federal contractor, supplies investigative platforms to agencies such as Immigration and Customs Enforcement, the Department of Defense, the Centers for Disease Control and Prevention and the Internal Revenue Service. These platforms aggregate data from various sources – driver’s license photos, social services, financial information, educational data – and present it in centralized dashboards designed for predictive policing and algorithmic profiling. These tools extend government reach in ways that challenge existing norms of privacy and consent.

The role of AI

Artificial intelligence has further accelerated this shift. Predictive algorithms now scan vast amounts of data to generate risk scores, detect anomalies and flag potential threats. These systems ingest data from school enrollment records, housing applications, utility usage and even social media, all made available through contracts with data brokers and tech companies. Because these systems rely on machine learning, their inner workings are often proprietary, unexplainable and beyond meaningful public accountability.
Data privacy researcher Justin Sherman explains the astonishing amount of information data brokers have about you.
Sometimes the results are inaccurate, generated by AI hallucinations – responses AI systems produce that sound convincing but are incorrect, made up or irrelevant. Minor data discrepancies can lead to major consequences: job loss, denial of benefits and wrongful targeting in law enforcement operations. Once flagged, individuals rarely have a clear pathway to contest the system’s conclusions.

Digital profiling

Participation in civic life, applying for a loan, seeking disaster relief and requesting student aid now contribute to a person’s digital footprint. Government entities could later interpret that data in ways that allow them to deny access to assistance. Data collected under the banner of care could be mined for evidence to justify placing someone under surveillance. And with growing dependence on private contractors, the boundaries between public governance and corporate surveillance continue to erode. Artificial intelligence, facial recognition systems and predictive profiling systems lack oversight. They also disproportionately affect low-income individuals, immigrants and people of color, who are more frequently flagged as risks. Initially built for benefits verification or crisis response, these data systems now feed into broader surveillance networks. The implications are profound. What began as a system targeting noncitizens and fraud suspects could easily be generalized to everyone in the country.

Eyes on everyone

This is not merely a question of data privacy. It is a broader transformation in the logic of governance. Systems once designed for administration have become tools for tracking and predicting people’s behavior. In this new paradigm, oversight is sparse and accountability is minimal. AI allows for the interpretation of behavioral patterns at scale without direct interrogation or verification. Inferences replace facts. Correlations replace testimony. The risk extends to everyone. While these technologies are often first deployed at the margins of society – against migrants, welfare recipients or those deemed “high risk” – there’s little to limit their scope. As the infrastructure expands, so does its reach into the lives of all citizens. With every form submitted, interaction logged and device used, a digital profile deepens, often out of sight. The infrastructure for pervasive surveillance is in place. What remains uncertain is how far it will be allowed to go. Nicole M. Bennett, Ph.D. Candidate in Geography and Assistant Director at the Center for Refugee Studies, Indiana University This article is republished from The Conversation under a Creative Commons license. Read the original article.

Author


Discover more from Daily News

Subscribe to get the latest posts sent to your email.

Continue Reading

News

Controlled burns reduce wildfire risk, but they require trained staff and funding − this could be a rough year

Published

on

file 20250422 56 h7olq0.jpg?ixlib=rb 4.1
Prescribed burns like this one are intentional, controlled fires used to clear out dry grass and underbrush that could fuel more destructive wildfires. Ethan Swope/Getty Images
Laura Dee, University of Colorado Boulder

Controlled burns

Red skies in August, longer fire seasons and checking air quality before taking my toddler to the park. This has become the new norm in the western United States as wildfires become more frequent, larger and more catastrophic. As an ecologist at the University of Colorado Boulder, I know that fires are part of the natural processes that forests need to stay healthy. But the combined effects of a warmer and drier climate, more people living in fire-prone areas and vegetation and debris built up over years of fire suppression are leading to more severe fires that spread faster. And that’s putting humans, ecosystems and economies at risk. To help prevent catastrophic fires, the U.S. Forest Service issued a 10-year strategy in 2022 that includes scaling up the use of controlled burns and other techniques to remove excess plant growth and dry, dead materials that fuel wildfires. However, the Forest Service’s wildfire management activities have been thrown into turmoil in 2025 with funding cuts and disruptions and uncertainty from the federal government. The planet just saw its hottest year on record. If spring and summer 2025 are also dry and hot, conditions could be prime for severe fires again.

More severe fires harm forest recovery and people

Today’s severe wildfires have been pushing societies, emergency response systems and forests beyond what they have evolved to handle. Extreme fires have burned into cities, including destroying thousands of homes in the Los Angeles area in 2025 and near Boulder, Colorado, in 2021. They threaten downstream public drinking water by increasing sediments and contaminants in water supplies, as well as infrastructure, air quality and rural economies. They also increase the risk of flooding and mudslides from soil erosion. And they undermine efforts to mitigate climate change by releasing carbon stored in these ecosystems. In some cases, fires burned so hot and deep into the soil that the forests are not growing back. While many species are adapted to survive low-level fires, severe blazes can damage the seeds and cones needed for forests to regrow. My team has seen this trend outside of Fort Collins, Colorado, where four years after the Cameron Peak fire, forests have still not come back the way ecologists would expect them to under past, less severe fires. Returning to a strategy of fire suppression − or trying to “go toe-to-toe with every fire” − will make these cases more common.
A burned landscape with black tree trunks, no canopy and little to no new growth on the ground.
Parts of Cameron Peak, burned in a severe fire in 2020, still showed little evidence of recovery in 2024. Efforts have been underway to try to replant parts of the burned areas by hand. Bella Oleksy/University of Colorado
Proactive wildfire management can help reduce the risk to forests and property. Measures such as prescribed burns have proven to be effective for maintaining healthy forests and reducing the severity of subsequent wildfires. A recent review found that selective thinning followed by prescribed fire reduced subsequent fire severity by 72% on average, and prescribed fire on its own reduced severity by 62%.
Four sets of illustrations. The most severe fires happened with no treatment. Thinning helps some. Prescribed burning keeps fires burning lower at the forest floor.
Prescribed burns and forest thinning tend to reduce the risk of extremely destructive wildfires. Kimberley T. Davis, et al., Forest Ecology and Management, 2024, CC BY
But managing forests well requires knowing how forests are changing, where trees are dying and where undergrowth has built up and increased fire hazards. And, for federal lands, these are some of the jobs that are being targeted by the Trump administration. Some of the Forest Service staff who were fired or put in limbo by the Trump administration are those who do research or collect and communicate critical data about forests and fire risk. Other fired staff provided support so crews could clear flammable debris and carry out fuel treatments such as prescribed burns, thinning forests and building fire breaks. Losing people in these roles is like firing all primary care doctors and leaving only EMTs. Both are clearly needed. As many people know from emergency room bills, preventing emergencies is less costly than dealing with the damage later.

Logging is not a long-term fire solution

The Trump administration cited “wildfire risk reduction” when it issued an emergency order to increase logging in national forests by 25%. But private − unregulated − forest management looks a lot different than managing forests to prevent destructive fires. Logging, depending on the practice, can involve clear-cutting trees and other techniques that compromise soils. Exposing a forest’s soils and dead vegetation to more sunlight also dries them out, which can increase fire risk in the near term.
Forest Service crew members put tree branches into a wood chipper as they prepare the area for a prescribed burn in the Tahoe National Forest, June 6, 2023.
Forest-thinning operations involve carefully removing young trees and brush that could easily burn, with a goal of creating conditions less likely to send fire into the crowns of trees. AP Photo/Godofredo A. Vásquez
In general, logging that focuses on extracting the highest-value trees leaves thinner trees that are more vulnerable to fires. A study in the Pacific Northwest found that replanting logged land with the same age and size of trees can lead to more severe fires in the future.

Research and data are essential

For many people in the western U.S., these risks hit close to home. I’ve seen neighborhoods burn and friends and family displaced, and I have contended with regular air quality warnings and red flag days signaling a high fire risk. I’ve also seen beloved landscapes, such as those on Cameron Peak, transform when conifers that once made up the forest have not regrown.
Burned trees and weeds in the ground below.
Recovery has been slow on Cameron Peak after a severe fire in 2020. This photo was taken in 2024. Bella Oleksy/University of Colorado
My scientific research group and collaborations with other scientists have been helping to identify cost-effective solutions. That includes which fuel-treatment methods are most effective, which types of forests and conditions they work best in and how often they are needed. We’re also planning research projects to better understand which forests are at greatest risk of not recovering after fires. This sort of research is what robust, cost-effective land management is based on. When careful, evidence-based forest management is replaced with a heavy emphasis on suppressing every fire or clear-cutting forests, I worry that human lives, property and economies, as well as the natural legacy of public lands left to every American, are at risk.The Conversation Laura Dee, Associate Professor of Ecology, University of Colorado Boulder This article is republished from The Conversation under a Creative Commons license. Read the original article.

Discover more from Daily News

Subscribe to get the latest posts sent to your email.

Continue Reading

Trending