Connect with us

Race Relations

Mass deportations don’t keep out ‘bad genes’ − they use scientific racism to justify biased immigration policies

Published

on

mass deportations
Anti-immigration policies were a key talking point for Republican candidates. Matt Rourke/AP Photo

Shoumita Dasgupta, Boston University

Threats of mass deportations loom on the post-2024 election horizon. Some supporters claim these will protect the country from immigrants who bring “bad genes” into America. But this is a misguided use of the language of science to give a sheen of legitimacy to unscientific claims.

Politicians invoke genetics to confirm false stereotypes that immigrants are more violent than native-born citizens as a result of biological differences. This is despite the fact that immigrants living in the country with or without legal authorization have significantly lower crime and violent crime rates than U.S. citizens. Moreover, there is no strong genetic evidence to support a biological predisposition for committing violent acts.

As a geneticist and a child of immigrants, I study the intersection of biology and bias. I am also author of the book “Where Biology Ends and Bias Begins: Lessons on Belonging from Our DNA.” What is clear from my professional work is that this line of thinking – attempting to use science to explain human difference in ways that reinforce social hierarchies – isn’t new. It takes the playbooks of genetic essentialism and scientific racism and applies them to public policy.

The fallacy of genetic essentialism

Genetic essentialism is the concept that genes alone are the reason why someone develops a specific trait or behaves a certain way. For instance, a genetic essentialist would say that a person’s athleticism, intelligence, personality and a range of other traits are encoded entirely in their DNA. They ignore the influence that sports training, material resources and learned behaviors have on these traits.

When used to explain differences between populations, genetic essentialism discounts the role that structural biases – inequities deeply ingrained in how systems operate – play in individual differences. Structural biases create a playing field that advantages one group over another from the start.

For instance, studies seeking to identify a gene for violent behavior may use measurements that are themselves biased. If arrest or incarceration rates were used as evidence of violence, study findings would be affected by discriminatory practices in the policing and criminal justice systems that more harshly penalize people of color.

Studies trying to disentangle the relative effects of genetic and structural factors on specific traits also face similar biases. For example, mental health outcomes are influenced by the identity-related stress that racial or sex and gender minorities experience. Similarly, socioeconomic outcomes are affected by the effects of redlining and segregation on generational wealth.

Genetics of educational attainment

As another example of behavioral genetics, consider a 2018 study on the genetics of educational attainment – in other words, whether certain genes were associated with years of schooling completed. The researchers were careful to communicate their results as pertaining specifically to educational attainment. They highlighted that genetic scores explained only about 11% to 13% of the variance – meaning, 87% to 89% of differences in educational attainment were due to influences other than genetics.

However, some popular press coverage oversimplified their findings as identifying genes for intelligence, even though the scientists did not directly measure intelligence – nor is it possible to.

Teacher reading a book to a group of children
The effects of racial segregation in schools continues to be reflected in educational attainment gaps. Wilfredo Lee/AP Photo

Educational attainment can reflect everything from generational wealth to racial biases in education. A student with access to tutors their parents paid for has fewer educational obstacles than a student who has to work after school to make ends meet. Likewise, school punishment practices that are biased against students from certain backgrounds can set them on a harmful trajectory known as the school-to-prison pipeline.

Genetic studies are not conducted in a vacuum, and social influences can confound analyses seeking to focus on biological effects. In fact, some scientists think of genes as potential controls to allow more careful study of the nongenetic factors accounting for the remaining 87% to 89% of differences in educational attainment.

Advertisement
Get More From A Face Cleanser And Spa-like Massage

Intentional misinterpretation of these observations on educational attainment have led some to conclude that Black students are simply not as intelligent as their white counterparts. They argue that these differences are genetically encoded and immutable. However, when the effects of wealth gaps and school segregation are accounted for, test score gaps substantially narrow. Importantly, educational attainment gaps actually invert, predicting that Black students complete more years of school than white students.

Slipping to scientific racism

This brings us to scientific racism: the way in which science is contorted to support preexisting views about the superiority of the white race over all others.

American physician Samuel Morton was one of the original forebears of scientific racism. He was interested in providing “evidence” to support his belief that Caucasians were the most intelligent of all races. To do this, he collected skulls and categorized them into five racial groupings he believed were derived from separate creation events. He measured skull volume as an indicator of intelligence.

Illustration depicting four different skulls
Samuel Morton and his colleagues used average skull volume to support their theory of white supremacy. Morton et al/U.S. National Library of Medicine via Internet Archive

When comparing the averages from each group, his results supported his original theory. However, if he had instead focused on the array of skull volumes in his collection, he would have seen substantial overlap in each of the groupings. That is, each group had a range of small to large skulls. Morton’s singular focus on proving his beliefs from the outset likely influenced his favored analytical approach. Nor is there a meaningful correlation between brain volume and intelligence.

Similar beliefs are at play when white supremacists manipulate data to create a scientific basis for their claims that white people are more intelligent than Black people. These tampered results appear in dark corners of the internet where they are shared in fringe journals, far-right social media memes and racist manifestos.

To be clear, there is no evidence that genetic differences related to intelligence or cognitive performance exist between racial groups. Instead, this is another argument growing out of replacement theory, the conspiracy theory that Jews and Western elites are deliberately replacing white populations with populations of color. Adherents believe that people of color are genetically inferior but are reproducing and immigrating at greater rates, and so threatening white power.

Human genetic variation

Scientists have systematically studied human genetic variation for decades, looking at differences in the DNA of people around the world. These studies definitively demonstrate that we are far more alike than different. The vast majority of common genetic variation is found across populations, and very few rare variants are specific to an individual group.

This may seem unexpected. Looking at the world around you, you’ll observe some differences between racially defined groups, such as skin tone and hair texture. However, there is no place in the world where you’ll be able to draw a line that cleanly separates people with dark skin tone from those with light skin tone. Skin color varies continuously across the globe, and a range of skin tones are present within any individual group.

Importantly, variation in one genetic trait is not predictive of other genetic traits. That is, you can’t extrapolate conclusions about traits such as disease predisposition from the genes that influence skin color. Even if the fallacy of genetic essentialism were true and cognitive ability was primarily a biological trait – which it is not – it would not be possible to connect an observed skin tone to predicted intelligence. https://www.youtube.com/embed/wK94PWbCmsk?wmode=transparent&start=0 President-elect Donald Trump pledged to use the military to carry out mass deportations.

Misappropriating genetics

While science does not support genetic essentialism or other underpinnings of replacement theory, this exact rationale has made its way into national immigration policy.

These policies grew directly out of the American eugenics movement, which sought to build a supposedly better human race through social engineering based on “race science.” Zoologist Charles Davenport created the Cold Spring Harbor Eugenics Record Office in 1910 to pursue his interests in evolution, breeding and human heredity. There, he and his colleagues collected records of American families, documenting their traits and ascribing genetic bases to them.

Advertisement
Get More From A Face Cleanser And Spa-like Massage

Harry Laughlin, a high school teacher Davenport recruited to serve as superintendent of the office, was later appointed as the expert eugenics agent for the U.S. congressional Committee on Immigration and Naturalization. He commissioned studies to document race-based trends in so-called biological traits such as intelligence, inventiveness and feeble-mindedness, erroneously concluding that observed patterns were due to genetic differences between populations. His findings were used to inform U.S. immigration quotas, which were set higher for populations deemed to have good genes and lower for those with undesirable traits.

These policies were codified in the Immigration Restriction Act of 1924. In signing the act, President Calvin Coolidge declared, “America must remain American,” paraphrasing a popular Ku Klux Klan slogan. This law severely restricted immigration from Asia and implemented strict quotas for immigrants from Southern and Eastern Europe. It also established elements of immigration that remain in 2025, including the visa system and the Border Patrol. With the passage of the Immigration Restriction Act, anti-Semitism and xenophobia became law of the land.

Coming full circle, genetic essentialism and racism continue to drive present-day rhetoric using “bad genes” to justify mass deportations of people deemed harmful to American society. Politicians and tech moguls are employing a combination of racism, willful misunderstanding of genetic science and political power to promote their own social agendas.

Person gazing at makeshift memorial at the foot of a tree, police cars in the background, and nonviolence signs in the foreground
Ten people were killed in the 2022 Buffalo shooting. Kent Nishimura/Los Angeles Times via Getty Images

Politicians and hate groups have often weaponized genetics, leading to violent events carried out in the name of white supremacy. These include the 2017 Charlottesville Unite the Right rally, the 2019 Christchurch shootings of Muslims at two mosques, and the 2022 Buffalo massacre of Black customers at a neighborhood grocery store.

A better understanding of science and history can empower scientists, policymakers and others to reject unscientific claims and protect vulnerable members of society targeted by racism.

Shoumita Dasgupta, Professor of Medicine, Assistant Dean of Diversity & Inclusion, Boston University

This article is republished from The Conversation under a Creative Commons license. Read the original article.


Discover more from Daily News

Subscribe to get the latest posts sent to your email.

Continue Reading
Advertisement Winter Super Sale – Enjoy 25% Off Your Entire Purchase + Free Shipping. Use Code: BPWARM25

Community

HBCUs Do More Than Boost Opportunity — Research Suggests They Can Also Help Reduce Incarceration Risk

Historically Black colleges and universities (HBCUs) play a crucial role in supporting Black students’ educational and socioeconomic advancement. By providing affordable education and mentorship, HBCUs help reduce crime rates among graduates. Despite funding challenges, their impact includes higher graduation rates and economic mobility, which help break cycles of poverty and incarceration.

Published

on

file 20260204 56 hunzkv.jpg?ixlib=rb 4.1
Jackson State University students attend an event in Mississippi in October 2025. Aron Smith/Jackson State University via Getty Images

Historically Black colleges and universities do more than offer Black youths a pathway to opportunity and success – I teach criminology, and my research suggests another benefit

Andrea Hagan, Loyola University New Orleans

Historically Black colleges and universities, often known as HBCUs, are well known for their deep roots in U.S. higher education and proven effectiveness at graduating Black students who go on to become professionally successful.

HBCUs are colleges and universities that were established before 1964, with the mission of educating Black Americans, though now anyone can attend.

As a criminology instructor who has spent 13 years studying the relationship between educational trajectories and criminal justice – and a Black woman who grew up in the South and attended an HBCU – I believe that HBCUs offer another often overlooked benefit.

They give young people, especially Black people, a pathway in higher education that they might not otherwise receive. By opening doors to education, jobs and mentorship, HBCUs disrupt the conditions that can cause young people – especially Black people – to get lost in the criminal justice system.

The U.S. incarcerates approximately 1.6 million people. Black Americans are locked up at five times the rate of white Americans. This disparity starts young: Black teenagers are 5.6 times more likely to be placed in juvenile detention than white teenagers, and people who are incarcerated as juveniles are nearly four times more likely to be incarcerated as adults. Overall, the vast majority of Black people are not incarcerated.

Attending a HBCU, or any other university, does not guarantee a stable financial future. And not graduating from high school or college certainly does not not mean that someone will become incarcerated.

But research shows that education, especially a college degree, is closely linked to lower crime rates. College graduates who do commit crimes reoffend at rates below 6%, while people who drop out of high school return to prison at rates around 75%.

This is why I believe HBCUs in particular have an important role to play in helping young Black people avoid this path.

Three young women wear black graduation robes and black graduation hats and stand in a row.
Spelman College graduates arrive at their commencement ceremony in May 2025 in College Park, Ga. Paras Griffin/Getty Images

Understanding HBCUs

Today, there are roughly 100 HBCUs in 19 states, as well as the District of Columbia and the U.S. Virgin Islands. The schools are a mix of public schools and private, nonprofit colleges and universities.

HBCUs make up just 3% of the country’s colleges and universities. But their graduates include 40% of Black engineers, 50% of Black lawyers and 70% of Black doctors in the United States.

Most HBCUs are located in Southern and mid-Atlantic states – a legacy of when segregation barred Black students from attending most colleges and universities.

Advertisement
Get More From A Face Cleanser And Spa-like Massage

Many HBCUs are also located in rural Southern communities, in particular. Residents of these areas tend to live in poverty and have limited educational opportunities.

Attending a local HBCU is often one of the most practical ways these prospective students can get a degree – in part because HBCUs are often more affordable than other four-year college options.

The average annual tuition for an in-state student at a public HBCU is roughly US$7,700 per year – well below the national average, which ranges from $12,000 at public schools to $45,000 at private schools. Some public HBCUs charge as little as $1,000 in annual tuition for in-state students.

Schools like Coppin State University in Baltimore and the University of Maryland Eastern Shore also offer in-state rates to out-of-state students from places that do not have HBCUs nearby.

Despite their focus on Black students, HBCUs are increasingly diverse.

In 2022, non-Black students made up 24% of the student population at HBCUs. By comparison, 15% of non-Black students made up HBCU populations in 1976.

HBCUs also enroll low-income students, regardless of race, at three times the rate that predominantly white colleges do.

Upward mobility

Research shows completing high school reduces arrest rates by 11% to 12% for both property and violent crimes, regardless of race or economic background.

College takes this effect further.

Studies have found that college enrollment helps young people with histories of delinquency to stop committing crimes. Completing a four-year degree reduces the likelihood of criminal behavior by 43% to 48%, compared to those who started college but did not finish.

Advertisement
Get More From A Face Cleanser And Spa-like Massage

A few long-recognized reasons help explain this pattern. Education increases earning potential, making crime a riskier and less attractive option for people with a degree. Education also encourages long-term thinking, strengthens ties to employers and communities, and builds problem-solving skills that help people navigate challenges.

I have seen firsthand, through my own experiences growing up in the South and teaching students, how HBCUs can help move Black students out of poverty. These schools stand out among other colleges in terms of how effectively they graduate low-income Black students and move them into the middle class, outcomes that research links to reduced criminal behavior.

When researchers rank colleges by whether and how their students improve their socioeconomic status, income and wealth over time, more than half of the highest-performing schools are HBCUs.

Black students who attend HBCUs are 30% more likely to earn a degree than Black students who attend colleges that are not HBCUs. Black HBCU graduates are also likely to earn more money than Black non-HBCU college graduates.

This matters because poverty is one of the strongest predictors of whether someone will commit a crime.

When colleges and universities graduate students who earn middle-class incomes, they help break what researchers call the cycle of intergenerational poverty and incarceration. This pattern describes how children of incarcerated parents are six times more likely to end up in the justice system.

An ongoing money problem

Despite their benefits, HBCUs have chronically struggled with funding. In recent decades, state governments have not given Black land-grant universities – meaning public colleges originally created through federal legislation to serve Black students during segregation – at least $12.8 billion the federal government said they were owed.

Recent federal support for HBCUs has been mixed, as the Trump administration has made widespread cuts to many universities and colleges.

In April 2025, President Donald Trump signed an executive order renewing the White House Initiative on HBCUs, a federal effort to help support these schools. At the time, he said that Black colleges had no reason to fear cuts.

But days later, Trump’s proposed 2026 budget included $64 million in cuts to Howard University, one of the oldest HBCUs.

Advertisement
Get More From A Face Cleanser And Spa-like Massage

In September 2025, the Trump administration redirected $435 million to HBCUs by cutting funds from grant programs that had supported Hispanic-serving institutions and other colleges that have a large proportion of Hispanic or other minority students.

A large crowd is seen on a field in front of a red brick building with a tall clock tower. HBCUs
People gather on Howard University’s campus during its annual homecoming event in October 2016. Cheriss May/NurPhoto via Getty Images

The context that matters

The U.S. criminal justice system disproportionately affects Black people at every stage – from arrests to incarceration. Black Americans make up about 13% of the U.S. population but account for roughly 37% of all people in U.S. jails and prisons.

According to the National Academies of Sciences, the lifetime risk of imprisonment for Black men born between 1975 and 1979, and with less than a high school education, was about 68% – meaning nearly 7 in 10 in that group experienced incarceration at least once.

I have seen firsthand that when Black students from low-income backgrounds enroll at HBCUs, they become more likely to complete a degree and achieve the kind of financial stability that research shows helps reduce the risk of becoming caught up in the criminal justice system.

Andrea Hagan, Instructor of Criminology & Justice, Loyola University New Orleans, Loyola University New Orleans

This article is republished from The Conversation under a Creative Commons license. Read the original article.

The Bridge is a section of the STM Daily News Blog meant for diversity, offering real news stories about bona fide community efforts to perpetuate a greater good. The purpose of The Bridge is to connect the divides that separate us, fostering understanding and empathy among different groups. By highlighting positive initiatives and inspirational actions, The Bridge aims to create a sense of unity and shared purpose. This section brings to light stories of individuals and organizations working tirelessly to promote inclusivity, equality, and mutual respect. Through these narratives, readers are encouraged to appreciate the richness of diverse perspectives and to participate actively in building stronger, more cohesive communities.

https://stmdailynews.com/the-bridge

Author


Discover more from Daily News

Subscribe to get the latest posts sent to your email.

Continue Reading

STM Blog

Vaccine mandates misinformation: 2 experts explain the true role of slavery and racism in the history of public health policy – and the growing threat ignorance poses today

Vaccine mandates misinformation: Florida’s vaccination rates decline as the state plans to eliminate mandates. Experts warn this could deepen health disparities, undermine public trust, and threaten community health, especially given the history of racism in vaccination practices.

Published

on

Vaccine mandates misinformation
Vaccination rates in Florida schools have dipped below the threshold for immunity to certain preventable diseases. Suzi Media Production/iStock via Getty Images Plus

Vaccine mandates misinformation:

Lauren MacIvor Thompson, Kennesaw State University and Stacie Kershner, Georgia State University

On Sept. 3, 2025, Florida announced its plans to be the first state to eliminate vaccine mandates for its citizens, including those for children to attend school.

Current Florida law and the state’s Department of Health require that children who attend day care or public school be immunized for polio, diphtheria, rubeola, rubella, pertussis and other communicable diseases. Dr. Joseph Ladapo, Florida’s surgeon general and a professor of medicine at the University of Florida, has stated that “every last one” of these decades-old vaccine requirements “is wrong and drips with disdain and slavery.”

As experts on the history of American medicine and vaccine law and policy, we took immediate note of Ladapo’s use of the word “slavery.”

There is certainly a complicated history of race and vaccines in the United States. But, in our view, invoking slavery as a way to justify the elimination of vaccines and vaccine mandates will accelerate mistrust and present a major threat to public health, especially given existing racial health disparities. It also erases Black Americans’ key work in centuries of American public health initiatives, including vaccination campaigns.

What’s clear: Vaccines and mandates save human lives

Evidence and data show that vaccines work, as do mandates, in keeping Americans healthy. The World Health Organization reported in a landmark 2024 study that vaccines have saved more than 154 million lives globally in just the past 50 years.

In the United States, vaccines for children are one of the top public health achievements of the 20th century. Rates of eight of the most common vaccine-preventable diseases in school-age children dropped by 97% or more from pre-vaccine levels, preventing an estimated 1,129,000 deaths and resulting in direct savings of US$540 billion and societal savings of $2.7 trillion.

History of vaccine mandates in the United States

Vaccine mandates in the United States date to the Colonial period and have a complex history. George Washington required his troops be inoculated, the predecessor of vaccination, against smallpox during the American Revolution.

To prevent outbreaks of this debilitating, disfiguring and deadly disease, state and local governments implemented smallpox inoculation and vaccination campaigns into the early 1900s. They targeted various groups, including enslaved people, immigrants, people living in tenement and other crowded housing conditions, manual laborers and others, forcibly vaccinating those who could not provide proof of prior vaccination.

Although religious exemptions were not recognized by law until the 1960s, some resisted these vaccination campaigns from the beginning, and 19th-century anti-vaccination societies urged the rollback of state laws requiring vaccination.

By the turn of the 20th century, however, the U.S. Supreme Court also began to intervene in matters of public health and vaccination. The court ultimately upheld vaccine mandates in Jacobson v. Massachusetts in 1905, in an effort to strike a balance between individual rights with the need to protect the public’s health. In Zucht v. King in 1922, the court also ruled in favor of vaccine mandates, this time for school attendance.

Advertisement
Get More From A Face Cleanser And Spa-like Massage

Vaccine mandates expanded by the middle of the 20th century to include vaccines for many dangerous childhood diseases, such as polio, measles, rubella and pertussis. When Jonas Salk’s polio vaccine became available, families waited in long lines for hours to receive it, hoping to prevent their children from having to experience paralysis or life in an iron lung.

Scientific studies in the 1970s demonstrated that state declines in measles cases were correlated with enforcement of school vaccine mandates. The federal Childhood Immunization Initiative launched in the late 1970s helped educate the public on the importance of vaccines and encouraged enforcement. All states had mandatory vaccine requirements for public school entry by 1980, and data over the past several decades continues to demonstrate the importance of these laws for public health.

Most parents also continue to support school mandates. A survey conducted in July and August 2025 by The Washington Post and the Kaiser Family Foundation finds that 81% of parents support laws requiring vaccines for school.

Black Americans’ long fight for public health equity

Despite the proven success of vaccines and the importance of vaccine mandates in maintaining high vaccination rates, there is a vocal anti-vaccine minority in the U.S. that has gained traction since the COVID-19 pandemic.

Misinformation proliferates both online and off. Some of the misinformation originates in the historical realities of vaccines and social policy in the United States.

When Ladapo, the Florida surgeon general, invoked the term “slavery” to refer to vaccine mandates, he may have been referring to the history of racism in the medical field, such as the U.S. Public Health Service Untreated Syphilis Study at Tuskegee. The study, which started in 1932 and spanned four decades, involved hundreds of Black men who were recruited without their knowledge or consent so that researchers could study the effects of untreated syphilis. Investigators misled the participants about the nature of the study and actively withheld treatment – including penicillin, which became the standard therapy in the late 1940s – in order to study the effects of untreated syphilis on the men’s bodies.

Today, the study is remembered as one of the most egregious instances of racism and unethical experimentation in American medicine. Its participants had enrolled in the study because it was advertised as a chance to receive expert medical care but, instead, were subjected to lies and painful “treatments.”

Three men standing shoulder to shoulder in long-sleeve shirts.
The 40-year untreated syphilis study at Tuskegee ended in 1972. National Archives Catalog/Centers for Disease Control and Prevention

Despite these experiences in the medical system, Black Americans have long advocated for better health care, connecting it to the larger struggle for racial equality.

Vaccination is no exception. Despite the fact that they were often the subject of forced innoculation, enslaved people helped to lead the first American public health initiatives around epidemic disease. Historians’ research on smallpox and slavery, for example, has found that inoculation was widely accepted and practiced by West Africans by the early 1700s, and that enslaved people brought the practice to the Colonies.

Although his role is often downplayed, an African man known as Onesimus introduced his enslaver Cotton Mather to inoculation.

Throughout the next century, enslaved people often continued to inoculate each other to prevent smallpox outbreaks, and enslaved and free people of African descent played critical roles in keeping their own communities as healthy as possible in the face of violence, racism and brutality. The modern Civil Rights Movement explicitly drew on this history and centered health equity for Black Americans as one of its key tenets, including working to provide access to vaccines for preventable diseases.

Advertisement
Get More From A Face Cleanser And Spa-like Massage

In our view, Ladapo’s reference to vaccines as “slavery” ignores this important and nuanced history, especially Black Americans’ role in the history of preventing communicable disease with vaccines.

Black and white scanned engraving of colonialist Cotton Mather.
Puritan slave owner Cotton Mather learned about smallpox inoculation from one of his slaves, an African man named Onesimus. benoitb/DigitalVision Vectors via Getty Images

Lessons to learn from Tuskegee

Ladapo’s word choice also runs the risk of perpetuating the rightful mistrust that continues to exist in communities of color about vaccines and the American health system more broadly. Studies show that lingering effects of Tuskegee and other instances of medical racism have had real consequences for the health and vaccination rates of Black Americans.

A large body of evidence shows the existence of persistent health disparities for Black people in the United States compared with their white counterparts, leading to shorter lifespans, higher rates of maternal and infant mortality and higher rates of communicable and chronic diseases, with worse outcomes.

Eliminating vaccine mandates in Florida and expanding exemptions in other states will continue to widen these already existing disparities that stem from past public health wrongs.

There is an opportunity here, however, for health officials, not just in Florida but across the nation, to work together to learn from the past in making American public health better for everyone.

Rather than weakening vaccine mandates, national, state and local public health guidance can focus on expanding access and communicating trustworthy information about vaccines for all Americans. Policymakers can acknowledge the complicated history of vaccines, public health and race, while also recognizing how advancements in science and medicine have given us the opportunity to eradicate many of these diseases in the United States today.

Lauren MacIvor Thompson, Assistant Professor of History and Interdisciplinary Studies, Kennesaw State University and Stacie Kershner, Deputy Director of the Center for Law, Health & Society, Georgia State University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

The Bridge is a section of the STM Daily News Blog meant for diversity, offering real news stories about bona fide community efforts to perpetuate a greater good. The purpose of The Bridge is to connect the divides that separate us, fostering understanding and empathy among different groups. By highlighting positive initiatives and inspirational actions, The Bridge aims to create a sense of unity and shared purpose. This section brings to light stories of individuals and organizations working tirelessly to promote inclusivity, equality, and mutual respect. Through these narratives, readers are encouraged to appreciate the richness of diverse perspectives and to participate actively in building stronger, more cohesive communities.

https://stmdailynews.com/the-bridge

Author


Discover more from Daily News

Subscribe to get the latest posts sent to your email.

Continue Reading

The Bridge

Muslim men have often been portrayed as ‘terrorists’ or ‘fanatics’ on TV shows, but Muslim-led storytelling is trying to change that narrative

Published

on

Muslim men
Hulu’s comedy-drama series ‘Ramy,’ created by actor-comedian Ramy Youssef, follows a young Egyptian-American Muslim navigating life’s challenges. Youssef, center, appears at a press conference in 2019. Frederick M. Brown/Getty Images

Muslim men have often been portrayed as ‘terrorists’ or ‘fanatics’ on TV shows, but Muslim-led storytelling is trying to change that narrative

Tazeen M. Ali, Washington University in St. Louis For over a century, Hollywood has tended to portray Muslim men through a remarkably narrow lens: as terrorists, villains or dangerous outsiders. From shows such as “24” and “Homeland” to procedural dramas such as “Law and Order,” this portrayal has seldom allowed for complexity or relatability. Such depictions reinforce Orientalist stereotypes – a colonial worldview that treats cultures in the East as exotic, irrational or even dangerous. However, recent years have seen a noticeable increase in Muslim-led storytelling across platforms in the U.S. and U.K. While still a minority, these stories depart from decades of misrepresentation. As a scholar of Islam and gender who has conducted research on masculinity, sexuality and national belonging in Muslim entertainment media, I analyze a new wave of critically acclaimed shows where Muslim characters are at the center of the narrative.

Historical stereotypes

Scholar of media and race Jack Shaheen has documented the systematic vilification of Arabs and Muslims in Western media. In his 2001 book “Reel Bad Arabs,” he analyzed over a thousand films and found that the vast majority depicted Arab and Muslim men almost exclusively as fanatics, oil-rich villains and misogynists.
‘Reel Bad Arabs’ documentary.
More recently, a 2021 study from the University of Southern California’s Annenberg Inclusion Initiative looked at 200 popular movies and found that Muslim characters were either completely missing or shown as violent. Despite the consistency of negative representations of Muslims on television following the rise in Islamophobia, the post-9/11 climate actually saw the introduction of more diverse Muslim characters. Such portrayals promoted the idea of the U.S. as a tolerant, liberal society. Scholar of popular culture Evelyn Alsultany writes that Hollywood introduced Muslim characters who were often law-abiding citizens or patriotic allies. She explains that despite these positive attempts, these characters were still depicted in simplistic ways, as either “good Muslims” or “bad Muslims.” The “good Muslim/bad Muslim” framework was coined by scholar of postcolonialism Mahmood Mamdani to describe how Muslims are understood across this binary. The “good Muslims” distance themselves from their faith and align themselves with Western liberal values to gain acceptance. Expanding on this theme, Islamic studies scholar Samah Choudhury explains how the mainstream success of South Asian Muslim male comedians such as Hasan Minhaj, Kumail Nanjiani and Aziz Ansari is shaped by their adoption of secular ideals. Even so-called “positive” characters, such as Muslim FBI agents or loyal informants in shows like “NCIS” or “Homeland,” ultimately served to normalize state surveillance and justify the global war on terrorism, a global campaign initiated by the U.S. following the Sept. 11, 2001, terrorist attacks. These brown and sometimes Black Muslim characters are portrayed as “good” only when aligned with U.S. state power.

Effort in contemporary television

Hulu’s comedy drama series “Ramy” is a milestone in Muslim storytelling. Created by actor-comedian Ramy Youssef, the series, which debuted in 2019, follows a young Egyptian-American Muslim navigating family, faith and relationships in New Jersey. Ramy is devoid of storylines about national security. Instead, the show foregrounds its main character’s grappling with religiosity, dating and identity. Moreover, as I have argued elsewhere, the protagonist’s religious devotion is never a punchline but a part of his everyday experience. For instance, Ramy prays five times a day – at the mosque and at home, fasts during Ramadan, and abstains from alcohol as a matter of Islamic observance. At the same time, he also partakes in hookup culture and wrestles with guilt for falling short of Islamic ideals. By showcasing this duality, the show illuminates internal debates within American Muslim communities, including on gendered norms around marriage and sexual ethics. Across the Atlantic, the BBC comedy series “Man Like Mobeen,” created by comedian-actor Guz Khan, offers a layered portrayal of Muslim life in inner-city Birmingham, England. The show follows Mobeen, a reformed British Pakistani gangster, striving and often failing to leave his criminal past behind and live as a devout Muslim while raising his teenage sister. The show explores the struggles of the working class. It situates Muslim communities within broader class and racial dynamics whereby working-class Black and brown men are vulnerable to racial profiling by law enforcement and gang violence. With incisive and dark humor, it challenges British racism against Muslims and offers social and political commentary on U.K. society. This includes critiques of British far-right movements and their racism, as well as the failures of the National Health Service.

Muslim women on screen

The flip side of stereotypical portrayals of Muslim men as violent and misogynist is the equally reductive portrayal of Muslim women as passive or oppressed. When Muslim women appear on screen, they are often presented as submissive or “liberated” only by a white non-Muslim male romantic interest. This process of liberation usually involves removing their hijab or distancing themselves from Islam. A refreshing departure from such storytelling norms can be found in the British Channel 4 comedy “We Are Lady Parts,” created by filmmaker and writer Nida Manzoor, which debuted in 2021. The show follows an all-female Muslim punk band in London. The bandmates are funny, creative and rebellious. While they defy Western views of Muslim women, they do not appear to be written solely to shatter stereotypes. They reflect the contradictions that many Muslims live with, juggling faith, identity and politics in their music. The band’s songs include feminist themes but are diverse, subverting Islamophobic stereotypes against women with humor with songs like “Voldemort Under My Headscarf,” or lusting after a love interest in “Bashir with the good beard.”
‘Voldemort Under My Headscarf,’ a song from the music comedy ‘We Are Lady Parts.’
The band members are also often seen engaged in ritual prayer together, a unified display of worship among women who otherwise have very different personalities, fashion sensibilities and goals in life. The show also addresses queerness, Islamophobia and intergenerational conflict with nuance and humor. I explore all of these themes in further detail in my forthcoming book, in which I examine how this new wave of Muslim media offers insights about the lived religious experiences of American and British Muslims.

Narrative authority

What unites these series is their rejection of reductive and stereotypical narratives. Muslim characters in these shows are not defined by violence, trauma or assimilation. Nor do they serve as spokespeople for all Muslims; they are written as flawed and evolving individuals. This wave of nuanced portrayals of Muslim life includes other recent productions such as Netflix’s 2022 series “Mo” and Hulu’s 2025 reality series “Muslim Matchmaker,” which centers real people whose lives and romantic journeys showcase American Muslim life in authentic ways. Muslims in the show are depicted as having various professions, levels of faith and life experiences. These series and their creators signal that real progress comes when Muslim voices are telling their own stories, not simply reacting to the gaze of outsiders or the pressures of political headlines. By foregrounding daily ritual, spiritual aspiration and even awkwardness and desire, “Ramy,” “Man Like Mobeen” and “We Are Lady Parts” all refuse the burden of “representation.” By moving away from the binary of “threatening other” versus “assimilated citizen,” this new wave of media challenges the legacy of Orientalism. Instead, they offer characters who reflect the complex realities of Muslim lives that are messy, joyful and evolving. Tazeen M. Ali, Assistant Professor of Religion and Politics, Washington University in St. Louis This article is republished from The Conversation under a Creative Commons license. Read the original article.

Discover more from Daily News

Subscribe to get the latest posts sent to your email.

Continue Reading

Trending