child education
NASA Selects 21 New Learning Projects to Engage Students in STEM
Last Updated on July 27, 2024 by Daily News Staff
Credits: NASA
NASA is awarding more than $3.8 million to 21 museums, science centers, and other informal education institutions for projects designed to bring the excitement of space science to communities across the nation and broaden student participation in STEM (science, technology, engineering, and mathematics).
Projects were selected for NASA’s Teams Engaging Affiliated Museums and Informal Institutions (TEAM II) program and TEAM II Community Anchor Awards. Both are funded through NASA’s Next Generation STEM (Next Gen STEM), which supports kindergarten to 12-grade students, caregivers, and formal and informal educators in engaging the Artemis Generation in the agency’s missions and discoveries. The selected projects will engage their communities in a wide variety of STEM topics, from aeronautics and Earth science to human space exploration.
TEAM II: NASA-Based Learning Opportunities
NASA’s vision for TEAM II is to enhance the capability of informal education institutions to host NASA-based learning activities while increasing the institutions’ capacity to use innovative tools and platforms to bring NASA resources to students. The agency has selected four institutions to receive approximately $3.2 million in cooperative agreements for projects they will implement during the next three years.
The selected institutions and their proposed projects are:
- Universities Space Research Association, Columbia, Maryland
Virtual Trips to Extreme Environments - Michigan Science Center, Detroit, Michigan
Urban Skies – Equitable Universe: Using Open Space to Empower Youth to Explore Their Solar System and Beyond - Museum of Science, Boston, Massachusetts
UNITED (Unveiling NASA’s Inspirational Tales of Exploration and Discovery) - University Corporation for Atmospheric Research, Boulder, Colorado
Using a Network of Ozone Bioindicator Gardens to Engage Communities on Air Quality and NASA’s TEMPO Mission
Community Anchors: Local Connections to NASA
The designation as a Community Anchor recognizes institutions as local hubs bringing NASA STEM and space science to students and families in traditionally underserved areas. The agency has selected 17 institutions to receive more than $660,000 in grants to help make these one- to two-year projects a reality, enhancing the local impact and strengthening their ability to build sustainable connections between their communities and NASA.
The selected institutions and their proposed projects are:
- St. Anna’s Episcopal Church, New Orleans, Louisiana
Communicating Our Future For Education Expansion (COFFEE) - Frontiers of Flight Museum, Inc., Dallas, Texas
Youth STEM Initiative – STEM Leaders in Education - Children’s Museum of Indianapolis, Inc., Indianapolis, Indiana
Our Earth From Above - Pacific Science Center Foundation, Seattle, Washington
Connecting Youth to the Journey of Human Space Flight - National Space Science & Technology Institute, Colorado Springs, Colorado
Mobile Earth + Space Observatory Science Experiences for Engaging Rural Students - Board of Regents of the University of Nebraska, Lincoln, Nebraska
Because I’m Earth it: A NebrASkA Experience - Pajarito Environmental Education Center, Los Alamos, New Mexico
Exploring STEM Opportunities from New Mexico to the Solar System - Scienceworks Hands-On Museum, Ashland, Oregon
ScienceWorks Robotics in Space Program - City of Manhattan, Kansas
Flying Cleaner and Faster: Connecting Kansas Kids to the Future of Aviation - Northern Kentucky University, Highland Heights, Kentucky
Afterschool NASA Production Club - Utah State University, Logan, Utah
4-H Moon to Mars Tetrathlon - New York Hall of Science, Queens, New York
Connecting Communities to Real Time Astronomy Phenomena: Solar Eclipse 2024 - Monterey Institute for Research In Astronomy, Marina, California
MIRA la Luna: Igniting Interest in STEM for Middle School Students of the Salinas Valley - Infinity Science Center, Inc., Pearlington, Mississippi
Outreach STEM Education: Bringing NASA STEM Education to local communities through local county library systems and INFINITY Science Center - Sierra Nevada Journeys, Reno, Nevada
NASA Family STEM Nights - Union Station Kansas City, Inc., Kansas City, Missouri
Union Station Kansas City Inc NASA Team II Proposal - Eugene Science Center Inc., Eugene, Oregon
Sky’s The Limit: Access to Portable Planetarium Experiences for Rural and Title I Schools to Address Disparity in STEM Proficiency
Next Gen STEM is a project within NASA’s Office of STEM Engagement, which develops unique resources and experiences to spark student interest in STEM and build a skilled and diverse next generation workforce. For the latest NASA STEM events, activities, and news, visit:
Source: NASA
Our Lifestyle section on STM Daily News is a hub of inspiration and practical information, offering a range of articles that touch on various aspects of daily life. From tips on family finances to guides for maintaining health and wellness, we strive to empower our readers with knowledge and resources to enhance their lifestyles. Whether you’re seeking outdoor activity ideas, fashion trends, or travel recommendations, our lifestyle section has got you covered. Visit us today at https://stmdailynews.com/category/lifestyle/ and embark on a journey of discovery and self-improvement.
Discover more from Daily News
Subscribe to get the latest posts sent to your email.
Family
Discord Launches Teen-by-Default Settings Globally: What’s Changing (and Why It Matters)
Discord is launching teen-by-default settings globally in early March, adding privacy-forward age assurance, tighter access to age-gated spaces, and new default messaging and content filters.
Discord is rolling out a major shift in how its platform handles teen safety: teen-appropriate settings will become the default experience for all new and existing users worldwide, with age verification required to unlock certain settings and access sensitive or age-gated spaces.
The update is set to begin as a phased global rollout in early March, and Discord says the goal is to strengthen age-appropriate protections while still preserving the privacy, community, and meaningful connection that have made the platform a go-to for gaming and interest-based groups.
Teen-by-default, globally (starting in March)
Discord says the new defaults will apply to all users, not just new signups. In practice, that means accounts will start with a more protective baseline, and verified adults will have more flexibility to adjust settings or access age-restricted content.
Discord is also introducing an age-verification (age assurance) step that may be required to:
- Change certain communication settings
- Access sensitive content
- Enter age-restricted channels, servers, or commands
- Use select message request features
“Nowhere is our safety work more important than when it comes to teen users,” said Savannah Badalich, Head of Product Policy at Discord, adding that the company is building on its existing safety architecture with teen safety principles at the core.
Privacy-forward age assurance: how Discord says it will work
A big part of the announcement is Discord’s attempt to thread the needle between safety and privacy.
Users will be able to choose from multiple methods, including:
- Facial age estimation (video selfie)
- Submitting identification to vendor partners
Discord also says it will implement its age inference model, a background system designed to help determine whether an account belongs to an adult without always requiring users to verify their age. Some users may be asked to use multiple methods if more information is needed to assign an age group.
Discord highlighted several privacy protections in its approach:
- On-device processing: Video selfies for facial age estimation never leave a user’s device.
- Quick deletion: Identity documents submitted to vendor partners are deleted quickly (in most cases, immediately after age confirmation).
- Straightforward verification: In most cases, users complete the process once and their Discord experience adapts to their verified age group.
- Private status: A user’s age verification status cannot be seen by other users.
After completing a chosen method, Discord says users will receive confirmation via a direct message from Discord’s official account. A user’s assigned age group can also be viewed in My Account settings, and users can appeal by retrying the process.
Discord also notes it prompts users to age-assure only within Discord and currently does not send emails or text messages about its age assurance process or results.
What’s changing in the default safety settings
Starting in early March, Discord says it will assign new default settings designed to support age-appropriate experiences while keeping privacy front and center. Highlights include:
- Content filters: Users must be age-assured as adults to unblur sensitive content or turn the setting off.
- Age-gated spaces: Only age-assured adults can access age-restricted channels, servers, and app commands.
- Message Request Inbox: DMs from people a user may not know are routed to a separate inbox by default; only age-assured adults can modify this setting.
- Friend request alerts: People will receive warning prompts for friend requests from users they may not know.
- Stage restrictions: Only age-assured adults may speak on stage in servers.
Discord notes it previously launched a teen-by-default experience in the UK and Australia last year, and says this global rollout builds on that approach to deliver consistent protections worldwide.
Giving teens a seat at the table: Discord Teen Council
Along with the safety updates, Discord also announced recruitment for its inaugural Discord Teen Council, a teen advisory body intended to bring authentic teen perspectives into how Discord shapes their experience.
Discord says the Teen Council will consist of 10–12 teens and will help inform future product features, policies, and educational resources.
- Who can apply: Teens ages 13–17
- Apply by: May 1, 2026
The bigger picture
Discord says these updates build on its broader safety ecosystem, including tools and resources such as Family Center, Teen Safety Assist, a Warning System, and more.
Whether you’re a parent, a teen user, or an adult who uses Discord for gaming communities and group chats, the headline is simple: the default experience is becoming more restrictive, and adult access will increasingly depend on age assurance.
Source: PRNewswire
Dive into “The Knowledge,” where curiosity meets clarity. This playlist, in collaboration with STMDailyNews.com, is designed for viewers who value historical accuracy and insightful learning. Our short videos, ranging from 30 seconds to a minute and a half, make complex subjects easy to grasp in no time. Covering everything from historical events to contemporary processes and entertainment, “The Knowledge” bridges the past with the present. In a world where information is abundant yet often misused, our series aims to guide you through the noise, preserving vital knowledge and truths that shape our lives today. Perfect for curious minds eager to discover the ‘why’ and ‘how’ of everything around us. Subscribe and join in as we explore the facts that matter. https://stmdailynews.com/the-knowledge/
Discover more from Daily News
Subscribe to get the latest posts sent to your email.
child education
Special Education Is Turning to AI to Fill Staffing Gaps—But Privacy and Bias Risks Remain
With special education staffing shortages worsening, schools are using AI to draft IEPs, support training, and assist assessments. Experts warn the benefits come with major risks—privacy, bias, and trust.
Seth King, University of Iowa
In special education in the U.S., funding is scarce and personnel shortages are pervasive, leaving many school districts struggling to hire qualified and willing practitioners.
Amid these long-standing challenges, there is rising interest in using artificial intelligence tools to help close some of the gaps that districts currently face and lower labor costs.
Over 7 million children receive federally funded entitlements under the Individuals with Disabilities Education Act, which guarantees students access to instruction tailored to their unique physical and psychological needs, as well as legal processes that allow families to negotiate support. Special education involves a range of professionals, including rehabilitation specialists, speech-language pathologists and classroom teaching assistants. But these specialists are in short supply, despite the proven need for their services.
As an associate professor in special education who works with AI, I see its potential and its pitfalls. While AI systems may be able to reduce administrative burdens, deliver expert guidance and help overwhelmed professionals manage their caseloads, they can also present ethical challenges – ranging from machine bias to broader issues of trust in automated systems. They also risk amplifying existing problems with how special ed services are delivered.
Yet some in the field are opting to test out AI tools, rather than waiting for a perfect solution.
A faster IEP, but how individualized?
AI is already shaping special education planning, personnel preparation and assessment.
One example is the individualized education program, or IEP, the primary instrument for guiding which services a child receives. An IEP draws on a range of assessments and other data to describe a child’s strengths, determine their needs and set measurable goals. Every part of this process depends on trained professionals.
But persistent workforce shortages mean districts often struggle to complete assessments, update plans and integrate input from parents. Most districts develop IEPs using software that requires practitioners to choose from a generalized set of rote responses or options, leading to a level of standardization that can fail to meet a child’s true individual needs.
Preliminary research has shown that large language models such as ChatGPT can be adept at generating key special education documents such as IEPs by drawing on multiple data sources, including information from students and families. Chatbots that can quickly craft IEPs could potentially help special education practitioners better meet the needs of individual children and their families. Some professional organizations in special education have even encouraged educators to use AI for documents such as lesson plans.
Training and diagnosing disabilities
There is also potential for AI systems to help support professional training and development. My own work on personnel development combines several AI applications with virtual reality to enable practitioners to rehearse instructional routines before working directly with children. Here, AI can function as a practical extension of existing training models, offering repeated practice and structured support in ways that are difficult to sustain with limited personnel.
Some districts have begun using AI for assessments, which can involve a range of academic, cognitive and medical evaluations. AI applications that pair automatic speech recognition and language processing are now being employed in computer-mediated oral reading assessments to score tests of student reading ability.
Practitioners often struggle to make sense of the volume of data that schools collect. AI-driven machine learning tools also can help here, by identifying patterns that may not be immediately visible to educators for evaluation or instructional decision-making. Such support may be especially useful in diagnosing disabilities such as autism or learning disabilities, where masking, variable presentation and incomplete histories can make interpretation difficult. My ongoing research shows that current AI can make predictions based on data likely to be available in some districts.
Privacy and trust concerns
There are serious ethical – and practical – questions about these AI-supported interventions, ranging from risks to students’ privacy to machine bias and deeper issues tied to family trust. Some hinge on the question of whether or not AI systems can deliver services that truly comply with existing law.
The Individuals with Disabilities Education Act requires nondiscriminatory methods of evaluating disabilities to avoid inappropriately identifying students for services or neglecting to serve those who qualify. And the Family Educational Rights and Privacy Act explicitly protects students’ data privacy and the rights of parents to access and hold their children’s data.
What happens if an AI system uses biased data or methods to generate a recommendation for a child? What if a child’s data is misused or leaked by an AI system? Using AI systems to perform some of the functions described above puts families in a position where they are expected to put their faith not only in their school district and its special education personnel, but also in commercial AI systems, the inner workings of which are largely inscrutable.
These ethical qualms are hardly unique to special ed; many have been raised in other fields and addressed by early-adopters. For example, while automatic speech recognition, or ASR, systems have struggled to accurately assess accented English, many vendors now train their systems to accommodate specific ethnic and regional accents.
But ongoing research work suggests that some ASR systems are limited in their capacity to accommodate speech differences associated with disabilities, account for classroom noise, and distinguish between different voices. While these issues may be addressed through technical improvement in the future, they are consequential at present.
Embedded bias
At first glance, machine learning models might appear to improve on traditional clinical decision-making. Yet AI models must be trained on existing data, meaning their decisions may continue to reflect long-standing biases in how disabilities have been identified.
Indeed, research has shown that AI systems are routinely hobbled by biases within both training data and system design. AI models can also introduce new biases, either by missing subtle information revealed during in-person evaluations or by overrepresenting characteristics of groups included in the training data.
Such concerns, defenders might argue, are addressed by safeguards already embedded in federal law. Families have considerable latitude in what they agree to, and can opt for alternatives, provided they are aware they can direct the IEP process.
By a similar token, using AI tools to build IEPs or lessons may seem like an obvious improvement over underdeveloped or perfunctory plans. Yet true individualization would require feeding protected data into large language models, which could violate privacy regulations. And while AI applications can readily produce better-looking IEPs and other paperwork, this does not necessarily result in improved services.
Filling the gap
Indeed, it is not yet clear whether AI provides a standard of care equivalent to the high-quality, conventional treatment to which children with disabilities are entitled under federal law.
The Supreme Court in 2017 rejected the notion that the Individuals with Disabilities Education Act merely entitles students to trivial, “de minimis” progress, which weakens one of the primary rationales for pursuing AI – that it can meet a minimum standard of care and practice. And since AI really has not been empirically evaluated at scale, it has not been proved that it adequately meets the low bar of simply improving beyond the flawed status quo.
But this does not change the reality of limited resources. For better or worse, AI is already being used to fill the gap between what the law requires and what the system actually provides.
Seth King, Associate Profess of Special Education, University of Iowa
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Discover more from Daily News
Subscribe to get the latest posts sent to your email.
Entertainment
Smart Gaming: How Parents Can Keep Kids Safe Online
Parents can enhance kids’ safety during online gaming by using privacy settings, researching games, enabling age checks, keeping personal information private, and utilizing parental controls and security tools.
Last Updated on January 21, 2026 by Daily News Staff
Smart Gaming: How Parents Can Keep Kids Safe Online
(Family Features) Playing video games can be a fun, social experience. However, online gaming also poses real risks, especially for kids. As a parent, you don’t necessarily need to be a gamer yourself to help keep your children safe when the controller is in their hands.
Consider taking proactive steps like these to create a healthy online gaming environment for kids of all ages.
Check System Privacy Settings
As a first line of defense – before your child even starts gaming – spend some time in the device or console privacy settings. Here you can turn off sharing, disable location tracking, limit microphone and camera access and restrict how other users can interact with your child’s profile. Similarly, many games and platforms include built-in privacy settings that can be tailored to your child’s age and online experience. These settings may allow you to limit who can view your child’s profile or send a friend request, message or voice chat.
Research Games
Because not all games are created equal, look up game ratings through a service such as ESRB before buying or downloading to understand the maturity level of the game and determine if it’s appropriate for your child. To take it a step further, read reviews from other parents or watch gameplay videos to see if you deem not only the content but also the social interaction acceptable.
Use Facial Age Estimation
Online platforms are increasingly looking for ways to keep users safe, and that includes added levels of verification. As part of a multilayered approach to safety, Roblox is the first online gaming platform to require age checks for users of all ages to access chat features, enabling age-appropriate communication and limiting conversations between adults and minors. These secure age checks are designed to be fast, easy and secure using Facial Age Estimation technology directly within the app.
“Our commitment to safety is rooted in delivering the highest level of protection for our users,” said Matt Kaufman, chief safety officer at Roblox. “By building proactive, age-based barriers, we can empower users to create and connect in ways that are both safe and appropriate.”
Once age-checked, users are assigned to one of six age groups: under 9, 9-12, 13-15, 16-17, 18-20 or 21 and older, ensuring conversations are safe and age appropriate. Age checks are optional; however, features like chat will not be accessible unless an age check is completed. Chat is also turned off by default for children under age 9, unless a parent provides consent after an age check.
Keep Personal Information Private
It’s seldom a bad idea to be extra cautious when interacting with strangers online, even if they seem friendly enough while playing the game. Teach children what information not to share, including their full name, address, birthday, school name, phone number, email address, passwords or any photos that may contain any personal information (like a house number or school logo) in the background. Also encourage a screen name and generic avatar for added privacy.
Turn on Parental Controls
Designed to allow parents a supervisory role in their child’s online gaming experience, parental controls on many platforms include the ability to set schedules and limit playtime, restrict access to certain content or social features, require a password for purchases or set a spending limit.
Avoid Clicking Unfamiliar Links
Player profiles and in-game chats may include links to external sites, including those promising rewards or cheat codes. Because they can be used to gain access to personal information, remind your children to ask an adult before clicking any unfamiliar links while gaming so they can be verified as trustworthy.
Employ Privacy and Security Tools
While system or console-specific settings allow parents to set content restrictions, approve downloads, manage friends lists and more, additional layers of security are sometimes necessary. Extra safeguards such as antivirus and internet security software, DNS (domain name system) filtering and two-factor authentication can also be enabled to help keep kids safe online.
For more tools to help parents make informed decisions and support their children’s gaming experience, visit corp.roblox.com/safety.
Photo courtesy of Shutterstock (father and daughter playing video game)
SOURCE:
Roblox
Our Lifestyle section on STM Daily News is a hub of inspiration and practical information, offering a range of articles that touch on various aspects of daily life. From tips on family finances to guides for maintaining health and wellness, we strive to empower our readers with knowledge and resources to enhance their lifestyles. Whether you’re seeking outdoor activity ideas, fashion trends, or travel recommendations, our lifestyle section has got you covered. Visit us today at https://stmdailynews.com/category/lifestyle/ and embark on a journey of discovery and self-improvement.
Discover more from Daily News
Subscribe to get the latest posts sent to your email.
