Connect with us

Tech

T-Mobile, MeetMo, and NantStudios Win Prestigious 2025 Lumiere Award for Revolutionary Las Vegas Grand Prix Formula One Fan Experience

Published

on

MeetMo
Radiant Images 360° 12K plate capture vehicle.

The world of motorsports just took a giant leap into the future! Excitement is in the air as T-Mobile, MeetMo, and NantStudios have clinched the illustrious 2025 Lumiere Award for Best Interactive Experience from the Advanced Imaging Society. This accolade is in recognition of their pioneering immersive video experience for fans at the celebrated Las Vegas Grand Prix!

A Game-Changing Experience

Imagine being able to step into a race track from the comfort of your own home, enveloped in a 360-degree augmented reality tour of the circuit, all captured in breathtaking 12K footage. Thanks to this remarkable collaboration, fans can now enjoy a race experience like never before, made possible by a spectacular fusion of 5G technology, virtual production, and artificial intelligence.


“By combining T-Mobile’s 5G Advanced Network Solutions with our real-time collaboration technology, we’ve created an immersive experience that brings fans closer to the action than ever before,” expressed Michael Mansouri, CEO of Radiant Images and MeetMo. His enthusiasm is shared by many, as this innovative project is seen as a quantum leap forward in the way motorsports are experienced.

The Technical Marvel Behind the Magic

Highlighting their technological finesse, the project transformed over 1.5TB of data into a stunningly interactive experience in mere hours—a feat that previously would have taken months. The journey began at the NantStudios headquarters in Los Angeles, where more than 10 minutes of ultra-high definition, immersive sequences were blended with telemetry and driver animation data captured tirelessly by Radiant Images’ crews in Las Vegas.

The astounding speed and efficiency were primarily powered by T-Mobile’s robust 5G infrastructure, allowing for rapid data transfers back and forth, ensuring seamless integration into the interactive app that fans could access. Chris Melus, VP of Product Management for T-Mobile’s Business Group, proudly remarked, “This collaboration broke new ground for immersive fan engagement.”

The Power of 5G

The integration of T-Mobile’s advanced network solutions turned the Las Vegas Grand Prix into a case study of innovation. With real-time capture and transmission capabilities utilizing Radiant Images’ cutting-edge 360° 12K camera car, production crews were able to capture immersive video feeds and transmit them instantaneously over the 5G network. This meant remote camera control and instant footage reviews, drastically cutting production time and resources.

Moreover, the seamless AR integration—thanks to the creative minds at NantStudios and their work with Unreal Engine—allowed the blending of virtual and real-world elements. Fans were treated to augmented reality overlays displaying real-time data, such as dashboard metrics and telemetry, all transmitted through the reliable 5G network.

Future of Fan Engagement

As Jim Chabin, President of the Advanced Imaging Society, eloquently noted, the remarkable work at the Las Vegas Grand Prix has set new standards for interactive sports entertainment. The recognition given to this innovative team underscores their commitment to pushing the envelope in immersive experiences.

Advertisement
Big Dill Pickleball Co. Serving Up Fun!

Gary Marshall, Vice President of Virtual Production at NantStudios, also highlighted the project’s importance: “This recognition underscores NantStudios’ legacy of pioneering real-time VFX and virtual production achievements, reaffirming our position as a leader in modern virtual production.”

F1 Las Vegas Grand Prix Fan Experience – Drive the Las Vegas Grand Prix Strip Circuit

The 2025 Lumiere Award is not just a trophy; it symbolizes the melding of creativity and technology in a way that elevates the fan experience to new heights. The collaboration between T-Mobile, MeetMo, and NantStudios exemplifies a thrilling future where motorsports become more accessible, engaging, and immersive. It’s a thrilling time to be a fan, and the development teams behind this innovation have truly set a new standard for content creators everywhere.

With such defining moments in sports entertainment, we can’t help but wonder what spectacular innovations lie ahead. Buckle up; it’s going to be a wild ride!

About the Companies

MeetMo
MeetMo.io is revolutionizing how creative professionals collaborate by combining video conferencing, live streaming, and AI automation into a single, intuitive platform. With persistent virtual meeting rooms that adapt to users over time, our platform evolves into a true collaborative partner, enhancing creativity and productivity. For more information please visit: https://www.meetmo.io

Radiant Images
Radiant Images is a globally acclaimed, award-winning technology provider specializing in innovative tools and solutions for the media and entertainment industries. The company focuses on advancing cinema, immersive media, and live production. https://www.radiantimages.com

T-Mobile
T-Mobile US, Inc.(NASDAQ: TMUS) is America’s supercharged Un-carrier, delivering an advanced 4G LTE and transformative nationwide 5G network that will offer reliable connectivity for all. T-Mobile’s customers benefit from its unmatched combination of value and quality, unwavering obsession with offering them the best possible service experience and indisputable drive for disruption that creates competition and innovation in wireless and beyond. Based in Bellevue, Wash., T-Mobile provides services through its subsidiaries and operates its flagship brands, T-Mobile, Metro by T-Mobile and Mint Mobile. For more information please visit: https://www.t-mobile.com

Advertisement
Big Dill Pickleball Co. Serving Up Fun!

NantStudios
NantStudios is the first real time-native, full-service production house; re-imagined from the ground up to deliver exceptional creative results through next generation technologies like Virtual Production. For more information please visit: https://nantstudios.com

SOURCE MeetMo

Looking for an entertainment experience that transcends the ordinary? Look no further than STM Daily News Blog’s vibrant Entertainment section. Immerse yourself in the captivating world of indie films, streaming and podcasts, movie reviews, music, expos, venues, and theme and amusement parks. Discover hidden cinematic gems, binge-worthy series and addictive podcasts, gain insights into the latest releases with our movie reviews, explore the latest trends in music, dive into the vibrant atmosphere of expos, and embark on thrilling adventures in breathtaking venues and theme parks. Join us at STM Entertainment and let your entertainment journey begin! https://stmdailynews.com/category/entertainment/

and let your entertainment journey begin!


Discover more from Daily News

Subscribe to get the latest posts sent to your email.

Hal Machina is a passionate writer, blogger, and self-proclaimed journalist who explores the intersection of science, tech, and futurism. Join him on a journey into innovative ideas and groundbreaking discoveries!

Continue Reading
Advertisement
Click to comment

Tell us what did you think about this article?

Tech

How close are quantum computers to being really useful? Podcast

Quantum computers could revolutionize science by solving complex problems. However, scaling and error correction remain significant challenges before achieving practical applications.

Published

on

quantum computers
Audio und verbung/Shutterstock

Gemma Ware, The Conversation

Quantum computers have the potential to solve big scientific problems that are beyond the reach of today’s most powerful supercomputers, such as discovering new antibiotics or developing new materials.

But to achieve these breakthroughs, quantum computers will need to perform better than today’s best classical computers at solving real-world problems. And they’re not quite there yet. So what is still holding quantum computing back from becoming useful?

In this episode of The Conversation Weekly podcast, we speak to quantum computing expert Daniel Lidar at the University of Southern California in the US about what problems scientists are still wrestling with when it comes to scaling up quantum computing, and how close they are to overcoming them.

https://cdn.theconversation.com/infographics/561/4fbbd099d631750693d02bac632430b71b37cd5f/site/index.html

Quantum computers harness the power of quantum mechanics, the laws that govern subatomic particles. Instead of the classical bits of information used by microchips inside traditional computers, which are either a 0 or a 1, the chips in quantum computers use qubits, which can be both 0 and 1 at the same time or anywhere in between. Daniel Lidar explains:

“Put a lot of these qubits together and all of a sudden you have a computer that can simultaneously represent many, many different possibilities …  and that is the starting point for the speed up that we can get from quantum computing.”

Faulty qubits

One of the biggest problems scientist face is how to scale up quantum computing power. Qubits are notoriously prone to errors – which means that they can quickly revert to being either a 0 or a 1, and so lose their advantage over classical computers.

Scientists have focused on trying to solve these errors through the concept of redundancy – linking strings of physical qubits together into what’s called a “logical qubit” to try and maximise the number of steps in a computation. And, little by little, they’re getting there.

Advertisement
Big Dill Pickleball Co. Serving Up Fun!

In December 2024, Google announced that its new quantum chip, Willow, had demonstrated what’s called “beyond breakeven”, when its logical qubits worked better than the constituent parts and even kept on improving as it scaled up.

Lidar says right now the development of this technology is happening very fast:

“For quantum computing to scale and to take off is going to still take some real science breakthroughs, some real engineering breakthroughs, and probably overcoming some yet unforeseen surprises before we get to the point of true quantum utility. With that caution in mind, I think it’s still very fair to say that we are going to see truly functional, practical quantum computers kicking into gear, helping us solve real-life problems, within the next decade or so.”

Listen to Lidar explain more about how quantum computers and quantum error correction works on The Conversation Weekly podcast.


This episode of The Conversation Weekly was written and produced by Gemma Ware with assistance from Katie Flood and Mend Mariwany. Sound design was by Michelle Macklem, and theme music by Neeta Sarl.

Clips in this episode from Google Quantum AI and 10 Hours Channel.

You can find us on Instagram at theconversationdotcom or via e-mail. You can also subscribe to The Conversation’s free daily e-mail here.

Listen to The Conversation Weekly via any of the apps listed above, download it directly via our RSS feed or find out how else to listen here.

Advertisement
Big Dill Pickleball Co. Serving Up Fun!

Gemma Ware, Host, The Conversation Weekly Podcast, The Conversation

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Author


Discover more from Daily News

Subscribe to get the latest posts sent to your email.

Continue Reading

Tech

Why building big AIs costs billions – and how Chinese startup DeepSeek dramatically changed the calculus

Published

on

DeepSeek
DeepSeek burst on the scene – and may be bursting some bubbles. AP Photo/Andy Wong

Ambuj Tewari, University of Michigan

State-of-the-art artificial intelligence systems like OpenAI’s ChatGPT, Google’s Gemini and Anthropic’s Claude have captured the public imagination by producing fluent text in multiple languages in response to user prompts. Those companies have also captured headlines with the huge sums they’ve invested to build ever more powerful models.

An AI startup from China, DeepSeek, has upset expectations about how much money is needed to build the latest and greatest AIs. In the process, they’ve cast doubt on the billions of dollars of investment by the big AI players.

I study machine learning. DeepSeek’s disruptive debut comes down not to any stunning technological breakthrough but to a time-honored practice: finding efficiencies. In a field that consumes vast computing resources, that has proved to be significant.

Where the costs are

Developing such powerful AI systems begins with building a large language model. A large language model predicts the next word given previous words. For example, if the beginning of a sentence is “The theory of relativity was discovered by Albert,” a large language model might predict that the next word is “Einstein.” Large language models are trained to become good at such predictions in a process called pretraining.

Pretraining requires a lot of data and computing power. The companies collect data by crawling the web and scanning books. Computing is usually powered by graphics processing units, or GPUs. Why graphics? It turns out that both computer graphics and the artificial neural networks that underlie large language models rely on the same area of mathematics known as linear algebra. Large language models internally store hundreds of billions of numbers called parameters or weights. It is these weights that are modified during pretraining. https://www.youtube.com/embed/MJQIQJYxey4?wmode=transparent&start=0 Large language models consume huge amounts of computing resources, which in turn means lots of energy.

Pretraining is, however, not enough to yield a consumer product like ChatGPT. A pretrained large language model is usually not good at following human instructions. It might also not be aligned with human preferences. For example, it might output harmful or abusive language, both of which are present in text on the web.

The pretrained model therefore usually goes through additional stages of training. One such stage is instruction tuning where the model is shown examples of human instructions and expected responses. After instruction tuning comes a stage called reinforcement learning from human feedback. In this stage, human annotators are shown multiple large language model responses to the same prompt. The annotators are then asked to point out which response they prefer.

Advertisement
Big Dill Pickleball Co. Serving Up Fun!

It is easy to see how costs add up when building an AI model: hiring top-quality AI talent, building a data center with thousands of GPUs, collecting data for pretraining, and running pretraining on GPUs. Additionally, there are costs involved in data collection and computation in the instruction tuning and reinforcement learning from human feedback stages.

All included, costs for building a cutting edge AI model can soar up to US$100 million. GPU training is a significant component of the total cost.

The expenditure does not stop when the model is ready. When the model is deployed and responds to user prompts, it uses more computation known as test time or inference time compute. Test time compute also needs GPUs. In December 2024, OpenAI announced a new phenomenon they saw with their latest model o1: as test time compute increased, the model got better at logical reasoning tasks such as math olympiad and competitive coding problems.

Slimming down resource consumption

Thus it seemed that the path to building the best AI models in the world was to invest in more computation during both training and inference. But then DeepSeek entered the fray and bucked this trend.

DeepSeek sent shockwaves through the tech financial ecosystem.

Their V-series models, culminating in the V3 model, used a series of optimizations to make training cutting edge AI models significantly more economical. Their technical report states that it took them less than $6 million dollars to train V3. They admit that this cost does not include costs of hiring the team, doing the research, trying out various ideas and data collection. But $6 million is still an impressively small figure for training a model that rivals leading AI models developed with much higher costs.

The reduction in costs was not due to a single magic bullet. It was a combination of many smart engineering choices including using fewer bits to represent model weights, innovation in the neural network architecture, and reducing communication overhead as data is passed around between GPUs.

It is interesting to note that due to U.S. export restrictions on China, the DeepSeek team did not have access to high performance GPUs like the Nvidia H100. Instead they used Nvidia H800 GPUs, which Nvidia designed to be lower performance so that they comply with U.S. export restrictions. Working with this limitation seems to have unleashed even more ingenuity from the DeepSeek team.

Advertisement
Big Dill Pickleball Co. Serving Up Fun!

DeepSeek also innovated to make inference cheaper, reducing the cost of running the model. Moreover, they released a model called R1 that is comparable to OpenAI’s o1 model on reasoning tasks.

They released all the model weights for V3 and R1 publicly. Anyone can download and further improve or customize their models. Furthermore, DeepSeek released their models under the permissive MIT license, which allows others to use the models for personal, academic or commercial purposes with minimal restrictions.

Resetting expectations

DeepSeek has fundamentally altered the landscape of large AI models. An open weights model trained economically is now on par with more expensive and closed models that require paid subscription plans.

The research community and the stock market will need some time to adjust to this new reality.

Ambuj Tewari, Professor of Statistics, University of Michigan

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Advertisement
Big Dill Pickleball Co. Serving Up Fun!

STM Daily News is a vibrant news blog dedicated to sharing the brighter side of human experiences. Emphasizing positive, uplifting stories, the site focuses on delivering inspiring, informative, and well-researched content. With a commitment to accurate, fair, and responsible journalism, STM Daily News aims to foster a community of readers passionate about positive change and engaged in meaningful conversations. Join the movement and explore stories that celebrate the positive impacts shaping our world.

https://stmdailynews.com/


Discover more from Daily News

Subscribe to get the latest posts sent to your email.

Continue Reading

Lifestyle

Biden helped bring science out of the lab and into the community − emphasizing research focused on solutions

Published

on

Biden
Biden began his presidency in the throes of the COVID-19 pandemic. Evan Vucci/AP Photo

Arthur Daemmrich, Arizona State University

President Joe Biden was inaugurated in January 2021 amid a devastating pandemic, with over 24 million COVID-19 cases and more than 400,000 deaths in the U.S. recorded at that point.

Operation Warp Speed, initiated by the Trump administration in May 2020, meant an effective vaccine was becoming available. Biden quickly announced a plan to immunize 100 million Americans over the next three months. By the end of April 2021, 145 million Americans – nearly half the population – had received one vaccine dose, and 103 million were considered fully vaccinated. Science and technology policymakers celebrated this coordination across science, industry and government to address a real-world crisis as a 21st-century Manhattan Project.

From my perspective as a scholar of science and technology policy, Biden’s legacy includes structural, institutional and practical changes to how science is conducted. Building on approaches developed over the course of many years, the administration elevated the status of science in the government and fostered community participation in research.

Raising science’s profile in government

The U.S. has no single ministry of science and technology. Instead, agencies and offices across the executive branch carry out scientific research at several national labs and fund research by other institutions. By elevating the White House Office of Science and Technology Policy to a Cabinet-level organization for the first time in its history, Biden gave the agency greater influence in federal decision-making and coordination.

Formally established in 1976, the agency provides the president and senior staff with scientific and technical advice, bringing science to bear on executive policies. Biden’s inclusion of the agency’s director in his Cabinet was a strong signal about the elevated role science and technology would play in the administration’s solutions to major societal challenges.

Under Biden, the Office of Science and Technology Policy established guidelines that agencies across the government would follow as they implemented major legislation. This included developing technologies that remove carbon dioxide from the atmosphere to address climate change, rebuilding America’s chip industry, and managing the rollout of AI technologies.

Close-up of gloved hand holding square semiconductor chip
The CHIPS and Science Act of 2022 boosted research and manufacture of semiconductor chips in the U.S. Narumon Bowonkitwanchai/Moment via Getty Images

Instead of treating the ethical and societal dimensions of scientific and technological change as separate from research and development, the agency advocated for a more integrated approach. This was reflected in the appointment of social scientist Alondra Nelson as the agency’s first deputy director for science and society, and science policy expert Kei Koizumi as principal deputy director for policy. Ethical and societal considerations were added as evaluation criteria for grants. And initiatives such as the AI bill of rights and frameworks for research integrity and open science further encouraged all federal agencies to consider the social effects of their research.

The Office of Science and Technology Policy also introduced new ways for agencies to consult with communities, including Native Nations, rural Americans and people of color, in order to avoid known biases in science and technology research. For example, the agency issued government-wide guidance to recognize and include Indigenous knowledge in federal programs. Agencies such as the Department of Energy have incorporated public perspectives while rolling out atmospheric carbon dioxide removal technologies and building new hydrogen hubs.

Advertisement
Big Dill Pickleball Co. Serving Up Fun!

Use-inspired research

A long-standing criticism of U.S. science funding is that it often fails to answer questions of societal importance. Members of Congress and policy analysts have argued that funded projects instead overly emphasize basic research in areas that advance the careers of researchers.

In response, the Biden administration established the technology, innovation and partnerships directorate at the National Science Foundation in March 2022.

The directorate uses social science approaches to help focus scientific research and technology on their potential uses and effects on society. For example, engineers developing future energy technologies could start by consulting with the community about local needs and opportunities, rather than pitching their preferred solution after years of laboratory work. Genetic researchers could share both knowledge and financial benefits with the communities that provided the researchers with data.

Fundamentally, “use-inspired” research aims to reconnect scientists and engineers with the people and communities their work ultimately affects, going beyond publication in a journal accessible only to academics.

The technology, innovation and partnerships directorate established initiatives to support regional projects and multidisciplinary partnerships bringing together researchers, entrepreneurs and community organizations. These programs, such as the regional innovation engines and convergence accelerator, seek to balance the traditional process of grant proposals written and evaluated by academics with broader societal demand for affordable health and environmental solutions. This work is particularly key to parts of the country that have not yet seen visible gains from decades of federally sponsored research, such as regions encompassing western North Carolina, northern South Carolina, eastern Tennessee and southwest Virginia.

Community-based scientific research

The Biden administration also worked to involve communities in science not just as research consultants but also as active participants.

Scientific research and technology-based innovation are often considered the exclusive domain of experts from elite universities or national labs. Yet, many communities are eager to conduct research, and they have insights to contribute. There is a decades-long history of citizen science initiatives, such as birdwatchers contributing data to national environmental surveys and community groups collecting industrial emissions data that officials can use to make regulations more cost effective.

Advertisement
Big Dill Pickleball Co. Serving Up Fun!

Going further, the Biden administration carried out experiments to create research projects in a way that involved community members, local colleges and federal agencies as more equal partners.

Hand-drawn signs displayed on a fence against a green field, with messages about climate change around a sign that reads 'It's our future'
Collaboration between the community, academia, industry and government can lead to more effective solutions. Deb Cohn-Orbach/UCG/Universal Images Group via Getty Images

For example, the Justice40 initiative asked people from across the country, including rural and small-town Americans, to identify local environmental justice issues and potential solutions.

The National Institutes of Health’s ComPASS program funded community organizations to test and scale successful health interventions, such as identifying pregnant women with complex medical needs and connecting them to specialized care.

And the National Science Foundation’s Civic Innovation Challenge required academic researchers to work with local organizations to address local concerns, improving the community’s technical skills and knowledge.

Frontiers of science and technology policy

Researchers often cite the 1945 report Science: The Endless Frontier, written by former Office of Scientific Research and Development head Vannevar Bush, to describe the core rationales for using American taxpayer money to fund basic science. Under this model, funding science would lead to three key outcomes: a secure national defense, improved health, and economic prosperity. The report, however, says little about how to go from basic science to desired societal outcomes. It also makes no mention of scientists sharing responsibility for the direction and impact of their work.

The 80th anniversary of Bush’s report in 2025 offers an opportunity to move science out into society. At present, major government initiatives are following a technology push model that focuses efforts on only one or a few products and involves little consideration of consumer and market demand. Research has repeatedly demonstrated that consumer or societal pull, which attracts development of products that enhance quality of life, is key to successful uptake of new technologies and their longevity.

Future administrations can further advance science and address major societal challenges by considering how ready society is to take up new technologies and increasing collaboration between government and civil society.

Arthur Daemmrich, Professor of Practice in the School for the Future of Innovation in Society, Arizona State University

Advertisement
Big Dill Pickleball Co. Serving Up Fun!

This article is republished from The Conversation under a Creative Commons license. Read the original article.

The science section of our news blog STM Daily News provides readers with captivating and up-to-date information on the latest scientific discoveries, breakthroughs, and innovations across various fields. We offer engaging and accessible content, ensuring that readers with different levels of scientific knowledge can stay informed. Whether it’s exploring advancements in medicine, astronomy, technology, or environmental sciences, our science section strives to shed light on the intriguing world of scientific exploration and its profound impact on our daily lives. From thought-provoking articles to informative interviews with experts in the field, STM Daily News Science offers a harmonious blend of factual reporting, analysis, and exploration, making it a go-to source for science enthusiasts and curious minds alike. https://stmdailynews.com/category/science/


Discover more from Daily News

Subscribe to get the latest posts sent to your email.

Continue Reading

Trending