State-of-the-art artificial intelligence systems like OpenAI’s ChatGPT, Google’s Gemini and Anthropic’s Claude have captured the public imagination by producing fluent text in multiple languages in response to user prompts. Those companies have also captured headlines with the huge sums they’ve invested to build ever more powerful models.
An AI startup from China, DeepSeek, has upset expectations about how much money is needed to build the latest and greatest AIs. In the process, they’ve cast doubt on the billions of dollars of investment by the big AI players.
I study machine learning. DeepSeek’s disruptive debut comes down not to any stunning technological breakthrough but to a time-honored practice: finding efficiencies. In a field that consumes vast computing resources, that has proved to be significant.
Where the costs are
Developing such powerful AI systems begins with building a large language model. A large language model predicts the next word given previous words. For example, if the beginning of a sentence is “The theory of relativity was discovered by Albert,” a large language model might predict that the next word is “Einstein.” Large language models are trained to become good at such predictions in a process called pretraining.
Pretraining requires a lot of data and computing power. The companies collect data by crawling the web and scanning books. Computing is usually powered by graphics processing units, or GPUs. Why graphics? It turns out that both computer graphics and the artificial neural networks that underlie large language models rely on the same area of mathematics known as linear algebra. Large language models internally store hundreds of billions of numbers called parameters or weights. It is these weights that are modified during pretraining. https://www.youtube.com/embed/MJQIQJYxey4?wmode=transparent&start=0 Large language models consume huge amounts of computing resources, which in turn means lots of energy.
Pretraining is, however, not enough to yield a consumer product like ChatGPT. A pretrained large language model is usually not good at following human instructions. It might also not be aligned with human preferences. For example, it might output harmful or abusive language, both of which are present in text on the web.
The pretrained model therefore usually goes through additional stages of training. One such stage is instruction tuning where the model is shown examples of human instructions and expected responses. After instruction tuning comes a stage called reinforcement learning from human feedback. In this stage, human annotators are shown multiple large language model responses to the same prompt. The annotators are then asked to point out which response they prefer.
Advertisement
It is easy to see how costs add up when building an AI model: hiring top-quality AI talent, building a data center with thousands of GPUs, collecting data for pretraining, and running pretraining on GPUs. Additionally, there are costs involved in data collection and computation in the instruction tuning and reinforcement learning from human feedback stages.
All included, costs for building a cutting edge AI model can soar up to US$100 million. GPU training is a significant component of the total cost.
The expenditure does not stop when the model is ready. When the model is deployed and responds to user prompts, it uses more computation known as test time or inference time compute. Test time compute also needs GPUs. In December 2024, OpenAI announced a new phenomenon they saw with their latest model o1: as test time compute increased, the model got better at logical reasoning tasks such as math olympiad and competitive coding problems.
Slimming down resource consumption
Thus it seemed that the path to building the best AI models in the world was to invest in more computation during both training and inference. But then DeepSeek entered the fray and bucked this trend.
DeepSeek sent shockwaves through the tech financial ecosystem.
Their V-series models, culminating in the V3 model, used a series of optimizations to make training cutting edge AI models significantly more economical. Their technical report states that it took them less than $6 million dollars to train V3. They admit that this cost does not include costs of hiring the team, doing the research, trying out various ideas and data collection. But $6 million is still an impressively small figure for training a model that rivals leading AI models developed with much higher costs.
The reduction in costs was not due to a single magic bullet. It was a combination of many smart engineering choices including using fewer bits to represent model weights, innovation in the neural network architecture, and reducing communication overhead as data is passed around between GPUs.
It is interesting to note that due to U.S. export restrictions on China, the DeepSeek team did not have access to high performance GPUs like the Nvidia H100. Instead they used Nvidia H800 GPUs, which Nvidia designed to be lower performance so that they comply with U.S. export restrictions. Working with this limitation seems to have unleashed even more ingenuity from the DeepSeek team.
Advertisement
DeepSeek also innovated to make inference cheaper, reducing the cost of running the model. Moreover, they released a model called R1 that is comparable to OpenAI’s o1 model on reasoning tasks.
They released all the model weights for V3 and R1 publicly. Anyone can download and further improve or customize their models. Furthermore, DeepSeek released their models under the permissive MIT license, which allows others to use the models for personal, academic or commercial purposes with minimal restrictions.
Resetting expectations
DeepSeek has fundamentally altered the landscape of large AI models. An open weights model trained economically is now on par with more expensive and closed models that require paid subscription plans.
The research community and the stock market will need some time to adjust to this new reality.
STM Daily News is a vibrant news blog dedicated to sharing the brighter side of human experiences. Emphasizing positive, uplifting stories, the site focuses on delivering inspiring, informative, and well-researched content. With a commitment to accurate, fair, and responsible journalism, STM Daily News aims to foster a community of readers passionate about positive change and engaged in meaningful conversations. Join the movement and explore stories that celebrate the positive impacts shaping our world.
Several missions have already attempted to land on the lunar surface in 2025, with more to come.
AP PhotoZhenbo Wang, University of Tennessee
Half a century after the Apollo astronauts left the last bootprints in lunar dust, the Moon has once again become a destination of fierce ambition and delicate engineering.
This time, it’s not just superpowers racing to plant flags, but also private companies, multinational partnerships and robotic scouts aiming to unlock the Moon’s secrets and lay the groundwork for future human return.
So far in 2025, lunar exploration has surged forward. Several notable missions have launched toward or landed on the Moon. Each has navigated the long journey through space and the even trickier descent to the Moon’s surface or into orbit with varying degrees of success. Together, these missions reflect both the promise and difficulty of returning to the Moon in this new space race defined by innovation, competition and collaboration.
As an aerospace engineer specializing in guidance, navigation and control technologies, I’m deeply interested in how each mission – whether successful or not – adds to scientists’ collective understanding. These missions can help engineers learn to navigate the complexities of space, operate in hostile lunar environments and steadily advance toward a sustainable human presence on the Moon.
Why is landing on the Moon so hard?
Lunar exploration remains one of the most technically demanding frontiers in modern spaceflight. Choosing a landing site involves complex trade-offs between scientific interest, terrain safety and Sun exposure.
The lunar south pole is an especially attractive area, as it could contain water in the form of ice in shadowed craters, a critical resource for future missions. Other sites may hold clues about volcanic activity on the Moon or the solar system’s early history.
Each mission trajectory must be calculated with precision to make sure the craft arrives and descends at the right time and place. Engineers must account for the Moon’s constantly changing position in its orbit around Earth, the timing of launch windows and the gravitational forces acting on the spacecraft throughout its journey.
They also need to carefully plan the spacecraft’s path so that it arrives at the right angle and speed for a safe approach. Even small miscalculations early on can lead to major errors in landing location – or a missed opportunity entirely.
Once on the surface, the landers need to survive extreme swings in temperature – from highs over 250 degrees Fahrenheit (121 degrees Celsius) in daylight down to lows of -208 F (-133 C) at night – as well as dust, radiation and delayed communication with Earth. The spacecraft’s power systems, heat control, landing legs and communication links must all function perfectly. Meanwhile, these landers must avoid hazardous terrain and rely on sunlight to power their instruments and recharge their batteries.
These challenges help explain why many landers have crashed or experienced partial failures, even though the technology has come a long way since the Apollo era.
Commercial companies face the same technical hurdles as government agencies but often with tighter budgets, smaller teams and less heritage hardware. Unlike government missions, which can draw on decades of institutional experience and infrastructure, many commercial lunar efforts are navigating these challenges for the first time.
Successful landings and hard lessons for CLPS
Several lunar missions launched this year belong to NASA’s Commercial Lunar Payload Services program. CLPS is an initiative that contracts private companies to deliver science and technology payloads to the Moon. Its aim is to accelerate exploration while lowering costs and encouraging commercial innovation.
An artist’s rendering of Firefly Aerospace’s Blue Ghost lander, which navigated and avoided hazards during its final descent to the surface.NASA/GSFC/Rani Gran/Wikimedia Commons
The first Moon mission of 2025, Firefly Aerospace’s Blue Ghost Mission 1, launched in January and successfully landed in early March.
The lander survived the harsh lunar day and transmitted data for nearly two weeks before losing power during the freezing lunar night – a typical operational limit for most unheated lunar landers.
Blue Ghost demonstrated how commercial landers can shoulder critical parts of NASA’s Artemis program, which aims to return astronauts to the Moon later this decade.
The second CLPS launch of the year, Intuitive Machines’ IM-2 mission, launched in late February. It targeted a scientifically intriguing site near the Moon’s south pole region.
An artist’s rendering of Intuitive Machines’ IM-2 mission, which is scheduled to land near the lunar south pole for in-situ resource utilization demonstration on the Moon.NASA/Intuitive Machines
The Nova-C lander, named Athena, touched down on March 6 close to the south pole. However, during the landing process, Athena tipped over. Since it landed on its side in a crater with uneven terrain, it couldn’t deploy its solar panels to generate power, which ended the mission early.
While Athena’s tipped-over landing meant it couldn’t do all the scientific explorations it had planned, the data it returned is still valuable for understanding how future landers can avoid similar fates on the rugged polar terrain.
Not all lunar missions need to land. NASA’s Lunar Trailblazer, a small lunar orbiter launched in February alongside IM-2, was intended to orbit the Moon and map the form, abundance and distribution of water in the form of ice, especially in shadowed craters near the poles.
Shortly after launch, however, NASA lost contact with the spacecraft. Engineers suspect the spacecraft may have experienced a power issue, potentially leaving its batteries depleted.
NASA is continuing recovery efforts, hoping that the spacecraft’s solar panels may recharge in May and June.
An artist’s rendering of NASA’s Lunar Trailblazer spacecraft. If recovered, it will orbit the Moon to measure the form and distribution of water on the lunar surface.Lockheed Martin Space
Ongoing and future missions
Launched on the same day as the Blue Ghost mission in January, Japanese company ispace’s Hakuto-R Mission 2 (Resilience) is on its way to the Moon and has successfully entered lunar orbit.
The lander carried out a successful flyby of the Moon on Feb. 15, with an expected landing in early June. Although launched at the same time, Resilience took a longer trajectory than Blue Ghost to save energy. This maneuver also allowed the spacecraft to collect bonus science observations while looping around the Moon.
The mission, if successful, will advance Japan’s commercial space sector and prove an important comeback for ispace after its first lunar lander crashed during its final descent in 2023.
The Resilience lunar lander days before its launch in the payload processing facility at the U.S. Space Force station. The Resilience lander has completed its Earth orbit and a lunar flyby. It is now completing a low-energy transfer orbit and entering an orbit around the Moon.Business Wire
The rest of 2025 promises a busy lunar calendar. Intuitive Machines plans to launch IM-3 in late 2025 to test more advanced instruments and potentially deliver NASA scientific experiments to the Moon.
The European Space Agency’s Lunar Pathfinder will establish a dedicated lunar communications satellite, making it easier for future missions, especially those operating on the far side or poles, to stay in touch with Earth.
Meanwhile, Astrobotic’s Griffin Mission-1 is scheduled to deliver NASA’s VIPER rover to the Moon’s south pole, where it will directly search for ice beneath the surface.
Together, these missions represent an increasingly international and commercial approach to lunar science and exploration.
As the world turns its attention to the Moon, every mission – whether triumph or setback – brings humanity closer to a permanent return to our closest celestial neighbor.Zhenbo Wang, Associate Professor of Mechanical and Aerospace Engineering, University of Tennessee
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Who owns the Moon?
Henglein and Steets/Getty ImagesScott Shackelford, Indiana University
Private industries have helped drop the cost of launching rockets, satellites and other equipment into space to historic lows. That has boosted interest in developing space – both for mining raw materials such as silicon for solar panels and oxygen for rocket fuel, as well as potentially relocating polluting industriesoff the Earth. But the rules are not clear about who would profit if, for instance, a U.S. company like SpaceX colonized Mars or established a Moon base.
At the moment, no company – or nation – is yet ready to claim or take advantage of private property in space. But the US$350 billion space industry could change quickly. Several companies are already planning to explore the Moon to find raw materials like water; Helium-3, which is potentially useful in fusion nuclear reactors; and rare earth elements, which are invaluable for manufacturing electronics. What they might find, and how easy the material is to bring back to Earth, remains to be seen.
Anticipating additional commercial interest, the Trump administration has created new rules through an executive order following a 2015 law change for how those companies might profit from operations on the Moon, asteroids and other planets. Those rules conflict with a longstanding international treaty the U.S. has generally followed but never formally joined. The administration also is planning to encourage other nations to adopt this new U.S. perspective on space mining.
As a scholar of space law and policy – and a proud sci-fi nerd – I believe the international community could find new ways to peacefully govern space from examples here on our planet, including deep seabed mining and Antarctica.
A 2015 meeting of the International Seabed Authority.AP Photo/David McFadden
Who owns space?
In general, regions of Earth beyond any one nation’s control – like the high seas, the atmosphere and Antarctica – have been viewed by the international community as globally shared resources. That principle applied to space, too, until President Donald Trump’s executive order specifically rejected the idea that space was any sort of “global commons” shared among all nations and peoples of the Earth.
This step is the latest in a series of decisions by U.S. presidents over the last 40 years that have signaled the country’s decreasing willingness to share these types of resources, especially through an international body like the United Nations.
That is one reason why the U.S. has not ratified the U.N. Convention on the Law of the Sea, for example, which was agreed to in 1982 and took effect in 1994.
A similar story played out regarding the Moon.
Moon Treaty and international space law
Over the decades, the U.S. has sought to use its space policy in various ways. President John F. Kennedy, for example, considered turning the Apollo Moon-landing program into a joint U.S.-Soviet mission to promote peace between the superpowers.
Lyndon Johnson’s administration similarly saw space as a shared region, and in 1967 signed the Outer Space Treaty, which proclaimed that space was the “province of all mankind.” However, that treaty didn’t say anything about mining on the Moon – so when the U.S. landed there in 1969, the international community called for regulations.
The U.N.’s eventual Moon Treaty declared the Moon the “common heritage of mankind,” and sought shared international control over resources found there.
However, that plan wasn’t very popular among advocates for a more commercial final frontier. In the U.S., a nonprofit group in favor of space colonization opposed the treaty, fearing it would discourage private investment. The treaty failed ratification in the U.S. Senate. Only 18 nations have, in fact, ratified the Moon Treaty among them Mexico and Australia, none of them major space-faring powers. But even though many countries seem to agree that the Moon Treaty isn’t the right way to handle lunar property rights, that doesn’t mean they agree on what they actually should do.
Finding profit in space
As space launches got cheaper, the U.S. SPACE Act, passed in 2015, gave U.S. companies the right to mine materials from asteroids for profit. That conflicts with the shared-resources view of the 1967 Outer Space Treaty.
Since then, there have been further political efforts to remove perceived legal hurtles to space mining. In 2017, a Republican congressman sought to formalize the U.S. rejection of space as any sort of common property, proposing a bill that said, “outer space shall not be considered a global commons.” That bill died, but it was reintroduced in 2019 and is currently awaiting action in the House.
A new space race?
Allowing private control of space resources could launch a new space race, in which wealthy companies, likely from developed countries, could take control of crucial resources – like ice on the Moon, which could supply water for people or to fuel rockets – and profit handsomely.
That, in turn, would increase the likelihood of a military arms race, with the U.S., Russia and China developing weapons to defend their citizens’ space assets.
Antarctica, a continent that by international agreement is has no armed military activity and is dedicated to scientific inquiry.NASA/JPL
Applying lessons from the deep, and Antarctica
In finding common ground, and charting a path forward, it is useful to consider lessons from other frontiers. The Moon Treaty tried to set up a system for sharing the benefits of Moon mining similar to how an existing system handled mining the deep sea.
The International Seabed Authority is a U.N. body that lets nations and private firms develop resources from the deep seabed so long as they share the proceeds, particularly with landlocked developing nations. It is recognized by more than 160 nations, though the U.S. is a notable holdout.
Environmental groups have criticized the Authority for not doing enough to safeguard fragile marine environments, but the overall model of sharing the wealth from a collective resource could still be useful. For instance, the Authority’s participants are working on a new code of ethics for deep-sea mining that would emphasize environmental sustainability. Those provisions could be mirrored on other worlds.
Similarly, the global management of Antarctica has useful parallels with the Moon. The entire continent is governed by a treaty that has avoided conflict since 1959 by freezing national territorial claims and barring military and commercial activities. Instead, the continent is reserved for “peaceful purposes” and “scientific investigation.”
A similar approach could become the core of a second attempt at a Moon Treaty, and could even accommodate a provision for commercial activity along the lines of the deep-sea mining rules. In so doing, we must also learn what has not worked in the past, such as ignoring the interests of the private sector and the developing world. Advocates are correct that defining property rights is an important precursor, but it is not a binary choice between a “global commons” or private property, rather there are a universe of rights that deserve consideration and that could provide a proper foundation for sustainable development.
But coming to an international agreement would take time, energy and a widespread willingness to view resources as common assets that should be collectively governed. All those ingredients are in short supply in a world where many countries are becoming more isolationist.
For the immediate future, other countries may or may not follow the U.S. lead, and its influence, toward privatizing space. Japan seems interested, as does Luxembourg, but China and Russia are concerned about their national security, and the European Space Agency is more inclined toward working collectively. Without better coordination, it seems likely that eventually peaceful, sustainable development of off-world resources will give way to competing claims, despite readily available examples of how to avoid conflict.
[Like what you’ve read? Want more?Sign up for The Conversation’s daily newsletter.]
Scott Shackelford, Associate Professor of Business Law and Ethics; Executive Director, Ostrom Workshop; Cybersecurity Program Chair, IU-Bloomington, Indiana University
This article is republished from The Conversation under a Creative Commons license. Read the original article.
A Michigan startup backed by Jeff Bezos is challenging the luxury EV market with a no-frills approach that’s capturing nationwide attention
When most people think of electric vehicles, images of Tesla’s sleek Model S or Ford’s high-tech Lightning come to mind – along with their premium price tags. But Slate Auto, a Troy, Michigan-based startup, is taking a radically different approach that’s got 100,000 Americans reaching for their wallets.
The Anti-Luxury EV
Slate’s electric pickup truck starts at just $25,000 to $27,500, making it one of the most affordable EVs ever announced. But here’s the catch – and the genius – behind their strategy: they’re stripping away everything that typically drives up EV costs.
No paint. No stereo system. No touchscreens. Not even power windows.
“We’re building the truck that America actually needs, not the one Silicon Valley thinks we want,” the company’s approach suggests, though they let their product speak for itself.
Backed by Billions, Built for the Masses
Despite its bare-bones approach, Slate Auto isn’t a garage startup. The company has secured approximately $700 million in funding from heavyweights investors including Jeff Bezos, Mark Walter, and Thomas Tull. This financial backing gives them the resources to challenge established automakers while maintaining their commitment to affordability.
The company plans to manufacture their trucks at a former Donnelly factory in Warsaw, Indiana, with production expected to begin in late 2026.
Advertisement
Market Response: 100,000 and Counting
Within just two weeks of opening reservations, Slate collected 100,000 orders at $50 each – generating $5 million in immediate revenue and demonstrating significant pent-up demand for affordable electric vehicles.
This response suggests that while the automotive industry has been focused on premium EVs loaded with features, there’s a massive market of consumers who simply want reliable, affordable electric transportation.
The Customization Philosophy
Slate’s minimalist approach isn’t just about cost-cutting – it’s about empowerment. By delivering a basic platform, they’re enabling customers to customize their trucks according to their specific needs and budgets. Want paint? Add it yourself or have it done locally. Need a sound system? Install exactly what you want.
This philosophy extends the vehicle’s lifecycle, as second and third owners can continue customizing and upgrading, potentially increasing long-term customer satisfaction and resale value.
What This Means for the EV Market
Slate Auto’s approach represents a fundamental shift in EV strategy. While competitors race to add more features, screens, and luxury appointments, Slate is proving that sometimes less really is more.
For communities like Phoenix, where transportation costs significantly impact family budgets, a $25,000 electric truck could be transformative. Small businesses, contractors, and everyday families who’ve been priced out of the EV market suddenly have an entry point.
Image Credit: Slate Auto
Slate Auto: Looking Ahead
As Slate moves toward their 2026 production timeline, they face the typical challenges of any automotive startup: scaling manufacturing, maintaining quality, and delivering on promises. However, their reservation numbers suggest they’ve identified a genuine market need that established automakers have overlooked.
The success or failure of Slate’s minimalist approach could reshape how the entire industry thinks about electric vehicles. Are consumers really demanding luxury features, or do they just want affordable, reliable electric transportation?
Advertisement
We’ll be following Slate Auto’s progress closely as they work toward production, bringing you updates on this potentially game-changing approach to electric vehicles.—STM Daily News will continue covering Slate Auto’s development and the broader evolution of affordable electric vehicles. Have thoughts on Slate’s approach? We’d love to hear from our readers about what features matter most in an electric vehicle.
STM Daily News is a vibrant news blog dedicated to sharing the brighter side of human experiences. Emphasizing positive, uplifting stories, the site focuses on delivering inspiring, informative, and well-researched content. With a commitment to accurate, fair, and responsible journalism, STM Daily News aims to foster a community of readers passionate about positive change and engaged in meaningful conversations. Join the movement and explore stories that celebrate the positive impacts shaping our world.
Privacy & Cookies: This site uses cookies. By continuing to use this website, you agree to their use.
To find out more, including how to control cookies, see here:
Cookie Policy