Tech
How close are quantum computers to being really useful? Podcast
Quantum computers could revolutionize science by solving complex problems. However, scaling and error correction remain significant challenges before achieving practical applications.
Quantum computers have the potential to solve big scientific problems that are beyond the reach of today’s most powerful supercomputers, such as discovering new antibiotics or developing new materials.
But to achieve these breakthroughs, quantum computers will need to perform better than today’s best classical computers at solving real-world problems. And they’re not quite there yet. So what is still holding quantum computing back from becoming useful?
In this episode of The Conversation Weekly podcast, we speak to quantum computing expert Daniel Lidar at the University of Southern California in the US about what problems scientists are still wrestling with when it comes to scaling up quantum computing, and how close they are to overcoming them.
Quantum computers harness the power of quantum mechanics, the laws that govern subatomic particles. Instead of the classical bits of information used by microchips inside traditional computers, which are either a 0 or a 1, the chips in quantum computers use qubits, which can be both 0 and 1 at the same time or anywhere in between. Daniel Lidar explains:
“Put a lot of these qubits together and all of a sudden you have a computer that can simultaneously represent many, many different possibilities … and that is the starting point for the speed up that we can get from quantum computing.”
Faulty qubits
One of the biggest problems scientist face is how to scale up quantum computing power. Qubits are notoriously prone to errors – which means that they can quickly revert to being either a 0 or a 1, and so lose their advantage over classical computers.
Scientists have focused on trying to solve these errors through the concept of redundancy – linking strings of physical qubits together into what’s called a “logical qubit” to try and maximise the number of steps in a computation. And, little by little, they’re getting there.
In December 2024, Google announced that its new quantum chip, Willow, had demonstrated what’s called “beyond breakeven”, when its logical qubits worked better than the constituent parts and even kept on improving as it scaled up.
Lidar says right now the development of this technology is happening very fast:
“For quantum computing to scale and to take off is going to still take some real science breakthroughs, some real engineering breakthroughs, and probably overcoming some yet unforeseen surprises before we get to the point of true quantum utility. With that caution in mind, I think it’s still very fair to say that we are going to see truly functional, practical quantum computers kicking into gear, helping us solve real-life problems, within the next decade or so.”
Listen to Lidar explain more about how quantum computers and quantum error correction works on The Conversation Weekly podcast.
This episode of The Conversation Weekly was written and produced by Gemma Ware with assistance from Katie Flood and Mend Mariwany. Sound design was by Michelle Macklem, and theme music by Neeta Sarl.
Clips in this episode from Google Quantum AI and 10 Hours Channel.
You can find us on Instagram at theconversationdotcom or via e-mail. You can also subscribe to The Conversation’s free daily e-mail here.
Listen to The Conversation Weekly via any of the apps listed above, download it directly via our RSS feed or find out how else to listen here.
Gemma Ware, Host, The Conversation Weekly Podcast, The Conversation
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Discover more from Daily News
Subscribe to get the latest posts sent to your email.
Tech
Why building big AIs costs billions – and how Chinese startup DeepSeek dramatically changed the calculus
Ambuj Tewari, University of Michigan
State-of-the-art artificial intelligence systems like OpenAI’s ChatGPT, Google’s Gemini and Anthropic’s Claude have captured the public imagination by producing fluent text in multiple languages in response to user prompts. Those companies have also captured headlines with the huge sums they’ve invested to build ever more powerful models.
An AI startup from China, DeepSeek, has upset expectations about how much money is needed to build the latest and greatest AIs. In the process, they’ve cast doubt on the billions of dollars of investment by the big AI players.
I study machine learning. DeepSeek’s disruptive debut comes down not to any stunning technological breakthrough but to a time-honored practice: finding efficiencies. In a field that consumes vast computing resources, that has proved to be significant.
Where the costs are
Developing such powerful AI systems begins with building a large language model. A large language model predicts the next word given previous words. For example, if the beginning of a sentence is “The theory of relativity was discovered by Albert,” a large language model might predict that the next word is “Einstein.” Large language models are trained to become good at such predictions in a process called pretraining.
Pretraining requires a lot of data and computing power. The companies collect data by crawling the web and scanning books. Computing is usually powered by graphics processing units, or GPUs. Why graphics? It turns out that both computer graphics and the artificial neural networks that underlie large language models rely on the same area of mathematics known as linear algebra. Large language models internally store hundreds of billions of numbers called parameters or weights. It is these weights that are modified during pretraining. https://www.youtube.com/embed/MJQIQJYxey4?wmode=transparent&start=0 Large language models consume huge amounts of computing resources, which in turn means lots of energy.
Pretraining is, however, not enough to yield a consumer product like ChatGPT. A pretrained large language model is usually not good at following human instructions. It might also not be aligned with human preferences. For example, it might output harmful or abusive language, both of which are present in text on the web.
The pretrained model therefore usually goes through additional stages of training. One such stage is instruction tuning where the model is shown examples of human instructions and expected responses. After instruction tuning comes a stage called reinforcement learning from human feedback. In this stage, human annotators are shown multiple large language model responses to the same prompt. The annotators are then asked to point out which response they prefer.
It is easy to see how costs add up when building an AI model: hiring top-quality AI talent, building a data center with thousands of GPUs, collecting data for pretraining, and running pretraining on GPUs. Additionally, there are costs involved in data collection and computation in the instruction tuning and reinforcement learning from human feedback stages.
All included, costs for building a cutting edge AI model can soar up to US$100 million. GPU training is a significant component of the total cost.
The expenditure does not stop when the model is ready. When the model is deployed and responds to user prompts, it uses more computation known as test time or inference time compute. Test time compute also needs GPUs. In December 2024, OpenAI announced a new phenomenon they saw with their latest model o1: as test time compute increased, the model got better at logical reasoning tasks such as math olympiad and competitive coding problems.
Slimming down resource consumption
Thus it seemed that the path to building the best AI models in the world was to invest in more computation during both training and inference. But then DeepSeek entered the fray and bucked this trend.
Their V-series models, culminating in the V3 model, used a series of optimizations to make training cutting edge AI models significantly more economical. Their technical report states that it took them less than $6 million dollars to train V3. They admit that this cost does not include costs of hiring the team, doing the research, trying out various ideas and data collection. But $6 million is still an impressively small figure for training a model that rivals leading AI models developed with much higher costs.
The reduction in costs was not due to a single magic bullet. It was a combination of many smart engineering choices including using fewer bits to represent model weights, innovation in the neural network architecture, and reducing communication overhead as data is passed around between GPUs.
It is interesting to note that due to U.S. export restrictions on China, the DeepSeek team did not have access to high performance GPUs like the Nvidia H100. Instead they used Nvidia H800 GPUs, which Nvidia designed to be lower performance so that they comply with U.S. export restrictions. Working with this limitation seems to have unleashed even more ingenuity from the DeepSeek team.
DeepSeek also innovated to make inference cheaper, reducing the cost of running the model. Moreover, they released a model called R1 that is comparable to OpenAI’s o1 model on reasoning tasks.
They released all the model weights for V3 and R1 publicly. Anyone can download and further improve or customize their models. Furthermore, DeepSeek released their models under the permissive MIT license, which allows others to use the models for personal, academic or commercial purposes with minimal restrictions.
Resetting expectations
DeepSeek has fundamentally altered the landscape of large AI models. An open weights model trained economically is now on par with more expensive and closed models that require paid subscription plans.
The research community and the stock market will need some time to adjust to this new reality.
Ambuj Tewari, Professor of Statistics, University of Michigan
This article is republished from The Conversation under a Creative Commons license. Read the original article.
STM Daily News is a vibrant news blog dedicated to sharing the brighter side of human experiences. Emphasizing positive, uplifting stories, the site focuses on delivering inspiring, informative, and well-researched content. With a commitment to accurate, fair, and responsible journalism, STM Daily News aims to foster a community of readers passionate about positive change and engaged in meaningful conversations. Join the movement and explore stories that celebrate the positive impacts shaping our world.
Discover more from Daily News
Subscribe to get the latest posts sent to your email.
Lifestyle
Biden helped bring science out of the lab and into the community − emphasizing research focused on solutions
Arthur Daemmrich, Arizona State University
President Joe Biden was inaugurated in January 2021 amid a devastating pandemic, with over 24 million COVID-19 cases and more than 400,000 deaths in the U.S. recorded at that point.
Operation Warp Speed, initiated by the Trump administration in May 2020, meant an effective vaccine was becoming available. Biden quickly announced a plan to immunize 100 million Americans over the next three months. By the end of April 2021, 145 million Americans – nearly half the population – had received one vaccine dose, and 103 million were considered fully vaccinated. Science and technology policymakers celebrated this coordination across science, industry and government to address a real-world crisis as a 21st-century Manhattan Project.
From my perspective as a scholar of science and technology policy, Biden’s legacy includes structural, institutional and practical changes to how science is conducted. Building on approaches developed over the course of many years, the administration elevated the status of science in the government and fostered community participation in research.
Raising science’s profile in government
The U.S. has no single ministry of science and technology. Instead, agencies and offices across the executive branch carry out scientific research at several national labs and fund research by other institutions. By elevating the White House Office of Science and Technology Policy to a Cabinet-level organization for the first time in its history, Biden gave the agency greater influence in federal decision-making and coordination.
Formally established in 1976, the agency provides the president and senior staff with scientific and technical advice, bringing science to bear on executive policies. Biden’s inclusion of the agency’s director in his Cabinet was a strong signal about the elevated role science and technology would play in the administration’s solutions to major societal challenges.
Under Biden, the Office of Science and Technology Policy established guidelines that agencies across the government would follow as they implemented major legislation. This included developing technologies that remove carbon dioxide from the atmosphere to address climate change, rebuilding America’s chip industry, and managing the rollout of AI technologies.
Instead of treating the ethical and societal dimensions of scientific and technological change as separate from research and development, the agency advocated for a more integrated approach. This was reflected in the appointment of social scientist Alondra Nelson as the agency’s first deputy director for science and society, and science policy expert Kei Koizumi as principal deputy director for policy. Ethical and societal considerations were added as evaluation criteria for grants. And initiatives such as the AI bill of rights and frameworks for research integrity and open science further encouraged all federal agencies to consider the social effects of their research.
The Office of Science and Technology Policy also introduced new ways for agencies to consult with communities, including Native Nations, rural Americans and people of color, in order to avoid known biases in science and technology research. For example, the agency issued government-wide guidance to recognize and include Indigenous knowledge in federal programs. Agencies such as the Department of Energy have incorporated public perspectives while rolling out atmospheric carbon dioxide removal technologies and building new hydrogen hubs.
Use-inspired research
A long-standing criticism of U.S. science funding is that it often fails to answer questions of societal importance. Members of Congress and policy analysts have argued that funded projects instead overly emphasize basic research in areas that advance the careers of researchers.
In response, the Biden administration established the technology, innovation and partnerships directorate at the National Science Foundation in March 2022.
The directorate uses social science approaches to help focus scientific research and technology on their potential uses and effects on society. For example, engineers developing future energy technologies could start by consulting with the community about local needs and opportunities, rather than pitching their preferred solution after years of laboratory work. Genetic researchers could share both knowledge and financial benefits with the communities that provided the researchers with data.
Fundamentally, “use-inspired” research aims to reconnect scientists and engineers with the people and communities their work ultimately affects, going beyond publication in a journal accessible only to academics.
The technology, innovation and partnerships directorate established initiatives to support regional projects and multidisciplinary partnerships bringing together researchers, entrepreneurs and community organizations. These programs, such as the regional innovation engines and convergence accelerator, seek to balance the traditional process of grant proposals written and evaluated by academics with broader societal demand for affordable health and environmental solutions. This work is particularly key to parts of the country that have not yet seen visible gains from decades of federally sponsored research, such as regions encompassing western North Carolina, northern South Carolina, eastern Tennessee and southwest Virginia.
Community-based scientific research
The Biden administration also worked to involve communities in science not just as research consultants but also as active participants.
Scientific research and technology-based innovation are often considered the exclusive domain of experts from elite universities or national labs. Yet, many communities are eager to conduct research, and they have insights to contribute. There is a decades-long history of citizen science initiatives, such as birdwatchers contributing data to national environmental surveys and community groups collecting industrial emissions data that officials can use to make regulations more cost effective.
Going further, the Biden administration carried out experiments to create research projects in a way that involved community members, local colleges and federal agencies as more equal partners.
For example, the Justice40 initiative asked people from across the country, including rural and small-town Americans, to identify local environmental justice issues and potential solutions.
The National Institutes of Health’s ComPASS program funded community organizations to test and scale successful health interventions, such as identifying pregnant women with complex medical needs and connecting them to specialized care.
And the National Science Foundation’s Civic Innovation Challenge required academic researchers to work with local organizations to address local concerns, improving the community’s technical skills and knowledge.
Frontiers of science and technology policy
Researchers often cite the 1945 report Science: The Endless Frontier, written by former Office of Scientific Research and Development head Vannevar Bush, to describe the core rationales for using American taxpayer money to fund basic science. Under this model, funding science would lead to three key outcomes: a secure national defense, improved health, and economic prosperity. The report, however, says little about how to go from basic science to desired societal outcomes. It also makes no mention of scientists sharing responsibility for the direction and impact of their work.
The 80th anniversary of Bush’s report in 2025 offers an opportunity to move science out into society. At present, major government initiatives are following a technology push model that focuses efforts on only one or a few products and involves little consideration of consumer and market demand. Research has repeatedly demonstrated that consumer or societal pull, which attracts development of products that enhance quality of life, is key to successful uptake of new technologies and their longevity.
Future administrations can further advance science and address major societal challenges by considering how ready society is to take up new technologies and increasing collaboration between government and civil society.
Arthur Daemmrich, Professor of Practice in the School for the Future of Innovation in Society, Arizona State University
This article is republished from The Conversation under a Creative Commons license. Read the original article.
The science section of our news blog STM Daily News provides readers with captivating and up-to-date information on the latest scientific discoveries, breakthroughs, and innovations across various fields. We offer engaging and accessible content, ensuring that readers with different levels of scientific knowledge can stay informed. Whether it’s exploring advancements in medicine, astronomy, technology, or environmental sciences, our science section strives to shed light on the intriguing world of scientific exploration and its profound impact on our daily lives. From thought-provoking articles to informative interviews with experts in the field, STM Daily News Science offers a harmonious blend of factual reporting, analysis, and exploration, making it a go-to source for science enthusiasts and curious minds alike. https://stmdailynews.com/category/science/
Discover more from Daily News
Subscribe to get the latest posts sent to your email.
Tech
Joe Biden’s record on science and tech: Investments and regulation for vaccines, broadband, microchips and AI
The Biden administration’s focus on science and technology has led to substantial investments in semiconductor manufacturing and clean energy, aiming to enhance U.S. competitiveness and innovation while addressing public health challenges.
Mark Zachary Taylor, Georgia Institute of Technology
In evaluating the outgoing Biden administration, much news has focused on inflation, immigration or Hunter’s laptop. But as an expert on national competitiveness in science and technology, I have a somewhat different emphasis. My research shows that U.S. prosperity and security depend heavily on the country’s ability to produce cutting-edge science and tech.
So, how did the Biden administration perform along these lines?
Advancing pandemic science and tech
President Joe Biden’s immediate challenge after inauguration was to end the COVID-19 pandemic and then shift the economy back to normal operations.
First, he threw the weight of his administration behind vaccine production and distribution. Thanks to President Donald Trump’s Operation Warp Speed, inoculations had begun mid-December 2020. But there had been no national rollout, and no plans existed for one. When Biden took office, only about 5% of Americans had been vaccinated.
The Biden administration collaborated with private retail chains to build up cold storage and distribution capacity. To ensure adequate vaccine supply, Biden worked to support the major pharmaceutical manufacturers. And throughout, Biden conducted a public relations campaign to inform, educate and motivate Americans to get vaccinated.
Within the first 10 weeks of Biden’s presidency, one-third of the U.S. population had received at least one dose, half by the end of May, and over 70% by year’s end. And as Americans got vaccinated, travel bans were lifted, schools came back into session, and business gradually returned to normal.
A later study found that Biden’s vaccination program prevented more than 3.2 million American deaths and 18.5 million hospitalizations, and saved US$1.15 trillion in medical costs and lost economic output.
In the wake of the economic distress caused by the COVID-19 pandemic, Biden signed two bills with direct and widespread impacts on science and technology. Previous administrations had promised infrastructure investments, but Biden delivered. The Infrastructure Investment and Jobs Act, passed with bipartisan support during late 2021, provided $1.2 trillion for infrastructure of all types.
Rather than just rebuilding, the act prioritized technological upgrades: clean water, clean energy, rural high-speed internet, modernization of public transit and airports, and electric grid reliability.
In August 2022, Biden signed the Inflation Reduction Act, totaling $739 billion in tax credits and direct expenditures. This was the largest climate change legislation in U.S. history. It implemented a vast panoply of subsidies and incentives to develop and distribute the science and tech necessary for clean and renewable energy, environmental conservation and to address climate change.
Science and tech marquees and sleepers
Some Biden administration science and technology achievements have been fairly obvious. For example, Biden successfully pushed for increased federal research and development funding. Federal R&D dollars jumped by 25% from 2021 to 2024. Recipients included the National Science Foundation, Department of Energy, NASA and the Department of Defense. In addition, Biden oversaw investment in emerging technologies, such as AI, and their responsible governance.
Biden also retained or raised Trump’s tariffs and continued his predecessor’s skepticism of new free-trade agreements, thereby cementing a protectionist turn in American trade policy. Biden’s addition was to add protectionist industrial policy – subsidies for domestic manufacturing and innovation, as well as “buy-American” mandates.
Other accomplishments have been more under the radar. For example, within the National Science Foundation, Biden created a Directorate for Technology, Innovation and Partnerships to improve U.S. economic competitiveness. Its tasks are to speed the development of breakthrough technologies, to accelerate their transition into the marketplace, and to reskill and upskill American workers into high-quality jobs with better wages.
Biden implemented policies aimed at strengthening and improving federal scientific integrity to help citizens feel they can trust federally funded science and its use. He also advanced new measures to improve research security, aimed at keeping federally funded research from being improperly obtained by foreign entities.
The CHIPS & Science Act
The jewel in the crown of Biden’s science and tech agenda was the bipartisan Creating Helpful Incentives to Produce Semiconductors (CHIPS) and Science Act, meant to strengthen U.S. manufacturing capabilities in advanced semiconductor chips. It has awarded about $40 billion to American chip producers, prompting an additional $450 billion in private investment in over 90 new manufacturing projects across 28 states.
Directed at everything from advanced packaging to memory chips, the CHIPS Act’s subsidies have reduced the private costs of domestic semiconductor production. CHIPS also pushes for these new manufacturing jobs to go to American workers at good pay. Whereas the U.S. manufactured few of the most advanced chips just two years ago, the industry expects the United States to possess 28% of global capacity by 2032.
Less well known are the “science” parts of the CHIPS Act. For example, it invested half a billion dollars in dozens of regional innovation and technology hubs across the country. These hubs focus on a broad range of strategic sectors, including critical materials, sustainable polymers, precision medicine and medical devices. Over 30 tech hubs have already been designated, such as the Elevate Quantum Tech Hub in Denver and the Wisconsin Biohealth Tech Hub.
The CHIPS Act also aims to broaden participation in science. It does so by improving the tracking and funding of research and STEM education to hitherto underrepresented Americans – by district, occupation, ethnicity, gender, institution and socioeconomic background. It also attempts to extend the impact of federally funded research to tackle global challenges, such as supply chain disruptions, resource waste and energy security.
Missed opportunities and future possibilities
Despite these achievements, the Biden administration has faced criticism on the science and tech front. Some critics allege that U.S. research security is still not properly defending American science and technology against theft or counterfeit by rivals.
Others insist that federal R&D spending remains too low. In particular, they call for more investment in U.S. research infrastructure – such as up-to-date laboratories and data systems – and emerging technologies.
The administration’s government-centered approach to AI has also drawn criticism as stifling and wrong-headed.
Personally, I am agnostic on these issues, but they are legitimate concerns. In my opinion, science and technology investments take considerable time to pan out, so early judgments of Biden’s success or failure are probably premature.
Nevertheless, the next administration has its work cut out for it. International cooperation will likely be key. The most vexing global problems require science and technology advances that are beyond the ability of any single country. The challenge is for the United States to collaborate in ways that complement American competitiveness.
National priorities will likely include the development of productive and ethical AI that helps the U.S. to be more competitive, as well as a new quantum computing industry. Neuroscience and “healthspan” research also hold considerable promise for improving U.S. competitiveness while transforming Americans’ life satisfaction.
Keeping the whole American science and technology enterprise rigorous will require two elements from the federal government: more resources and a competitive environment. American greatness will depend on President-elect Trump’s ability to deliver them.
Mark Zachary Taylor, Associate Professor of Public Policy, Georgia Institute of Technology
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Discover more from Daily News
Subscribe to get the latest posts sent to your email.
-
Urbanism1 year ago
Signal Hill, California: A Historic Enclave Surrounded by Long Beach
-
News2 years ago
Diana Gregory Talks to us about Diana Gregory’s Outreach Services
-
Senior Pickleball Report2 years ago
The Absolute Most Comfortable Pickleball Shoe I’ve Ever Worn!
-
STM Blog2 years ago
World Naked Gardening Day: Celebrating Body Acceptance and Nature
-
Senior Pickleball Report2 years ago
ACE PICKLEBALL CLUB TO DEBUT THEIR HIGHLY ANTICIPATED INDOOR PICKLEBALL FRANCHISES IN THE US, IN EARLY 2023
-
Travel2 years ago
Unique Experiences at the CitizenM
-
Automotive2 years ago
2023 Nissan Sentra pricing starts at $19,950
-
Senior Pickleball Report2 years ago
“THE PEOPLE’S CHOICE AWARDS OF PICKLEBALL” – VOTING OPEN