Qualcomm announced Monday that it will release new artificial intelligence accelerator chips, marking new competition for Nvidia, which has so far dominated the market for AI semiconductors.
The stock soared 15% following the news.
The AI chips are a shift from Qualcomm, which has thus far focused on semiconductors for wireless connectivity and mobile devices, not massive data centers.
Qualcomm said that both the AI200, which will go on sale in 2026, and the AI250, planned for 2027, can come in a system that fills up a full, liquid-cooled server rack.
Qualcomm is matching Nvidia and AMD, which offer their graphics processing units, or GPUs, in full-rack systems that allow as many as 72 chips to act as one computer. AI labs need that computing power to run the most advanced models.
Qualcomm’s data center chips are based on the AI parts in Qualcomm’s smartphone chips called Hexagon neural processing units, or NPUs.
“We first wanted to prove ourselves in other domains, and once we built our strength over there, it was pretty easy for us to go up a notch into the data center level,” Durga Malladi, Qualcomm’s general manager for data center and edge, said on a call with reporters last week.
The entry of Qualcomm into the data center world marks new competition in the fastest-growing market in technology: equipment for new AI-focused server farms.
Nearly $6.7 trillion in capital expenditures will be spent on data centers through 2030, with the majority going to systems based around AI chips, according to a McKinsey estimate.
Read more CNBC tech news
The industry has been dominated by Nvidia, whose GPUs have over 90% of the market so far and sales of which have driven the company to a market cap of over $4.5 trillion. Nvidia’s chips were used to train OpenAI’s GPTs, the large language models used in ChatGPT.
But companies such as OpenAI have been looking for alternatives, and earlier this month the startup announced plans to buy chips from the second-place GPU maker, AMD, and potentially take a stake in the company. Other companies, such as Google, Amazon and Microsoft, are also developing their own AI accelerators for their cloud services.
Qualcomm said its chips are focusing on inference, or running AI models, instead of training, which is how labs such as OpenAI create new AI capabilities by processing terabytes of data.
The chipmaker said that its rack-scale systems would ultimately cost less to operate for customers such as cloud service providers, and that a rack uses 160 kilowatts, which is comparable to the high power draw from some Nvidia GPU racks.
Malladi said Qualcomm would also sell its AI chips and other parts separately, especially for clients such as hyperscalers that prefer to design their own racks. He said other AI chip companies, such as Nvidia or AMD, could even become clients for some of Qualcomm’s data center parts, such as its central processing unit, or CPU.
“What we have tried to do is make sure that our customers are in a position to either take all of it or say, ‘I’m going to mix and match,'” Malladi said.
The company declined to comment, the price of the chips, cards or rack, and how many NPUs could be installed in a single rack. In May, Qualcomm announced a partnership with Saudi Arabia’s Humain to supply data centers in the region with AI inferencing chips, and it will be Qualcomm’s customer, committing to deploy up to as many systems as can use 200 megawatts of power.
Qualcomm said its AI chips have advantages over other accelerators in terms of power consumption, cost of ownership, and a new approach to the way memory is handled. It said its AI cards support 768 gigabytes of memory, which is higher than offerings from Nvidia and AMD.
David Sacks, White House AI and Crypto Czar, attends a meeting of the White House Task Force on Artificial Intelligence (AI) Education in the East Room at the White House in Washington, D.C., U.S., September 4, 2025.
“The U.S. has at least 5 major frontier model companies. If one fails, others will take its place,” Sacks wrote in a post on X.
Sacks’ comments came after OpenAI CFO Sarah Friar said Wednesday that the startup wants to establish an ecosystem of private equity, banks and a federal “backstop” or “guarantee” that could help the company finance its infrastructure investments.
She softened her stance later in a LinkedIn post and said OpenAI is not seeking a government backstop for its infrastructure commitments. She said her use of the word “backstop” clouded her point.
“As the full clip of my answer shows, I was making the point that American strength in technology will come from building real industrial capacity which requires the private sector and government playing their part,” Friar wrote.
The White House did not immediately respond to CNBC’s request for comment. OpenAI directed CNBC to Friar’s LinkedIn post.
Sacks said the Trump Administration does want to make permitting and power generation easier, and that the goal is to facilitate rapid infrastructure buildouts without raising residential electricity rates.
“To give benefit of the doubt, I don’t think anyone was actually asking for a bailout. (That would be ridiculous.),” he wrote.
Mustafa Suleyman, CEO of Microsoft AI and then CEO and co-founder of Inflection AI, speaks during the Axios BFD event in New York on Oct. 12, 2023.
Brendan Mcdermid | Reuters
Microsoft on Thursday said it’s forming a team that will be tasked with performing advanced artificial intelligence research.
Mustafa Suleyman, CEO of the Microsoft AI group that includes Bing and the Copilot assistant, announced the formation of the MAI Superintelligence Team, and said in a blog post that he’ll be leading it.
“We are doing this to solve real concrete problems and do it in such a way that it remains grounded and controllable,” Suleyman wrote. “We are not building an ill-defined and ethereal superintelligence; we are building a practical technology explicitly designed only to serve humanity.”
The decision comes months after Facebook parent Meta spent billions to hire talent for its new Meta Superintelligence Labs unit that’s working on research and products. The term superintelligence typically refers to machines deemed more intelligent than the smartest people.
Suleyman was a co-founder of AI lab DeepMind, which Google bought in 2014. After leaving Google in 2022, he co-founded and led AI startup Inflection. Microsoft hired Suleyman and several other Inflection employees last year.
Top technology companies have rushed to hire leading AI engineers and researchers, augmenting their products with generative AI capabilities. The boom started with OpenAI’s launch of ChatGPT in 2022.
Microsoft uses OpenAI models in Bing and Copilot, while OpenAI runs workloads in Microsoft’s Azure cloud. Microsoft also owns a $135 billion equity stake in OpenAI following a restructuring.
Microsoft has taken steps to reduce its dependence on OpenAI. After the Inflection deal, the software company also began drawing on models from Google and from Anthropic, which was founded by former OpenAI executives.
The new Microsoft AI research group will focus on providing useful companions for people that can help in education and other domains, Suleyman wrote in his blog post. It will also pursue narrow areas in medicine and in renewable energy production.
“We’ll have expert level performance at the full range of diagnostics, alongside highly capable planning and prediction in operational clinical settings,” Suleyman wrote.
As investors and analysts are increasingly voicing their concerns about overspending on AI without a clear path to profits, Suleyman said he wants “to make clear that we are not building a superintelligence at any cost, with no limits.”
Doordash‘s stock plummeted toward its worst session ever as investors rejected the company’s aggressive spending strategy.
The food delivery platform said it plans to shell out “several hundred million dollars” next year on new product initiatives like autonomous delivery and a new global tech stack.
These plans will improve its product globally, but involve “direct and opportunity costs” in the short run, Doordash said.
CEO Tony Xu defended the company’s spending decisions during the earnings call with analysts and said Doordash is running the business as it always has — to solve problems for customers in the highest quality ways.
“Our track record in investing in the areas that we currently have operating … have suggested that we’ve had some success in repeating this playbook, and we’re doing this now for future growth,” he said.
In recent months, Doordash has spent big money to open new markets and boost optionality for customers as it battles industry competitors such as Uber, and worries mount of a slowdown in consumer discretionary spending.
Read more CNBC tech news
This year, the California-based company purchased restaurant booking platform SevenRooms for $1.2 billion and acquired British food delivery firm Deliveroo in a deal worth $3.9 billion. Doordash also launched an autonomous robot delivery robot known as Dot in September and new DashMart fulfillment services for retailers.
The length and breadth of these investments will remain a key issue for the company’s shares, wrote Wells Fargo analyst Ken Gawrelski.
“In our view, this is one of the best operational management teams in the sector and longer duration investors are likely to remain supportive through this period,” he wrote. “However, given inconsistent disclosure, we believe patience may be required.”
Doordash’s third-quarter profit totaled 55 cents per share, falling short of the 69 cents per share forecasted by LSEG. Revenues grew 27% from a year ago to $3.45 billion, above Wall Street’s $3.36 billion estimate.
The company expects adjusted EBITDA in the fourth quarter between $710 million to $810 million, with a midpoint of $760 million. Analysts polled by FactSet expected $806.8 million.
Doordash expects Deliveroo to add $45 million to adjusted EBITDA in the fourth quarter and about $200 million in 2026.