After a quarter where Nvidia’s sales nearly doubled, investors and analysts are wondering how long the chipmaker can keep this kind of growth going now that it has a $140 billion annual revenue run rate.
Those hopes fall on Blackwell, which is Nvidia’s name for a family of server products based around its next-generation AI chip.
CEO Jensen Huang and CFO Colette Kress gave investors several new data points on how Blackwell’s launch is shaping up on a call with analysts on Wednesday. The duo emphasized that the rollout is on track, and they signaled that Blackwell sales over the next few quarters will be limited by how many chips and systems Nvidia can make, not how much it can sell.
“Blackwell production is in full steam,” Huang said. “We will deliver this quarter more Blackwells than we had previously estimated.”
The company’s positive comments on Blackwell are one reason why the stock is only down 1%, despite the company missing elevated expectations from bullish investors who anticipated Nvidia would significantly exceed its own forecasts.
Huang and Kress’s comments also addressed fears about shipment delays that were spurred by reports that said Nvidia was making ongoing engineering changes to its systems to address problems.
Some of Nvidia’s most important end-customers have already received some Blackwell chips, the company confirmed on Wednesday. Microsoft, Oracle and OpenAI have posted pictures of Blackwell-based server racks on their social media accounts, and on Wednesday, the company said 13,000 Blackwell chips have already been shipped to customers.
“There’s still a lot of a lot of engineering that happens at this point,” Huang said. “But as you see from all of the systems that are being stood up, Blackwell is in great shape.”
Those sample chips aren’t the bulk of the shipments that the company is expecting to make. They’re early versions intended to allow customers to start testing and get their systems and software ready for the volume shipments, which will start in Nvidia’s current quarter.
“We will we’ll ship more Blackwells next quarter than this [quarter], and we’ll ship more Blackwells the quarter after that than than our first quarter,” Huang said.
In July, Nvidia said it expected “several billion dollars” of Blackwell revenue in its current quarter, and on Wednesday, the company said it expects the amount of Blackwell sales for this quarter to be higher than its original forecast. Huang also said that Microsoft will soon start to preview its Blackwell-based systems to cloud customers.
A limiting factor to producing more Blackwell systems is the amount of components that Nvidia’s suppliers can provide, Huang said. Additionally, it takes time to ramp up the velocity of a manufacturing process that has gone from zero shipments to billions of dollars of shipments in a few months.
“It is the case that demand exceeds our supply, and that’s expected as we’re in the beginnings of this generative AI revolution,” Huang said.
“Almost every company in the world seems to be involved in our supply chain,” Huang said.
Nvidia said that Blackwell’s gross margins will be lower in the coming months than the 73.5% it reported in the third quarter, but the company said that margin will increase as the product matures. Huang pointed out that Blackwell comes as just the chip itself or in configurations that include an entire rack and other components.
Nvidia’s overall message on Wednesday was that its new Blackwell chip is in short supply because companies like OpenAI need the fastest GPUs available as quickly as possible to develop next-generation AI models. As Blackwell rolls out, Nvidia’s current AI chips, which it calls Hopper, will be relegated to serving AI models, not creating new ones. Nvidia said that Blackwell sales will eventually exceed those of Hopper.
“You see now that at the tail end of the last generation of foundation models, we’re at at about 100,000 Hoppers,” Huang said. “The next generation starts at 100,000 Blackwells.”
OpenAI has signed a deal to buy $38 billion worth of capacity from Amazon Web Services, its first contract with the leader in cloud infrastructure and the latest sign that the $500 billion artificial intelligence startup is no longer reliant on Microsoft.
Under the agreement announced on Monday, OpenAI will immediately begin running workloads on AWS infrastructure, tapping hundreds of thousands of Nvidia’s graphics processing units (GPUs) in the U.S., with plans to expand capacity in the coming years.
Amazon stock climbed about 5% following the news.
The first phase of the deal will use existing AWS data centers, and Amazon will eventually build out additional infrastructure for OpenAI.
“It’s completely separate capacity that we’re putting down,” said Dave Brown, vice president of compute and machine learning services at AWS, in an interview. “Some of that capacity is already available, and OpenAI is making use of that.”
Read more CNBC Amazon coverage
OpenAI has been on a dealmaking spree of late, announcing roughly $1.4 trillion worth of buildout agreements with companies including Nvidia, Broadcom, Oracle and Google — prompting skeptics to warn of an AI bubble and question whether the country has the power and resources needed to turn the ambitious promises into reality.
Until this year, OpenAI had an exclusive cloud agreement with Microsoft, which first backed the company in 2019 and has invested a total of $13 billion. In January, Microsoft said it would no longer be the exclusive cloud provider for OpenAI, and was moving to an arrangement where it would have right of first refusal for new requests.
Last week, Microsoft’s preferential status expired under its newly negotiated commercial terms with OpenAI, freeing the ChatGPT creator to partner more widely with the other hyperscalers. Even before that, OpenAI forged cloud deals with Oracle and Google, but AWS is by far the market leader.
“Scaling frontier AI requires massive, reliable compute,” OpenAI CEO Sam Altman said in Monday’s release. “Our partnership with AWS strengthens the broad compute ecosystem that will power this next era and bring advanced AI to everyone.”
OpenAI will still be spending heavily with Microsoft, reaffirming that commitment by saying last week that it will purchase an incremental $250 billion of Azure services.
For Amazon, the pact is significant both in the size and scale of the deal itself and because the cloud giant has close ties to OpenAI rival Anthropic. Amazon has invested billions of dollars in Anthropic, and is currently constructing an $11 billion data center campus in New Carlisle, Indiana, that’s designed exclusively for Anthropic workloads.
“The breadth and immediate availability of optimized compute demonstrates why AWS is uniquely positioned to support OpenAI’s vast AI workloads,” AWS CEO Matt Garman said in the release.
In its earnings report last week, Amazon reported more than 20% year-over-year revenue growth at AWS, beating analyst estimates. But growth was faster at Microsoft and Google, which reported cloud expansion of 40% and 34%, respectively.
Starting on Nvidia
The current agreement with OpenAI is explicitly for use of Nvidia chips, including two popular Blackwell models, but there’s potential to incorporate additional silicon down the road. Amazon’s custom-built Trainium chip is being used by Anthropic in the new facility.
“We like Trainium because we’re able to give customers something that gives them better price performance and honestly gives them choice,” Brown said, adding that he can’t provide any details on “anything we’ve done with OpenAI on Trainium at this point.”
The infrastructure will support both inference — such as powering ChatGPT’s real-time responses — and training of next-generation frontier models. OpenAI can expand with AWS as needed over the next seven years, but no plans beyond 2026 have been finalized.
OpenAI CEO Sam Altman (L) shakes hands with Microsoft Chief Technology Officer and Executive VP of Artificial Intelligence Kevin Scott during the Microsoft Build conference at the Seattle Convention Center Summit Building in Seattle, Washington, U.S., on May 21, 2024.
Jason Redmond | Afp | Getty Images
OpenAI’s foundation models, including so-called open-weight options, are already available on Bedrock, AWS’s managed service for accessing leading AI systems.
Companies including Peloton, Thomson Reuters, Comscore, and Triomics use OpenAI models on AWS for tasks ranging from coding and mathematical problem solving to scientific analysis and agentic workflows.
Monday’s announcement establishes a more direct relationship.
“As part of this deal, OpenAI is a customer of AWS,” Brown said. “They’ve committed to buying compute capacity from us, and we’re charging OpenAI for that capacity. It’s very, very straightforward.”
For OpenAI, the most highly valued private AI company, the AWS agreement is another step in getting ready to eventually go public. By diversifying its cloud partners and locking in long-term capacity across providers, OpenAI is signaling both independence and operational maturity.
Altman acknowledged in a recent livestream that an IPO is “the most likely path” given OpenAI’s capital needs. CFO Sarah Friar has echoed that sentiment, framing the recent corporate restructuring as a necessary step toward going public.
MongoDB CEO Dev Ittycheria arrives at the Allen & Co. Media and Technology Conference in Sun Valley, Idaho, on July 9, 2025.
David Paul Morris | Bloomberg | Getty Images
Database software maker MongoDB said on Monday that CEO Dev Ittycheria is stepping down from the top job after an 11-year run.
Chirantan “CJ” Desai, who has spent the past year as president of product and engineering at Cloudflare, is replacing Ittycheria, effective Nov. 10, MongoDB said. Ittycheria will remain on the company’s board.
“Earlier this year, I would say as part of our normal succession planning process, the board asked me about my long-term plans and whether I could commit for another five years as CEO,” Ittycheria told CNBC in an interview. “I thought long and hard about it, and I talked to my family, I talked to the board and ultimately realized I couldn’t make that kind of decision.”
Before joining MongoDB, Ittycheria was president of BMC, which bought his company BladeLogic for $854 million in 2008. As BladeLogic’s co-founder and CEO, Ittycheria took the company public in 2007. He’s also been an investor at venture firms OpenView and Greylock.
Ittycheria led MongoDB’s IPO in 2017, three years after taking the helm. The company won over individual software developers thanks to its database’s architecture that could store a variety of data in documents, challenging market incumbents like Oracle.
Under Ittycheria, the company prioritized cloud subscriptions, landed multi-year deals, partnered with rival cloud providers Amazon and Microsoft and expanded the software’s capabilities into generative artificial intelligence.
MongoDB’s stock closed on Friday at $359.82, representing a fifteenfold gain since the IPO and lifting the company’s market cap to almost $30 billion. MongoDB’s net loss in the July quarter narrowed to $47 million from $54.5 million a year earlier, while revenue rose 24% to $591 million.
Cloudflare said in a filing on Thursday that Desai would step down on Nov. 7, to become CEO “at another notable, publicly-traded company.” Desai previously served as operating chief at ServiceNow. He resigned in July 2024, after the software company found a policy violation with the hiring of the U.S. Army’s chief information officer. Previously Desai held leadership positions at EMC and Symantec.
“We talked to people close to ServiceNow, as well as other people who know CJ really well, and we felt very, very comfortable that CJ is the right person to lead MongoDB in this next era,” Ittycheria said.
Desai, whose first job out of college was at Oracle, said he will split his time between New York and the San Francisco area.
MongoDB also said it expects to exceed the high end of its guidance ranges for revenue and adjusted earnings per share in the fiscal third quarter. The top end of its range was 79 cents per share in earnings, and $592 million in revenue.
Desai said he’s “looking forward to grow MongoDB to $5 billion-plus in a durable, profitable way, in revenues, and most importantly, to be the gold standard for modern database technology, no matter what kind of workloads exist.” He did not offer a timeline for the revenue goal.
Executives will discuss the leadership change on a conference call with analysts at 10 a.m. ET.
Jensen Huang, CEO of Nvidia, speaks during the 2025 Asia-Pacific Economic Cooperation (APEC) CEO Summit in Gyeongju, South Korea, October 31, 2025.
Kim Soo-hyeon | Reuters
Microsoft said Monday it has secured export licenses to ship Nvidia chips to the United Arab Emirates in a move that could accelerate the Gulf’s lofty AI ambitions.
The tech giant said it is the first company under U.S. President Donald Trump‘s administration to secure such licenses from the Commerce Department and that the approval, granted in September, was based on “updated and stringent technology safeguards.”
The licenses enable the firm to ship the equivalent of 60,400 additional A100 chips, involving tech darling Nvidia’s more advanced GB300 GPUs.
“While the chips are powerful and the numbers are large, more important is their positive impact across the UAE,” Microsoft said in a blog post. “We’re using these GPUs to provide access to advanced AI models from OpenAI, Anthropic, open-source providers, and Microsoft itself.”
Nvidia shares climbed 3% Monday. Microsoft stock rose slightly.
Azad Zangana, head of GCC macroeconomic analysis at Oxford Economics, said in a note that Nvidia’s chips are “crucial” for the UAE’s push to be a major global player in AI.
“Access to the world’s leading AI chips provides the hardware that will give developers the leading edge that is needed in an incredibly competitive global landscape,” Zangana wrote.
There is a “very important” relationship between the UAE and U.S. governments that has spanned multiple administrations, Microsoft President Brad Smith told CNBC’s Dan Murphy at the ADIPEC conference in Abu Dhabi.
“We’re very grateful to the Secretary of Commerce Howard Lutnick, and the work that he has championed to enable export licenses to be made available to us,” Smith said. “That builds as well on the relationships we had with Secretary [Marco] Rubio when he was in the Senate and Democrats as well. [It] takes two parties to govern, and we keep that in mind.”
Microsoft also announced it will be increasing its investment in UAE, bringing its total contribution to $15.2 billion by the end of this decade.
That includes a $1.5 billion equity investment in AI firm G42 and more than $5.5 billion in capital expenses for the expansion of Microsoft’s AI and cloud infrastructure projects in the region.
“We’re really investing in trust, and I think it’s that combination of technology, talent and trust that you’re seeing come together here in the UAE, around AI, around technology, but really the future of the whole economy,” Smith said.