Nvidia stock surged close to a $1 trillion market cap in after-hours trading Wednesday after it reported a shockingly strong strong forward outlook and CEO Jensen Huang said the company was going to have a “giant record year.”
Sales are up because of spiking demand for the graphics processors (GPUs) that Nvidia makes, which power AI applications like those at Google, Microsoft, and OpenAI.
related investing news
an hour ago
12 hours ago
Demand for AI chips in datacenters spurred Nvidia to guide to $11 billion in sales during the current quarter, blowing away analyst estimates of $7.15 billion.
“The flashpoint was generative AI,” Huang said in an interview with CNBC. “We know that CPU scaling has slowed, we know that accelerated computing is the path forward, and then the killer app showed up.”
Nvidia believes it’s riding a distinct shift in how computers are built that could result in even more growth — parts for data centers could even become a $1 trillion market, Huang says.
Historically, the most important part in a computer or server had been the central processor, or the CPU, That market was dominated by Intel, with AMD as its chief rival.
With the advent of AI applications that require a lot of computing power, the graphics processor (GPU) is taking center stage, and the most advanced systems are using as many as eight GPUs to one CPU. Nvidia currently dominates the market for AI GPUs.
“The data center of the past, which was largely CPUs for file retrieval, is going to be, in the future, generative data,” Huang said. “Instead of retrieving data, you’re going to retrieve some data, but you’ve got to generate most of the data using AI.”
“So instead of instead of millions of CPUs, you’ll have a lot fewer CPUs, but they will be connected to millions of GPUs,” Huang continued.
For example, Nvidia’s own DGX systems, which are essentially an AI computer for training in one box, use eight of Nvidia’s high-end H100 GPUs, and only two CPUs.
Google’s A3 supercomputer pairs eight H100 GPUs alongside a single high-end Xeon processor made by Intel.
That’s one reason why Nvidia’s data center business grew 14% during the first calendar quarter versus flat growth for AMD’s data center unit and a decline of 39% in Intel’s AI and Data Center business unit.
Plus, Nvidia’s GPUs tend to be more expensive than many central processors. Intel’s most recent generation of Xeon CPUs can cost as much as $17,000 at list price. A single Nvidia H100 can sell for $40,000 on the secondary market.
Nvidia will face increased competition as the market for AI chips heats up. AMD has a competitive GPU business, especially in gaming, and Intel has its own line of GPUs as well. Startups are building new kinds of chips specifically for AI, and mobile-focused companies like Qualcomm and Apple keep pushing the technology so that one day it might be able to run in your pocket, not in a giant server farm. Google and Amazon are designing their own AI chips.
But Nvidia’s high-end GPUs remain the chip of choice for current companies building applications like ChatGPT, which are expensive to train by processing terabytes of data, and are expensive to run later in a process called “inference,” which uses the model to generate text, images, or make predictions.
Analysts say that Nvidia remains in the lead for AI chips because of its proprietary software that makes it easier to use all of the GPU hardware features for AI applications.
Huang said on Wednesday that the company’s software would not be easy to replicate.
“You have to engineer all of the software and all of the libraries and all of the algorithms, integrate them into and optimize the frameworks, and optimize it for the architecture, not just one chip but the architecture of an entire data center,” Huang said on a call with analysts.
The letters AI, which stands for “artificial intelligence,” stand at the Amazon Web Services booth at the Hannover Messe industrial trade fair in Hannover, Germany, on March 31, 2025.
Amazon said Wednesday that its cloud division has developed hardware to cool down next-generation Nvidia graphics processing units that are used for artificial intelligence workloads.
Nvidia’s GPUs, which have powered the generative AI boom, require massive amounts of energy. That means companies using the processors need additional equipment to cool them down.
Amazon considered erecting data centers that could accommodate widespread liquid cooling to make the most of these power-hungry Nvidia GPUs. But that process would have taken too long, and commercially available equipment wouldn’t have worked, Dave Brown, vice president of compute and machine learning services at Amazon Web Services, said in a video posted to YouTube.
“They would take up too much data center floor space or increase water usage substantially,” Brown said. “And while some of these solutions could work for lower volumes at other providers, they simply wouldn’t be enough liquid-cooling capacity to support our scale.”
Rather, Amazon engineers conceived of the In-Row Heat Exchanger, or IRHX, that can be plugged into existing and new data centers. More traditional air cooling was sufficient for previous generations of Nvidia chips.
Customers can now access the AWS service as computing instances that go by the name P6e, Brown wrote in a blog post. The new systems accompany Nvidia’s design for dense computing power. Nvidia’s GB200 NVL72 packs a single rack with 72 Nvidia Blackwell GPUs that are wired together to train and run large AI models.
Computing clusters based on Nvidia’s GB200 NVL72 have previously been available through Microsoft or CoreWeave. AWS is the world’s largest supplier of cloud infrastructure.
Amazon has rolled out its own infrastructure hardware in the past. The company has custom chips for general-purpose computing and for AI, and designed its own storage servers and networking routers. In running homegrown hardware, Amazon depends less on third-party suppliers, which can benefit the company’s bottom line. In the first quarter, AWS delivered the widest operating margin since at least 2014, and the unit is responsible for most of Amazon’s net income.
Microsoft, the second largest cloud provider, has followed Amazon’s lead and made strides in chip development. In 2023, the company designed its own systems called Sidekicks to cool the Maia AI chips it developed.
The logo of the cryptocurrency Bitcoin can be seen on a coin in front of a Bitcoin chart.
Silas Stein | Picture Alliance | Getty Images
Bitcoin hit a fresh record on Wednesday afternoon as an Nvidia-led rally in equities helped push the price of the cryptocurrency higher into the stock market close.
The price of bitcoin was last up 1.9%, trading at $110,947.49, according to Coin Metrics. Just before 4:00 p.m. ET, it hit a high of $112,052.24, surpassing its May 22 record of $111,999.
The flagship cryptocurrency has been trading in a tight range for several weeks despite billions of dollars flowing into bitcoin exchange traded funds. Bitcoin purchases by public companies outpaced ETF inflows in the second quarter. Still, bitcoin is up just 2% in the past month.
Stock Chart IconStock chart icon
Bitcoin climbs above $112,000
On Wednesday, tech stocks rallied as Nvidia became the first company to briefly touch $4 trillion in market capitalization. In the same session, investors appeared to shrug off the latest tariff developments from President Donald Trump. The tech-heavy Nasdaq Composite notched a record close.
While institutions broadly have embraced bitcoin’s “digital gold” narrative, it is still a risk asset that rises and falls alongside stocks depending on what’s driving investor sentiment. When the market is in risk-on mode and investors buy growth-oriented assets like tech stocks, bitcoin and crypto tend to rally with them.
Investors have been expecting bitcoin to reach new records in the second half of the year as corporate treasuries accelerate their bitcoin buying sprees and Congress gets closer to passing crypto legislation.
Don’t miss these cryptocurrency insights from CNBC Pro:
Perplexity AI on Wednesday launched a new artificial intelligence-powered web browser called Comet in the startup’s latest effort to compete in the consumer internet market against companies like Google and Microsoft.
Comet will allow users to connect with enterprise applications like Slack and ask complex questions via voice and text, according to a brief demo video Perplexity released on Wednesday.
The browser is available to Perplexity Max subscribers, and the company said invite-only access will roll out to a waitlist over the summer. Perplexity Max costs users $200 per month.
“We built Comet to let the internet do what it has been begging to do: to amplify our intelligence,” Perplexity wrote in a blog post on Wednesday.
Perplexity is best known for its AI-powered search engine that gives users simple answers to questions and links out to the original source material on the web. After the company was accused of plagiarizing content from media outlets, it launched a revenue-sharing model with publishers last year.
In May, Perplexity was in late-stage talks to raise $500 million at a $14 billion valuation, a source familiar confirmed to CNBC. The startup was also approached by Meta earlier this year about a potential acquisition, but the companies did not finalize a deal.
“We will continue to launch new features and functionality for Comet, improve experiences based on your feedback, and focus relentlessly–as we always have–on building accurate and trustworthy AI that fuels human curiosity,” Perplexity said Wednesday.