Jensen Huang, chief executive officer of Nvidia Corp., speaks to members of the media prior to the keynote address at the Nvidia AI summit in Washington, DC, US, on Tuesday, Oct. 28, 2025.
Kent Nishimura | Bloomberg | Getty Images
Nvidia CEO Jensen Huang said at the company’s GTC conference on Tuesday that its Blackwell graphics processing units — the company’s fastest AI chips — are now in full production in Arizona.
Previously, Nvidia’s fastest GPUs were solely manufactured in Taiwan.
Huang said that President Donald Trump had asked him nine months ago to bring manufacturing back to U.S. shores.
“The first thing that President Trump asked me for is bring manufacturing back,” Huang said. “Bring manufacturing back because it’s necessary for national security. Bring manufacturing back because we want the jobs. We want that part of the economy.”
Earlier this month, Nvidia and Taiwan Semiconductor Manufacturing Company announced that the first Blackwell wafers had been produced in a facility in Phoenix, Arizona. Wafers are the base material on which semiconductors are etched onto.
Nvidia said in a video that Blackwell-based systems will now be assembled in the U.S., too.
Much of what the company announced on Tuesday at its conference in Washington was for an audience of policymakers to convince them of the essential role that Nvidia plays, and that it would hurt U.S. interests to restrict its exports.
Huang said on Tuesday on a panel before his speech that Nvidia was holding its conference in Washington to allow Trump to attend, according to CNBC’s Kristina Partsinevelos, but the president is currently on a trip in Asia.
Trump said on Tuesday that he planned to meet with Huang on Wednesday, according to a Reuters report.
Demand for the company’s GPUs remains high, with 6 million Blackwell GPUs shipped in the last four quarters, Huang said Tuesday. Nvidia expects $500 billion in GPU sales between the Blackwell generation and next year’s Rubin chips combined, he added.
Cell networks ‘built on foreign technologies’
Additionally, Huang Tuesday said Nvidia would partner with Finland-based Nokia to build gear for telecommunications, an industry that he said was worth $3 trillion. As part of the partnership, Nvidia will take a $1 billion stake in Nokia.
Huang said that Nvidia is building chips for 5G and 6G base stations because it’s important to have wireless networks based on American technology.
“Thank you for helping the United States bring telecommunication technology back to America,” Huang said to Nokia CEO Justin Hotard during his speech.
The deal is an appeal to Western policymakers who have long had concerns about the amount of technology from China’s Huawei that is used for cellular networks around the world.
“Our fundamental communication fabric is built on foreign technologies,” Huang said. “That has to stop, and we have an opportunity to do that, especially during this fundamental platform shift.”
Nokia will use Nvidia chips in its future base stations, which are the pricey computers that distribute cellular signals. Huawei gear, the market leader, was effectively banned in the U.S. in 2018, leaving Nokia and Ericcson as the primary equipment vendors for U.S. networks.
Huang said that Nokia would be using a new product called Nvidia ARC that combines its Grace GPU, a Blackwell GPU and the company’s networking parts. Huang said that AI delivered over next-generation 6G networks could help operate robots and deliver more accurate weather forecasts.
Stakes are high
The location of the conference carries significance as Nvidia makes the case that it is a core part of the “U.S technology stack.”
Huang has argued that it would be better for American interests if Chinese AI developers got used to U.S. technology like Nvidia’s chips, rather than forcing the Chinese to develop their own AI chips.
“Nvidia is a proud American company building the U.S. AI infrastructure that will ensure our country leads the world in shaping the future of innovation,” Kari Briski, Nvidia’s vice president of generative AI software for enterprise, told reporters on a Monday call.
The stakes are high for Nvidia. U.S. export restrictions have already cost Nvidia billions of dollars in lost sales.
In April, the U.S. government informed Nvidia that its H20 chip, which was specially designed to comply with U.S. export controls, would require a license to ship to China. In May, Nvidia said it would have recorded about $10.5 billion in H20 sales over two quarters if the government hadn’t made the license requirement.
Then, in July, Huang visited Trump in Washington and again tried to persuade him and other administration officials that it is in U.S. interests to ship Nvidia chips to China. The Trump administration said it would approve license requests for the H20, but that Nvidia would have to pay the U.S. government 15% of China sales.
Still, Nvidia’s China business isn’t yet back on track.
Earlier this month, Huang said at a financial conference that Nvidia is currently “100% out of China” and has no market share there. While Nvidia said it would receive licenses for the H20 chip, the company hasn’t revealed a newer chip for China based on the company’s current generation of Blackwell GPUs.
Quantum computing
Many of Nvidia’s announcements on Tuesday were partnerships intended to signal that the company works with a variety of U.S. companies.
Among those announcements was NVQLink, a new way to connect quantum chips to Nvidia’s GPUs.
The U.S. having a lead in quantum computing is important to policymakers because military officials are worried that a foreign adversary may be able to spy on military communications if it gets a working quantum computer first.
Nvidia officials said in a Monday call that its chips can be used to correct errors that pop up during quantum computing and advance the technology. Nvidia said that 17 different quantum computing startups would produce hardware compatible with NVQLink.
“Researchers will be able to do more than just error correction,” Huang said Tuesday. “They will also be able to orchestrate quantum devices and AI supercomputers to run quantum GPU applications.”
Nvidia also said it will partner with the Department of Energy to build seven new supercomputers.
OpenAI CEO Sam Altman (L) speaks with Microsoft Chief Technology Officer and Executive VP of Artificial Intelligence Kevin Scott during the Microsoft Build conference at Microsoft headquarters in Redmond, Washington, on May 21, 2024.
Jason Redmond | AFP | Getty Images
Investors can’t get enough of artificial intelligence, despite worries over the sector’s excessively high valuations.
Both Apple and Microsoft reached a market capitalization of over $4 trillion after their shares rose. It was the first time Apple hit that milestone, though it closed just shy of that level.
Tech companies can’t get enough of each other, either.
Nvidia announced a $1 trillion investment in Nokia, which the Finnish company said will go toward developing its AI plans. For those, like me, who remember Nokia as a company that made the most desirable and bullet-proof phones: It primarily produces cellular equipment now.
Meanwhile, with its 27% stake in OpenAI’s for-profit business, Microsoft is potentially sitting on a goldmine — provided AI finds its footing as a sustainable, revenue-generating business in the long run. OpenAI on Tuesday announced it had completed its restructuring as a nonprofit with a controlling stake in its for-profit arm.
It’s not just Microsoft. Investors who have poured money into tech could potentially gain big — as Cathie Wood of Ark Invest says, “If our expectations for AI … are correct, we are at the very beginning of a technology revolution.”
What you need to know today
And finally…
Jerome Powell, chairman of the US Federal Reserve, during the International Monetary Fund (IMF) and World Bank Fall meetings at the IMF headquarters in Washington, DC, US, on Thursday, Oct. 16, 2025.
Markets are assigning a nearly 100% probability that the Federal Open Market Committee will approve a second consecutive quarter percentage point, or 25 basis point, reduction in the federal funds rate. The overnight lending benchmark is currently targeted between 4%-4.25%.
Beyond that, policymakers are likely to debate, among other things, the future path of reductions, the challenges posed by a lack of economic data and the timetable for ending the reduction in the Fed’s asset portfolio of Treasurys and mortgage-backed securities.
A man walks past a logo of SK Hynix at the lobby of the company’s Bundang office in Seongnam on January 29, 2021.
Jung Yeon-Je | AFP | Getty Images
South Korea’s SK Hynix on Wednesday posted record quarterly revenue and profit, boosted by a strong demand for its high bandwidth memory used in generative AI chipsets.
Here are SK Hynix’s third-quarter results versus LSEG SmartEstimates, which are weighted toward forecasts from analysts who are more consistently accurate:
Revenue: 24.45 trillion won ($17.13 billion) vs. 24.73 trillion won
Operating profit: 11.38 trillion won vs. 11.39 trillion won
Revenue rose about 39% in the September quarter compared with the same period a year earlier, while operating profit surged 62%, year on year.
On a quarter-on-quarter basis, revenue was up 10%, while operating profit grew 24%.
SK Hynix makes memory chips that are used to store data and can be found in everything from servers to consumer devices such as smartphones and laptops.
The company has benefited from a boom in artificial intelligence as a key supplier of high-bandwidth memory or HBM chips used to power AI data center servers.
“As demand across the memory segment has soared due to customers’ expanding investments in AI infrastructure, SK Hynix once again surpassed the record-high performance of the previous quarter due to increased sales of high value-added products,” SK Hynix said in its earnings release.
HBM falls into the broader category of dynamic random access memory, or DRAM — a type of semiconductor memory used to store data and program code that can be found in PCs, workstations and servers.
SK Hynix has set itself apart in the DRAM market by getting an early lead in HBM and establishing itself as the main supplier to the world’s leading AI chip designer, Nvidia.
However, its main competitors, U.S.-based Micron and South Korean-based tech giant Samsung, have been working to catch up in the space.
“With the innovation of AI technology, the memory market has shifted to a new paradigm and demand has begun to spread to all product areas,” SK Hynix Chief Financial Officer Kim Woohyun said in the earnings release.
“We will continue to strengthen our AI memory leadership by responding to customer demand through market-leading products and differentiated technological capabilities,” he added.
The HBM market is expected to continue to boom over the next few years to around $43 billion by 2027, giving strong earnings leverage to memory manufacturers such as SK Hynix, MS Hwang, research director at Counterpoint Research, told CNBC.
“[F]or SK Hynix to continue generating profits, it’ll be important for the company to maintain and enhance its competitive edge,” he added.
A report from Counterpoint Research earlier this month showed that SK Hynix held a leading 38% share of the DRAM market by revenue in the second quarter of the year, increasing its shares after having overtaken Samsung in the first quarter.
The report added that the global HBM market grew 178% year over year in the second quarter, and SK Hynix dominated the space with a 64% share.
Celestica CEO Rob Mionis explained how his company designs and manufactures infrastructure that enables artificial intelligence in a Tuesday interview with CNBC’s Jim Cramer.
“If AI is a speeding freight train, we’re laying the tracks ahead of the freight train,” Mionis said.
He pushed back against the notion that the AI boom is a bubble, saying that the technology has gone from a “nice to have” to a “must have.”
Celestica reported earnings Monday after close, managing to beat estimates and raise its full-year outlook. The stock hit a 52-week high during Tuesday’s session and closed up more than 8%. Celestica has had a huge run over the past several months, and shares are currently up 253.68% year-to-date.
Mionis described some of Celestica’s business strategies, including how the Canadian outfit chose to move away from commodity markets and into design and manufacturing. He told Cramer that choice “has paid off in spades” for his company.
Celestica’s focus on design and manufacturing enables the company to “consistently execute at scale,” he added.
He detailed Celestica’s data center work, saying the company makes high-speed networking and storage system for hyperscalers, digital native companies and other enterprise names.
Mionis praised the company’s partnership with semiconductor maker Broadcom, saying Celestica uses Broadcom’s silicon in a lot of its designs.
“What it means for us is when they launch a new piece of silicon — so the Tomahawk 6 is their 1.6 terabyte silicon — when they launch that into the marketplace, they’ll work with us to develop products, and those products end up in the major hyperscalers.”
Jim Cramer’s Guide to Investing
Sign up now for the CNBC Investing Club to follow Jim Cramer’s every move in the market.
Disclaimer The CNBC Investing Club Charitable Trust owns shares of Broadcom.