Developers in the world of artificial intelligence can’t get enough of Nvidia’s processors. Demand is so strong that the company said late Wednesday that revenue in the current quarter will jump 170% to roughly $16 billion.
Nvidia shares rose more than 2% on Thursday before slumping towards the end of the day to finish flat and miss a record close, while the broader market had a rough day.
There’s a flipside to the story. AMD, Nvidia’s main rival in the market for graphics processing units (GPUs), is falling further behind, while chip giant Intel continues to miss out on the hottest trend in technology.
Shares of AMD and Intel fell 7% and 4%, respectively, following Nvidia’s fiscal second-quarter earnings announcement.
Nvidia’s blowout report and comments from executives suggesting that demand will remain high through next year is giving investors a reason to ask if the company has any serious competition when it comes to making the kind of GPUs needed to build and run large AI models.
Nvidia’s success also signals a shift in the market for data center chips. The most important — and generally most expensive — part of a data center buildout is no longer tied to central processors, or CPUs, made by Intel or AMD. Rather, it’s the AI-accelerating GPUs that big cloud companies are buying.
Alphabet, Amazon, Meta and Microsoft are snapping up Nvidia’s next-generation processors, which are so profitable that the company’s adjusted gross margin increased 25.3 percentage points to 71.2% in the period.
“NVDA Data Center revenues are now expected to be more than double INTC+AMD Data Center revenues combined, underscoring the growing importance of accelerators for today’s Data Center customers,” Deutsche Bank analyst Ross Seymore wrote in a note on Thursday.
Nvidia is now expected to post $12 billion in data center sales in the current quarter, according to FactSet data. Intel’s data center group is expected to post $4 billion in revenue, while analysts project AMD’s division will generate sales of $1.64 billion.
AMD and Intel are trying to stay relevant in the AI market, but it’s a struggle.
Intel CEO Pat Gelsinger said on the chipmaker’s earnings call in July that the company still sees “persistent weakness” in all segments of its business through year-end and that cloud companies were focusing more on securing graphics processors for AI instead of Intel’s central processors. Intel’s next high-end data center GPU, called Falcon Shores, is expected to be released in 2025. Its 2023 chip was cancelled.
AMD said on Thursday it acquired a French AI software firm called Mipsology. The company is also working on its own software suite for AI developers called ROCm to compete with Nvidia’s CUDA offering.
Like Intel, AMD faces a timing challenge. Earlier this year, it announced a new flagship AI chip, the MI300. But it’s currently only being shipped in small quantities, a process called “sampling.” The chip will hit the market next year.
“There is no meaningful competition for Nvidia’s high-performance GPUs until AMD starts shipping its new AI accelerators in high volumes in early 2024,” said Raj Joshi, senior vice president at Moody’s Investors Services, in an email.
The window is closing. While AMD and Intel are developing AI technology, they may find that all their big prospective customers have filled up on Nvidia chips before they can start shipping in large quantities.
“AI spending will be a material driver for several companies in our coverage,” Morgan Stanley analyst Joseph Moore wrote in a report. Moore cited AMD, Marvel and Intel as “having strong AI prospects.”
“But for those companies,” he wrote, “AI strength is going be offset by a crowding out of the budget.”
In this photo illustration, a person is holding a smartphone with the logo of US GPU hardware company Lambda Inc. (Lambda Labs) on screen in front of website.
Timon Schneider | SOPA Images | AP
Cloud computing startup Lambda announced on Monday a multibillion-dollar deal with Microsoft for artificial intelligence infrastructure powered by tens of thousands of Nvidia chips.
The agreement comes as Lambda benefits from surging consumer demand for AI-powered services, including AI chatbots and assistants, CEO Stephen Balaban told CNBC’s “Money Movers” on Monday.
“We’re in the middle of probably the largest technology buildout that we’ve ever seen,” Balaban said. “The industry is going really well right now, and there’s just a lot of people who are using ChatGPT and Claude and the different AI services that are out there.”
Balaban said the partnership will continue the two companies’ long-term relationship, which goes back to 2018.
A specific dollar amount was not disclosed in the deal announcement.
Read more CNBC tech news
Founded in 2012, Lambda provides cloud services and software for training and deploying AI models, servicing over 200 thousand developers, and also rents out servers powered by Nvidia’s graphics processing units.
The new infrastructure with Microsoft will include the NVIDIA GB300 NVL72 systems, which are also deployed by hyperscalerCoreWeave, according to a release.
“We love Nvidia’s product,” Balaban said. “They have the best accelerator product on the market.”
The company has dozens of data centers and is planning to continue not only leasing data centers but also constructing its own infrastructure as well, Balaban said.
Earlier in October, Lambda announced plans to open an AI factory in Kansas City in 2026. The site is expected to launch with 24 megawatts of capacity with the potential to scale up to over 100 MW.
Tesla models Y and 3 are displayed at a Tesla showroom in Corte Madera, California, on Dec. 20, 2024.
Justin Sullivan | Getty Images
Tesla has been ordered to provide records to U.S. federal auto safety regulators to comply with a sweeping investigation into possible safety defects with the company’s flush-mounted, retractable door handles that can lead to people getting trapped.
Owners said they were unable to enter or exit their cars due to battery power loss and other situations impeding normal use of the doorhandles.
In some cases, owners’ children were trapped inside hot vehicles, requiring first responder interventions or breaking windows to open the doors.
NHTSA’s Office of Defects Investigations said they had “received 16 reports of exterior door handles becoming inoperative due to low 12VDC battery voltage in certain MY 2021 Tesla Model Y vehicles,” as of October 27, 2025.
Read more CNBC tech news
The agency began the electronic door handles investigation into Tesla following a Bloomberg report bringing incidents to light. The news agency reported that people were injured or died after becoming trapped in Tesla vehicles after collisions or battery power losses that prevented doors from opening normally.
Tesla design leader Franz Von Holzhausen has said in subsequent press interviews that the company would change the design of its door handles.
Tesla competitors, including Rivian, are also reconsidering flush-mounted, or retractable door handle designs.
Volkswagen CEO Thomas Schäfer recently said his company’s customers don’t even want the flush-mounted, electronic doorhandles and VW has no plans to adopt them.
Meanwhile, China is expected to implement new vehicle safety standards around door handles, including a requirement to have more clearly marked, accessible and easier-to-use emergency, interior door release mechanisms.
China’s Ministry of Industry and Information Technology has released draft standards and comments are open through November 22.
The NHTSA Tesla probe seeks records concerning all model year, “2021 Tesla Model Y vehicles manufactured for sale or lease in the United States,” as well as “peer vehicles,” including Tesla Model 3 and Model Y vehicles from model years 2017 to 2022, and “systems related to opening doors including, door handles, door latches, 12VDC batteries, software,” and other components.
Tesla has until Dec. 10 to provide the records.
While Tesla can seek an extension on the deadline from NHTSA, it may face fines of “$27,874 per violation per day, with a maximum of $139,356,994” if the company either fails to or refuses to “respond completely, accurately, or in a timely manner” to NHTSA’s information requests, the agency cautioned in its letter.
Amazon went from “Magnificent Seven” zero to hero in a matter of days. First, it was blowout earnings on Thursday night, followed by a 9.6% stock surge the next day. Then, on Monday, it was a big cloud deal with OpenAI, and the stock soared another 4.5%. Amazon came into last week’s earnings print as the worst-performing Mag 7 stock in 2025. Now, it is up more than 16% year to date and hitting another all-time high Monday. AMZN 5D mountain Amazon performance over 5 sessions “Amazon just completely refuted” concerns about slowing growth in its cloud unit, Amazon Web Services, Jim Cramer said Monday on ” Squawk on the Street ,” shortly after it was announced that Amazon secured a $38 billion commitment from OpenAI to use AWS cloud infrastructure for additional computer power. The partnership signals that the ChatGPT creator is no longer relying solely on Microsoft’s cloud service, Azure. Under the deal, OpenAI will immediately begin running workloads on AWS, tapping hundreds of thousands of Nvidia graphics processing units (GPUs) across U.S. data centers. It will begin with existing capacity, then expand over time, with AWS planning to build out new infrastructure specifically for OpenAI. “This deal is very exciting,” Jim said, adding that AWS is “no longer a pitiful helpless giant versus everybody else.” AWS growth went to 20% in the latest quarter from 17.5% in the prior period. “Let’s just take that growth rate even more,” predicted Jim, who has been saying for weeks that he believed in Amazon CEO Andy Jassy and was standing by the stock. Amazon emphasized their own chips, but “a lot of it is because they have so much Nvidia compute,” Jim said Monday during the Morning Meeting for Club members. AWS already owns a huge number of Nvidia chips — the specialized processors that power AI models like ChatGPT. These chips, known as GPUs, are what make it possible to train and run massive AI systems quickly. Normally, we would trim Amazon on such a two-session rally. But this AI boom is different, as Jim wrote in my Sunday column. The Mag 7 can’t be thought of as a cohort. We must treat each one as its own story. Jim said the “Mag 7 is too much of the market, get out” is a money-losing, false narrative, which was certainly playing out Monday. While the AWS deal is smaller, as if $38 billion is small in this age of AI spending, it underscores that OpenAI is heading toward a multi-cloud future. Until January, Microsoft was OpenAI’s exclusive cloud partner. That later shifted to a right of first refusal, which expired last week. Microsoft reaffirmed its role in the newly recapitalized OpenAI with a $250 billion commitment from OpenAI to keep scaling on Azure. OpenAI has also signed cloud deals with Google and Oracle. The timing of the AWS deal comes as Amazon doubles down on expanding cloud capacity. During Amazon’s third-quarter earnings call last week, the company said it has been focused on accelerating capacity for AWS, noting that it’s on track to double its overall capacity by the end of 2027. This capacity consists of power, data center, and chips, primarily its custom silicon, Trainium, and Nvidia. (Jim Cramer’s Charitable Trust is long AMZN, NVDA, MSFT. See here for a full list of the stocks.) As a subscriber to the CNBC Investing Club with Jim Cramer, you will receive a trade alert before Jim makes a trade. Jim waits 45 minutes after sending a trade alert before buying or selling a stock in his charitable trust’s portfolio. If Jim has talked about a stock on CNBC TV, he waits 72 hours after issuing the trade alert before executing the trade. THE ABOVE INVESTING CLUB INFORMATION IS SUBJECT TO OUR TERMS AND CONDITIONS AND PRIVACY POLICY , TOGETHER WITH OUR DISCLAIMER . NO FIDUCIARY OBLIGATION OR DUTY EXISTS, OR IS CREATED, BY VIRTUE OF YOUR RECEIPT OF ANY INFORMATION PROVIDED IN CONNECTION WITH THE INVESTING CLUB. NO SPECIFIC OUTCOME OR PROFIT IS GUARANTEED.