A Samsung Electronics Co. 12-layer HBM3E, top, and other DDR modules arranged in Seoul, South Korea, on Thursday, April 4, 2024. Samsung’s profit rebounded sharply in the first quarter of 2024, reflecting a turnaround in the company’s pivotal semiconductor division and robust sales of Galaxy S24 smartphones. Photographer: SeongJoon Cho/Bloomberg via Getty Images
Bloomberg | Bloomberg | Getty Images
High-performance memory chips are likely to remain in tight supply this year, as explosive AI demand drives a shortage for these chips, according to analysts.
SK Hynix and Micron – two of the world’s largest memory chip suppliers – are out of high-bandwidth memory chips for 2024, while the stock for 2025 is also nearly sold out, according to the firms.
“We expect the general memory supply to remain tight throughout 2024,” Kazunori Ito, director of equity research at Morningstar said in a report last week.
The demand for AI chipsets has boosted the high-end memory chip market, hugely benefiting firms such Samsung Electronics and SK Hynix, the top two memory chipmakers in the world. While SK Hynix already supplies chips to Nvidia, the company is reportedly considering Samsung as a potential supplier too.
High-performance memory chips play a crucial role in the training of large language models (LLMs) such as OpenAI’s ChatGPT, which led AI adoption to skyrocket. LLMs need these chips to remember details from past conversations with users and their preferences to generate human-like responses to queries.
“The manufacturing of these chips are more complex and ramping up production has been difficult. This likely sets up shortages through the rest of 2024 and through much of 2025,” said William Bailey, director at Nasdaq IR Intelligence.
HBM’s production cycle is longer by 1.5 to 2 months compared with DDR5 memory chip commonly found in personal computers and servers, market intelligence firm TrendForce said in March.
Samsung during its first-quarter earnings call in April said its HBM bit supply in 2024 “expanded by more than threefold versus last year.” Chip capacity refers to the number of bits of data a memory chip can store.
“And we have already completed discussions with our customers with that committed supply. In 2025, we will continue to expand supply by at least two times or more year on year, and we’re already in smooth talks with our customers on that supply,” Samsung said.
Micron didn’t respond to CNBC’s request for comment.
“The big buyers of AI chips – firms like Meta and Microsoft – have signaled they plan to keep pouring resources into building AI infrastructure. This means they will be buying large volumes of AI chips, including HBM, at least through 2024,” said Chris Miller, author of “Chip War,” a book on the semiconductor industry.
Chipmakers are in a fierce race to manufacture the most advanced memory chips in the market to capture the AI boom.
SK Hynix in a press conference earlier this month said that it would begin mass production of its latest generation of HBM chips, the 12-layer HBM3E, in the third quarter, while Samsung Electronics plans to do so within the second quarter, having been the first in the industry to ship samples of the latest chip.
“Currently Samsung is ahead in 12-layer HBM3E sampling process. If they can get qualification earlier than its peers, I assume it can get majority shares in end-2024 and 2025,” said SK Kim, executive director and analyst at Daiwa Securities.
Shares of grocery delivery service Instacart dropped about 7% in extended trading on Wednesday, following a report that said the U.S. Federal Trade Commission has begun an investigation into the company’s pricing practices.
The FTC sent a civil investigative demand to Instacart, Reuters reported, citing unnamed people.
A study released last week showed that prices for the same products in the same supermarkets that work with Instacart can vary by around 7%, which can result in over $1,000 in extra annual costs for customers. Instacart responded by saying that retailers determine prices listed in the app.
In 2022, Instacart spent $59 million to acquire Eversight, a company specializing in artificial intelligence-driven pricing and promotions for retailers and consumer packaged goods. Instacart sought to “create compelling savings opportunities for customers in real-time” with Eversight, according to a regulatory filing.
The FTC and Instacart did not immediately respond to requests for comment.
Jim Cramer implores Amazon not to engage in “sham-like” circular AI deals that remind him of the kind of speculation that fueled the 1990s dotcom bubble that burst more than two decades ago. According to multiple reports on Wednesday, Amazon is in talks about a potential $10 billion investment in OpenAI in exchange for the ChatGPT creator agreeing to use the cloud giant’s custom AI chips. “They really need Trainium chips sold so badly that they give somebody $10 billion to buy them,” Jim said during the Club’s Morning Meeting on Wednesday . “I would love to see them not play this game.” “I really respect Amazon, and this shocks me that they’re willing to put up with this,” Jim said on “Squawk on the Street” earlier Wednesday. “You can’t do these deals. These deals are not real.” Over the past several years, many investors have been sounding the alarm over the growing levels of AI-related spending from megacap hyperscalers to compete in the so-called AI arms race. The push for AI requires the buildout of data centers and high-performance chips to run the systems. Jim said the current spate of interconnected investment activity is similar to deals in the lead-up to the year 2000. “The market is not going to let this happen,” Jim predicted, calling the stock market a “cruel task master,” in a stark warning about excess that drove the tech-heavy Nasdaq to a then-record high in March 2000 and the 78% crash over 2½ years that followed. OpenAI has been on a deal spree in 2025, securing massive amounts of computing power from firms including Nvidia , Advanced Micro Devices , Oracle , and Amazon’s cloud unit. That has amounted to the AI startup making $1.4 trillion in infrastructure commitments in recent months. Jim recently referred to OpenAI’s deal activity as “2000 in a nutshell,” as it continues to make aggressive, leveraged bets, raising concerns about an AI bubble. (Jim Cramer’s Charitable Trust is long AMZN, NVDA. See here for a full list of the stocks.) As a subscriber to the CNBC Investing Club with Jim Cramer, you will receive a trade alert before Jim makes a trade. Jim waits 45 minutes after sending a trade alert before buying or selling a stock in his charitable trust’s portfolio. If Jim has talked about a stock on CNBC TV, he waits 72 hours after issuing the trade alert before executing the trade. THE ABOVE INVESTING CLUB INFORMATION IS SUBJECT TO OUR TERMS AND CONDITIONS AND PRIVACY POLICY , TOGETHER WITH OUR DISCLAIMER . NO FIDUCIARY OBLIGATION OR DUTY EXISTS, OR IS CREATED, BY VIRTUE OF YOUR RECEIPT OF ANY INFORMATION PROVIDED IN CONNECTION WITH THE INVESTING CLUB. NO SPECIFIC OUTCOME OR PROFIT IS GUARANTEED.
Rohit Prasad, Senior VP & Head Scientist for Alexa, Amazon, on Centre Stage during day one of Web Summit 2022 at the Altice Arena in Lisbon, Portugal.
Ben McShane | Sportsfile | Getty Images
Rohit Prasad, a top Amazon executive overseeing its artificial general intelligence unit, is leaving the company at the end of this year, the company confirmed Wednesday.
As part of the move, Amazon CEO Andy Jassy said the company is reorganizing the AGI unit under a more expansive division that will also include its silicon development and quantum computing teams. The new division will be led by Peter DeSantis, a 27-year veteran of Amazon who currently serves as a senior vice president in its cloud unit.
This is breaking news. Please refresh for updates.