Memory chips are at the center of all devices, helping store and access data in smartphones, computers and the servers training generative artificial intelligence models.
Just three companies make more than 90% of the world’s dynamic random-access memory, or DRAM, chips. With Samsung and SK Hynix both headquartered in South Korea, Idaho-based Micron is the only manufacturer in the U.S. — that has made it the latest target of China’s bans on U.S. technologies.
About a quarter of Micron’s revenue comes from China, and “about half that revenue is at risk,” Micron CEO Sanjay Mehrotra told CNBC in an interview.
Meanwhile, Micron is doubling down on U.S. manufacturing. Its current leading-edge chips are made in Japan and Taiwan, but Micron is aiming to bring advanced memory production to the U.S. starting in 2026 with a new $15 billion chip fabrication plant in Boise, Idaho. Micron celebrated its 45th anniversary in October by pouring the first cement at the new fab.
The facility is located next to Micron’s huge research and development facility, where CNBC got a behind-the-scenes tour.
Micron’s existing research and development facility in Boise, Idaho, shown here on Oct. 6, 2023.
Ben Farrar
“Memory is very cost-sensitive and we have to get economies of scale to mass produce our chips on a level that meets the market demands,” said Scott Gatzemeier, Micron’s corporate vice president of front end U.S. expansion.
DRAM and NAND memory chips are a cheaper type of semiconductor than the high-powered central processing units from Intel and AMD and graphics processing units that sparked Nvidia’s growth. But multiple memory chips are needed to support each GPU or CPU, so making memory requires more fab space.
That’s why Micron is planning the biggest chip project in U.S. history, spending $100 billion over 20 years to build four 600,000 square foot fabs in upstate New York.
Mehrotra told CNBC that Micron’s goal is to vastly increase the U.S. share of DRAM production, which he said currently sits at just 2%. That production comes from Micron’s fab in Manassas, Virginia. The company is getting assistance from the federal CHIPS and Science Act, which offers billions of dollars to incentivize domestic production.
“With Micron’s investments through CHIPS support in Boise, Idaho, as well as in Syracuse, New York, that 2% over the course of nearly 20 years will be changing to about 15% of the worldwide production coming from the U.S.,” Mehrotra said.
The U.S. share of overall chip manufacturing has plummeted from 37% to 12% in the last three decades, largely because it costs at least 20% more to build and operate a new fab in the U.S. than in Asia. Labor is also cheaper there, the supply chain is more accessible and government incentives have been far greater. That’s why the CHIPS and Science Act set aside $52.7 billion for companies that manufacture in the U.S.
Senate Majority Leader Chuck Schumer, D-N.Y., co-sponsored the bill.
“When it came to chips so essential to everything we do, we had lost that edge,” Schumer told CNBC in an interview. “And if we didn’t get back that edge, not just on chips but on science broadly, we would no longer be the No. 1 economic power in the world.”
Micron and at least 460 other companies have applied for funds from the CHIPS Act. States are also offering incentives to entice chip companies. Micron told CNBC it’s eligible for up to $5.5 billion from the state of New York for the four fabs it’s building just north of Syracuse. New York Gov. Kathy Hochul signed the state’s Green CHIPS Act into law last year.
“If they hadn’t passed the CHIPS and Science Act first, I don’t think it would have been as many incentives as necessary,” Hochul said. “I knew I had to woo them, talk about our incentives, but also we get out of it 50,000 jobs. That’s a good deal for us any day of the week.”
These promises come on the heels of a major price slump for memory chips, which led to layoffs at Micron and SK Hynix, and resulted in Samsung slashing production. Now, Micron is betting big that the memory market will grow.
“The large language learning models and other things like that continue to increase large demand,” Gatzemeier said.
“We’re now moving into things like FaceTime, higher resolution images, movies on demand,” he said. “All of that requires more and more memory to be made available.”
Micron says construction in New York will begin at the end of 2024 and chip production there will start in 2027. With both Idaho and New York fabs online, Mehrotra told CNBC that Micron plans to increase the share of chips it makes in the U.S. from 10% to nearly 60% in the next two decades.
Micron CEO Sanjay Mehrotra shows CNBC’s Katie Tarasov a 300mm silicon wafer at the memory company’s San Jose office on Oct. 2, 2023.
Kent Kessinger
‘Feast or famine’
Micron was founded in 1978 by three chip engineers, along with one of their twin brothers, in the basement of a dental office in Boise. By 1980, it was building its first fab and a year later was pumping out a revolutionarily small 64K DRAM chip. These chips, used for storing bits of data that can be quickly accessed by a CPU, ended up in many of the early PCs.
Gatzemeier, who joined as an intern in 1997, explained the two main kinds of memory: DRAM and NAND.
DRAM is “volatile memory, which means that when the power is removed, it loses all of its information. It’s very fast but has to be, and it sits near the CPU and it’s used for real-time processing,” he said. “NAND flash memory is what’s in your SSDs or your storage cards. And NAND flash is nonvolatile, meaning it’ll still store your memory even when the power’s removed.”
Micron went public in 1984. Memory was a crowded field, but over the years, it has whittled down to just three top players.
“The name of the game is high performance and low cost at the same time,” said Patrick Moorhead, CEO of Moor Insights and Strategy. “Otherwise, you’re going to be blasted out of the market.”
When it comes to the biggest type of memory, DRAM, Samsung is by far the leader, followed by SK Hynix and then Micron. Micron has made 11 acquisitions since 1998, including Texas Instruments‘ memory division, Numonyx, Elpida and Inotera.
“For a very long period, they had not invested in a new fab,” said Gaurav Gupta, an analyst at Gartner. “But they were still able to retain their market share by acquiring other smaller memory firms, which were either going out of business or bankrupt.”
Unlike many kinds of chips, memory wasn’t in short supply during the chip shortage. Micron and its competitors saw a major upswing in the pandemic-fueled boom in consumer electronics. Micron’s profits then fell significantly due to weakened demand for PCs and smartphones and a chip oversupply that led to lower prices. It’s a downturn that has affected much of the chip industry.
“When I look at this market over the past 30 years, it’s always feast or famine,” Moorhead said. “We have an oversupply now. But guess what? Give it a couple of months and we will be in an undersupply and prices will go up.”
Even amid the downturn, Mehrotra is optimistic about the growth of Micron’s smartphone business. It supplies memory in phones from Apple, Motorola, Asus and more.
“The mix of smartphones is going more and more toward higher-end smartphones, toward the flagship smartphones, which require more memory as well,” Mehrotra said. “When we look ahead at 2024, we actually expect that year-over-year total worldwide smartphone unit sales will increase.”
Micron is also focused on rapid growth markets such as automotive and AI. The next generation of its most advanced product, High Bandwidth Memory, is set for volume production next year. HBM helps AI models such as ChatGPT remember past conversations and user preferences to generate more humanlike responses.
“It is able to pack 50% more memory capacity in a memory cube,” Mehrotra said. “It is able to give you 50% faster performance and is able to give you about 2.5 times better power and performance efficiency. And these are all the elements that are critically important in AI applications.”
Banned in China
Micron is facing one major specific challenge. In May, China’s cybersecurity administration banned some of its sales to key China infrastructure projects, saying it failed a security review. Last year, the U.S. barred chip companies from supplying China with certain key technologies.
“Micron is absolutely just a pawn in this game right now,” Moorhead said. “They weren’t the first and they were not the last.”
Mehrotra offers a more diplomatic approach.
“It’s very important for U.S. and China to provide an environment to the businesses so that they can invest in a predictable manner,” he said. “And what I can also tell you is that Micron, of course, is totally committed to bringing the value of its technology and products and manufacturing scale to the benefit of our customers across various end markets in China.”
“Micron is obviously trying to diversify its base,” Gartner’s Gupta said. “It has testing and packaging facilities in China. And obviously they are trying to move, diversify out of China.”
China can still rely on chips from Samsung, SK Hynix and smaller Chinese memory makers. That’s because memory is considered a commodity, meaning it’s relatively easy to switch between products from different companies. But that’s not guaranteed to last.
“When we get back to the boom days and Hynix and Samsung can’t fulfill all the volumes, you might see China diving back into Micron and suddenly lifting any restrictions,” Moorhead said.
Moorhead added that China’s cybersecurity risk accusation about Micron is “a front.”
“Compared to a CPU or a GPU system, it’s pretty hard to embed something nefarious into something like storage or memory,” he said. “That would be technology that I have never heard of.”
“We think China was being very nasty about this to Micron,” Schumer told CNBC ahead of the visit. “China’s upset with the Biden administration’s very smart prohibition of selling certain types of chip manufacturing equipment to China. But we’re going to stick up for Micron.”
This also isn’t the first time Micron has been at the center of U.S.-China tensions. In 2018, the U.S. accused Chinese chip company Fujian Jinhua of stealing intellectual property from Micron, a claim the Chinese company denied.
With no slowdown in geopolitical tension, Micron is instead focusing on U.S. expansion. Water and power were both significant reasons Micron settled on New York for its biggest project.
A rendering of Micron’s planned four memory chip fabs it will build north of Syracuse, New York, spending $100 billion over the next 20 years.
Micron
“Not just the Finger Lakes, but two Great Lakes: Lake Erie and Lake Ontario,” Hochul said. “There’s plentiful water and low-cost power generated primarily by hydroelectric and wind and solar. So we’re ready for it. We know it’s going to be a transition, but that’s what we want to do.”
Micron said each of its new fabs will use the equivalent of 25 Olympic-size swimming pools worth of water each day, with a goal of reusing or recycling 75% of that. Micron will also use the same amount of energy required to power some 25,000 homes.
“The energy costs are, interestingly enough, lower in the United States than most parts of the world,” Moorhead said. “People are more expensive in the United States, and so is the materials and the cost to build that factory. But that gap is narrowing over time.”
“That won’t happen in New York because we already have a legacy,” Hochul said. “We have Wolfspeed, we have GlobalFoundries. So this is not a new industry to us.”
Micron runs a Chip Camp in Boise for middle schoolers, which Gatzemeier’s daughter attended over the summer, and is investing in university programs to feed the pipeline for future semiconductor engineers.
“We’re actively starting our hiring ramp now,” Gatzemeier said. “We’ve started aggressively targeting all the universities. We’re also really going to draw on the global resources that Micron has across the world and bring in some of that semiconductor expertise to help train these new team members.”
Bitcoin was far and away the best-performing asset class in 2024 as new exchange-traded funds ushered in more widespread adoption and hopes for deregulation under a new presidential administration lifted digital assets to record levels.
But owning cryptocurrency also came with its usual unpredictability and dizzying swings, as this month’s trading clearly illustrates. Bitcoin has more than doubled in price since starting the year in the $40,000 range, with it last trading near $95,500. Ether has scored a nearly 50% year-to-date gain, and last traded at around the $3,400 level.
Stock Chart IconStock chart icon
Bitcoin and ether since the start of 2024
The most prosperous stretch of the year occurred in the weeks following the U.S. presidential election. By mid-December, the cryptocurrency had rocketed above $108,000 for the first time, fueled by optimism that President-elect Donald Trump‘s victory over Vice President Kamala Harris would open the door for greater regulatory clarity and send new money rushing into the sector.
Since then, however, prices have eased. Bitcoin is negative for the month, hurt by the expectation that the Federal Reserve’s rate cuts will roll out at a slower-than-anticipated pace. The market has also faced a stretch of apparent profit-taking and choppiness into the end of the year.
The year began with a strong boost of confidence from the introduction in January of new ETFs that hold the cryptocurrency. The funds, which are pitched by asset managers as a simpler way for investors to access bitcoin, have pulled in tens of billions of dollars of cash this year. The iShares Bitcoin Trust ETF (IBIT) now has more than $50 billion in assets.
Stock Chart IconStock chart icon
Microstrategy shares this year
Ether ETFs joined the excitement in July. The demand for those funds has not been as strong as for their bitcoin counterparts, but the category has still attracted more than $2 billion in net inflows in less than six months, according to FactSet.
Strong tail winds for cryptocurrencies also lifted connected stocks to record levels. Bitcoin proxy Microstrategy has surged 388% since the start of the year, while Coinbase and Robinhood have rallied about 47% and 200%, respectively. MicroStrategy shares have surged since mid-December as the company was added into the Nasdaq 100 index.
Some mining stocks, however, haven’t performed as well, with Mara Holdings and Riot Platforms on track for double-digit year-to-date losses. The drop in mining stocks may be a direct result of this year’s bitcoin halving, which reduced the block rewards. Along with transaction fees, this is one of the most significant ways miners make money.
— CNBC’s Jesse Pound contributed reporting.
Don’t miss these cryptocurrency insights from CNBC Pro:
Hock Tan, CEO of Broadcom (L) and former CEO of Intel, Pat Gelsinger.
Reuters | CNBC
It was a big year for silicon in Silicon Valley — but a brutal one for the company most responsible for the area’s moniker.
Intel, the 56-year-old chipmaker co-founded by industry pioneers Gordon Moore and Robert Noyce and legendary investor Arthur Rock, had its worst year since going public in 1971, losing 61% of its value.
The opposite story unfolded at Broadcom, the chip conglomerate run by CEO Hock Tan and headquartered in Palo Alto, California, about 15 miles from Intel’s Santa Clara campus.
Broadcom’s stock price soared 111% in 2024 as of Monday’s close, its best performance ever. The current company is the product of a 2015 acquisition by Avago, which went public in 2009.
The driving force behind the diverging narratives was artificial intelligence. Broadcom rode the AI train, while Intel largely missed it. The changing fortunes of the two chipmakers underscores the fleeting nature of leadership in the tech industry and how a few key decisions can result in hundreds of billions — or even trillions — of dollars in market cap shifts.
Broadcom develops custom chips for Google and other huge cloud companies. It also makes essential networking gear that large server clusters need to tie thousands of AI chips together. Within AI, Broadcom has largely been overshadowed by Nvidia, whose graphics processing units, or GPUs, power most of the large language models being developed at OpenAI, Microsoft, Google and Amazon and also enable the heftiest AI workloads.
Despite having a lower profile, Broadcom’s accelerator chips, which the company calls XPUs, have become a key piece of the AI ecosystem.
“Why it’s really shooting up is because they’re talking about AI, AI, AI, AI,” Eric Ross, chief investment strategist at Cascend, told CNBC’s “Squawk Box” earlier this month.
Intel, which for decades was the dominant U.S. chipmaker, has been mostly shut out of AI. Its server chips lag far behind Nvidia’s, and the company has also lost market share to longtime rival Advanced Micro Devices while spending heavily on new factories.
Intel’s board ousted Pat Gelsinger from the CEO role on Dec. 1, after a tumultuous four-year tenure.
“I think someone more innovative might have seen the AI wave coming,” Paul Argenti, professor of management at Dartmouth’s Tuck School of Business, said in an interview on “Squawk Box” after the announcement.
An Intel spokesperson declined to comment.
Broadcom is now worth about $1.1 trillion and is the eighth U.S. tech company to cross the trillion-dollar mark. It’s the second most valuable chip company, behind Nvidia, which has driven the AI boom to a $3.4 trillion valuation, trailing only Apple among all public companies. Nvidia’s stock price soared 178% this year, but actually did better in 2023, when it gained 239%.
Until four years ago, Intel was the world’s most valuable chipmaker, nearing a $300 billion market cap in early 2020. The company is now worth about $85 billion, just got booted off the Dow Jones Industrial Average — replaced by Nvidia — and has been in talks to sell off core parts of its business. Intel now ranks 15th in market cap among semiconductor companies globally.
‘Not meant for everybody’
Following the Avago-Broadcom merger in 2015, the combined company’s biggest business was chips for TV set-top boxes and broadband routers. Broadcom still makes Wi-Fi chips used in laptops as well as the iPhone and other smartphones.
After a failed bid to buy mobile chip giant Qualcomm in 2018, Broadcom turned its attention to software companies. The capstone of its spending spree came in 2022 with the announced acquisition of server virtualization software vendor VMware for $61 billion. Software accounted for 41% of Broadcom’s $14 billion in revenue in the most recent quarter, thanks in part to VMware.
What’s exciting Wall Street is Broadcom’s role working with cloud providers to build custom chips for AI. The company’s XPUs are generally simpler and less expensive to operate than Nvidia’s GPUs, and they’re designed to run specific AI programs efficiently.
Cloud vendors and other large internet companies are spending billions of dollars a year on Nvidia’s GPUs so they can build their own models and run AI workloads for customers. Broadcom’s success with custom chips is setting up an AI spending showdown with Nvidia, as hyperscale cloud companies look to differentiate their products and services from their rivals.
Broadcom’s chips aren’t for everyone, as only a handful of companies can afford to design and build their own custom processors.
“You have to be a Google, you have to be a Meta, you have to be a Microsoft or an Oracle to be able to use those chips,” Piper Sandler analyst Harsh Kumar told CNBC’s “Squawk on the Street” on Dec. 13, a day after Broadcom’s earnings. “These chips are not meant for everybody.”
While 2024 has been a breakout year for Broadcom — AI revenue increased 220% — the month of December has put it in record territory. The stock is up 45% for the month as of Monday’s close, 16 percentage points better than its prior best month.
On the company’s earnings call on Dec. 12, Tan told investors that Broadcom had doubled shipments of its XPUs to its three hyperscale providers. The most well known of the bunch is Google, which counts on the technology for its Tensor Processing Units, or TPUs, used to train Apple’s AI software released this year. The other two customers, according to analysts, are TikTok parent ByteDance and Meta.
Tan said that within about two years, companies could spend between $60 billion and $90 billion on XPUs.
“In 2027, we believe each of them plans to deploy 1 million XPU clusters across a single fabric,” Tan said of the three hyperscale customers.
In addition to AI chips, AI server clusters need powerful networking parts to train the most advanced models. Networking chips for AI accounted for 76% of Broadcom’s $4.5 billion of networking sales in the fourth quarter.
Broadcom said that, in total, about 40% of its $30.1 billion in 2024 semiconductor sales were related to AI, and that AI revenue would increase 65% in the first quarter to $3.8 billion.
“The degree of success amongst the hyperscalers in their initiatives here is clearly an area up for debate,” Cantor analyst C.J. Muse, who recommends buying Broadcom shares, wrote in a report on Dec. 18. “But any way you slice it, the focus here will continue to be a meaningful boon for those levered to custom silicon.”
Intel’s very bad year
Prior to 2024, Intel’s worst year on the market was 1974, when the stock sank 57%.
The seeds for the company’s latest stumbles were planted years ago, as Intel missed out on mobile chips to Qualcomm, ARM and Apple.
Rival AMD started taking market share in the critical PC and server CPU markets thanks to its productive manufacturing relationship with Taiwan Semiconductor Manufacturing Company. Intel’s manufacturing process has been a notch behind for years, leading to slower and less power-efficient central processing units, or CPUs.
But Intel’s most costly whiff is in AI — and it’s a big reason Gelsinger was removed.
Nvidia’s GPUs, originally created for video games, have become the critical hardware in the development of power-hungry AI models. Intel’s CPU, formerly the most important and expensive part in a server, has become an afterthought in an AI server. The GPUs Nvidia will ship in 2025 don’t even need an Intel CPU — many of them are paired to an Nvidia-designed ARM-based chip.
As Nvidia has reported revenue growth of at least 94% for the past six quarters, Intel has been forced into downsizing mode. Sales have declined in nine of the past 11 periods. Intel announced in August that it was cutting 15,000 jobs, or about 15% of its workforce.
“We are working to create a leaner, simpler, more agile Intel,” board Chair Frank Yeary said in a Dec. 2 press release announcing Gelsinger’s departure.
A big problem for Intel is that it lacks a comprehensive AI strategy. It’s touted the AI capabilities on its laptop chips to investors, and released an Nvidia competitor called Gaudi 3. But neither the company’s AI PC initiative nor its Gaudi chips have gained much traction in the market. Intel’s Gaudi 3 sales missed the company’s own $500 million target for this year.
Late next year, Intel will release a new AI chip that it codenamed Falcon Shores. It won’t be built on Gaudi 3 architecture, and will instead be a GPU.
“Is it going to be wonderful? No, but it is a good first step in getting the platform done,” Intel interim co-CEO Michelle Holthaus said at a financial conference held by Barclays on Dec. 12.
Holthaus and fellow interim co-CEO David Zinsner have vowed to focus on Intel’s products, leaving the fate of Intel’s costly foundry division unclear.
Before he left, Gelsinger championed a strategy that involved Intel both finding its footing in the semiconductor market and manufacturing chips to compete with TSMC. In June, at a conference in Taipei, Gelsinger told CNBC that when its factories get up and running, Intel wanted to build “everybody’s AI chips,” and give companies such as Nvidia and Broadcom an alternative to TSMC.
Intel said in September that it plans to turn its foundry business into an independent unit with its own board and the potential to raise outside capital. But for now, Intel’s primary client is Intel. The company said it didn’t expect meaningful sales from external customers until 2027.
At the Barclays event this month, Zinsner said the separate board for the foundry business is “getting stood up today.” More broadly, he indicated that the company is looking to remove complexity and associated costs wherever possible.
“We are going to constantly be scrutinizing where we’re spending money, making sure that we’re getting the appropriate return,” Zinsner said.
The World Artificial Intelligence Conference in Shanghai in July 2023.
Aly Song | Reuters
Alibaba is cutting prices on its large language models by up to 85%, the Chinese tech giant announced Tuesday.
The Hangzhou-based e-commerce firm’s cloud computing division, Alibaba Cloud, said in a WeChat post that it’s offering the price cuts on its visual language model, Qwen-VL, which is designed to perceive and understand both texts and images.
Shares of Alibaba didn’t move much on the announcement, closing 0.5% higher on the final trading day of the year in Hong Kong.
Nevertheless, the price cuts demonstrate how the race among China’s technology giants to win more business for their nascent artificial intelligence products is intensifying.
Major Chinese tech firms including Alibaba, Tencent, Baidu, JD.com, Huawei and TikTok parent company Bytedance have all launched their own large language models over the past 18 months, looking to capitalize on the hype around the technology.
It’s not the first time Alibaba has announced price cuts to incentivize businesses to use its AI products. In February, the company announced price reductions of as much as 55% on a wide range of core cloud products. More recently, in May, the company reduced prices on its Qwen AI model by as much as 97% in a bid to boost demand.
Large language models, or LLMs for short, are AI models that are trained on vast quantities of data to generate humanlike responses to user queries and prompts. They are the bedrock for today’s generative AI systems, like Microsoft-backed startup OpenAI’s popular AI chatbot, ChatGPT.
In Alibaba’s case, the company is focusing its LLM efforts on the enterprise segment rather than launching a consumer AI chatbot like OpenAI’s ChatGPT. In May, the company said its Qwen models have been deployed by over 90,000 enterprise users.