Connect with us

Published

on

Chip engineer handling a wafer.

Sinology | Moment | Getty Images

With the U.S. restricting China from buying advanced semiconductors used in artificial intelligence development, Beijing is placing hopes on domestic alternatives such as Huawei. 

The task has been made more challenging by the fact that U.S. curbs not only inhibit China’s access to the world’s most advanced chips, but also restrict availing technology vital for creating an AI chip ecosystem. 

Those constraints span the entire semiconductor value chain, ranging from design and manufacturing equipment used to produce AI chips to supporting elements such as memory chips. 

Beijing has mobilized tens of billions of dollars to try to fill those gaps, but while it has been able to “brute force” its way into some breakthroughs, it still has a long way to go, according to experts. 

“U.S. export controls on advanced Nvidia AI chips have incentivized China’s industry to develop alternatives, while also making it more difficult for domestic firms to do so,” said Paul Triolo, partner and senior vice president for China at advisory firm DGA-Albright Stonebridge Group.

Here’s how China stacks up against the rest of the world in four key segments needed to build AI chips. 

AI chip design

Nvidia is regarded as the world’s leading AI chip company, but it’s important to understand that it doesn’t actually manufacture the physical chips that are used for AI training and computing.

Rather, the company designs AI chips, or more precisely, graphics processing units. Orders of the company’s patented GPU designs are then sent to chip foundries — manufacturers that specialize in the mass production of other companies’ semiconductor products. 

While American competitors such as AMD and Broadcom offer varying alternatives, GPU designs from Nvidia are widely recognized as the industry standard. The demand for Nvidia chips is so strong that Chinese customers have continued to buy any of the company’s chips they can get their hands on.

But Nvidia is grappling with Washington’s tightening restrictions. The company revealed in April that additional curbs had prevented it from selling its H20 processor to Chinese clients.

Nvidia’s H20 was a less sophisticated version of its H100 processor, designed specifically to skirt previous export controls. Nevertheless, experts say, it was still more advanced than anything available domestically. But China hopes to change that. 

In response to restrictions, more Chinese semiconductor players have been entering the AI processor arena. They’ve included a wide array of upstarts, such as Enflame Technology and Biren Technology, seeking to soak up billions of dollars in GPU demand left by Nvidia.

But no Chinese firm appears closer to providing a true alternative to Nvidia than Huawei’s chip design arm, HiSilicon. 

Huawei’s most advanced GPU in mass production is its Ascend 910B. The next-generation Ascend 910C was reportedly expected to begin mass shipments as early as May, though no updates have emerged. 

Dylan Patel, founder, CEO and chief analyst at SemiAnalysis, told CNBC that while the Ascend chips remain behind Nvidia, they show that Huawei has been making significant progress. 

“Compared to Nvidia’s export-restricted chips, the performance gap between Huawei and the H20 is less than a full generation. Huawei is not far behind the products Nvidia is permitted to sell into China,” Patel said.

He added that the 910B was two years behind Nvidia as of last year, while the Ascend 910C is only a year behind. 

But while that suggests China’s GPU design capabilities have made great strides, design is just one aspect that stands in the way of creating a competitive AI chip ecosystem.

AI chip fabrication

To manufacture its GPUs, Nvidia relies on TSMC, the world’s largest contract chip foundry, which produces most of the world’s advanced chips.

TSMC complies with U.S. chip controls and is also barred from taking any chip orders from companies on the U.S. trade blacklist. Huawei was placed on the list in 2019.

That has led to Chinese chip designers like Huawei to enlist local chip foundries, the largest of which is SMIC.

SMIC is far behind TSMC — it’s officially known to be able to produce 7-nanometer chips, requiring less advance tech than TSMC’s 3-nanometer production. Smaller nanometer sizes lead to greater chip processing power and efficiency.

There are signs that SMIC has made progress. The company is suspected to have been behind a 5-nanometer 5G chip for Huawei’s Mate 60 Pro, which had rocked confidence in U.S. chip controls in 2023.  The company, however, has a long way to go before it can mass-produce advanced GPUs in a cost-efficient manner. 

According to independent chip and technology analyst Ray Wang, SMIC’s known operation capacity is dwarfed by TSMC’s. 

“Huawei is a very good chip design company, but they are still without good domestic chipmakers,” Wang said, noting that Huawei is reportedly working on its own fabrication capabilities. 

But the lack of key manufacturing equipment stands in the way of both companies.

Advanced Chip equipment  

SMIC’s ability to fulfill Huawei’s GPU requirements is limited by the familiar problem of export controls, but in this case, from the Netherlands. 

While Netherlands may not have any prominent semiconductor designers or manufacturers, it’s home to ASML, the world’s leading supplier of advanced chipmaking equipment — machines that use light or electron beams to transfer complex patterns onto silicon wafers, forming the basis of microchips.

In accordance with U.S. export controls, the country has agreed to block the sale of ASML’s most advanced ultraviolet (EUV) lithography machines. The tools are critical to making advanced GPUs at scale and cost-effectively. 

EUV is the most significant barrier for Chinese advanced chip production, according to Jeff Koch, an analyst at SemiAnalysis. “They have most of the other tooling available, but lithography is limiting their ability to scale towards 3nm and below process nodes,”  he told CNBC.

SMIC has found methods to work around lithography restrictions using ASML’s less advanced deep ultraviolet lithography systems, which have seen comparatively fewer restrictions.

Through this “brute forcing,” producing chips at 7 nm is doable, but the yields are not good, and the strategy is likely reaching its limit, Koch said, adding that “at current yields it appears SMIC cannot produce enough domestic accelerators to meet demand.”

SiCarrier Technologies, a Chinese company working on lithography technology, has reportedly been linked to Huawei.

But imitating existing lithography tools could take years, if not decades, to achieve, Koch said. Instead, China is likely to pursue other technologies and different lithography techniques to push innovation rather than imitation, he added.

AI memory components

While GPUs are often identified as the most critical components in AI computing, they’re far from the only ones. In order to operate AI training and computing, GPUs must work alongside memory chips, which are able to store data within a broader “chipset.”

In AI applications, a specific type of memory known as HBM has become the industry standard. South Korea’s SK Hynix has taken the industry lead in HBM. Other companies in the field include Samsung and U.S.-based Micron

“High bandwidth memory at this stage of AI progression has become essential for training and running AI models,” said analyst Wang.

As with the Netherlands, South Korea is cooperating with U.S.-led chip restrictions and began complying with fresh curbs on the sale of certain HBM memory chips to China in December. 

In response, Chinese memory chip maker ChangXin Memory Technologies, or CXMT, in partnership with chip-packaging and testing company Tongfu Microelectronics, is in the early stages of producing HBM, according to a report by Reuters.

According to Wang, CXMT is expected to be three to four years behind global leaders in HBM development, though it faces major roadblocks, including export controls on chipmaking equipment.

SemiAnalysis estimated in April that CXMT remained a year away from ramping any reasonable volume.

Chinese foundry Wuhan Xinxin Semiconductor Manufacturing is reportedly building a factory to produce HBM wafers. A report from SCMP said that Huawei Technologies had partnered with the firm in producing HBM chips, although the companies did not confirm the partnership.

Huawei has leaned on HBM stockpiles from suppliers like Samsung for use in their Ascend 910C AI processor, SemiAnalysis said in an April report, noting that while the chip was designed domestically, it still relies on foreign products obtained prior to or despite restrictions.

“Whether it be HBM from Samsung, wafers from TSMC, or equipment from America, Netherlands, and Japan, there is a big reliance on foreign industry,” SemiAnalysis said.

Continue Reading

Technology

Scale AI’s Alexandr Wang confirms departure for Meta as part of $14.3 billion deal

Published

on

By

Scale AI's Alexandr Wang confirms departure for Meta as part of .3 billion deal

Alexandr Wang, CEO of ScaleAI speaks on CNBC’s Squawk Box outside the World Economic Forum in Davos, Switzerland on Jan. 23, 2025.

Gerry Miller | CNBC

Scale AI founder Alexandr Wang told employees in a memo on Thursday that he’s leaving for Meta, confirming reports from earlier in the week about his departure and a large investment from the social networking company.

Meta is pumping $14.3 billion into Scale AI as part of the deal, and will have a 49% stake in the artificial intelligence startup, but will not have any voting power, a Scale AI spokesperson said.

“As you’ve probably gathered from recent news, opportunities of this magnitude often come at a cost,” Wang wrote in the memo that he shared on X. “In this instance, that cost is my departure. It has been the absolute greatest pleasure of my life to serve as your CEO.”

Scale AI is promoting Jason Droege, the chief strategy officer, to the CEO role. Droege was previously a venture partner at Benchmark and an Uber vice president.  

A small number of Scale AI employees will also join Meta as part of the agreement, Wang wrote.

A Meta spokesperson confirmed that the company has finalized its “strategic partnership and investment in Scale AI.

“As part of this, we will deepen the work we do together producing data for AI models and Alexandr Wang will join Meta to work on our superintelligence efforts,” the spokesperson said. “We will share more about this effort and the great people joining this team in the coming weeks.”

Meta’s big bet on Wang fits into CEO Mark Zuckerberg’s plans to bolster his company’s AI efforts amid fierce competition from OpenAI and Google-parent Alphabet. Zuckerberg has made AI his company’s top priority for 2025, but has grown increasingly frustrated with his team, particularly as Meta’s latest version of its flagship Llama AI models received a tepid response from developers, CNBC reported earlier this week.

Although Zuckerberg has traditionally placed long-standing employees into high-ranking position, he decided that the outsider Wang would be better suited to oversee AI initiatives deemed crucial for the company.

Scale AI counts a number of Meta rivals as customers, including Google, Microsoft and OpenAI. Meta is one of Scale AI’s biggest clients.

The Scale AI spokesperson said that Meta’s investment and hiring of Wang will not impact the startup’s customers, and that Meta will not be privy to any of its business information or data.

WATCH: Meta’s one of AI’s leaders, not a laggard.

Meta's one of AI's leaders not a laggard, says Futurum Group CEO Daniel Newman

Continue Reading

Technology

Scale AI plans to promote strategy chief Droege to CEO as founder Wang heads for Meta

Published

on

By

Scale AI plans to promote strategy chief Droege to CEO as founder Wang heads for Meta

FILE PHOTO: Jason Droege speaks at the WSJTECH live conference in Laguna Beach, California, U.S. October 22, 2019.

Mike Blake | Reuters

Scale AI plans to promote Chief Strategy Officer Jason Droege to serve as its new CEO, with founder Alexandr Wang heading to Meta as part of a multibillion-dollar deal with the company, CNBC has confirmed.

Meta is finalizing a $14 billion investment into artificial intelligence startup Scale AI, CNBC reported earlier this week. Wang will help lead a new AI research lab at Meta and will be joined by some of his colleagues. The New York Times was first to report about the new AI lab.

Bloomberg first reported that Droege was picked to be the new CEO. CNBC confirmed Scale AI’s plans with a person familiar with the matter who asked not to be named because of confidentiality. Scale AI and Droege didn’t respond to CNBC’s requests for comment.

Droege joined Scale AI in August of 2024, according to his LinkedIn profile. Prior to his role at the startup, he served as a venture partner at Benchmark and a vice president at Uber.

Founded in 2016, Scale AI has achieved a high profile in the industry by helping major tech companies like OpenAI, Google and Microsoft prepare data they use to train cutting-edge AI models. 

Meta has been pouring billions of dollars into AI, but CEO Mark Zuckerberg has been frustrated with its progress. Zuckerberg will be counting on Wang to better execute Meta’s AI ambitions following the tepid reception of the company’s latest Llama AI models.

Meta will take a 49% stake in Scale AI with its investment, The Information reported.

–CNBC’s Jonathan Vanian contributed to this report

Continue Reading

Technology

Oracle shares pop 13% to record high on earnings beat, cloud optimism

Published

on

By

Oracle shares pop 13% to record high on earnings beat, cloud optimism

Larry Ellison, Oracle’s co-founder, chief technology officer and chairman, at right, and U.S. President Donald Trump share a laugh as Ellison uses a stool to stand on as he speaks during a news conference in the Roosevelt Room of the White House in Washington on Jan. 21, 2025. Trump announced an investment in artificial intelligence (AI) infrastructure and took questions on a range of topics including his presidential pardons of Jan. 6 defendants, the war in Ukraine, cryptocurrencies and other topics.

Andrew Harnik | Getty Images News | Getty Images

Oracle shares soared 13% on Thursday to a record close, after the database software vendor issued robust earnings and a strong forecast, fueled by growth in cloud.

Revenue climbed 11% year over year during the fiscal fourth quarter to $15.9 billion, topping the $15.59 billion average estimate, according to LSEG. Adjusted earnings per share of $1.70 exceeded the average analyst estimate of $1.64.

“All told, ORCL has entered an entirely new wave of enterprise popularity that it has not seen since the Internet era in the late 90s,” Piper Sandler analysts wrote in a note to clients. The firm was one of several to lift its price target on the stock, raising its prediction to $190 from $130.

Oracle has been making headway in the cloud infrastructure market to challenge Amazon, Google and Microsoft. It’s still small by comparison, with $3 billion in cloud revenue during the May quarter, compared with over $12 billion for Google, which counts productivity software subscriptions and cloud infrastructure sales when reporting cloud metrics. But Oracle’s business is growing faster.

Future expansion can also come from sales of Oracle’s database on clouds other than its own.

“The growth rate in multi-cloud is astonishing,” Oracle Chairman Larry Ellison said on Wednesday’s conference call with analysts. “In other words, our database is now moving very rapidly to the cloud, I think because – a few reasons, because the database has now all these AI capabilities, but also, quite frankly, now people can get it in whatever cloud they want.”

Remaining performance obligations, a measurement of money that’s expected to be recognized as revenue in the future, sat at $138 billion, up 41% from a year earlier. Oracle CEO Safra Catz said RPO will likely more than double in the 2026 fiscal year, which ends in May 2026. Revenue for the new fiscal year should come in above $67 billion, she said. That’s higher than LSEG’s $65.18 billion consensus.

Gains from OpenAI’s Stargate artificial intelligence data center project, targeting $500 billion in investments over four years, are not yet included in forecasts.

“If Stargate turns out to be, everything is advertised, then we’ve understated our RPO growth,” Ellison said.

For fiscal 2029, revenue should be above the $104 billion target the company set in September, Catz said.

Still, the company faces the challenge of meeting client demand in cloud.

“Demand continues to dramatically outstrip supply,” Catz said, though she added that the company isn’t having trouble sourcing Nvidia graphics processing units.

Analysts at RBC, who recommend holding the stock, raised their price target to $195 to $145. But they noted that, “with the backdrop of continued capacity constraints, we struggle to see a path to meaningful acceleration in the near term.”

WATCH: Oracle shares hit record high

Oracle shares hit record high

Continue Reading

Trending