Connect with us

Published

on

China is focusing on large language models (LLMs) in the artificial intelligence space. 

Blackdovfx | Istock | Getty Images

China’s attempts to dominate the world of artificial intelligence could be paying off, with industry insiders and technology analysts telling CNBC that Chinese AI models are already hugely popular and are keeping pace with — and even surpassing — those from the U.S. in terms of performance.

AI has become the latest battleground between the U.S. and China, with both sides considering it a strategic technology. Washington continues to restrict China’s access to leading-edge chips designed to help power artificial intelligence amid fears that the technology could threaten U.S. national security.

It’s led China to pursue its own approach to boosting the appeal and performance of its AI models, including relying on open-sourcing technology and developing its own super-fast software and chips.

China is creating popular LLMs

On Hugging Face, a repository of LLMs, Chinese LLMs are the most downloaded, according to Tiezhen Wang, a machine learning engineer at the company. Qwen, a family of AI models created by Chinese e-commerce giant Alibaba, is the most popular on Hugging Face, he said.

“Qwen is rapidly gaining popularity due to its outstanding performance on competitive benchmarks,” Wang told CNBC by email.

He added that Qwen has a “highly favorable licensing model” which means it can be used by companies without the need for “extensive legal reviews.”

Qwen comes in various sizes, or parameters, as they’re known in the world of LLMs. Large parameter models are more powerful but have higher computational costs, while smaller ones are cheaper to run.

“Regardless of the size you choose, Qwen is likely to be one of the best-performing models available right now,” Wang added.

DeepSeek, a start-up, also made waves recently with a model called DeepSeek-R1. DeepSeek said last month that its R1 model competes with OpenAI’s o1 — a model designed for reasoning or solving more complex tasks.

These companies claim that their models can compete with other open-source offerings like Meta‘s Llama, as well as closed LLMs such as those from OpenAI, across various functions.

“In the last year, we’ve seen the rise of open source Chinese contributions to AI with really strong performance, low cost to serve and high throughput,” Grace Isford, a partner at Lux Capital, told CNBC by email.

China pushes open source to go global

Open sourcing a technology serves a number of purposes, including driving innovation as more developers have access to it, as well as building a community around a product.

It is not only Chinese firms that have launched open-source LLMs. Facebook parent Meta, as well as European start-up Mistral, also have open-source versions of AI models.

But with the technology industry caught in the crosshairs of the geopolitical battle between Washington and Beijing, open-source LLMs give Chinese firms another advantage: enabling their models to be used globally.

“Chinese companies would like to see their models used outside of China, so this is definitively a way for companies to become global players in the AI space,” Paul Triolo, a partner at global advisory firm DGA Group, told CNBC by email.

While the focus is on AI models right now, there is also debate over what applications will be built on top of them — and who will dominate this global internet landscape going forward.

“If you assume these frontier base AI models are table stakes, it’s about what these models are used for, like accelerating frontier science and engineering technology,” Lux Capital’s Isford said.

Today’s AI models have been compared to operating systems, such as Microsoft’s Windows, Google‘s Android and Apple‘s iOS, with the potential to dominate a market, like these companies do on mobile and PCs.

If true, this makes the stakes for building a dominant LLM higher.

“They [Chinese companies] perceive LLMs as the center of future tech ecosystems,” Xin Sun, senior lecturer in Chinese and East Asian business at King’s College London, told CNBC by email.

“Their future business models will rely on developers joining their ecosystems, developing new applications based on the LLMs, and attracting users and data from which profits can be generated subsequently through various means, including but far beyond directing users to use their cloud services,” Sun added.

Chip restrictions cast doubt over China’s AI future

AI models are trained on vast amounts of data, requiring huge amounts of computing power. Currently, Nvidia is the leading designer of the chips required for this, known as graphics processing units (GPUs).

Most of the leading AI companies are training their systems on Nvidia’s most high-performance chips — but not in China.

Over the past year or so, the U.S. has ramped up export restrictions on advanced semiconductor and chipmaking equipment to China. It means Nvidia‘s leading-edge chips cannot be exported to the country and the company has had to create sanction-compliant semiconductors to export.

Despite, these curbs, however, Chinese firms have still managed to launch advanced AI models.

“Major Chinese technology platforms currently have sufficient access to computing power to continue to improve models. This is because they have stockpiled large numbers of Nvidia GPUs and are also leveraging domestic GPUs from Huawei and other firms,” DGA Group’s Triolo said.

Indeed, Chinese companies have been boosting efforts to create viable alternatives to Nvidia. Huawei has been one of the leading players in pursuit of this goal in China, while firms like Baidu and Alibaba have also been investing in semiconductor design.

“However, the gap in terms of advanced hardware compute will become greater over time, particularly next year as Nvidia rolls out its Blackwell-based systems that are restricted for export to China,” Triolo said.

Lux Capital’s Isford flagged that China has been “systematically investing and growing their whole domestic AI infrastructure stack outside of Nvidia with high-performance AI chips from companies like Baidu.”

“Whether or not Nvidia chips are banned in China will not prevent China from investing and building their own infrastructure to build and train AI models,” she added.

Continue Reading

Technology

Amazon Web Services is building equipment to cool Nvidia GPUs as AI boom accelerates

Published

on

By

Amazon Web Services is building equipment to cool Nvidia GPUs as AI boom accelerates

The letters AI, which stands for “artificial intelligence,” stand at the Amazon Web Services booth at the Hannover Messe industrial trade fair in Hannover, Germany, on March 31, 2025.

Julian Stratenschulte | Picture Alliance | Getty Images

Amazon said Wednesday that its cloud division has developed hardware to cool down next-generation Nvidia graphics processing units that are used for artificial intelligence workloads.

Nvidia’s GPUs, which have powered the generative AI boom, require massive amounts of energy. That means companies using the processors need additional equipment to cool them down.

Amazon considered erecting data centers that could accommodate widespread liquid cooling to make the most of these power-hungry Nvidia GPUs. But that process would have taken too long, and commercially available equipment wouldn’t have worked, Dave Brown, vice president of compute and machine learning services at Amazon Web Services, said in a video posted to YouTube.

“They would take up too much data center floor space or increase water usage substantially,” Brown said. “And while some of these solutions could work for lower volumes at other providers, they simply wouldn’t be enough liquid-cooling capacity to support our scale.”

Rather, Amazon engineers conceived of the In-Row Heat Exchanger, or IRHX, that can be plugged into existing and new data centers. More traditional air cooling was sufficient for previous generations of Nvidia chips.

Customers can now access the AWS service as computing instances that go by the name P6e, Brown wrote in a blog post. The new systems accompany Nvidia’s design for dense computing power. Nvidia’s GB200 NVL72 packs a single rack with 72 Nvidia Blackwell GPUs that are wired together to train and run large AI models.

Computing clusters based on Nvidia’s GB200 NVL72 have previously been available through Microsoft or CoreWeave. AWS is the world’s largest supplier of cloud infrastructure.

Amazon has rolled out its own infrastructure hardware in the past. The company has custom chips for general-purpose computing and for AI, and designed its own storage servers and networking routers. In running homegrown hardware, Amazon depends less on third-party suppliers, which can benefit the company’s bottom line. In the first quarter, AWS delivered the widest operating margin since at least 2014, and the unit is responsible for most of Amazon’s net income.

Microsoft, the second largest cloud provider, has followed Amazon’s lead and made strides in chip development. In 2023, the company designed its own systems called Sidekicks to cool the Maia AI chips it developed.

WATCH: AWS announces latest CPU chip, will deliver record networking speed

AWS announces latest CPU chip, will deliver record networking speed

Continue Reading

Technology

Bitcoin rises to fresh record above $112,000, helped by Nvidia-led tech rally

Published

on

By

Bitcoin rises to fresh record above 2,000, helped by Nvidia-led tech rally

The logo of the cryptocurrency Bitcoin can be seen on a coin in front of a Bitcoin chart.

Silas Stein | Picture Alliance | Getty Images

Bitcoin hit a fresh record on Wednesday afternoon as an Nvidia-led rally in equities helped push the price of the cryptocurrency higher into the stock market close.

The price of bitcoin was last up 1.9%, trading at $110,947.49, according to Coin Metrics. Just before 4:00 p.m. ET, it hit a high of $112,052.24, surpassing its May 22 record of $111,999.

The flagship cryptocurrency has been trading in a tight range for several weeks despite billions of dollars flowing into bitcoin exchange traded funds. Bitcoin purchases by public companies outpaced ETF inflows in the second quarter. Still, bitcoin is up just 2% in the past month.

Stock Chart IconStock chart icon

hide content

Bitcoin climbs above $112,000

On Wednesday, tech stocks rallied as Nvidia became the first company to briefly touch $4 trillion in market capitalization. In the same session, investors appeared to shrug off the latest tariff developments from President Donald Trump. The tech-heavy Nasdaq Composite notched a record close.

While institutions broadly have embraced bitcoin’s “digital gold” narrative, it is still a risk asset that rises and falls alongside stocks depending on what’s driving investor sentiment. When the market is in risk-on mode and investors buy growth-oriented assets like tech stocks, bitcoin and crypto tend to rally with them.

Investors have been expecting bitcoin to reach new records in the second half of the year as corporate treasuries accelerate their bitcoin buying sprees and Congress gets closer to passing crypto legislation.

Don’t miss these cryptocurrency insights from CNBC Pro:

Continue Reading

Technology

Perplexity launches AI-powered web browser for select group of subscribers

Published

on

By

Perplexity launches AI-powered web browser for select group of subscribers

Dado Ruvic | Reuters

Perplexity AI on Wednesday launched a new artificial intelligence-powered web browser called Comet in the startup’s latest effort to compete in the consumer internet market against companies like Google and Microsoft.

Comet will allow users to connect with enterprise applications like Slack and ask complex questions via voice and text, according to a brief demo video Perplexity released on Wednesday.

The browser is available to Perplexity Max subscribers, and the company said invite-only access will roll out to a waitlist over the summer. Perplexity Max costs users $200 per month.

“We built Comet to let the internet do what it has been begging to do: to amplify our intelligence,” Perplexity wrote in a blog post on Wednesday.

Perplexity is best known for its AI-powered search engine that gives users simple answers to questions and links out to the original source material on the web. After the company was accused of plagiarizing content from media outlets, it launched a revenue-sharing model with publishers last year.

In May, Perplexity was in late-stage talks to raise $500 million at a $14 billion valuation, a source familiar confirmed to CNBC. The startup was also approached by Meta earlier this year about a potential acquisition, but the companies did not finalize a deal.

“We will continue to launch new features and functionality for Comet, improve experiences based on your feedback, and focus relentlessly–as we always have–on building accurate and trustworthy AI that fuels human curiosity,” Perplexity said Wednesday.

WATCH: Perplexity CEO on AI race: The market of providing answers to questions will become a commodity

Perplexity CEO on AI race: The market of providing answers to questions will become a commodity

Continue Reading

Trending