Connect with us

Published

on

Nvidia CEO Jensen Huang arrives to attend the opening ceremony of Siliconware Precision Industries Co. (SPIL)’s Tan Ke Plant site in Taichung, Taiwan Jan. 16, 2025. 

Ann Wang | Reuters

Nvidia announced new chips for building and deploying artificial intelligence models at its annual GTC conference on Tuesday. 

CEO Jensen Huang revealed Blackwell Ultra, a family of chips shipping in the second half of this year, as well as Vera Rubin, the company’s next-generation graphics processing unit, or GPU, that is expected to ship in 2026.

Nvidia’s sales are up more than sixfold since its business was transformed by the release of OpenAI’s ChatGPT in late 2022. That’s because its “big GPUs” have most of the market for developing advanced AI, a process called training.

Software developers and investors are closely watching the company’s new chips to see if they offer enough additional performance and efficiency to convince the company’s biggest end customers — cloud companies including Microsoft, Google and Amazon — to continue spending billions of dollars to build data centers based around Nvidia chips.

“This last year is where almost the entire world got involved. The computational requirement, the scaling law of AI, is more resilient, and in fact, is hyper-accelerated,” Huang said.

Tuesday’s announcements are also a test of Nvidia’s new annual release cadence. The company is striving to announce new chip families on an every-year basis. Before the AI boom, Nvidia released new chip architectures every other year. 

The GTC conference in San Jose, California, is also a show of strength for Nvidia. 

The event, Nvidia’s second in-person conference since the pandemic, is expected to have 25,000 attendees and hundreds of companies discussing the ways they use the company’s hardware for AI. That includes Waymo, Microsoft and Ford, among others. General Motors also announced that it will use Nvidia’s service for its next-generation vehicles.

The chip architecture after Rubin will be named after physicist Richard Feynman, Nvidia said on Tuesday, continuing its tradition of naming chip families after scientists. Nvidia’s Feynman chips are expected to be available in 2028, according to a slide displayed by Huang.

Nvidia will also showcase its other products and services at the event. 

For example, Nvidia announced new laptops and desktops using its chips, including two AI-focused PCs called DGX Spark and DGX Station that will be able to run large AI models such as Llama or DeepSeek. The company also announced updates to its networking parts for tying hundreds or thousands of GPUs together so they work as one, as well as a software package called Dynamo that helps users get the most out of their chips.

Jensen Huang, co-founder and chief executive officer of Nvidia Corp., speaks during the Nvidia GPU Technology Conference (GTC) in San Jose, California, US, on Tuesday, March 18, 2025. 

David Paul Morris | Bloomberg | Getty Images

Vera Rubin

Nvidia expects to start shipping systems on its next-generation GPU family in the second half of 2026. 

The system has two main components: a CPU, called Vera, and a new GPU design, called Rubin. It’s named after astronomer Vera Rubin.

Vera is Nvidia’s first custom CPU design, the company said, and it’s based on a core design they’ve named Olympus. 

Previously when it needed CPUs, Nvidia used an off-the-shelf design from Arm. Companies that have developed custom Arm core designs, such as Qualcomm and Apple, say that they can be more tailored and unlock better performance.

The custom Vera design will be twice as fast as the CPU used in last year’s Grace Blackwell chips, the company said. 

When paired with Vera, Rubin can manage 50 petaflops while doing inference, more than double the 20 petaflops for the company’s current Blackwell chips. Rubin can also support as much as 288 gigabytes of fast memory, which is one of the core specs that AI developers watch.

Nvidia is also making a change to what it calls a GPU. Rubin is actually two GPUs, Nvidia said. 

The Blackwell GPU, which is currently on the market, is actually two separate chips that were assembled together and made to work as one chip.

Starting with Rubin, Nvidia will say that when it combines two or more dies to make a single chip, it will refer to them as separate GPUs. In the second half of 2027, Nvidia plans to release a “Rubin Next” chip that combines four dies to make a single chip, doubling the speed of Rubin, and it will refer to that as four GPUs.

Nvidia said that will come in a rack called Vera Rubin NVL144. Previous versions of Nvidia’s rack were called NVL72.

Jensen Huang, co-founder and chief executive officer of Nvidia Corp., speaks during the Nvidia GPU Technology Conference (GTC) in San Jose, California, US, on Tuesday, March 18, 2025. 

David Paul Morris | Bloomberg | Getty Images

Blackwell Ultra

Nvidia also announced new versions of its Blackwell family of chips that it calls Blackwell Ultra.

That chip will be able to produce more tokens per second, which means that the chip can generate more content in the same amount of time as its predecessor, the company said in a briefing.

Nvidia says that means that cloud providers can use Blackwell Ultra to offer a premium AI service for time-sensitive applications, allowing them to make as much as 50 times the revenue from the new chips as the Hopper generation, which shipped in 2023.

Blackwell Ultra will come in a version with two paired to an Nvidia Arm CPU, called GB300, and a version with just the GPU, called B300. It will also come in versions with eight GPUs in a single server blade and a rack version with 72 Blackwell chips.

The top four cloud companies have deployed three times the number of Blackwell chips as Hopper chips, Nvidia said.

DeepSeek

Nvidia kicks off its GTC Conference: The Committee debate how to trade it

Continue Reading

Technology

Musk says Tesla is expanding Austin robotaxi service, adding Grok to cars

Published

on

By

Musk says Tesla is expanding Austin robotaxi service, adding Grok to cars

Tesla CEO Elon Musk attends an opening ceremony for Tesla China-made Model Y program in Shanghai, China, on Jan. 7, 2020.

Aly Song | Reuters

Tesla CEO Elon Musk said the company is expanding its robotaxi service area and bringing xAI’s Grok to vehicles as it rolled out a new iteration of the artificial intelligence chatbot.

Shares gained about 3%.

Musk said on X that Grok, his AI chatbot that praised Adolf Hitler and posted a barrage of antisemitic comments recently, will be available in Tesla vehicles “next week at the latest.”

xAI officially launched the Grok 4 update overnight as the company continued to face backlash for the vitriol written by the chatbot.

In response to a user post on his social media platform X, Musk said the company is expanding its Austin, Texas robotaxi service area this weekend. He also said Tesla is awaiting regulatory approval for a launch in the Bay Area “probably in a month or two.”

Read more CNBC tech news

The expansion of robotaxi and Grok integration comes at a fraught time for Musk and his empire.

Tesla set its annual shareholder meeting for Nov. 6, a Thursday filing showed. A group of investors recently called on the electric vehicle company to schedule the meeting.

Its last shareholder meeting was in June 2024, as Musk established himself as a major backer of President Donald Trump‘s reelection campaign. Musk later led the Trump administration’s Department of Government Efficiency, known as DOGE.

After stepping down from DOGE at the end of May, Musk has openly feuded with Trump on social media over the major tax bill, with the president suggesting the government look at cutting contracts for Musk’s companies.

Shares have tanked from their post-election high over investor concerns that the public fight could hamper Tesla. Slowing sales and rising competition also stifled some investor appetite.

Tesla shares fell Monday, with the company losing $68 billion in value after Musk continued to blast Trump’s “Big Beautiful Bill” and said he was establishing his own political party, the “America Party.”

The world’s richest man suffered another blow Wednesday when Linda Yaccarino stepped down as CEO of his social media platform X, leaving the role after a turbulent two years for the company.

Continue Reading

Technology

Amazon Web Services is building equipment to cool Nvidia GPUs as AI boom accelerates

Published

on

By

Amazon Web Services is building equipment to cool Nvidia GPUs as AI boom accelerates

The letters AI, which stands for “artificial intelligence,” stand at the Amazon Web Services booth at the Hannover Messe industrial trade fair in Hannover, Germany, on March 31, 2025.

Julian Stratenschulte | Picture Alliance | Getty Images

Amazon said Wednesday that its cloud division has developed hardware to cool down next-generation Nvidia graphics processing units that are used for artificial intelligence workloads.

Nvidia’s GPUs, which have powered the generative AI boom, require massive amounts of energy. That means companies using the processors need additional equipment to cool them down.

Amazon considered erecting data centers that could accommodate widespread liquid cooling to make the most of these power-hungry Nvidia GPUs. But that process would have taken too long, and commercially available equipment wouldn’t have worked, Dave Brown, vice president of compute and machine learning services at Amazon Web Services, said in a video posted to YouTube.

“They would take up too much data center floor space or increase water usage substantially,” Brown said. “And while some of these solutions could work for lower volumes at other providers, they simply wouldn’t be enough liquid-cooling capacity to support our scale.”

Rather, Amazon engineers conceived of the In-Row Heat Exchanger, or IRHX, that can be plugged into existing and new data centers. More traditional air cooling was sufficient for previous generations of Nvidia chips.

Customers can now access the AWS service as computing instances that go by the name P6e, Brown wrote in a blog post. The new systems accompany Nvidia’s design for dense computing power. Nvidia’s GB200 NVL72 packs a single rack with 72 Nvidia Blackwell GPUs that are wired together to train and run large AI models.

Computing clusters based on Nvidia’s GB200 NVL72 have previously been available through Microsoft or CoreWeave. AWS is the world’s largest supplier of cloud infrastructure.

Amazon has rolled out its own infrastructure hardware in the past. The company has custom chips for general-purpose computing and for AI, and designed its own storage servers and networking routers. In running homegrown hardware, Amazon depends less on third-party suppliers, which can benefit the company’s bottom line. In the first quarter, AWS delivered the widest operating margin since at least 2014, and the unit is responsible for most of Amazon’s net income.

Microsoft, the second largest cloud provider, has followed Amazon’s lead and made strides in chip development. In 2023, the company designed its own systems called Sidekicks to cool the Maia AI chips it developed.

WATCH: AWS announces latest CPU chip, will deliver record networking speed

AWS announces latest CPU chip, will deliver record networking speed

Continue Reading

Technology

Bitcoin rises to fresh record above $112,000, helped by Nvidia-led tech rally

Published

on

By

Bitcoin rises to fresh record above 2,000, helped by Nvidia-led tech rally

The logo of the cryptocurrency Bitcoin can be seen on a coin in front of a Bitcoin chart.

Silas Stein | Picture Alliance | Getty Images

Bitcoin hit a fresh record on Wednesday afternoon as an Nvidia-led rally in equities helped push the price of the cryptocurrency higher into the stock market close.

The price of bitcoin was last up 1.9%, trading at $110,947.49, according to Coin Metrics. Just before 4:00 p.m. ET, it hit a high of $112,052.24, surpassing its May 22 record of $111,999.

The flagship cryptocurrency has been trading in a tight range for several weeks despite billions of dollars flowing into bitcoin exchange traded funds. Bitcoin purchases by public companies outpaced ETF inflows in the second quarter. Still, bitcoin is up just 2% in the past month.

Stock Chart IconStock chart icon

hide content

Bitcoin climbs above $112,000

On Wednesday, tech stocks rallied as Nvidia became the first company to briefly touch $4 trillion in market capitalization. In the same session, investors appeared to shrug off the latest tariff developments from President Donald Trump. The tech-heavy Nasdaq Composite notched a record close.

While institutions broadly have embraced bitcoin’s “digital gold” narrative, it is still a risk asset that rises and falls alongside stocks depending on what’s driving investor sentiment. When the market is in risk-on mode and investors buy growth-oriented assets like tech stocks, bitcoin and crypto tend to rally with them.

Investors have been expecting bitcoin to reach new records in the second half of the year as corporate treasuries accelerate their bitcoin buying sprees and Congress gets closer to passing crypto legislation.

Don’t miss these cryptocurrency insights from CNBC Pro:

Continue Reading

Trending