Connect with us

Published

on

A customer tries on the Apple Vision Pro headset during the product launch at an Apple Store in New York City on Feb. 2, 2024.

Angela Weiss | Afp | Getty Images

The Vision Pro, the new virtual reality headset from Apple, can transport you to Hawaii or the surface of the moon.

It displays high-resolution computer graphics a few millimeters from the user’s eyes, all while allowing the user to control a desktop-like interface using their eyes and subtle hand gestures. The Vision Pro provides a preview of what using a computer could be like in five years, early adopters say.

The Vision Pro starts at $3,499. After adding storage and accessories such as straps, the whole package can cost as much as $4,500.

That’s a lot more expensive than competing headsets, such as Meta’s Quest 3, which starts at $499. It’s pricier than Meta’s high-end headset, the Quest Pro, which starts at $999. It’s also more expensive, even after controlling for inflation, than the first iPad ($499) or the first iPhone ($499 with a two-year contract).

The Vision Pro includes lots of pricey state-of-the-art parts. One estimate from research firm Omdia puts the “bill of materials” for the headset at $1,542, and that doesn’t include the costs of research and development, packaging, marketing or Apple’s profit margin.

The most expensive part in the headset is the 1.25 inch Sony Semiconductor display that goes in front of the user’s eye.

It’s a key component that helps the virtual experience feel more realistic than previous consumer headsets. The displays have a lot of pixels and lifelike colors, and are built with state-of-the-art manufacturing techniques.

Apple pays about $228 for the “Micro OLED” displays it uses, according to the Omdia estimate. Each Vision Pro needs two of them, one for each eye. Sony Semiconductor declined CNBC’s request to comment for this story.

The Vision Pro displays are the latest example of Apple embracing a new kind of display technology at a larger scale and earlier than the rest of the electronics industry.

Apple’s usage of LCD touchscreens for the first iPhone in 2007, and its later transition to organic LEDs or OLED displays with the iPhone X in 2017, upended existing supply chains and, after Apple shipped millions of units, ultimately drove the cost of the parts for the entire industry down.

Apple has a massive effect on the display industry, said Jacky Qiu, co-founder of OTI Lumionics, which makes materials for manufacturing micro LED panels. He said display makers fight for Apple’s business, which can be make or break for these companies.

“Apple is now the biggest player in terms of OLEDs, in terms of displays. They are the ones that are basically taking all the high-margin displays, all the stuff that is the high-spec type of stuff that is allowing the panel makers today to become profitable,” Qiu said.

“You look at the display business, you either work for Apple and make the iPhone screens and you’re profitable, or you don’t, and you lose money. It’s as brutal as that,” Qiu said.

Micro OLED

The Vision Pro’s displays are a defining feature. They’re packed with pixels and are sharper than any competing headset.

It’s one of the main points that Meta CEO Mark Zuckerberg complimented when comparing the $499 Quest 3 headset to Apple’s headset.

“Apple’s screen does have a higher resolution and that’s really nice,” Zuckerberg said in a video posted on his Instagram page, while saying that Quest’s screens are brighter.

“What’s so revolutionary about the OLED displays that are in the Vision Pro, the difference between Micro OLED and the OLED that you find on a television in your living room is that the pixels are actually a lot denser, they’re smaller and they’re more compact,” said Wayne Rickard, CEO of Terecircuits, a company that makes materials and techniques for display manufacturing.

An Apple Vision Pro headset is displayed during the product release at an Apple Store in New York City on Feb. 2, 2024.

Angela Weiss | AFP | Getty Images

According to a teardown analysis from repair firm iFixit, each Vision Pro display has a resolution of 3660 by 3200 pixels. That’s more pixels per eye than the iPhone 15, which has a screen resolution of 2556 by 1179 pixels. Meta’s Quest 3 comes in at a resolution of 2,064 by 2,208 per eye.

The Vision Pro’s screens are much smaller than the iPhone’s screen, which makes the pixels closer together, and more difficult to manufacture. The Vision Pro displays have 3,386 pixels per inch versus the iPhone 15, which has about 460 pixels per inch on its display.

In total, Apple says the Vision Pro’s displays have more than 23 million total pixels.

They’re some of the densest displays ever built. According to iFixit, 54 Vision Pro pixels can fit in a single iPhone pixel, and each pixel is about 7.5 microns from the next pixel, a measurement called “pixel pitch,” according to Apple’s specifications.

The Apple Vision Pro home screen.

Todd Haselton | CNBC

“With Micro LEDs in particular, it can get down to about below 10 microns. For comparison, a red blood cell might be about 20 microns, so half the size of a red blood cell,” Rickard said.

Apple opted for high-resolution displays so they’d be closer to simulating reality when using the headset’s passthrough mode, which uses outward-facing cameras to show video of the real world inside the headset. It also helps users read text or numbers in virtual reality. It helps remove the “screen door” effect of other headsets where you can see the pixels.

VR headsets need pixel-dense displays because the user’s eyes are so close to the screen. TVs have significantly fewer pixels, but it doesn’t matter because viewers are feet away.

The production of this kind of display requires cutting-edge manufacturing. For example, most displays are built on a backplane made out of glass. The Vision Pro displays are so pixel-dense that they use a silicon backplane, much like a semiconductor.

‘An incredible amount of technology packed into the product’

The new Apple Vision Pro headset is displayed during the Apple Worldwide Developers Conference in Cupertino, California, on June 5, 2023.

Justin Sullivan | Getty Images

The second most expensive part in the Vision Pro is the company’s main processor, which includes Apple’s M2 chip, the same chip it uses in the MacBook Air, and the R1 chip, which is a custom processor to handle video feeds and other sensors on the device.

Bill of materials estimates don’t take into account research and development costs, packaging or shipping. They also don’t take into account capital expenditures that can add up-front costs to big parts orders, but they’re useful for people in the manufacturing world to get an idea of how expensive the parts are in any given device.

Display technologies embraced by Apple typically come down in price after Apple makes them mainstream and as multiple suppliers compete for business.

“South Korean suppliers like Samsung Display and LG Display have shown their interest in this technology. Chinese suppliers like Seeya and BOE are also small-scale mass-produced [OLED on silicon] products,” said Jay Shao, Omdia analyst for displays, in an email. He expects the costs for Vision Pro spec screens to come down in the coming years.

Apple declined to comment, but Apple CEO Tim Cook is not a fan of cost estimates and teardowns. “I’ve never seen one that’s even close to accurate,” he said on an earnings call in 2015.

Apple doesn’t typically discuss its suppliers, but in February, Cook was asked about the device’s price tag on an earnings call.

“If you look at it from a price point of view, there’s an incredible amount of technology packed into the product,” Cook said.

He mentioned some of the most expensive parts in the device and emphasized the R&D costs that Apple spent developing it.

“There’s 5,000 patents in the product, and it’s built on many innovations that Apple has spent multiple years on from silicon to displays and significant AI and machine learning. All the hand tracking, the room mapping, all of this stuff is driven by AI, and so we’re incredibly excited about it,” Cook continued.

Don’t miss these stories from CNBC PRO:

Continue Reading

Technology

Amazon Kuiper second satellite launch postponed by ULA due to rocket booster issue

Published

on

By

Amazon Kuiper second satellite launch postponed by ULA due to rocket booster issue

A United Launch Alliance Atlas V rocket is shown on its launch pad carrying Amazon’s Project Kuiper internet network satellites as the vehicle is prepared for launch at the Cape Canaveral Space Force Station in Cape Canaveral, Florida, U.S., April 28, 2025.

Steve Nesius | Reuters

United Launch Alliance on Monday was forced to delay the second flight carrying a batch of Amazon‘s Project Kuiper internet satellites because of a problem with the rocket booster.

With roughly 30 minutes left in the countdown, ULA announced it was scrubbing the launch due to an issue with “an elevated purge temperature” within its Atlas V rocket’s booster engine. The company said it will provide a new launch date at a later point.

“Possible issue with a GN2 purge line that cannot be resolved inside the count,” ULA CEO Tory Bruno said in a post on Bluesky. “We will need to stand down for today. We’ll sort it and be back.”

The launch from Florida’s Space Coast had been set for last Friday, but was rescheduled to Monday at 1:25 p.m. ET due to inclement weather.

Read more CNBC tech news

Amazon in April successfully sent up 27 Kuiper internet satellites into low Earth orbit, a region of space that’s within 1,200 miles of the Earth’s surface. The second voyage will send “another 27 satellites into orbit, bringing our total constellation size to 54 satellites,” Amazon said in a blog post.

Kuiper is the latest entrant in the burgeoning satellite internet industry, which aims to beam high-speed internet to the ground from orbit. The industry is currently dominated by Elon Musk’s Space X, which operates Starlink. Other competitors include SoftBank-backed OneWeb and Viasat.

Amazon is targeting a constellation of more than 3,000 satellites. The company has to meet a Federal Communications Commission deadline to launch half of its total constellation, or 1,618 satellites, by July 2026.

Don’t miss these insights from CNBC PRO

AWS CEO: Lots of opportunity to expand infrastructure globally

Continue Reading

Technology

Google issues apology, incident report for hourslong cloud outage

Published

on

By

Google issues apology, incident report for hourslong cloud outage

Thomas Kurian, CEO of Google Cloud, speaks at a cloud computing conference held by the company in 2019.

Michael Short | Bloomberg | Getty Images

Google apologized for a major outage that the company said was caused by multiple layers of flawed recent updates.

The company released an incident report late on Friday that explained hours of downtime on Thursday. More than 70 Google cloud services stopped working properly across the globe, knocking down or disrupting dozens of third-party services, including Cloudflare, OpenAI and Shopify. Gmail, Google Calendar, Google Drive, Google Meet and other first-party products also malfunctioned.

“We deeply apologize for the impact this outage has had,” Google wrote in the incident report. “Google Cloud customers and their users trust their businesses to Google, and we will do better. We apologize for the impact this has had not only on our customers’ businesses and their users but also on the trust of our systems. We are committed to making improvements to help avoid outages like this moving forward.”

Thomas Kurian, CEO of Google’s cloud unit, also posted about the outage in an X post on Thursday, saying “we regret the disruption this caused our customers.”

Google in May added a new feature to its “quota policy checks” for evaluating automated incoming requests, but the new feature wasn’t immediately tested in real-world situations, the company wrote in the incident report. As a result, the company’s systems didn’t know how to properly handle data from the new feature, which included blank entries. Those blank entries were then sent out to all Google Cloud data center regions, which prompted the crashes, the company wrote.

Engineers figured out the issue in 10 minutes, according to the company. However, the entire incident went on for seven hours after that, with the crash leading to an overload in some larger regions.

As it released the feature, Google did not use feature flags, an increasingly common industry practice that allows for slow implementation to minimize impact if problems occur. Feature flags would have caught the issue before the feature became widely available, Google said.

Going forward, Google will change its architecture so if one system fails, it can still operate without crashing, the company said. Google said it will also audit all systems and improve its communications “both automated and human, so our customers get the information they need asap to react to issues.” 

— CNBC’s Jordan Novet contributed to this report.

WATCH: Google buyouts highlight tech’s cost-cutting amid AI CapEx boom

Google buyouts highlight tech's cost-cutting amid AI CapEx boom

Continue Reading

Technology

AMD shares rise 9% after analysts say they expect a ‘snapback’ for chipmaker

Published

on

By

AMD shares rise 9% after analysts say they expect a 'snapback' for chipmaker

AMD CEO Lisa Su unveils the AMD vision for Advancing Al.

Courtesy: AMD

Shares of Advanced Micro Devices rose nearly 9% on Monday after analysts at Piper Sandler lifted their price target on the stock on optimism about the chipmaker’s latest product announcement.

The analysts said they see a snapback for AMD’s graphics processing units, or GPUs, in the fourth quarter. That’s when they expect the chipmaker to be through the bulk of the $800 million in charges that AMD said it would incur as a result of a new U.S. license requirement that applies to exports of semiconductors to China and other countries. 

Last week, AMD revealed its next-generation artificial intelligence chips, the Instinct MI400 series. Notably, the company unveiled a full-server rack called Helios that enables thousands of the chips to be tied together. That chip system is expected to be important for AI customers such as cloud companies and developers of large language models. 

AMD CEO Lisa Su showed the products on stage at an event in San Jose, California, alongside OpenAI CEO Sam Altman, who said they sounded “totally crazy.”

“Overall, we are enthused with the product launches at the AMD event this week, specifically the Helios rack, which we think is pivotal for AMD Instinct growth,” the analysts wrote in their note. 

Piper Sandler raised its price target for AMD’s share price from $125 to $140.

The stock jumped past $126 on Monday to close at its highest level since Jan. 7, before President Donald Trump announced sweeping new tariffs and AMD warned of the chip control charges.

Don’t miss these insights from CNBC PRO

AMD CEO Lisa Su: We are still in the very early innings of AI

Continue Reading

Trending