Connect with us

Published

on

Apple Vision Pro review: Here's what you need to know

It’s night. I’m at a lake near Oregon’s Mount Hood, sitting on the beach. Jazz music is playing as I write. I’m not in the real world.

Well, I sort of am. 

I’m wearing Apple’s new Vision Pro headset, which looks like a fancy pair of glowing ski goggles.

Apple’s long-awaited headset, which starts at $3,500, launches in the U.S. on Friday. It’s the company’s first major new gadget to hit the market since the Apple Watch debuted in April 2015. I’ve been testing it for nearly a week. While it has some shortcomings, it’s easily the most fun new product I’ve tried out in years.

Analysts don’t expect the Vision Pro to drive massive amounts of revenue initially. UBS anticipates Apple will ship about 400,000 headsets, leading to a “relatively immaterial” $1.4 billion in revenue this year. However, I’m convinced that if Apple eventually sells cheaper versions, we’ll see millions of people using them in the coming years.

Apple Vision Pro home screen. Here I’m on top of a mountain in Hawaii.

Todd Haselton | CNBC

The Vision Pro offers a new kind of experience that Apple calls “spatial computing.” You sit in your world while looking at a digital one, and then plop different apps around you. You can work, play games, watch movies or surf the web.

Thanks to very sharp displays, and a full M2 processor that’s usually found in Macs, the Vision Pro has the power to do a lot of what you’d expect from an Apple device. There’s a dedicated App Store for Vision Pro apps, but you can also install more than a million iPhone or iPad apps. Or pair it with your Mac and work while looking at a 4K display inside the goggles.

I’m only scratching the surface of the capabilities, but here’s the gist: This is an entirely new type of computing, providing a whole new world of experiences. It feels like the future.

Here’s what you need to know:

What’s good

Apple Vision Pro

Source: Apple

I was skeptical when I first met with Apple to see the Vision Pro. Companies have been trying to do virtual reality and augmented reality and mixed reality or gobbledygook reality for years.

Sometimes it’s cool, but most of the time I’m done after an hour or so.

With the Vision Pro, there are three key parts that come into play. It has super sharp and colorful screens, it allows you to see the world around you by default using “passthrough” technology, and it has a fast processor.

Text is super crisp on the Apple Vision Pro

Todd Haselton | CNBC

The displays help remove the “screendoor” effect that’s common in lower-cost headsets like the Meta Quest 3. That’s where you can see the pixels as you look through a headset. You can easily read text on a website or a book on the Vision Pro. And I was able to watch movies, including in 3D, on screens bigger and nicer than any TV in my house.

Apple Vision Pro.

Source: Apple

The Quest 3 and other headsets also have passthrough. But Apple’s works better. It’s clearer and sharper, enough so that I can comfortably see the room around me in full color and without any lag, though I still can’t read my phone. And I love how you can turn the small digital crown, just like on the Apple Watch or AirPods Max, to adjust the volume or transport yourself into a fully 3D landscape.

You can select different scenes to surround you.

Todd Haselton | CNBC

Virtual travel is a nice touch. You can work or watch movies in Hawaii, by a lake, in White Sands or at Joshua Tree. They’re all relaxing environments with calming sounds and slow animations – like clouds moving across the sky — that help you feel like you’re almost there.

Navigation is easy once you get the hang of it. This reminds me a bit of the iPhone moment, when Apple launched its multitouch display that changed how we interact with phones that had largely been navigated with a stylus, touchpad or keyboard. There aren’t any controllers here. The headset uses sensors to track your eyes (and even verify when you’re making purchases online or in the App Store.) Apple has a quick setup process that aligns the headset to your eyes and then has you look at a series of dots, pinching your fingers as you go so you can calibrate. If you wear glasses, Apple also sells inserts that pop into the headset.

It’s incredibly accurate. You just look where you want to go and then tap your thumb and index finger to select a button or app. There’s a white bar at the bottom of every app, for example, that you can grab to pull and push around. You can adjust the size of any app by looking at the corner and then dragging it out or in at a diagonal angle. And you can swipe through photos or scroll websites by holding your index finger and thumb together while pulling up or down.

Likewise, you zoom in and out by holding those fingers on both hands and pulling outward or inward. You don’t have to flail your hands in front of you. The headset’s external cameras can detect your fingers down in your lap. You can be subtle.

Apple Vision Pro with a bunch of apps open.

Todd Haselton | CNBC

It packs a punch.

I launched more than a dozen apps around me. There’s no point in doing more, because you can’t see it all. I loved setting it up with a browser in front of me, music next to me and a TV screen above it all. But the world is yours to customize. You can open mail and a browser or leave Slack open next to a Word document with your calendar on the other side. Put your text messages on the ceiling if you want. It’s a completely new way to multitask.

Multitasking with the Apple Vision Pro

Todd Haselton | CNBC

A note: My screenshots show apps askew. But, in the headset, they’re all perfectly level.

I didn’t run into any slowdowns during my time with the Vision Pro. Part of that is due to how Apple renders content. It’s technically only sharpening the areas of the screen where you’re focusing, leaving the other areas blurry. That’s why some of the screenshots here look out of focus around the sides. Inside the headset, it’s all super crisp. It’s called foveated rendering, and it allows for optimized processing.

Gaming on the Apple Vision Pro is a lot of fun.

Todd Haselton | CNBC

I loved watching movies with the headset. I lounged on my couch and put up a huge screen across the wall of my living room and watched an hour of “Barbie,” and the two first episodes of “Masters of the Air” before the battery was at about 5%. Another night I watched “Greyhound.” I used the NBA app, which was updated to work on the Vision Pro, to stream four games at once, with the main game in the middle and others pinned to the sides. It’s wild.

With the NBA app I could watch a bunch of games at once.

Todd Haselton | CNBC

Apple also has some specially recorded content that’s so sharp you feel like you’re standing right there next to a rendered dinosaur or a video of a rhinoceros. There’s a slightly terrifying clip with a woman walking on a tightrope between cliffs. Don’t watch if you’re afraid of heights. The clips show the type of content third parties will eventually be able record and publish to the headset. I imagine sports highlights or even sitting courtside at a live game.  

The Disney+ app is fun. You can watch movies in one of about four different 3D landscapes. I sat in a racer on Tatooine and watched a bit of a Star Wars movie, but then switched over to watch “Spider-Man: Into the Spider-verse” in 3D. Unlike 3D TVs and movies, which generally flopped, the effects work well in the headset. It’s neat, but I still prefer watching movies in 2D. It feels more natural to me.

Apple Vision Pro FaceTiming and multitasking.

Todd Haselton | CNBC

FaceTime works well. You see a clear video of the person you’re calling on a screen in front of you. But they don’t see you. Or, not the real you. They see a 3D-rendered version of you called a digital Persona. It’s still in beta, and mine looked like a much older version of me. My colleague thought I looked like an 80-year-old man. My wife laughed.

You create a Persona by selecting an option in the settings menu and then removing the headset and following screens on the external display. It asks you to look up, look down, look left, look right, smile, smile with teeth, and close your eyes. Then, in seconds, it creates a 3D Persona.

My digital Persona from the Apple Vision Pro. I think I look great!

It looks more human than cartoony like with other headsets. I spoke with people over FaceTime also using Personas, and it’s much easier to hold a conversation without feeling like you’re two goofy avatars trying to talk. You can hold a real meeting if necessary in your pajamas while your Persona is in work attire. Personas also carry over to other apps like WebEx.

You can see my persona’s eyes on the screen here.

Jay Yarow | CNBC

Your Persona’s eyes can appear on the outside display. Someone will see glowing effects on the outside of the headset if you have screens up in front of you. If they begin talking to you and you’re in an immersive view – like one of the landscapes I mentioned earlier – they’ll start to fade into focus so you can see them. As you look at them, the eyes of your digital Persona become visible on the outside of the headset. It looks like you’re wearing a snorkeling mask.

In real life, I just removed the headset face when my wife came in to chat.

The built-in speakers are great. They get nice and loud and support spatial audio, so if you turn your head away from the movie in front of you, the sound stays in the same place, much like if you were watching a real TV. Music and movies sounded fantastic, with full surround sound. People can hear the audio coming out of the headset, though, so you’ll want to use AirPods in public.

Photos in the Apple Vision Pro

Todd Haselton | CNBC

I love the “spatial photos” you can capture using the cameras on the outside of the Vision Pro or with the latest iPhone 15 Pro and iPhone 15 Pro Max. The camera creates a 3D version of a photo or video. I filmed my 4-month-old daughter eating and my dog’s 9th birthday, for example, in hopes that I’ll be able to come back and relive some of those moments. I wish I had recorded some of these videos when my stepfather was alive because it’d feel like he was in the room with me. Some people might see it as a gimmick, but I found it moving.

Lastly, the build quality is superb. Apple used top-of-the-line glass, screens and metals. It feels like a premium headset and it’s comfortable to wear. My only complaint is that I had to be deliberate to hold it by the metal frame. The padded inserts pop off their magnets if you try to grab them. Those could be stronger, but they were designed to be easily removed so people could share the headset by popping in their own inserts.

What’s bad

Apple Vision Pro

Todd Haselton | CNBC

Apple’s apps work well. You’ll find Notes, Music, Safari, Podcasts, Photos, Apple TV+, Maps and more. Other apps include SkyGuide and Disney+ and there are Apple Arcade games. Many more are coming, as most apps haven’t yet been built specifically for Vision Pro.

The Vision Pro supports more than a million iPhone and iPad apps. But you need to search for each app individually and some of them aren’t available. Netflix and Spotify haven’t been shy about not supporting the Vision Pro, though you can easily access either using the browser. Still, there are lots of others that I couldn’t find: 1Password isn’t there, which made logging into some apps a bit of a pain. You won’t find Uber, DoorDash (but there’s GrubHub!) or Amazon. None of Google’s apps are here, including YouTube TV, though it works fine in the browser.

SkyGuide in the Apple Vision Pro is fantastic.

Todd Haselton | CNBC

Popular games like Diablo Immortal and Genshin Impact aren’t available. Facebook’s apps aren’t here, so no Instagram. These are just a few I noticed.

Some work well, though. I didn’t have any issues with the X iPad app, for example. CNBC’s app worked fine. Others, like Amazon Prime Video, exist but aren’t great. A bug shows a big box in the middle of the screen when you’re watching a movie, but a fix is coming.

For some apps that aren’t yet available, developers are working to optimize them and eliminate bugs.

X on the Apple Vision Pro

Todd Haselton | CNBC

Apple Keychain was sometimes buggy in iPad apps. This is Apple’s version of 1Password, and I rely on it to enter my username and passwords. It generally works fine. But if you have two usernames for apps, like my wife and I do for Amazon Prime Video or Peacock, the app locks up when you try to select a different login. I informed Apple of the bug.

The floating keyboard is useful for search or typing quick messages, but you won’t be able to type very fast at first. You look at each letter on a digital keyboard and select it, or reach out and tap the digital keyboard. I got faster during my time with the Vision Pro, but nowhere near as quick as I am on my iPhone or a real keyboard. You can just use Siri voice-to-text to respond to iMessages or enter URLs in the browser (and launch apps). Still, you’re going to want to use a keyboard if you have to do a lot of typing.

There’s also the battery pack that plugs into the headset with a proprietary plug that you twist in. I don’t mind it. I thought the pack worked fine, but it would be a lot easier if it was just embedded into the headset, though that would add weight.

Should you buy it?

Apple Vision Pro

Todd Haselton | CNBC

I’d buy the Vision Pro right now if I had an extra $3,500. I’d even consider trading in my iPad Pro and MacBook Pro to offset the cost since the headset gives me a lot of the same capabilities. But that’s not an option.

You’ll definitely love it for movies. I think a lot of people will also really enjoy being able to read the news and browse the web while having a huge TV screen open and lounging on their couch. Some may find they can work in it. I did. It’s fun.

Apple’s real opportunity will materialize when it finds a way to mass produce the Vision Pro at closer to $2,000, or less. Until then, it may be a niche product. But the experience blows everything else out of the water. It’s Apple’s most exciting product in years and it’s the best example yet that this will become a new way of computing.

Don’t miss these stories from CNBC PRO:

Continue Reading

Technology

Apple and Broadcom shares keep hitting records. Why each have more room to run

Published

on

By

Apple and Broadcom shares keep hitting records. Why each have more room to run

Continue Reading

Technology

Workday shares sink on subscription revenue guidance concerns

Published

on

By

Workday shares sink on subscription revenue guidance concerns

The Workday Inc. pop-up pavilion ahead of the World Economic Forum (WEF) in Davos, Switzerland, on Saturday, Jan. 19, 2025.

Hollie Adams | Bloomberg | Getty Images

Shares of software maker Workday dropped as much as 10% on Wednesday as analysts lowered their price targets, citing a lack of a upside after the company revised its full-year subscription revenue forecast.

Many software stocks have been under pressure in 2025 as commentators have worried that generative artificial intelligence tools that can quickly write lines of code might pose risks to incumbents.

This year, Workday has announced the launch of several AI agents and expanded its offerings through startup acquisitions. Earlier this month, Workday completed the $1.1 billion purchase of AI and learning software company Sana.

Despite those moves, Workday’s third-quarter earnings report on Tuesday failed to impress Wall Street.

The company called for $8.83 billion in subscription revenue for the fiscal year that will end in January 2026, implying 14.4% growth, but the figure was up just $13 million from the company’s guidance in August. The new number includes contributions from Sana and a contract with the U.S. Defense Intelligence Agency, Workday finance chief Zane Rowe told analysts on a conference call.

“Investors were likely looking for more of a beat-and-raise quarter,” Cantor Fitzgerald analysts Matt VanVliet and Mason Marion wrote in a note to clients. They have the equivalent of a buy rating on Workday stock. The new number, they wrote, “borders on a slight guide down.” The analysts held their 12-month price target on Workday stock at $280.

Stifel, with a hold rating on the stock, lowered its Workday target to $235 from $255.

“It does not appear that the underlying momentum of the business is showing any signs of stabilization,” Stifel’s Brad Reback and Robert Galvin wrote in a note.

Reback and Galvin said Workday implied that growth from its 12-month subscription revenue backlog will continue to slow when removing impact from acquisitions. They expect the trend to continue even as customers sign up for Workday’s AI products, they wrote.

The outcome was “like turkey without the gravy,” Evercore analysts, with the equivalent of a buy rating on the stock, wrote in the title of their note.

Analysts at RBC, which also has the equivalent of a buy rating on Workday shares, lowered their price target to $320 from $340. Despite the mixed guidance, they wrote in a note to clients, results for the fiscal third quarter did exceed consensus. Plus, AI products contributed over 1.5 percentage points of annualized revenue growth, Workday CEO Carl Eschenbach said on Tuesday’s conference call.

‘”We remain encouraged by early AI momentum,” the RBC analysts wrote.

WATCH: AI will drive the market higher in 2026, says Citizens’ Mark Lehmann

AI will drive the market higher in 2026, says Citizens’ Mark Lehmann

Continue Reading

Technology

MIT study finds AI can already replace 11.7% of U.S. workforce

Published

on

By

MIT study finds AI can already replace 11.7% of U.S. workforce

AI can already replace 11.7% of the U.S. workforce, MIT study finds

Massachusetts Institute of Technology on Wednesday released a study that found that artificial intelligence can already replace 11.7% of the U.S. labor market, or as much as $1.2 trillion in wages across finance, health care and professional services.

The study was conducted using a labor simulation tool called the Iceberg Index, which was created by MIT and Oak Ridge National Laboratory. The index simulates how 151 million U.S. workers interact across the country and how they are affected by AI and corresponding policy.

The Iceberg Index, which was announced earlier this year, offers a forward-looking view of how AI may reshape the labor market, not just in coastal tech hubs but across every state in the country. For lawmakers preparing billion-dollar reskilling and training investments, the index offers a detailed map of where disruption is forming down to the zip code.

“Basically, we are creating a digital twin for the U.S. labor market,” said Prasanna Balaprakash, ORNL director and co-leader of the research. ORNL is a Department of Energy research center in eastern Tennessee, home to the Frontier supercomputer, which powers many large-scale modeling efforts.

The index runs population-level experiments, revealing how AI reshapes tasks, skills and labor flows long before those changes show up in the real economy, Balaprakash said.

The index treats the 151 million workers as individual agents, each tagged with skills, tasks, occupation and location. It maps more than 32,000 skills across 923 occupations in 3,000 counties, then measures where current AI systems can already perform those skills.

What the researchers found is that the visible tip of the iceberg — the layoffs and role shifts in tech, computing and information technology — represents just 2.2% of total wage exposure, or about $211 billion. Beneath the surface lies the total exposure, the $1.2 trillion in wages, and that includes routine functions in human resources, logistics, finance, and office administration. Those are areas sometimes overlooked in automation forecasts.

The index is not a prediction engine about exactly when or where jobs will be lost, the researchers said. Instead, it’s meant to give a skills-centered snapshot of what today’s AI systems can already do, and give policymakers a structured way to explore what-if scenarios before they commit real money and legislation.

The researchers partnered with state governments to run proactive simulations. Tennessee, North Carolina and Utah helped validate the model using their own labor data and have begun building policy scenarios using the platform.

Amazon layoffs hit engineers, gaming division, ad business

Tennessee moved first, citing the Iceberg Index in its official AI Workforce Action Plan released this month. Utah state leaders are preparing to release a similar report based on Iceberg’s modeling.

North Carolina state Sen. DeAndrea Salvador, who has worked closely with MIT on the project, said what drew her to the research is how it surfaces effects that traditional tools miss. She added that one of the most useful features is the ability to drill down to local detail.

“One of the things that you can go down to is county-specific data to essentially say, within a certain census block, here are the skills that is currently happening now and then matching those skills with what are the likelihood of them being automated or augmented, and what could that mean in terms of the shifts in the state’s GDP in that area, but also in employment,” she said.

Salvador said that kind of simulation work is especially valuable as states stand up overlapping AI task forces and working groups.

The Iceberg Index also challenges a common assumption about AI risk — that it will stay confined to tech roles in coastal hubs. The index’s simulations show exposed occupations spread across all 50 states, including inland and rural regions that are often left out of the AI conversation.

To address that gap, the Iceberg team has built an interactive simulation environment that allows states to experiment with different policy levers — from shifting workforce dollars and tweaking training programs to exploring how changes in technology adoption might affect local employment and gross domestic product.

“Project Iceberg enables policymakers and business leaders to identify exposure hotspots, prioritize training and infrastructure investments, and test interventions before committing billions to implementation,” the report says.

Balaprakash, who also serves on the Tennessee Artificial Intelligence Advisory Council, shared state-specific findings with the governor’s team and the state’s AI director. He said many of Tennessee’s core sectors — health care, nuclear energy, manufacturing and transportation — still depend heavily on physical work, which offers some insulation from purely digital automation. The question, he said, is how to use new technologies such as robotics and AI assistants to strengthen those industries rather than hollow them out.

For now, the team is positioning Iceberg not as a finished product but as a sandbox that states can use to prepare for AI’s impact on their workforces.

“It is really aimed towards getting in and starting to try out different scenarios,” Salvador said.

WATCH: Amazon targets middle managers in mass layoffs, memo suggests more cuts coming as AI thins Big Tech

Amazon targets middle managers in mass layoffs, memo suggests more cuts coming as AI thins Big Tech

Continue Reading

Trending