May 01, 2023
Apple Vision Pro Full Specs, Features & Details
Today Apple finally revealed Vision Pro, its $3500 "spatial computer" launching
Today Apple finally revealed Vision Pro, its $3500 "spatial computer" launching in 2024. Here's a rundown of its announced specs and features.
Vision Pro is technically opaque, but uses high resolution color cameras to show you the real world and a display on the front to show your eyes to other people.
It's an ultra high-end headset packing in the highest resolution and most sensors ever in an AR/VR product. It introduces new features never before shipped, and its visionOS fuses eye tracking and hand tracking for intuitive input while re-thinking the lines between 2D, AR, and VR.
Of course, a spec sheet only tells part of the story. You should also read our hands-on impressions of Vision Pro here:
Vision Pro has a custom aluminum alloy frame supporting a curved "three-dimensionally formed" laminated glass front plate to achieve a relatively thin design.
What keeps Vision Pro lighter than headsets like Quest 2 is the separation of the battery from the headset. Some other headsets like Meta Quest Pro and Pico 4 have their battery in the rear of the strap, but Apple's design takes it off your head entirely with an external battery tethered to a magnetic connector on the left of the headband.
Apple claims the external battery lasted 2 hours under the following test conditions:
Alternatively, Vision Pro can used perpetually by plugging the battery in to a USB-C power source. Apple hasn't yet gone into detail about how much power is needed.
Vision Pro features dual micro-OLED displays with unprecedented pixel density. Apple says each is "the size of a postage stamp" yet together they have 23 million pixels.
Apple didn't reveal the exact resolution, but 23 million total pixels would suggest a per-eye resolution of around 3400×3400 for a square aspect ratio, or around 3680×3140 for the 7:6 aspect ratio the marketing renders show.
We don't actually know the real aspect ratio of the displays, however, and it's also conceivable Apple's 23 million figure only counts the pixels visible to the lenses.
The normal display refresh rate is 90Hz, but this automatically increases to 96Hz when watching 24fps content to avoid frame pacing judder.
Apple confirmed Vision Pro's displays support wide color gamut and high dynamic range, but didn't reveal detailed specs like peak brightness.
Vision Pro has a total of twelve cameras, a TrueDepth sensor, a LiDAR sensor, and six microphones.
Six of the twelve cameras are under the front glass.
Two of these provide high resolution color for the headset's passthrough view of the real world, streaming "over one billion color pixels per second."
The other four front cameras are used for headset positional tracking, and likely have fisheye lenses.
One purpose of the TrueDepth sensor is hand tracking. Apple describes the hand tracking quality as "so precise, it completely frees up your hands from needing clumsy hardware controllers."
Vision Pro lacks any kind of tracked controllers, though it supports playing traditional iPad games on a virtual screen with a gamepad.
The LiDAR sensor is used to perform real time 3D meshing of your environment, in conjunction with the other front cameras.
Apps can leverage this mesh but don't get access to actual camera data. One use case example Apple gave was virtual objects casting shadows on real tables, but this only scratches the surface of what should be possible.
Two downwards facing cameras track your face, while four internal IR cameras beside them track your eyes, helped by a ring of LED illuminators around the lenses.
Vision Pro's eye tracking serves three purposes: authentication, foveated rendering, and driving your FaceTime avatar.
Apple is calling its new iris scanning authentication OpticID, following the naming scheme of TouchID and FaceID from its other devices. OpticID is how you unlock Vision Pro, and it also works with Apple Pay purchases and password autofill. As with TouchID and FaceID, the biometric data powering OpticID is processed on-device by a Secure Enclave Processor.
Foveated rendering is a technique where only the small region of the display that your eyes are currently looking at is rendered in full resolution, thus freeing up performance since the rest is lower resolution. Freed up GPU resources can be used for better performance, to increase rendering resolution, or to increase graphics settings. It leverages the fact that our eyes only see in high resolution in the very center of the fovea.
Finally, eye tracking combines with the downwards cameras to track your facial expressions in real time to drive your FaceTime Persona, Apple's take on photorealistic avatars. Meta has been showing off research towards this for over four years now, but it looks like Apple will be the first to ship – albeit not to the same quality of Meta's research.
To fuse the input from all these cameras, sensors, and microphones together, Apple developed a custom chip it calls R1.
Apple claims R1 "virtually eliminates lag, streaming new images to the displays within 12 milliseconds."
For comparison, the founder of the French headset startup Lynx claims Meta Quest Pro's passthrough latency is 35-60 milliseconds. It's unclear if this is a like-for-like comparison though.
Vision Pro only has two physical controls, both on the top. A button to capture "spatial videos" and "spatial photos" at any moment, and a Digital Crown.
Pressing the Digital Crown brings up the system Home View. But turning it controls your level of immersion, all the way from full AR to full VR. If you go halfway, for example, you'll see VR in front of you and AR behind you.
On existing headsets like Meta Quest and Pico 4, passthrough is a toggle option, meaning you have to choose between full and no immersion. Apple wants to instead let you choose exactly how engaged to be with your real surroundings.
A completely unique feature of Vision Pro is an external display that shows your eyes to other people in the room and indicates how aware of them you are. Apple calls this technology EyeSight.
When you're in an AR app EyeSight shows a colored pattern in front of your eyes, and when you're in a VR app it shows only the pattern with your eyes invisible.
When someone comes close to you, Vision Pro will show a cutout of the person and EyeSight will reveal your eyes to them.
EyeSight uses a curved OLED panel with a lenticular lens so the perspective of your eyes looks correct regardless of the angle people are looking at the headset from. Apple claims this is effectively "a 3D display that makes the device look transparent".
Apple described making sure that you're "never isolated from the people around you" as one of its "foundational design goals" for Vision Pro, and the company sees it as a clear differentiator to fully opaque headsets like Meta Quests.
Vision Pro is powered by the same Apple Silicon M2 chip used in recent Macs.
Apple says this delivers "unparalleled standalone performance", and allows Vision Pro to "maintain a comfortable temperature and run virtually silent."
Compared to the rumored specs of the next generation Qualcomm Snapdragon chipset Meta Quest 3 will use, Apple's M2 should deliver roughly 30-50% faster single-threaded CPU performance, roughly 80% faster multi-threaded. The GPU is also more powerful, but this is much harder to get an exact estimate for as benchmarks vary wildly and often don't reflect real-world performance.
However, without knowing the exact clock speeds of the processors in each headset, this is only a very rough comparison. Also, in practice Vision Pro should have far more effectively available GPU power than Quest 3 since it uses foveated rendering.
visionOS is Apple's custom "spatial operating system" for Vision Pro, and presumably for future headsets in the Vision line too.
Apple describes visionOS as "familiar, yet groundbreaking." It presents you with floating 2D apps that you scroll through with a flick of your fingers. You select menu items with your eyes by just looking at them, and you use your fingers to click.
Many of Apple's first party apps and services are available in visionOS, including Notes, Messages, Safari, Keynote, Photos, FaceTime, and Apple Music.
Rather than just existing within their 2D frame, many of Apple's apps "become spatial," taking up space around you. In FaceTime group calls for example, each person's webcam view becomes its own floating rectangle, with spatial audio coming from each person. Apple also gave the example of being able to pull out 3D models from messages into real space.
Vision Pro also lets you expand your Mac's display into a huge virtual display wirelessly just by looking at it.
A major focus of visionOS is watching movies and TV shows on a huge virtual screen, including support for viewing 3D movies from Apple's library with depth.
Vision Pro has "audio pods" on its side, each with two drivers. Apple describes it as the "most advanced Spatial Audio system ever."
If you have an iPhone with a TrueDepth FaceID sensor you can scan your face to enable Personalized Spatial Audio, where the system will tailor the sound to your head and ear geometry for the most precise possible spatial audio.
Vision Pro also uses a technique called Audio Ray Tracing, where it scans the features and materials of your space to "precisely match sound to your room." This technique is also used in Apple's HomePod speakers.
Apple claims Vision Pro buyers will be "totally convinced that sounds are coming from the environment around you."
Apple Vision Pro will go on sale in the US in early 2024 starting at $3500. It'll be available online and in Apple Stores.
Apple says more countries will get Vision Pro "later next year," but didn't disclose exactly which countries.
Update June 8: added dipslay refresh rates and link to our hands-on impressions.
Apple Vision Pro Meta Quest Pro Launch Operating System Lens Adjustment Display Type Total Pixels HDR Refresh Rates Chipset Color Cameras Depth Sensors Eye & Face Tracking Battery Location Battery Life Front Display Microphones Authentication Tracked Controllers Price Headset Total Pixels very Update June 8: