After the whirlwind of team activity that was our initial Vision Pro teardown, we’ve been digging much, much deeper into those dual displays, the multitude of sensors, the lenses, and that beautifully-overengineered battery pack. And not to spoil the surprise, but we’ve found that those dual displays are utterly incredible: You can fit more than 50 Vision Pro pixels into the space of a single iPhone 15 Pro pixel. Yes, you read that right.
Apple says that the Vision Pro’s displays offer “more pixels than a 4K TV for each eye.” But what exactly does 4K, or any K, mean this close to your eyeballs? Your phone has a much higher pixel density than your TV, for example, and yet you cannot see the pixels on either in normal use. So let’s get into the details, guided by our senior tech writer Arthur Shi.
First up, let’s talk about the two measurements that matter. Pixels per inch, or PPI, is what it sounds like—the number of pixels crammed into a given area of the screen. It’s an absolute measure, based on the physical properties of the device.
Then there are pixels per degree (PPD). PPD takes into account your distance from the screen. The closer your eyes get to the screen, the easier it is for you to make out individual pixels, which is why your phone’s screen needs a higher resolution than your TV, and why a movie at the theater looks fine, even though it may only be 2K. So what does that mean for the Vision Pro?
Pixels Per Inch, Where Size Matters
Each lens assembly has a pancake lens array, a housing with embedded eye-tracking cameras, and a display panel. The display panels are most likely made by Sony—possibly a custom version of their microOLED displays.
The display doesn’t light up from edge to edge, so we’re only counting the lit portion of each panel. The lit sensor area measures around 27.5 mm wide by 24 mm high, or about one inch by one inch. Using an Evident Scientific DSX1000 microscope, we measured the pixels at 7.5 μm (the size of a red blood cell). Each pixel is roughly a square: red and green sub-pixels stacked on top of each other, with a double-sized blue sub-pixel on the side. With those measurements, the lit area totals 3660 px by 3200 px. That equates to 12,078,000 pixels smushed into 0.98 square inches!
But the corners are cut off, so that reduces the pixel count even further. It’s asymmetrical, with triangular corner cut-off areas of 6.95 mm2, 11.52 mm2, 9.9 mm2, and 10.15 mm2, totaling 38.52 mm of deactivated screen. Compare that to the total area of around 660 mm2, and we see that the corners cut off 5.3% of the total, leaving 11,437,866 visible pixels per panel. Counting both panels and a margin of error on our side, that’s the 23,000,000 pixels Apple claims.
Divide those pixels by the length/width, and you can calculate the PPI (pixels per inch), which as mentioned is a measurement of pixel density. The Vision Pro comes in at a stunning 3,386 PPI. This, says Arthur, “is AMAZING PPI!” (his caps, not mine).
That many pixels must make the Vision Pro 4K, right? The horizontal resolution on the panel doesn’t quite make it to the consumer 4K UHD standard of 3,840 pixels wide. In short, this is a seriously high-res display. But it’s not technically 4K, which is why Apple didn’t simply call these 4K panels.
It’s certainly the highest-density display that we’ve ever seen. For comparison, the iPhone 15 Pro Max has around 460 PPI, which means you can fit ~54 Vision Pro pixels into a single iPhone pixel.
Or how about the 12.9-inch iPad Pro? That has 264 PPI, making the Vision Pro 12.8x finer. For a direct comparison to another VR headset, the HTC Vive Pro has ~950 PPI (4896 px × 2448 px), less than a third of the Vision Pro, and the Meta Quest 3 has ~1218 PPI. Don’t worry, it gets weirder.
Pixels Per Degree: It’s All About the Angle
A high pixel density alone doesn’t guarantee a great visual resolution. Consider a 65″ 4K TV, which has a “paltry” 68 PPI. But your 4K movie still looks great on it. Why? Because you’re sitting pretty far away from the TV, effectively shrinking all the pixels down to a small area. The further away you sit, the higher the PPI scales. Think of how the Jumbotron at your local sports venue doesn’t look pixelated when you’re sitting across the stadium.
Because of this, VR engineers like to use a slightly fancier metric to measure display “goodness”: angular resolution, measured in pixels per degree, or PPD. This is the number of horizontal pixels per degree of viewing angle, and it lets us compare between displays with different resolutions, designed to be viewed at different distances.
But even that isn’t the whole story, for a variety of reasons:
- The PPD changes from the edge of the screen vs. the center.
- Lenses intentionally distort and tweak the PPDs in wonky ways.
- Stereoscopic views affect how many “pixels” are seen, muddying calculations.
With a rough measurement of 100° FOV (Field of View), we estimate the Vision Pro to have an average of 34 PPD. In comparison, a 65″ 4K TV viewed 6.5 feet away is 95 average PPD, and the iPhone 15 Pro Max held 1 foot away is 94 average PPD. Even though the pixel density and physical dimensions are drastically different between the iPhone and the 65” TV, they end up having similar PPD simply because of how they’re viewed.
Tl;dr: The Vision Pro may have an ultra-high resolution (PPI) display, but because it’s so close to the eye, it has low angular resolution.
As a final example, friend of iFixit Karl Guttag has a blog post explaining why using the Vision Pro as a replacement for a good monitor is “ridiculous.” The gist is that when you beam the Mac’s output into the Vision Pro, and suspend it in the air in front of you, you’re only using a small portion of the available pixels. Even if the Vision Pro’s displays were really 4K, you’d only be using a small central section, with the rest of the pixels used to show the surrounding room.
This, says Karl, makes for a virtual Mac display with low enough PPD to see individual pixels—not quite the standard desktop display experience. So while you will totally be able to use your Mac in your virtual world, you will prefer an actual 4K or 5K monitor for fine work.
Then again, even if you shell out 200 bucks for the huge carrying case, the Vision Pro is way more portable than an Apple Studio Display.
Wearing glasses under your VR headset is less than ideal—Apple’s answer is to add accessory lenses inside, for the Vision Pro, those lenses are made by noted German optics specialist ZEISS and snap into place magnetically.
Each lens comes with a pairing code, although this isn’t quite the same kind of parts pairing we usually deal with. These codes don’t lock a component to a single device. If you want to remove your prescription lens inserts from your own Vision Pro and drop them into a friend’s unit, you can. You’ll just have to pair them with the new headset by inputting the same code. The Vision Pro they’re being swapped into has to know the prescription of those lenses and calibrate accordingly, hence the little QR-type code you scan by looking at it during the setup process.
Still, it is parts pairing in that a secret handshake code is required to install lenses—giving ZEISS a solid monopoly, at least for now. It also means that any typos or shipping snafus might render your new lenses useless, at least temporarily.
You’d think that submitting a prescription would be enough to get the Vision Pro calibrated with your lenses—even if Apple is doing some fancy footwork—but without transparency, we can’t be sure.
The Vision Pro has another small problem for spectacles wearers. Contrary to some reports, Apple says that corrective lenses are available for most conditions, including astigmatism (which we weren’t sure about in part one), and they also offer bifocals, and progressives. But if you have a prism value included in your prescription, you’re out of luck. Prism correction is used to correct for diplopia, or double vision. The easiest way to see if your vision prescription is supported is to use ZEISS’s online tool.
The theme of this device appears to be that well-tailored comfort is fairly simple to achieve, as long as you go through the official channels.
By Apple logic, a complex device requires an equally complex battery solution. The hefty battery bank—$200 from Apple if you want to buy it separately—is both super simple and hilariously over-engineered.
Looking like a bulked-up first-gen iPhone, the case is milled out of a single chunk of aluminum, and the lid snaps into place with firm perimeter clips, leaving little to no seam for us to pry at. We needed a hammer and chisel to open it up! Adhesive also lines the lid, just to make sure you get the message: this pack is not designed to be opened.
As for the battery cells themselves, Apple’s using three iPhone-battery-sized packs stacked atop each other, connected in series. We pulled out an iPhone 15 Plus battery to compare and found that it’s ever-so-slightly smaller in area while being a tad thicker.
The cells in our Vision Pro pack are listed at 15.36 Wh a piece, suggesting a 46.08 Wh total capability. That doesn’t quite track with the 35.9 Wh rating etched on the pack’s (gorgeous) aluminum enclosure. This, at first glance, looks like Apple’s undershooting the Watt-hour rating by over 20 percent. Apple’s no stranger to battery life debacles, so there’s a chance they’re purposefully undercharging the cells for longevity—the same reason they just released an 80% charging limit on the iPhone 15 Pro. Or perhaps they’re calculating Wh differently, accounting for thermal losses, or something else.
Apple is clearly obsessive over the user experience with this battery, packing in temperature sensors and an accelerometer (so it displays the charge LED when you hold it up, and might even detect when you are wearing the pack). If you’re putting a battery in your pocket for two hours straight, then you really, really don’t want it to get hot.
The pack is also outputting a non-USB-standard 13 volts to keep up with the Vision Pro’s processing demands, which is one explanation for the bespoke “big Lightning” cable—so you don’t accidentally plug other devices in and fry them. It also explains why you can’t just plug it straight into a USB-C battery pack. In fact, the Vision Pro’s battery pack has enough tech to act as an uninterruptible power supply, providing it specific, clean power even when plugged into the wall.
All in all, it seems like Apple is taking the risks of wearing a battery really, really seriously, rather than trying to pack in as much juice as possible. This is also a great illustration of why easily swappable batteries are a good idea—you don’t need to go wild with the Watt-hours if you can just grab a spare, even if it’s a $200 spare.
Apple highlighted the Vision Pro’s M2 chip and the new R1 chip in their announcement. But, what about the rest? And did you hear about the abnormally complex charging board in the battery pack?
We got you covered! Head over here to check out all the chips in the Vision Pro.
Apple’s biggest advantage over other headset makers comes in the form of sensors, kind of. Yes, anyone can slap a bunch of cameras, a LiDAR scanner, and all that other stuff into a headset, but Apple’s sensors have a secret weapon: years of experience analyzing, interpreting, and fusing complex sensor data and multiple iterations of sensor design.
Remember when Apple added LiDAR to the iPhone 12 Pro and iPad Pro in 2020? Yes, it makes for better low-light photos, distance measurements, and accessibility features for the visually impaired. But we have a hunch that Apple had another motive: The LiDAR sensor in the iPad Pro let Apple test AR features in a pretty low-stakes environment, putting the hardware into mass production, gaining both valuable expertise and user feedback.
The front-facing Face ID TrueDepth camera is another great example of Apple’s sensor tech. The Face ID array has a laser that projects infrared dots onto your face, a flood illuminator that bathes your face in IR, and an IR camera that can see it all. This is processed to create a 3D map of your face. Apple got so good at this that it uses the Face ID array to scan your ears, in order to create a custom 3D model that tailors AirPods’ spatial audio to your ear shape.
The result? Room mapping without the need for guardrails. Face mapping for the (admittedly somewhat creepy) Memoji V2 aka Persona. And no need for handheld controllers.
Another Apple specialty is accelerometers—and the interpretation thereof. You can find them in the iPhone, in the Apple Watch (where it can detect whether you have fallen and call emergency services), and even in the original HomePod. When it detects that it has been moved, the HomePod re-listens to the room and recalibrates its audio accordingly. The AirPods have accelerometers to detect your tap-commands. All of this requires advanced interpretation of sensor data, which Apple has gotten good at year after year.
We might be taking a closer look at those sensors, but it’s not the hardware that really matters in this case. It’s the way Apple has honed the production and reliability of these sensors over the years and integrated them with software designed to milk the most from them.
Coming up with a repairability score for the Vision Pro is just as challenging as calculating the subjective resolution of the displays.
On the one hand, there’s a lot to like here. The battery is modular, so it’s trivial to swap in a fully charged one, or replace a flagging unit after a year or two of hard use—even if you can’t actually hot-swap the battery without powering down the headset.
Equally modular are the side straps. Granted, these contain speakers that are virtually impossible to excavate, but the combined assembly is modular and pops off with a SIM-card tool. The power port is integrated into one of the band assemblies, so it’s also semi-modular. We also like the easy-to-fit lens inserts and the magnetically attached light shields.
“The super-modularity—to a level where it’s less of a component and more of a feature, like different bands—makes it so that even Apple has how-to’s on its site,” says iFixit repairability engineer Carsten Frauenheim.
We’re really relieved to see that all of the parts that touch your skin are easily removable, including the light seal and light seal cushion. When we score for repairability, we look for easy access to consumable and easily breakable parts. Notably, the optics, screens, and moving parts for the inter-pupillary distance adjustment are on the eye-side as opposed to behind the fragile front glass. Prying up the fabric layer from its clips to access the adjustment is considerably less risky than trying to remove the expensive 3D glass in one go.
But on the other hand, getting to the front cameras and sensors—or anything behind the front glass—is a real headache. That glass cover needs a lot of careful finessing with a heat gun and multiple prying tools to remove in one piece. Sure, if you kill the EyeSight while trying to break in, it’s not the end of the world. The real danger is that cracking the glass might render the sensors behind it useless, simply by blocking their view.
Our repair scoring heavily weights parts like the screen and battery that are necessary for the function of the device. It’s hard to argue that the EyeSight display is a critical component. But the external sensors are essential, and if you break the glass and obscure them, then the device will stop functioning correctly.
So how does the Apple Vision Pro compare on the repairability scale to head-mounted displays (HMDs) from competitors? Arguably the most important comparison is to Meta’s Quest 2 and Quest 3 headsets, which together dominate the VR and AR (collectively XR) markets with a roughly 70% market share.
The answer to that question, along with almost anything related to XR hardware, is that it’s complicated.
Let’s take the Quest 2 and Quest 3 as examples. Both are designed with a front-loaded headset secured by a replaceable harness, much like the Vision Pro. What differentiates Meta’s headsets from most others is that it’s also a standalone device, unlike the Valve Index, HTC Vive, and PS VR2. But that’s where the similarities between the Quest 2 & 3 and the Vision Pro end.
Where the Vision Pro has an external battery pack, both the Quest 2 and Quest 3 have their lithium polymer batteries buried inside the device, to the point that it’s one of the last things you’d remove in a highly complex repair. A battery replacement is the most likely and common repair, and in that category, the Vision Pro is a clear winner. That’s because the Vision Pro’s battery puck, while an integral part of the otherwise unpowered device, is an external component.
Where the Vision Pro fails is the fragility of the front glass, combined with the complexity of the User Interface (UI). Imagine you trip over the non-MagSafe battery cable and the headset tumbles onto your equally beautiful hardwood floor, shattering the glass EyeSight cover.
Even if every sensor is still functional, they will be blinded. When the external cameras’, LiDAR sensor’s, and IR emitters’ line of sight is interrupted by cracked glass, your hands won’t track correctly. Since there are no controllers (unlike most other HMDs), anyone with a broken front glass will have to lean on accessibility features like voice control.
The Quest 2 and Quest 3 fare much better on durability. For one, the exterior shell is made of plastic, which is far less fragile than glass. The cameras are recessed in their own notch on the case, thereby being a completely separate module from the rest of the HMD, allowing for easier repair and replacement.
Similar to the Vision Pro, the Quest Pro buries its cameras and sensors under the front plastic, but that front plastic is held in place by clips that simply pop off, allowing for easy and cheap repairs.
When we give a repairability score, we pick which components to evaluate based on what we’re seeing in the rest of the product category. But XR hardware is so bleeding edge that everything from the exterior build to the basic means of navigation still varies wildly from device to device and generation to generation.
For example, we spent a long time talking about how to handle the Vision Pro’s lack of controllers. For other HMDs, we’ve considered how easily replaceable the controller batteries and buttons will be. Should we expect that fewer XR devices will have controllers going forward? The Quest 2 is still the most common VR headset on the market and will likely remain so for a few years, owing to its accessible price point. On the other hand, the Vision Pro (and its intuitive UI) is likely to set the tone for future headsets.
This presents a challenge for scoring on the repairability scale, as marrying the current “low-tech” hardware of today with the complex factory-calibrated sensors of tomorrow requires a compromise. On the one hand, we must consider the dominant hardware currently in the market, while ensuring we don’t unfairly penalize the intuitive technology of tomorrow’s HMDs.
Okay, we’re done philosophizing for now—so, what does this pricey piece of ultra-dense tech earn repair-wise? While we don’t know enough to apply a concrete score, we’re comfortable assigning a provisional 4/10 on the repairability scale.
The bottom line? It’s wild to look at an Apple device this complex and see a replaceable battery. Yes, it’s maybe closer to a power supply, and yes, it’s expensive. But when it comes down to it, Apple looked at a device and intended for parts to be replaced. That’s pretty major. The bar is really that low, folks—there are still HMDs without replaceable pads. Do we wish that the battery pack had replaceable cells? Of course. Is this pretty impressive for Apple’s first foray? Also yes.
Visions of the Future
We’ve yet to complete testing on many aspects. Currently, iFixit engineering intern Chayton Ritter has Apple’s proprietary battery cable splayed out on a breadboard, and is trying to determine what sort of electronic handshakes are required to make the Vision Pro accept battery packs.
And we’ve also been busy swapping internal parts between two headsets to check for the presence of iFixit’s nemesis—parts-pairing. Nothing conclusive so far, but we’d love to hear your experiences while we investigate.
While the future of XR may not yet be here, we think that’s a good thing. By being conscious of the tech we buy, we encourage manufacturers to make better tech. By holding manufacturers accountable for what they create, we can ensure that repairability is considered at an early stage of the design process, and incorporated wherever possible. Our hope is that, by the time these things get shrunken into a pair of eyeglasses, repairability won’t be something tacked on at the end, but a fundamental design tenet. If face goggles really are the future of computing, then we need to get this right from the very beginning.