Why The Apple Vision Pro Feels REAL

Why The Apple Vision Pro Feels REAL

Hi, this is Wayne again with a topic “Why The Apple Vision Pro Feels REAL”.
The main thing that makes the Apple Vision Pro different from VR headsets that came before is that, even though it has a fully opaque covering in front of your eyes, it feels like you’re looking right through it. Obviously, part of this is Apple’s choice of cameras and screens, but a major contributor is its absurdly low latency. That is the delay between light hitting the cameras on the outside of the headset and the image being displayed on the internal screen. Apple says that 12 milliseconds is about the highest amount of latency.

Why The Apple Vision Pro Feels REAL

The human brain can tolerate before you start to notice a mismatch between a camera feed and what’s actually going on in the physical world and lo and behold, the listed latency of the Vision. Pro is 12 milliseconds, although this sounds a bit like marketing speak early reviews of the headset have been positive with reviewers, noting that there was little to no perceptible latency. So how did they do? This Apple has apparently pulled this off with a chip they’re calling the R1, along with a special operating system, Vision, OS, essentially Vision.

Why The Apple Vision Pro Feels REAL

Os is partially a real-time operating system. That’S quite a bit different from more familiar platforms like Windows or iOS. A real-time operating system or rtos is used in situations where low latency is absolutely Paramount. Think, for example, about avionic systems that use real-time data to help control airliners like the Boeing 787 or the software inside self-driving cars. So they can react to hazards in a similar amount of time or faster to human drivers if they ever become a real thing. But how does an rtos accomplish this we’ll tell you right after we thank paperlike for sponsoring this video.

If you’re looking for a screen protector for your iPad paper, like has got you covered, it’s got your iPad covered too. That paper, like 2.1, is manufactured in Switzerland and is designed to help you write and draw on your iPad just like how you would on paper. It uses their exclusive micro, bead technology called Nano dots to emulate the stroke resistance of paper without SA sacrificing screen.

Clarity. Make sure to check out paperlike at the link below fundamentally, a real-time operating system enforces rules that certain tasks are guaranteed to be completed in a fixed amount of time. Instead of a situation where most of the running processes have to wait, their turn to go through the processor and rtos makes sure important tasks are completed.

First, an rtos is also often deterministic a fancy way of saying that the same inputs will result in the same outputs every time time. In more concrete terms, this means that the rtos has to ensure that every task going through the processor will meet its time constraint. No matter what other tasks try to interrupt, looking a bit more specifically at The Vision Pro, the headset can actually keep the visual pass through of your actual surroundings, working even if the rest of vision, OS crashes, it appears that the R1 chip is running an RT That handles data related to depth, eye tracking and head tracking, while the rest of the vision, proos non-real time software is handled elsewhere, namely by the more general purpose.

Why The Apple Vision Pro Feels REAL

M2 Chip that you can find in Apple’s current lineup MacBooks. This makes sense, as real-time operating systems are usually quite small, think of a few megabytes of code instead of the many gigabyte behemoths that run our PCS. It’S a safe bet that the code running on the R1 is fairly lightweight and focused only on keeping the crucial low latency aspects of the headset working properly, allowing it to be more reliable.

Even if some something goes wrong with the rest of the headset software Apple. Hasn’T given us tons of details on the R1, but we do know that it has 250 GB per second of memory, bandwidth the same as amd’s RX 6600 XT GPU, although this might seem like a huge amount for a hypers specialized chip. That’S running a small RT. The chip does have to process data from 13 different cameras, seven different microphones and six different sensors, all all at once, and all that data flowing to the chip adds up.

So, even though RT’s tend to be smaller in terms of absolute size, they can still potentially handle huge amounts of data, and this is a godsend if you’ve just spent 3,500 bucks on a Vision Pro, because you don’t want to be losing your lunch just because of Excessive latency, so thanks for watching guys, if you like this video hit like hit, subscribe and hit up one of our previous videos, if you’re into the nitty-gritty, like this, maybe you’d like our video, explaining what a kernel is not talking. Pop corn, though that is delicious .