What the iPhone 15’s Spatial Video Feature Means for the Vision Pro

What the iPhone 15's Spatial Video Feature Means for the Vision Pro

Hi, this is Wayne again with a topic “What the iPhone 15’s Spatial Video Feature Means for the Vision Pro”.
Apple’S biggest new product in years, The Vision Pro a spatial Computing device for your head isn’t coming out until early 2024, but at Apple’s iPhone and watch event. I was curious if there were going to be any hints or new bits of information about that product here and it turns out. Yes, there were one small one and one that seemed kind of like a hint about one. So here’s what happened. First of all, the iPhone 15 Pro is gon na have something that’s related to Vision Pro. They announced that these cameras on the 15 Pro are going to be able to record spatial videos that are going to be viewable on the Vision Pro next year. So what is a spatial video? Basically, it’s a 3D video clip and this type of technology has existed before in a lot of ways. Google experimented with VR 180 that they even made separate cameras, for I remember reviewing that and you could post these to YouTube and also be able to see them in VR. Now I’m not sure how apples is going to work, but I did see a clip of a spatial video during my Vision, Pro demo back at WWDC, and they look really nice very immersive and 3D kind of looking like these ghostly Snippets of memories. So Apple’s original pitch was that you were going to wear the Vision Pro on your head to record these special moments in your family’s life and then be able to play them back later on the headset. I don’t want to wear a Vision Pro in my head and my kids birthday party, but the idea of using a phone to do it. That’S a lot more normal. That’S what you would do.

Anyhow. What we don’t know is how that’s actually going to work now. Is this going to be a separate video format that you’re going to toggle on the iPhone 15 Pro record spatial video, as opposed to regular video? If that’s the case, then I’m not sure which one I would toggle I mean I would usually default to the one I want to share with everyone else. It’S going to be frustrating to think about recording a video clip that is not compatible with Vision Pro or something that is only available for Vision Pro and then I can’t share with my family.

What the iPhone 15's Spatial Video Feature Means for the Vision Pro

So I’m really curious how Apple Works that out I mean they already have live photos and they have a lot of photos with depsense information in them that are compatible with a lot of other apps already. So I guess we’ll see. I think it’s going to be away from the test out how that relationship works and then once the Vision Pro launches, I mean at some point that technology is going to trickle down to the other phones too. Hopefully, it’ll feel nice and integrated and like an optional thing, so that was the big one, a small one that I noticed involved the Apple watch now the Apple watch is not Vision Pro compatible.

What the iPhone 15's Spatial Video Feature Means for the Vision Pro

However, there’s something that really made me: pay attention double tap now. This feature, which allows you to tap to open up things or do little actions on your watch actually already existed as an accessibility feature, you can turn it on right now on your Apple watch, but on the new Apple watch, it’s supposed to work better, more reliably Without draining battery life, but it’s being launched as an everyday feature and what’s interesting, is that tap looks a lot like the type of TAPS that I was doing with the gestural interface on the Vision Pro now at some point, the watch is going to work with Vision Pro, I’m just going to say that, because all the plans for AR and VR that I’ve seen talk about, watches being an essential part, meta is looking at neural input, wristbands and watches as a way of connecting with future AR glasses. I mean once these things get smaller and more mobile. It makes sense that you’d have something like this on your wrist, especially with haptic feedback, to give you a sense of feedback, so right now, you’re just looking at Taps on an Apple Watch, but what’s gon na happen. Next, I mean we’re supposed to have some sort of new Apple watch next year, this Apple watch x, according to reports that could be a whole new redesign. Is that a moment that they introduced some way for it to interact with the Vision? Pro I mean it seems likely, but I’m not going to skip ahead too much, but I do think that apple is going to try to lay out a little more of a gestural interface familiarity across its products. I mean how are you going to get used to using the Vision Pro unless you start having some ideas that feel like? Oh, I’ve done that before it may start with the Apple watch and it may start bleeding over in some other places, watch that space so to speak, because I feel like there’s going to be some interesting things that happen. So we still don’t know the specific date for Vision Pro and I didn’t get a chance to demo it again, but there were definitely hints and talk about it, starting with spatial memories, starting with the iPhone 15 Pro so anyway. Those are some of my thoughts being here at the Apple event in Cupertino. If you have any questions or comments, things you’re curious about with mixed reality, The Vision Pro, let me know in the comments and make sure to like And subscribe thanks. .