Hi, this is Wayne again with a topic “Why iPhone’s Features are Always “Late””.
24. 7.: hey! What’S up, I’m Kim BHD here and yesterday was Apple’s WWDC their worldwide developer conference event, their big software event every year, where they go over a bunch of new features for all their os’s. We got new iOS 15 stuff. We got new iPad OS 15 stuff and we got Mac OS Monterey, a bunch of other stuff. Now I do plan on making a separate video going over some stuff in more detail, especially, I want to talk about iPad OS 15 because I think they made some interesting changes, some of which did what I wanted, but also some of it didn’t so make sure You subscribe to see that when it does come out, but I couldn’t help but notice yesterday, watching all these new features get unveiled for the iPhone for iOS 15.
that a lot of them were familiar because we’ve seen them before and Android. Now this isn’t new at all like this. This literally happens. Basically every single developer event like every WWDC. We see features in the iPhone that we’ve already seen in Android and every Google I O. We also see some stuff that we’ve seen in previous iPhones. It’S fine. I actually literally just made a video my last one about why companies should copy the good stuff from each other, but there is a trend that mainly Apple’s features come later and also, arguably, typically better, like you probably all remember how widgets literally widgets and an app Drawer and picture-in-picture were all famously late to iOS 14 last year, after being an Android for a while.
Well, there was a bunch of new iOS 15 features now that were just announced, many of which are really cool, also many of which have been in Android for a while just slightly different. So I think ios’s new live text feature is like the perfect example of this. Okay, so live text takes any image in your gallery or, in the camera, recognizes any text inside of it and then just lets you long press to copy and paste that real life text anywhere else you want.
It feels kind of like magic. If you can just lift handwritten text or printed text and labels Etc, and take them put them anywhere else in any other app, if you took a picture of a business with a sign out front, you can long press the text on that sign and look them Up or call their phone number and it’ll even do some more complex image. Recognition where, if you take a picture of a pet or something It’ll, recognize that it’s a dog you can long press the dog and it’ll.
Tell you what breed of dog it is. So that’s pretty cool! Now I don’t know if you’ve ever used, Google lens on an Android phone, but it’s basically the exact same thing as all the stuff we’ve just described and it’s been around for a while. You can point Google Lens at literally anything with text in it, give it a second and it’ll. Let you copy and paste that real life text again into whatever you want on your phone, translate it in Google search, Etc. That’S built into the viewfinder of a lot of phones and inside phones like the pixel that use Google photos as the default photos app. You can hit the lens button at any point to find and copy and paste any text or look up similar images with object. Recognition – and it isn’t always perfect, but it is impressively accurate. Sometimes it does way more than just pets too now on most Android phones.
It isn’t quite as as seamless feeling as apples where, if I’m just looking at a photo in the gallery – and I see a phone number on a sign just long pressing – that phone number and being able to call right away is a slight step faster than having To go through hitting the Google Lens button and copying and pasting, but I feel like you can almost guarantee that will now be lifted into Google’s next version. Facetime got a bunch of neat little features like spatial audio and blurred background portrait mode and different microphone modes. So you can choose standard or isolating voices or wide spectrum mode, depending on background noise, but probably the biggest addition is called share, play which lets you watch things and share things like your screen. Inside of FaceTime now, you’ve been able to screen share inside of Google meet Zoom teams, whatever else you use for forever, but FaceTime now being able to do this, it’s a little more well integrated than the others, because here you’re now listening to Apple music.
With this nice UI and it’s all synced up or you can watch TV shows or movies, with synced playback controls. So if somebody pauses it pauses for everyone, if somebody fast forwards to show you something it fast forwards for both of you, so everyone can see the same thing at the same time. It’S super easy to stay in sync and, of course, developer conference.
Gold shareplay will have a new API, so any developer that makes a media app will be able to plug into this and make it work in FaceTime. So I think you already have HBO ESPN Disney and I imagine things like Netflix Spotify in the future YouTube. Hopefully, but the point is all this: stuff is better than the typical like screen sharing.
While you play a video where you’re getting like 5 to 10 FPS – and it depends on your internet connection, it’s just not quite as good as it being built in the way it is here, oh also, by the way small detail, you can now share FaceTime links To anyone, just anyone who can click a link, which means you can join a FaceTime call from the web on an Android phone or a Windows desktop. I guess now we can officially say we have FaceTime on Android. Never thought I’d say that, but you can say it now: Apple also added on device voice recognition, so it works offline and much faster, which feels like a huge step up for Siri, which you guessed it is catching up to a Google assistant and Google voice. Recognition has been doing on pixels since 2019 that I’ve been absolutely loving.
They also improve their photo memories, feature which strongly resembles Google photos memory feature and they added a lot of really cool street level, features to Apple maps that continue to make big strides actually towards catching up to and matching Google Maps. There’S way more street view level. Information and there’s an AR View and a few select cities get these really detailed maps where 3D models of landmarks are dropped in there so yeah. Clearly, you can see the trend. A lot of these new features, new features that’ll show up on the iPhone, have already been in Google’s versions for a while ever it seems like every time the iPhone gets new features they’ve been somewhere in some Android phone before so why? The question is why it’s not like Apple’s just choosing to be later, all the time, there’s good reasons for it and it turns out.
Of course, Google is incredible software company and that’s why they’ve been so Innovative there for a long time, but also Apple’s teams have a distinct focus on this ecosystem thing that we’ve talked about. So I’ve made an entire video about the ecosystem, I’ll link it below the like button. If you want to watch it, but it’s a huge Focus for Apple, not just to make good new stuff but to make everything plug into each other seamlessly, because of course it makes the products better, but it also makes it harder to leave the ecosystem.
They think about this constantly people might like their iPhones but really love their airpods. So it’s harder to leave the iPhone because of how well they work with the airpods and how much they love them, or maybe it’s air tags or iMessage or airdrop with the Mac. There’S tons of these, but it’s not just devices, it’s also the increasing number of services they make too. It’S Apple new use.
It’S Apple TV, it’s iMessage, it’s FaceTime. So, while Google’s teams can be ridiculously Innovative because the teams are a little more siled and they get to work sort of without the constraints of having to talk to each other all the time, they will often churn out amazing, incredible new features that just don’t talk To anything else, meanwhile, Apple’s teams, even if they have the exact same idea at the exact same time as Google’s teams. They will often have this constraint of having to work with the rest of the ecosystem and plug into as many different things as possible, which often multiplies the amount of time needed to make the thing work.
But the end result is typically something that’s got some functionality or some plugs that the other versions don’t so with notifications, for example, we’ve I mean Android notifications have just been better than ios’s since the beginning of time. Just facts, but iOS 15 did get some new notification stuff and one of the newest feature teachers. I really liked is called Focus modes and this is sick. So as someone who likes customization, it kind of feels like multiple profiles on your iPhone, or at least as close as they’ve ever gotten to it. So you can set your phone to do not disturb mode or sleeping mode or personal or work mode, and when your focus mode is set to work.
For example, you only get notifications from your work apps and your work contacts and when it’s set to personal, it Grays out all of your work, apps and work contacts and has a separate set of things you can do and you can even set a new home Screen for your non-work focus mode now there have been some Android phones with like a work profile mode and that’s not super new, but the cherry on top now is when you set that Focus mode on your iPhone. All of the other iCloud devices also now will switch to that same work, Focus mode and have that same restriction, so you set it on your iPhone at some time can be triggered by a time of day or location or whatever. Then your iPad, your Mac, your watch, everything is now also locked to that same same set of restrictions, and your iMessage status will also now show, as in work, Focus mode to everyone that usually connects with you over iMessage. So that’s a lot of extra work to make all of that work, but now hey everything’s, working together as you’d expect it to. You saw how they tied iMessage into Apple news, where, if someone sends you an article and messages it’ll appear in the for you section of the Apple news app like. I don’t see, Google, adding seamless integration between Google messages and Google news, I mean.
Maybe it will eventually, but those apps and those teams that make them don’t really seem to work together at all that continuity feature. You saw of dragging your cursor from your Mac to your iPad back and forth, seamlessly and literally dragging and dropping files between them. That is one of the coolest slickest ecosystem Flex features. I think I’ve ever seen, and I don’t know how many years we would have to wait to see Google doing that with a Chrome, OS laptop and an Android tablet. But I would hold my breath. That’S just one of the things that Apple does so well that they take extra time for, but that ends up being really great.
Uh Dieter from The Verge did a really great video. Recently, you should watch on how Apple’s sort of been laying the groundwork for stuff. Like that over the years, but the bottom line is it’s what we’ve come to expect so yeah we wait for it. So that’s just a fun theme I figured I would highlight also if you’ve been hanging out in the official MKBHD Discord. You already know this, but we’ve been chatting over all the new stuff that Apple’s announced at WWDC if you haven’t already joined over there, I’ll leave a link below and also I’m gon na host a Discord stage Channel event in our Discord. Uh today this afternoon.
I’Ll put a Time below and I’ll have some YouTube friends in there and we’ll talk about some of these WWDC announcements, but also the themes of like which features we saw in Android and places we’ve seen before versus what’s new in iOS 15 and on the iPad. It’Ll be fun, we should hang out. We should chat over there, but also for those of you in the YouTube comments section. Let me know this: do you have a preference between super Innovative new bleeding edge features versus a little bit later, but a little bit more? Well, polished or plugged in features, let me know what you think of that: okay, either way. That’S been it thanks for watching catch, you guys in the next one peace. .