Portrait Mode: Explained!

Portrait Mode: Explained!

Hi, this is Wayne again with a topic “Portrait Mode: Explained!”.
This is a picture that was taken with a very high-end camera, and so it will help us illustrate the feature we’re trying to achieve. So let me bring up a picture you see in this beautiful portrait. The gentleman the front is pin sharp in focus and that background has a beautiful blur. This is called shallow depth of field. It’S something that is illustrative of a great camera that has often a very big sensor like a full-frame sensor or a really big, fast lens and the quality of that background blur that’s.

What’S called bokeh and the higher the quality of the bokeh, usually the more advanced and higher quality, the lens and camera system, so our goal is to try to do something like this using the two cameras in the iphone 7 plus. This is a incredible breakthrough. So now i’m going to show you the first picture. We’Ve ever shown the world of a depth of field photo taken from an iphone 7 plus, with this new feature, the picture i showed you before this was taken on an iphone 7 plus portrait mode. That’S probably a phrase: you’ve heard a lot more about lately, you know portrait mode. This live focus that depth effect over here shallow depth of field over there. The first time we really saw this portrait mode start to actually make an impact on smartphones was an iphone. Of course it was the iphone 7 plus.

This thing came out with dual cameras: one normal and one telephoto, and this this beta portrait mode in ios 10., and it was garbage like it really was trash when it first came out, but apple stuck with it, worked on it and a couple software updates. Later. It’S gotten better and better.

So now, here we are in 2017 and you can see pretty much every other phone coming out. Has these dual cameras feels more like a rule than an exception. We actually kind of wonder if a new phone comes out and it doesn’t have dual cameras like what the deal is so here’s the question: how well has portrait mode actually gotten through these software updates over the years and how well does it actually compare to a Legit professional camera like this here this is the hasselblad x1d.

This is the type of sensor. It’S a big sensor, camera that, like apple, said that they’re emulating it’s a medium format sensor and it’s amazing. Now medium format refers to the sensor size, so you may have heard of full frame cameras with their huge 36 by 24 millimeter sensors. Well, medium format. Sensors are even larger than full frame 43.8 by 32.9 millimeters. This sensor is incredible.

Portrait Mode: Explained!

It also recently happened to get the highest ever score on dxomark of 102, if you’re into that, but bottom line. I’Ve had this for a couple months now and i love it. I’Ve taken the best photos i’ve ever taken in my life with it. You can check the flickr i’ll link it below, so it’s that’s what we’re comparing it to so here i have iphone 10 galaxy note 8 and google pixel 2, all of which have really really high-end cameras and of course they all have a portrait mode to try To emulate what this guide does so, let’s see how they stack up all right.

Portrait Mode: Explained!

So here’s what i would consider your standard portrait mode example same photo same angle from iphone 10 galaxy note, 8, pixel 2 and the hasselblad, and you can see some differences right off the bat and then, when i put them all side by side, especially of course How the hasselblad does it so well and here’s just another one, so you get a better idea: iphone 10 galaxy note, 8 pixel 2 and the hasselblad. Now what you’re going to start to notice is that portrait mode on these smartphone cameras? It does a pretty good job of replicating the general effect of a shallow depth of field, but of course none of them are perfect and there are a couple main differences in how they handle it. So number one is what i’ll call edge detection this camera doesn’t need edge, detection software because it just kind of has a plane of focus if your object is out of the plane of focus too close or too far away, it’ll be blurred out, and everything in That plane will be sharp simple as that, but a smartphone sensor is so small and the lens is so wide angle that for most normal photos, almost everything is just going to be in focus so to create this nice blur. On the background, you either need to get really close to the subject, or it needs some kind of edge detection or depth sensing to determine what’s in the foreground, keep it sharp and then what’s in the background and blur it out now different phones.

Portrait Mode: Explained!

Do this different ways so galaxy note, 8 and iphone 10 both have dual cameras and they use the difference in depth. Difference between those cameras to sense distance. So the same way we use our eyes to see in 3d and the google pixel 2 is actually one that does this all with split pixel technology, so a difference in distance between adjacent pixels, which is really impressive, which is why i called this thing so smart. So, while none of these do a perfect replication as you’ve seen, they all handle it to varying degrees of success, like the more you shoot portrait mode on the iphone 10, for example, the more you realize apple basically just takes the face, make sure that’s in focus And then blurs pretty much everything else, not just the background everything, even if it’s supposed to be in the same plane of focus your ears, hair shirt, the rest of your body.

Everything else will be blurred, it just seems kind of sloppy. Sometimes the galaxy note 8. Does a pretty good job actually with edge detection, but again it’s obviously not perfect and then you’ll see the pixel 2 is is maybe the most artificial looking, but also possibly the most pleasing.

Looking result, it cuts the edges the most sharply and then goes really far to separate them by sharpening the subject a bit and blurring the background the most. So then, the number two thing – and you might not think about this as much – is the blur variation. The differences in amount of blur, so it’s not just as simple as taking the entire background and blurring it all out. That would be simple, but it’s a bit more complex with photos from a high-end camera. The amount of blur is actually dependent on not just the focal length, but how far into the background that subject is so the things that are further from the plane of focus are more blurred and the things that are closer are less blurred, so smartphone cameras believe It or not, are actually getting better at also doing this pretty well. You can see in this shot it’s an actual gradient of the amount of blur you’re supposed to have, and thanks to the depth sensing abilities it kind of does it you can see. It probably is best handled by the pixel 2 here they all still do it uh, but it’s probably the most dramatic with the pixel 2.. It’S not quite what the hasselblad is doing, but it’s decently close and then there’s other small things like distance and brightness and exposure pickiness portrait mode doesn’t always work on a smartphone. Now, obviously, you put the 90 millimeter lens on the hasselblad and the background will be blurred every single time, but with portrait mode on the smartphone, it’s not exactly universal earlier, i said the best way to guarantee blurring the background from a smartphone camera is to get The subject really close to it portrait mode basically just extends that range of background blur to photos a couple feet away: six eight feet away, but again, once you get too far away, portrait mode will stop working as well, and the iphone will tell you about it.

It’S very vocal about placing your subject within eight feet and telling you when it isn’t going to work. So if you’re too far away, it’s just gon na take a normal photo with no portrait mode and same with note 8. But it mostly just says due to shooting conditions, so it’s not as specific, but you kind of just have to know how to fix it, and pixel 2 has no warning at all. It actually doesn’t give you a preview of the background blur. So you don’t know if it worked or not until after you take the shot and then let it process and then look at it and see if it worked. Sometimes it fails – and you don’t know till later, so a shot like this when you’re too far away, they tend to not be able to tell the difference as easily between the foreground and the background.

So a large sensor, camera still makes a solid amount of bokeh, but usually your smartphone will just go back to taking a normal photo and then your other weird quirk with these is just non-human subjects or things that aren’t your typical portrait or stuff. With weird outlines. This artificial intelligence has been training itself on, like this normal human portrait for so long that it does best with that. It’S done pretty well with pets. Also, interestingly enough, i’ve noticed, but then it breaks down like when you see stuff like this headphone picture. This is what i’m talking about.

It becomes pretty obvious. They get a lot of the outside of the blur down, but then there’s like holes in the middle and awkward lines and things where again edge detection can be weird. But here’s the thing the biggest difference between these smartphone cameras and the hasselblad, and things like this is smartphone cameras are improving way faster. The whole background blur thing smartphones, are closing the gap on slowly but they’re, also adding unique features that big cameras can’t touch like artificially changing lighting effects. As soon as you take the photo or tweaking the amount of background blur after the photo was taken, we aren’t getting noticeably new big features like this, nearly as quickly in dslrs or mirrorless cameras.

But you can just push a software update to a phone and get new stuff like this all the time, so mobile photography is an awesome place to be right now, because you can do amazing things with your smartphone camera that you couldn’t do a year or two Ago now, as someone that takes a lot of photos with a lot of different types of cameras from the smartphones to the mirrorless cameras to the hasselblad and everything in between i’ve always been able to tell the difference between a smartphone photo and a dedicated camera. Always but this year with these new cameras, is the closest it’s ever been it’s the most blurred. That line has ever been.

I think you and i will always be able to tell the difference if you pixel peep enough between a smartphone camera and a big sensor. Just because of the physical constraints of trying to make such a small sensor do such big things, and so for that reason the big cameras will always have their place. Professionals will always buy that, but these cameras are getting so good and they still have their advantages of being so small and portable, and always with you that getting like a really high quality, selfie or really high quality photo wherever you are without having to worry about Your big camera is the biggest advantage of a smartphone, so dslrs and dedicated cameras for that professionalism and quality, but smartphone cameras for portability and ease of use. I think bottom line, that’s what they’re good at and the best camera really is still just the one. You have with you, i don’t think that’ll change for a while, but this is what gets me excited about the next year or two of smartphones and smartphone cameras and what they get really good at. So that’s what we have to look forward to so thanks to these companies, dedication to this stuff, it’ll only get better from here, either way, that’s been it! Thank you for watching talk to you guys, the next one peace you .