Hi, this is Wayne again with a topic “DxOMark Smartphone Ratings: Explained!”.
We’Re proud to report that pixel received a rating of 89, that’s the highest reading ever for a smartphone HTC u11 camera the highest and the only one with the highest score. The exo mark has issued pixel to an unprecedented score of 98. That’S the highest score of any smartphone camera. Stop guys I’m Kim PhD here recently, the Google pixel 2 came out with the highest ever rating of a smartphone camera on DxO mark again, Google tweeted it Google’s CEO tweeted it a whole bunch of people put it in their headlines.
It’S shone like a badge of honor every time it happens, but why I mean we’ve talked about smartphone cameras in the past, but suddenly in the past 2 or 3 years, DxO mark ratings have become the holy grail of measuring smartphone camera quality. So this is DxO. Mark explained so DxO labs measures and rates a bunch of different cameras, sensors and lenses, some dslr’s, some cine cameras, Canon glass, Nikon glass, all that stuff and they have an entire separate section dedicated to smartphone cameras.
If you click on this mobile section of the site, you’ll see this hierarchy of smartphone cameras, each of them with a big number rating. This is somewhat accurate, but also a bit misleading, using one single number to describe and to encompass all the complexities that go into a smartphone. Camera is a bit too broad.
It’S just a little bit crazy in my opinion, so DxO breaks it down into photo and video and then bricks each of those down into a bunch of subcategories. So if you scroll down far enough in a DxO smartphone camera view, underneath the rating you’ll see that breakdown of the subset scores for photos and for videos, pixel to, for example, got a 99 for photos and a 96 for video. But you notice these numbers clearly aren’t an average of the sub scores, so they have this weighting system, essentially their own algorithm, for combining all of the sub scores into this overall score.
So it’s that score that gets published and plastered all over the internet. All over your Twitter feeds, but most of these are chuckles, in fact, probably all of them ignored the sub scores and how well these cameras did in each category. You’D have to actually go to DxO marks, site and actually scroll all the way down and read through the review for that, and if you did, you would find that DxO does some pretty legit really solid testing.
They try to be scientific and objective wherever they possibly can. They have a bunch of these indoor sets these controlled environments for testing things like noise performance, sharpness, color, reproduction and detail in the exact same situation for every phone. So it’s repeatable every time for testing boko with portrait mode.
They have a foreground subject and a background. That’S a certain distance away. They measure autofocus speed and shutter speed lag with moving subjects the whole deal and then they also have a bunch of slightly less exact. But still pretty telling outdoor tests for things like high dynamic range and seeing color casts etc according to their site, they have quote more than 50 challenging and realistic indoor and outdoor scenes. Then they turn these photo samples into ratings for each category. Using a panel of experts, they say, and essentially what they do is they take the new image from the smartphone camera and match it with the closest one on a scale with what’s called an image quality ruler, they have an existing scale with a bunch of different Levels of noise, for example, and they match the new sample with where it falls on that scale.
So that’s how they turn a qualitative measurement into a quantitative value, but these websites reporting on the new phones. When they come out. You know they don’t really want all that that’s a little too much for a headline, so DxO is smart and to make it more easy to headline they summarized everything into one big overall score, which, like I mentioned, is not the average of the sub score. So we can assume they’re weighted roughly in that order with exposure and contrast being the most important, auto focus and color being almost as important and this decision of the order that they weight things is subjective, the order that they decide to rank these and the amount Of weight that they put into each one is entirely up to them, and that’s why you shouldn’t put all your purchase decision into just the overall score. You should look past the overall rating. Compiling everything into one overall score can be misleading because different characteristics matter to different people. Some people take a lot of low-light photos or take a lot of photos of people, a lot of selfies or a lot of landscape shots. Here’S a perfect example: the Galaxy Note. 8 got an overall score of 94. The pixel 2 got an overall score of 98.
You take a lot of portrait photos which one are you gon na pick you’re gon na pick the Galaxy Note, 8 you’re gon na throw out the overall score and pick the one that actually has a dedicated second camera for portrait photos. If you look at the photo sub scores, it got a way higher score for zoom and bokeh, but these are pretty new things and so they’re not really weighted as heavily. So they don’t really have nearly as much of an effect on that overall score and, as a result, the phone that you would pick because it’s better for what you like got a lower overall score on the DxO mark scale. You see what I’m getting at honestly. Most of these smartphone cameras are great when you talk about things, like contrast and color and dynamic range to varying degrees. Obviously, but the top ones are all pretty good.
The bigger difference comes with the different modes, with things like portrait mode and long exposure and panorama stitching, and all that those are what make a bigger difference to the user experience, especially when you see a phone like pixel to doing portrait mode, mostly within the software Versus with a second camera, now here’s another reason you may want to look past the overall score. Dxo Labs is a consulting company as well as a testing company for a fee. They will work with smartphone manufacturers before their phone comes out to create a better camera. So they’ll provide their their testing software, their hardware and their testing methodology so that you can calibrate your sensor and your image processing pipeline to produce better photos, but also kinda to just do better on their tests. It’S pretty easy to look at this and think they were just working with you to get a higher score.
Now, I’m not saying that smartphone manufacturers are specifically tuning their cameras to do better on deck so mark, instead of actually doing what’s better for the consumer. But it’s a blurry line there, it’s kind of similar to what we see with benchmark. I mean most of the things that do better on DxO mark are also better for the camera, but again it’s combining it. It’S tuning it to their exact specs and with those better with their methodology. Basically, if you don’t work with them on tuning, your camera, sweet you’ll get a lower score, and if you don’t have that DxO mark money, you’re kind of left out on the cold, they might not even test your phone. So you can take all that with a grain of salt. Obviously I don’t think DxO mark is changing their test results based on who paid. Obviously, the reputation is based off of objectivity and scientific accuracy in a way, so they wouldn’t want to risk that, but the simple fact that they partner up with certain manufacturers and work on certain devices, while they don’t with others, is kind of an awkward relationship. Imagine if Geekbench imagine, if the benchmarking suite Geekbench partnered with certain manufacturers on improving overall device performance and then also made a big deal about which one scored better in Geekbench, you see how weird that might be anyway. Moving on to another quick point, with the pixel to getting a 98 you’re, probably wondering what’s the best score they’ve ever given to any camera, that would be a 108, and that was the sensor score for the red helium 8 k with the super 35 sensor, though, When you’ve been watching footage from for this whole video now, if you’re thinking wait a minute, if combining all the sub score, values makes a value higher than the average and they gave a value of a hundred and eight for a sensor in a different test. How what how you get a score over a hundred sitting on top of it all, with its current sensor, the helium 8 KS 35, with a DX ohm mark score of 108.
Yes, that’s supposed to be out of a hundred nope, that’s no! No! It’S not the XO mark scores are not out of a hundred that the pixel two did not get a 98 out of 100. It’S not a ninety-eight percent. I see way too many tech news outlets reporting, DxO ratings, as if they’re like close to a perfect score, because they think they’re out of a hundred they’re. Not the fact that they’re close to 100 is great, but that’s purely a coincidence and cameras are gon na.
Keep getting better on this linear scale and they’re going to reach and pass a hundred. There’S gon na be a first smartphone to hit a hundred they’re gon na make a huge deal about that. You’Re gon na see it in every headline and then there’s gon na. Be the first smartphone to pass 100 they’re gon na make a big deal about that too, and that’s gon na make a lot of headlines again, but there is no maximum or perfect score. The fact that there are near 100 right now, like I said, is a pure coincidence and it’s a result of their weighting system that it pushes those numbers up to near 100.
And that’s why in the past year or two, they kind of get reported more often because they look like a near perfect score, but again it’s just a coincidence. They will keep going above 100. That’S just the way it is for now. So the whole thing about every new smartphone camera being the new highest ever rating on DXL mark, while it’s kind of annoying right now, that’s supposed to happen, it’s just getting reported more, it’s kind of like when Apple says up their keynotes. This is the fastest iPhone.
We’Ve ever made well, I hope it’s the fastest iPhone you’ve ever made. You shouldn’t be making slower phones from last year, but you don’t see anyone like Geekbench rating these chips on a scale and then giving a new crown to the highest ever rating. Every time it just doesn’t work that way. My problem isn’t with DxO mark it’s really more with the press. They just kind of have a habit of parroting the new highest overall score and not really looking too far into it or putting any effort into reporting that too it’s nice when they do, but for the most part, they’re just trying to get their clicks. So, if you’re on the other side of this and you’re actually reading these reviews and these ratings, you just got to look a little bit more into.
It actually read the review if you’re thinking about making a purchase so keep doing your thing DxO mark now, if you want to bundle everything into one overall score, that’s representative of the phone: that’s cool! Just try to surface a little bit more information more easily and if you’re in tech press and you write a new article, every time, there’s a new highest DxO mark score. But you don’t write a new article every time. There’S a new highest Geekbench score! Think about what you’re doing stop it get some help. So in summary, DxO mark does real testing.
Yes, but an overall number to describe everything that goes into a camera is a little too broad, yes and then DxO also does consulting to improve cameras based on their own algorithms and testing. So maybe take those numbers with a grain of salt and the score is just happened to be around 100 right now, but that’s purely a coincidence, they are not out of 100. So what we’ve learned here is ratings. Are a nice clean, even number to point to one being better than the other, but if you actually want to make a purchase decision, you got to look a little bit deeper than the DxO mark rating plus they’re, all pretty incredible. Anyway.
You should watch the blind test, I’ll link it below either way. That’S been it. Thank you for watching talk to you guys, the next one peace .