Why DxOMark Smartphone Camera Scores are Wrong

Why DxOMark Smartphone Camera Scores are Wrong

Hi, this is Wayne again with a topic “Why DxOMark Smartphone Camera Scores are Wrong”.
You probably heard of DX amok, it’s an independent benchmark that scientifically assesses image, quality on phones and cameras, and you see companies bragging about their DX remark scores all the time, but it doesn’t give the right results. Let me explain why something. I’Ve heard a lot of people saying is that DxO mark doesn’t give the right scores, because shooting photos in lab conditions does not translate to the real world and so to address this first DX amok is actually really thorough. They take their phones indoor-outdoor in the day and at night they test everything from zoom to bokeh and flash quality being detailed in their testing is not the issue here.

So what is the problem? Well, we can break it down into seven main things. Number seven smartphone features are evolving really quickly and this test just isn’t built for that. Take the pixel three: they scored this phone at 101, DxO Markus factored in the new wide-angle front, camera better bokeh and refined zoom, but there is more to it than that.

Why DxOMark Smartphone Camera Scores are Wrong

The pixel three house top shot a really clever feature that allows you to fix people’s faces after having taken a photo and because this doesn’t come into their criteria because it doesn’t affect noise or texture or exposure. It isn’t counted but does improve the overall camera experience. Yeah, an even bigger crime is the exclusion of night sight mode such a powerful feature that it changes, low-light photos from completely average to literally the best you can get to not include a feature like this, but at the same time praise the Galaxy Note 9 for Its really good low-light photography, it kind of makes the pixel 3 review seem unrepresentative. Then we’ve got withholding.

Why DxOMark Smartphone Camera Scores are Wrong

If you look at the current league table, you’ll actually notice. Quite an odd result. Always p20 Pro is the current market leader, and this is particularly strange because the company has released an improved version of it in the mate 20 Pro. If you have a flick through you’ll then realize that the reason for this is the May 20 Pro is nowhere to be found, and the suppose, if reason for this is an interesting one, the mate 20 Pro scores too highly it seems like Huawei, has made the Decision to not bother competing with itself, they all stay at the top of the leaderboards until some company overtakes the p20 Pro and then their plan is to release the mate 20 Pro score and just be like well we’re still on top clever strategies. Aside, the point here is that if this list can be adjusted by manufacturers who don’t want the results to be public, then it’s inherently biased.

Why DxOMark Smartphone Camera Scores are Wrong

It means that you’re not seeing a representative League table but in fact only a list of phones for which the manufacturers are happy with the results from okay, then we’ve got the fact that the evaluation of cameras leads you into a lot of gray areas. It’S quite easy to look at two photos, decide which one is more detailed and give that phone a higher number based on this. But what, if you’re analyzing something more complex? How do you assign a number to the fact that hallways night mode takes better shots, but needs you to keep the phone still for four whole seconds versus one plus is one second? How do you factor in that? The mate 20 Pro has a pretty incredible light.

Painting mode, even though it’s a niche feature that not everyone will use, it’s not entirely DXM ox fault, but the very nature of trying to turn something as subjective as this into a single number is problematic. And, unlike with a lot of reviews that give a number at the end of their analysis, just as a summary, this number is the only thing most people are seeing for say. For example, I was picking my phone based mainly on its video quality.

Looking at DxO marks cause I’d be pushed towards a p20 Pro which goes a hundred and nine, whereas in actuality the iPhone eight takes better video and audio another nice pause 92, which leads me on to the next point that microphone quality isn’t considered and given that, When it comes to video audio is like 30 percent of the game. That feels like quite a major exclusion. Okay, now is where the big problems start to come in.

There is a conflict of interest. Dxo Mark’s main business is the consultation of companies they come in and they advise them on how they can improve their cameras. But the problem is that the very nature of this advice will be how to improve cameras in the ways that we think are important.

Aka. How to do well on our tests? Companies can also purchase the DxO analyzer, a massive and expensive package of hardware software and training that allows them to essentially simulate some of the tests. Dierks amok uses to determine scores which is genuinely the equivalent of revising for a test.

When you already have the answers OMS that purchase this can effectively build their products around scoring well in these tests, which, as you can imagine, makes it massively unfair for those that don’t and then this causes a ripple effect, because all these companies are fighting each other To score the highest in this benchmark, it could actually hinder the progress of smartphone cameras. Of course, they will have to make the cameras better if they want to score higher in the tests, but the focus of these improvements will be weighted towards the areas that DxO mark cares about and away from. Perhaps new and more exciting features. The most important thing, though, and we’ve kind of alluded to this already – is the fact that one number is not enough. The actual analysis and the written text on the website is really thorough, really well researched, but by giving companies this one number that they can use to summarize everything you’re effectively giving them a way to brag about their camera.

That gives very little indication to a consumer. How good it might be for them, so what I think would be more useful is for DxO mark to only issue subcategory scores, so a company could tell people our device is one of the best at handling exposure or they could tell people our device has one Of the highest rated zoom capabilities, this would remove the subjectivity that comes into deciding an overall score to try and encompass everything into one number and what end up, meaning that the numbers mean more to the end consumer. Like I said, the analysis you’ll find on the DX amok website is some of the most thorough in the world, but it’s more the scoring system.

I think that’s causing a lot of the problems that we’re seeing you get weird results like the a 20-18 LG g7. Being only as good as the 2015 galaxy s6 edge or xiaomi’s mimics 2’s beating the iphone eight plus, even though it’s video and audio quality isn’t even close. So what do you guys think thanks for watching? Okay, you .