Hi, this is Wayne again with a topic “How Samsung Phones are “Faking” their Photos.”.
Samsung have been advertising their phones as having up to 100 times space zoom capabilities for years now you can actually pick up a Samsung phone Zoom, a ridiculous amount of times into the moon, take the shot and still end up with the result. That’S somehow crisper than a 5 000 professional camera setup could take it’s very impressive, but the internet has apparently just proved they’re faking it. Let me explain the reason Havoc all started when a Reddit user called I break photos downloaded a high-res shot of the Moon from the internet. He then downscaled that image to just 170 by 170 pixels, which is such a tiny resolution, that basically all detail, would have disappeared and then applied a blur to it. What he’s trying to do here is to manufacture something that contains little actual detail, but that the phone might still identify as a legitimate moonshot to see if it artificially adds in extra detail so anyway, he then apparently turns off all the lights uses his Samsung phone To take a photo of this blurry 170×170 moonshot, and he ends up with this. It’S such an incredible difference that he comes to the conclusion. Samsung is using its supposed AI scene, Optimizer mode to place craters in spots that in his shot, but we’re really just a blurry mess that this phone is not just sharpening up.
The original image like Samsung suggests the AI does, by taking multiple shots and blending their detail together, but instead recognizing that it’s a moon and just swapping out his actual photo with a correct texture of what it should actually look like. This is a big claim, so I thought I need to make sure this is for real. I decided to try and capture four photos. Each one pushing the boat out a little bit further number one was a genuine real-life shot of the moon at night, which went about as expected. I ended up with this rather impressive, looking image, but nothing unusual happening yet for sure number. Two though I wanted to see how easy it would be to fool the phone by using a photo of the Moon on a screen instead of the real life Moon.
So I pulled up the same photo that I taken from my Samsung onto my MacBook, placed it onto a dark sky scene and tried to capture it, and I noticed the phone is not super easy to fall. It knows you’re never going to be sitting right next to the moon. So if you just sit in front of your screen and use your one times main camera, it’s just gon na look like a photo on the screen. But after about five minutes of getting just the right distance such that I could get to 25 times, zoom and the Moon properly fit within the shot scene, detection did kick in it, recognized the moon and turned this shot into the shot.
Okay, a little bit weird, you would expect a 25 times Zoom photo of a photo to result in some noticeable quality loss, but this weirdly did not, which does imply some fiddling and just to make sure that I decided to stick my cat’s face into the same Starry Sky background now this was late at night. My decision making was getting questionable, but yeah for this photo. There was a noticeable degradation of quality between the original image and the one taken from the phone of that image. So something is going on specifically with the moon, which is where photo 3 came in.
This is where we went further and we actually tried to see if we could recreate what the redditor had initially found. I took my original photo downscaled it from a 2000 by 2000 image into a 200 by 200 image to remove almost all details added a blur such that there was now no visible Moon, texture at all and repeated Steps From the last photo, and this was properly Surprising, my Samsung not just churned out a shot with zero detail loss from The Source image, but one that actually had more now. This might not seem surprising, we’re quite used to phones, dramatically neatening up images after we’ve taken them, and we know what the real shot looked like. So we already kind of have that image in our minds but think about it for a second.
If all the phone is seen in this instance, is this white blob how on Earth would it know that this dark patch over here is a crater? How would it know that this blurry bit here is Moon? King venting detail detail out of nowhere, and the only way this makes sense is if the phone is not using its normal image processing that it would use for every other shot, but instead recognizing the moon shape and going. Oh, I actually know what a moon looks like already. I can use a sneaky trick here, just to try and 100 confirm that this was happening. I even printed out a blurry version of the Moon hanging it from a light by a string and sure enough.
The phone detected the paper as a moon and added detail that didn’t even exist onto a paper cutout. I tried the same using my iPhone 14 pro, and no extra detail was added and also again on my Google pixel 7 Pro again, the photo is just a blur as you’d expect it to be so you could say that some level of Faking is occurring, but I wanted to see if I could get the phone to completely cheat to try and get it to identify a completely random unrelated object as the moon and then get it to accidentally slap the entire Moon texture. On top of it, I tried lighting up a ping pong ball in the black room, didn’t work. I tried suspending a brightly lit piece of blank paper that didn’t work either, but the phone did remove the wire it was hanging from which was odd, almost like it was trying to turn it into a steel object.
But the point is, I couldn’t completely fool It. Whatever Samsung is doing to the moon images, it’s at least pretty good at differentiating photos of the actual Moon and photos of brightly lit white objects. But then I had a really interesting idea. I realized how I could test for absolute certain if Samsung was just replacing your images of the moon or not matter sub to the channel would be Monumental. I realized that if I took the Blurred out moonshot that Samsung does recognize scrubbed out a little piece of detail here and replaced it with a detail of my own I’d, be able to conclusively see if Samsung would just ignore my tweaks if they would just overwrite. My image, with the same normal, correct stock Moon image as before. What I actually saw, though, is fascinating, because the tweaks I’ve made were minor.
The phone still recognized the object as the moon and added in detail, but instead of completely replacing the image, it did indeed keep the tweaks I’d made, proving that it was in fact using the photo I took as the bass. But what really surprised me is that it also identified the image, I’d added as being one of the creators on the moon going as far as to fuse the two images and apply the crate texture to that image. So what’s happening here is Samsung has trained their AI to fully understand the real texture and details of the Moon, using hundreds of thousands of real images to teach you the patterns to look out for, because of that, this AI can recognize when a moon is present. In future, shots, and so long as there are enough identifiers for the phone to be able to confirm it, we’ll use what it’s learned about. What the moon should look like to analyze.
Each Moon photo from scratch recognizing the bits it recognizes and adding those in and then improvising how to enhance the rest. It’S definitely beyond the level of tweaking that their phones would be able to do for normal random objects that they’ve never seen before. But it is definitely better than what a lot of people are, accusing them of which is using some sort of pre-baked image to slap on top of what you tape. If an image like this existed, you would be able to find it within the camera app file, but no one can so. Why exactly is the issue? Why are people so worked up about a moon photo? Well, there’s two separate things that I’ve seen people concerned about here. One is, I guess, the moral question of. If a company can do this, especially without being open about it, is it changing the sanctity of what it means for some to be a photo as opposed to a generated image we’re in the age where AI can make anything you ask it to so? Do you want your photos, your cherished memories, to just become one of those two or do you want them to stay real, but for this I would say: well I like the idea of pure images, there is just so much useful stuff that smartphone cameras already do That just would not be possible if we said no.
Software is not allowed to interfere and the truth is, if that’s the stance we took, then phone cameras will be stuck in the last decade in terms of quality like for a while. Now, if you, for example, zoomed into a signpost, that’s sitting far away, your phone already does recognize. That text is present and specifically uses what it knows about text to enhance the legibility. Is that a bad thing, because the text has been enhanced and therefore isn’t real? I personally rather have sharper text and the Richer more detailed photos in general.
Then here’s exactly the scene, you shot, it looks bad, but hey it’s genuine and I don’t think I’m in the minority. The second concern people have with this situation is the more Sinister accusation of it, meaning that Samsung is deceiving their customers and it’s a valid point. The moon photos have been used as part of Samsung’s marketing on multiple occasions saying.
Our phones have such good cameras that they can zoom in 100 times into an interplanetary object and still capture a crisp output and because it’s talked about in the marketing that also trickles. Through to comparisons and organic content, where people are naturally going to test the phone side by side with other phones in these situations and come to the conclusion that Samsung’s cameras have just destroyed everyone else, when actually, if you’re using an AI, That’s specifically trained to make This one use case look incredible, but then, if you try, shooting literally anything else, it will just be normal. Is that your camera is doing it. I would say no at the point where you can turn a shot like this into a shot like this. It’S got. Nothing to do with Optics. This is a software trick, and really you could just substitute in your potato, laptop webcam and so long as you can get a very blurry base image.
You can still end up with a perfect moon at the end. That would beat 90 of other phones on the market. What I call this a great camera um. Well, no, not really.
The problem people have is not that Samsung is using AI for its mooning capabilities. Learning capabilities, cut that, let’s not get demonetized. It’S that something is using unique processing algorithms to make just the Moon look better than photos in any other situation can and then selling the feature to people as a camera feature, as opposed to the software feature that it actually is, and it’s not even a particularly Advanced software feature either remember our view of the moon is locked from Earth, meaning that we basically just see one side of it at all times.
So all the AIS had to learn is basically just a flat jpeg image with all that said, does it actually matter? This whole debacle has actually become a huge conversation topic. I’Ve not seen a tech news site not talking about this, and obviously anyone naturally inclined to not like Samsung, is seeing this as decisive proof that they’re a bunch of crooks. I would say this that as far as Samsung is concerned, yeah they should have done a better job.
Communicating what they’re doing here and what’s this pretty hidden Moon mode, is this company is pretty much monopolize the market for moon photos thanks to the way they’ve marketed it, but because of the way the feature Works, they really should have avoided trying to link the output Of these moon photos to the camera quality of their phones, but same time as far as the end result is concerned, I don’t think this is some sort of new Scandal. If anything, it’s probably a good thing. It’S hard to stress just how many different ways that smartphones already use software to enhance images they’ll make grass greener they’ll, make Skies richer, they’ll tweak. The core features of your face, which I mean if unsolicited photo tampering, is the issue, is a lot more invasive than just adjusting the moon. And then you could also argue that the name of the feature scene, Optimizer, does also kind of indicate that it’s about to use some sort of clever AI to enhance your shot. And if you still decide that you don’t want it, you can just turn it off and get a more genuine photo and for the most part, would you rather have a shot like this? Or would you rather have a shot like this plus the way I’m looking at it? Is that if I’m out at night and my eyes can see the moon, then any AI that allows the phone to also be able to do so? Is I think, going to capture a memory that’s closer to the way? I remember it so, even though the process of creating that image might be less authentic, the end photo is arguably more authentic right.
Sorry for the moon puns to check out the most elaborate proposal that you’ve ever seen in your entire life that video is here I’ll catch you there .