Hi, this is Wayne again with a topic “1000 photos (?) on the Pixel 8 Pro: AI, UI…Oh?”.
Google is a software company that sometimes makes hardware, and nowhere is that more apparent than in the pixel 8 Pros camera system. I love trees where AI, editing and video is at the Forefront. I’M Becca welcome back to fullframe. This is the $ 999 pixel8 pro it has a tensor G3 chipset, a flat 120 HZ 6.7 in screen that has a peak brightness of 2,400 nits and, of course, a temperature sensor, but its biggest updates come to the camera system. The most noticeable change is a redesigned UI.
All of the controls have been brought down to the bottom of the screen for easier access and can be adjusted entirely by the right thumb. Even better, though there is always a visible, dedicated photo video toggle button, unlike on previous pixel devices, where every shooting mode was on a single scrollable line. Now video and photo modes have been put into their own lines that are accessible via this switch. This has naturally caused me to switch between photo and video and back far more often and they go hand inand, with Google’s increased focus on video quality.
The pixel8 Pro 50 megapix, F 1.68 main sensor, is capable of 4K video at 2430 and 60 frames per second baby swi baby and Google claims that it has two times better Optical quality and that the lens brings in 21 % more light. Well, video from this device has pleasing focus, fallof and a fair amount of saturation. Switching between lenses is met with jarring frame and color changes. What is that? Is the Google pixel 8 yeah really and when compared to the iPhone 15 Pro Max’s standard, 4K 24 frames per second video.
The pixel is just less consistent with its color choices in low light. It adds a magenta Hue and, when light quickly shifts like it does here, the pixel more obviously is adjusting to balance everything out where the iPhone’s change is more subtle, not to mention the pixel is not able to Output 4K Pro reslock. However, I do appreciate the amount of contrast the pixel holds on to in its current state.
I am not impressed with this camera’s lowlight video performance. It’S a grainy mess, although video on the pixel will be getting a boost when night sight for video, a software that will offload processing to the cloud to digitally enhance video in low light launches. Later this year, we haven’t had a chance to put this to the test, so stay tuned for more on that this winter there’s also a new audio Magic Eraser that can reduce background noise in video y’all. I am so obsessed with this building. It’S just like mean, and it’s not made out of glass, and it has like all of these really cool, like small details to it.
That, just like add up to this big imposing beautiful building I mean that is. That is a good looking building. I love it. I love it man, I hate that I love a skyscraper.
I don’t know why. I hate that I’m not going to unpack that right now. This feature is fine.
Google has also finally kicked the overly cool tones and images. Don’T lean too far into the Samsung oversaturate. Everything Zone either when taking photos. The system is also not as afraid of contrast as it once was, allowing Shadows on faces to stay Shadows, sometimes even more than the iPhone will. Although much like in video, the iPhone consistently gets the colors correct, unlike the pixel, the lower aperture also allows for more natural Focus fall off, especially when a subject is closer to the lens.
And lastly, I already talked about this, but I can’t stress it enough. The video to photo toggle is huge, especially when video is finally looking equally as good, if not better than still images. The experience of using this camera is far more fun.
With this simple change, the 48 megap F, 1.95 Ultra wide camera, is seeing a whole centimeter of improvements this year. Man look how long my shadow is mine long? No, it’s not! No! It’S not because of a larger sensor, wider aperture and a closer focusing distance of 2 cm, as opposed to 3 cm on the pixel 7 Pro and pixel 8. The largest improvements come to the the macro mode in the ultra wide. I get this question a lot. Why is the ultra wide lens used for macro photography? It’S because the lens is wider and the sensor is smaller as compared to the main lens.
So you can have a much smaller minimum Focus. Distance photos taken in macro mode are sharp and they do have a nice follow off, but I really only used it on flowers and I don’t know I just struggle to find a need for it. This mode continues to feel gimmicky to me just waiting for this guy to walk by we’ll get him we’ll get him. The 48 megapixel telephoto land has an improved f2.8 lens that captures more light. It is still a five time zoom that equates to around 112 mm and much like the iPhone 15 Pro Max’s five times telephoto.
It doesn’t hold a candle to the s23 ultra Samsung’s 10 times 230 mm telephoto lens is obviously more zoomed in, but also just plain sharper and good light compared to the iPhone 15 Pro Max, though, the pixel more consistently identifies objects and separates them. From the background, though, in perfect light and without a subject in the immediate foreground photos from these devices, telephoto lenses are hard to differentiate the problem with the 8 Pro’s telephoto lens. Is it stabilization by default in video mode? The pixel8 pro has standard stabilization enabled – and this is great for folks who won’t dare step into deeper camera settings, but in attempt to provide a stable image, the camera creates frustration.
I often felt like I was fighting the camera to pan or tilt slightly in order to achieve the framing I wanted, and even with stabilization, turned off, there’s a sort of sway to the image that can be difficult to work with, while these settings ultimately make for A Sharper Image I can’t get over how frustrating it feels when I use this lens, and it all just made me less likely to play with it at all. But specs aside, this system is so much more about the tools it provides to edit. These photos and videos Say It Ain’t. So we got manual controls – and this is big when you take a photo on most modern smartphones.
Data from the scene is analyzed and carried down a controlled path to the best outcome, or most objectively quote unquote. Good photo, throwing user generated turns into this path. Turns that may or likely may not result in these good photos is challenging for the system, but look at Google doing what Samsung has done for years, including some manual controls in the native camera app. There is now access to white balance, Focus shutter, speed and ISO settings, along with the ability to quickly switch between 12 megap and 50 megap photos, raw or jpegs, and even override the camera’s choice to automatically switch lenses.
You can overexpose at your heart’s desire, underexpose the way got intended and even pretend that this sensor is large enough to actually have a large Focus range. Okay. So for someone who just ragged on the iPhone 15 Pro Max for not having these features, I am being a bit harsh and although you can only get the pro controls on the pixel 8 Pro, Google has never claimed that the system is for professionals with a Capital P: instead, it’s just another, really fun tool to play with and when I feel like find fine tuning. My image I can just pop in and do that so yeah, it’s it’s a nice feature to have, and I’m stoked that it’s here this camera takes photos.
Well, but it can manipulate them even better. Google’S Magic editor now not only allows for erasing objects or people, but you can also fully move things within the frame or change the expression on folks faces. These AI editing tools are fun and making them accessible to everyone is great, but it also accelerates the creation of false narratives or realities, especially when you can do this to any photo.
You upload to Google photos, and yes, many folks have pointed out that these photo editing tools are available in many other places, including apps. You can install on your phone. Google is just the first phone maker to bake them in, and I don’t think they’re wrong for doing so, but with that approach comes great responsibility and it is this lack of recognition that these tools could be used for bad. That is the problem. For me, Google explained to me that itpc metadata is embedded into images that go through magic, editor, meaning that if and when – and that is a very big. If and when other companies start adhering to the standard. Anyone will be able to see that an image has been manipulated using the software, but we aren’t there yet and these tools they exist now and when anyone can just erase or move real things, it sets everybody up for more uncertainty.
The pixel8 pro is camera. It still struggles with an old weakness which is color consistency, and it will come as a surprise to no one that the software and the UI were the most tuned and best part of the experience. So, at the end of the day, Google remains a software company that sometimes makes hardware and for every misstep it takes on the hardware side. There are two AI editing tools, ready to make it right or make it up entirely.
Okay, I finally reached that point in my YouTube career, where I can say I uh, I have a dbrand skin. Well, The Verge has a dbrand skin and in fact they have two two dbrand skins anyway, link down below uh to these things. Uh I’ve made it. What can I say, sking next, some butt .