Give Me 4K or Nothing

Give Me 4K or Nothing

Hi, this is Wayne again with a topic “Give Me 4K or Nothing”.
Oh, we shot a video about this today. This is super cool uh, 4K upscaling for web video Nvidia just announced RTX super resolution is Now supported for users of our tx30 and 40 series gpus. We were talking earlier in the show. Sorry, we were talking earlier in the show about Moats yeah nvidia’s, additional technology that you get to leverage when you have their gpus is a moat. Yes, yeah.

That’S a perfect example um, so upscaling for videos played in Chrome and Edge with support for the RTX 20 series likely to come so we’ll see how that goes. Anything from a 360p to 1440p plus video is supported and can upscale up to 4K 4K app scaling was previously only available on nvidia’s Shield TV and not for high refresh rate video, and it didn’t go as low as 360p RTX VSR. That’S what they’re calling it uses AI upscaling to sharpen low resolution video while removing compression artifacts uh? This is Wild, I’m not going to spoil the whole video. It’S definitely worth a watch, but I actually preferred the RTX upscaled 1080p to the native 4k on YouTube.

Give Me 4K or Nothing

Whoa. Whoa yeah, that’s because 4k on YouTube still has a lot of banding. It does it’s pretty low bit rate? Yes, so, with a little bit of sharpening, did you mess around with floor plan and smoothing? No, I didn’t think to try that I’m interested we should play around with it, because it’s it’s a little. It’S it’s different than Philippines.

Give Me 4K or Nothing

I’M sorry than YouTube! It worked on CBC. Gem like it seems to just work on anything. I suspect it would work which is cool. I just wonder like it worked on Netflix, so it worked on DRM, protected content, which I wasn’t sure if it would interesting yeah like it’s, it’s pretty cool that is actually quite interesting.

Give Me 4K or Nothing

I’M hoping would be DRM protected as well, so yeah yeah. Well, okay, no! Don’T worry about it um. So the the missing information, though, rather than just like Edge detection and sharpening, is predicted by a neural network. That’S trained on large data sets of images at different resolutions and replaced, which is how that works.

Welcome to everything these days, it’s pretty impressive um. There are some issues and you guys are going to want to check out the video but um. Apparently, someone in floor plane chat is using it to watch the stream on float plane. Okay, how is it yeah? Maybe talk about that? While I run and go pee sure my bladder is going to explode the show’s too long yeah, it’s we’re already at three hours.

I think I think Dan left to like make coffee is that what I heard uh? No, that’s the uh drainage system for the air conditioner, oh uh, that makes sense. I have found um a merch message. Every 22 seconds this stream wow my fingers, hurt my eyes hurt, so I just had to go look at something far away for a while yeah we’re at yeah we’re at we’re at a little bit over three hours: okay, um, I haven’t seen a response yet from Prometheus awoken who’s.

The person who said they’re using it to watch the stream right now, um is anyone else using it to watch the full plane stream right now, um. Maybe they will respond soon. Oh there, it is flipping, looks good already, but it’s hard to believe this is live.

Quality is insane. Does this mean we can talk about the Christmas album? I love that cup. No, I uh uh. I I love that comment. Uh, because photo plane already looked good.

Is great but the fact that it looks even so much better is: is fantastic, um yeah? What is the bit rate on Philippine? So when people talk about bitrate on YouTube as well, it’s not it’s not really a fixed thing. Um it’ll it’ll vary right, so we have like targets and stuff, but like it’s, it’s variable bitrate, it’s gon na It’s Gon na Change quite a bit across the course of a video um and there’s there’s videos like just fully self-admitting. There’S. There’S videos that Linus Tech tips has uploaded um that have parts of them that don’t really look that good, because, however, the variable bit rate and compression all that kind of stuff decided to deal with it. It just didn’t really deal with it that well um and that’s true for every video platform. That does those types of things and we just try to Target a better amount than normal. Basically uh.

The response was uh, uh where’d it go. Oh no! Basically, it’s it’s better, they said full plane. Video is already really good, but watching it through the Nvidia.

Oh there, it is FP, looks good already, but it’s hard to believe this is live. Quality is insane wow, that’s really cool yeah! That’S super cool uh. Our discussion question is: what does this mean for the future of digital video streaming? I mean right now: spoiler.

It consumes up to 300 watts of power, so nothing in the short term, because realistically so much of streaming, video consumption is on devices like this. That would light on fire if they were consuming 300 watts power um, but in the very long term I I could. I could see it fundamentally changing the way that we build video streaming infrastructure, just kind of going like yeah um. Instead of optimizing, your stream for what looks the best to the eye at a given bit rate, you might start to optimize your stream, for you know what might be most easily interpreted by an AI, enhanced or machine learning, enhanced player.

So, for example, like you might basically be able to encode information into the video like this is our this is like put just put like a like a pattern. This is like a rock outcropping or whatever you know. This is grass. You know fill in the blanks which could lead to people having very different viewing experiences depending on which data set their machine learning. Enhanced player was trained on, like it could be like okay, you know that tool that Nvidia built where you just like. Essentially, Ms paint and it turns into a landscape – oh yeah, yeah yeah. I could see it being kind of like that to a point where you could just stream blocks of color, and it would just like decides things yeah you have metadata. I don’t think this actor is angry.

I don’t think Nvidia is going to build this that way, but I think this same technology could absolutely do that. Like I bet you something that you could do is with a certain pattern. Like almost QR code style, you could tell an interpretive whatever to repeat or or to continue yeah, so you could have like a certain pattern and then some indicator amongst that pattern, to take this pattern and either continue it so change it as it goes, but keep The theme or to just repeat it so it’s like object-based compression, almost yeah whoa, so instead of like you, could have like. Oh a wall yeah like this wall, which they can’t really see, but you could have like a more complicated design on a wall yeah. But then, instead of filling all your bit rates with that complicated, it’s almost like mocap like motion, so you could almost have like like uh like encoded, like dots like like object, markers yeah, and then you essentially so, instead of whatever streaming data, you are actually just Uh rendering it in in subsequent frames, it wouldn’t work very well for really fast paced video, but for something like Wan show. You could probably get the bitrate essentially down to extreme negligible, and I wonder if there’s ways that you could communicate to it like confetti Canon, because one of the biggest things that compression algorithms have a lot of problem with is lots of small things that are moving And shifting so confetti has always been this, like people noticed it originally from uh when, when we move to digital video for TVs and the NFL was playing yeah, they would shoot confetti cannons at the end of the game, and the whole thing would just turn into Like fuzzy snow, you can’t see anything and people were very confused about it.

It’S the same problem. We have today literally the exact same thing. If digital video, you can press it, you put too many things moving in different ways, and it’s just your quality is just gon na tank immediately. So if you could just tell the GPU to do that and just tell it to generate it instead of needing to like deal with all these changes, very cool be really interesting. Uh YouTube announces some vague new AI tools, something something um create artificial scenes, swap clothing. Virtually sounds like kind of YouTuber YouTuber stuff, uh they’re, also apparently rolling out a feature where creators can record a short parallel to another video similar to tick tock’s duet feature. It also happens to be similar to YouTube’s own long defunct, video responses feature which was discontinued September 2013 because it had a click-through rate of .0004 percent at the scale. Youtube’S at now point zero, zero, zero four percent could actually be worth having .