Hi, this is Wayne again with a topic “The NEW Chip Inside Your Phone! (NPUs)”.
Ai chips have suddenly become a big selling point for phones, but that might seem a little surprising that your little smartphone, which already has serious limitations on power, consumption and heat generation, can run something as seemingly complicated as AI. So how exactly do they pull this off? Well, these neural processing units or npus are quite a bit different than your phone’s main CPU cores features like Apple’s neural engine or the machine learning engine on a Google tensor chip are highly optimized for AI tasks, but probably suck at pretty much anything else. It’S kind of like how a GPU Works, although they are much better for rendering Graphics than a more general purpose. Cpu you’re not going to run your operating system off of your graphics card. They are embarrassingly parallel a relatively small amount of die area.
Then that is dedicated to AI can effectively run machine learning based tasks without sucking down too much power, but that doesn’t answer the question of why there’s such a push to put these chips in our phones in the first place, I mean we hear so much about Cloud AI, where neural networks run on powerful servers, so can’t we just offload tasks like image, optimization and voice recognition to the cloud. Well, the answer lies in how large and complex the AI models are, that your device needs to use models for common smartphone. Ai features such as voice recognition, facial recognition and some kinds of image correction are often relatively small, meaning that they can be run on device on a limited amount of silicon, and if these functions can be run locally instead of in the cloud, it’s generally better to Do so, for example, if you use an Android phone’s speech recognition button, you will wait around for your phone to send your speech over to a server over the Internet. Wait for that server to figure out what you’re trying to say and then wait to get the results back to your phone.
If you could get results right now, that would be a big selling point for a modern phone. So, even though Cloud Hardware might be more powerful, the latency advantage of having a chip on your device makes this trade-off worth it, not to mention that it helps protect your privacy by keeping as much of your data on your phone as possible. But when may it not make sense to rely on a phone’s npu we’re going to tell you right after we thank MSI for sponsoring this video introducing the MSI mag 1250g pci5 power supply yeah. You can keep your build Simple and Clean, because this puppy is fully modular and why not clean up some zeros on that energy bill.
It also has an 80 plus gold certification. So you know it’s power efficient, upgrade your PC’s power with the MSI mag 1250g pcie. 5 check it out at the link below more advanced forms of generative.
Ai aren’t quite at the point where you can run them on a phone efficiently and by generative AI. I mean artificial intelligence that can can create new media. Think about the stories that get generated by chat, GPT or the AI art from services like mid Journey.
Now you probably don’t expect to run an entire Advanced image generation model on a phone, at least with npus the size they are now. But what about commonly touted features like Google’s Magic editor on its pixel, lineup? Well, magic editor appears to need an internet connection, since the feature uses enough generative AI to the point where the phone has to rely on cloud servers. In order to give you the image you want in a reasonable amount of time, however, less demanding features such as live translate can run on device, since the idea of AI specific Hardware on consumer devices is still relatively new. Tech companies are still trying to figure out exactly where The Sweet Spot is in terms of which tasks can and should be done on device versus which ones should be offloaded to the cloud. In fact, lots of AI as a service type products don’t yet have a clear pathway to monetization. Instead, it’s more common for Tech firms to roll the features out now figure out how they work and then Jam them into their business model.
At some point Down the Line, This is actually part of the reason that the die areas of npus and phones are still relatively small. Hardware. Manufacturers would rather have enough inside the phone to enable AI features, but then figure out exactly what the use cases are before they dedicate more hard Hardware to AI you’re. Also, seeing this on the desktop and laptop side of things with both AMD and Intel coming out with consumer processors that include npus and the idea is that features like Windows. Studio Effects will run on device. So your video calls look a little bit nicer, But, as time goes on, both PC and phone manufacturers are aiming to get more and more AI functions, running locally you’re.
Already. Seeing the push for this with how both team red and team blue have partnered, with a number of outside software developers to make applic that can take advantage of their npus while it remains to be seen what AI features will become Mainstays, it’s clear that your gadgets Are going to have significantly more brain power going forward for better or for worse, if you guys enjoyed this video leave a like or a dislike, depending on how you feel check out our video on the hardware that runs chat GPT if you’re looking for something else To watch and leave a comment: if you have a suggestion for a future video and, of course, don’t forget to subscribe. .