Hi, this is Wayne again with a topic “Self Driving Future! Snapdragon Ride & Qualcomm AI Engine”.
So when you think of qualcomm, it’s likely you’re thinking about this, but they’re a lot more than that uh. In reality, this is qualcomm. So is this this, and even this and now qualcomm has kind of taken this new thing out of hiding something that i never expected them to. Try and what’s even more surprising, is how well it actually worked.
So if there are two things that i know that i’m always going to enjoy uh it’s going to san diego and hanging out with qualcomm. So i headed down to take a look at some of the latest projects that they’re working on and typically i’m just going. There to hear about their latest chipset and how they’re pushing sort of that platform forward what they’re doing with ai and cameras and megapixels, but lately they’ve been throwing some big time, majorly curveballs. So, last time, if you saw that video uh, we were jumping rc cars over me and i got a chance to race them over a mile away with the phone using 5g on was the time their new snapdragon 888. So i got a full like race suit and everything so like.
I said it’s never a bad time, uh hanging out with qualcomm this time i wanted to show off their latest advancements with their ai and they’re, using what they’ve learned from the smartphone world, along with their snapdragon 8 gen, 1 to level up camera software, audio noise Cancellation, even autonomous drivers, so all noise cancellation stuff was incredible, had a chance to go into a room. I’Ll tell you about it that i’ll never forget, but the autonomous driving was unbelievable, so they showed us a vehicle fitted with their newest snapdragon ride platform technology they’re, calling it their snapdragon ride, vision system. So as someone who’s experienced autonomous driving in other cars, i’m excited to see just how good qualcomm could make it spoil alert. It was really good, so we’ve seen other autonomous level, 2 plus level 3 autonomous cars out on the road using various technologies, some using lidar.
Some using radar, so i’m using vision, so i’m using a mixture of all those technologies. Sometimes they work. Sometimes they don’t.
I generally have equated autonomous driving with like a 15 year old who just got their learner’s permit. Sometimes it’ll drive perfectly sometimes they’ll try to make a left turn to traffic and might kill you. So i went in with a bit of skepticism about how this was all going to work. I’Ve experienced, you know, semi-autonomous and autonomous cars before so i knew qualcomm was already working with a lot of car manufacturers on this platform. What i didn’t realize was how many and how far along they were so they’re working with renault ferrari, even bmw, which qualcomm’s modems are already in over 150 million vehicles on the road.
Today, your car might even run on qualcomm solutions. I don’t even know it uh. So what the snapdragon ride platform is it’s a set of cloud-connected platforms meant for driver assistance and autonomy, so the car gathers data is actually vision and radar combined. So there’s cameras mounted around the car, so it’s using vision and radar instead of relying on lidar.
So i did ask why lidar wasn’t in there and they mentioned the cameras and radar were not only getting the job done, but doing it pretty well, plus they clarified that lidar is still really expensive and didn’t see a good reason to use it for prototypes over Vision and radar and as it appears, this was kind of tough to argue with those results and because this is all sort of ai, that’s really qualcomm’s secret sauce. It’S constantly learning and updating with every single drive. Sometimes, when you see demos like this only works on a closed track or a very controlled environment, we’ve done these drives in the past, where you’re inside of a building or you can only go on a certain loop. With this, we went right on a busy highway in san diego, with some very heavy traffic and handled everything very very smoothly. So i liked first, this was a real drive in real streets on real freeways with real other drivers. There was nothing staged about this, so they did was they had a driver sitting in the driver’s seat.
He got us on the freeway, then activated the system, we sort of just heard a single ding, and then it was hands off and were able to drive anywhere. We wanted to go on the freeway, it changed lanes, it avoided collisions it. Let cars pass in front of us that passed cars are moving too slow. It drove as a normal experienced driver would, and that’s the best compliment that i can give it now. They did tell me that this can work anywhere. They wanted it to go, but they opted to. When we got off the freeway to turn the system off, they said it could work on unmapped roads didn’t have to be already mapped highways that it could and will and should work pretty much anywhere and again. The best compliment i can give this is that it was remarkably unremarkable i felt like i was just sitting on the seat with somebody else driving and i think ultimately, that’s the vision for autonomous driving is that you can sit there and not think about it.
Now that was a few minutes where i was like this thing’s driving itself, and that is still a crazy amazing thing, but once that novelty wears off and the car is just driving, it drove like anybody else, and that was awesome. The fact that qualcomm had the confidence – not only let me put this on camera, but to take it out on a very crowded, congested san, diego freeway. It says a lot about how far they are and how confident they are in the technology. So one thing that was really cool that i hadn’t seen before was this car was doing predictive behaviors. So we saw a ui that probably is not a production intended ui, but we’d see cars up ahead on the cameras, then we’d see arrows come in their cars with the right or left meaning. The system was assuming that those cars were going to go left or right now it didn’t always happen, but the cars preparing for those situations and if it did happen, the car was ready to react and react very quickly. That was awesome.
Also, things were able to see, see things like lane changes, system, visual, detections of other vehicles on the road and all those predicted behaviors and also see sort of renders of other cars. It’S all awesome and it’s all very cool and confidence building to actually see, and this technology is on the road now and it’s only going to get better. So this isn’t like qualcomm’s going to be making their own.
You know qualcomm cars. They explain that their snapdragon ride efficient system could be fully adopted by car manufacturers as is or they can combine sort of their own drive policy stack uh to work with it, giving their systems level two or even level. Three autonomy almost instantly so they’re, creating a system that can give a solid framework for autonomous driving going forward that can grow and adapt as ai engine continues to get better and again, i want to say this was not a staged or a canned demo. I know this video is sponsored by qualcomm technologies inc, but we were out on the freeway and they were confident being on the freeway and anything could have happened out there.
We could have seen intoxicated drivers, we could have seen construction. We could have seen a lot of sort of other driving hazards and the fact that they were confident enough to. Let me keep rolling and go out on these freeways, says a lot about technology and how far along it is uh. So i thought the autonomous stuff was pretty incredible, but it was not the only kind of groundbreaking tech uh that qualcomm showed me. They gave me sort of a tour behind the scenes of a lot of their labs things that most people don’t see or even think about. So listen.
I appreciate good audio as much the next guy when i hear good audio on a phone or headphones on about you, but i don’t spend a lot of time. You know thinking about how companies are going to fine-tune that audio. I just think i assumed audio experts tweaked a few levels here and there behind the scenes and it’s good to go, and obviously i know it’s far more complicated than that. But what i did not expect was qualcomm takes such a high level scientific approach to the process of audio fidelity and noise cancellation.
I think qualcomm’s been investing in ai for well over a decade and they’re, currently on their seventh generation of the qualcomm ai engine and they’re, putting in so many things. It’S in phones, vacuums, pcs, vr ar headsets automotive, even vehicles that have flown to freaking mars and they’re, utilizing a lot more than just the socs they’re, also using cloud services and 5g to make all these advancements in ai possible. So it took us behind some doors that needed multiple key cards uh to enter. So there were some small chambers with robots that looked like robots, but had super realistic human ears. If you think about robot face human ears a little bit creepy uh, but it did show how they test acoustic measurements for qualcomm audio, and that was awesome. Then i got to go into an anechoic chamber and you’ve, probably seen pictures of these they’re filled with noise. Reducing foams that was nuts i’ve never been in one of these before it absorbed wavelengths down to 160 hertz, basically translates the quietest place that i have ever been. In fact, it’s the quietest place in all of qualcomm, there’s a ton of equipment in the room and when they closed the eight inch thick door, all the sounds you could feel it getting like sucked out and when i held my breath, i hear my own heartbeat, Which is a very weird feeling: they use this chamber to measure microphones and speakers to ensure they have the cleanest and most precise, audio tuning environment possible and they’re, taking this level of audio testing and putting them directly into phones into microphone systems.
So that was cool, but the next room was sort of how that measurement has been implemented in real life uh using essentially noise cancellation, listen, we’ve all been on a ton of zoom calls meat calls team calls whatever, where there’s babies crying people eating chips. In the background there’s people walking in the back of a coffee shop, it gets loud uh when they turned this on all that noise went away and they probably had the world’s worst white noise machine turned on all sort of those things that you try to drown Out – and it was absolutely incredible how well this worked – i mean just watch this or listen to this, and i was seeing this live. This isn’t like we didn’t doctor this in post. This was an actual demo of all that background noise completely being taken out and beyond that, and not sounding sort of robotic or tinny i’ll go ahead and do some babies crying i’ll go ahead and talk while i’ve got some dogs barking, we’ve got a police siren Test one two, three and finally, so here’s kind of the raw audio that you would normally hear right. Yeah can’t hear anything can’t hear anything. So here’s the cleaned up version, one two: three, i’m talking! That’S crazy! That’S like magic! This is all done using qualcomm’s voice.
Communication suite and using sort of ai baked into the chip sort of handle all of these things, and it was pretty cool to actually see it happen. So next they took us to a basement which was a weird place to be looked a little bit like the bat cave they showed us. I don’t know i’ll subscribe it. Look at this thing.
It’S a. I used to call it an orb of speakers uh that i got to step inside and they test and calibrate how audio is heard to both the human ears, as well as measurement devices and taking extra steps to ensure sort of high-grade audio quality. That’S on their mobile platforms, but outside of it as well again, all using their qualcomm ai engine. So we also saw how their ai engine’s being used for cameras as well as running in the background. Without you even realizing it, the camera could accurately detect and track every motion. I was doing with both body and facial tracking things that you would notice on the user side like fast autofocus, realistic depth of field for video portrait modes. It looked really really awesome to see and to have this stuff and it’s all made possible because of that 7th gen, qualcomm ai engine and it’s at the core and driving all of these on-device experiences, and not only on the snapdragon 8 gen 1 mobile platform.
But to other industries and ai platforms as well, and we talked about ai so much to become like a buzz word. It’S ai! It’S ai! It’S ai sort of seeing ai, constantly, improving and seeing the use case as a consumer was awesome. This is going to filter down to you, making a phone call from a train station on an airplane or boarding an airplane. There’S a lot of noise and the person on the other end hearing you. It might not be something that you think about something. That’S going to be there and working because of qualcomm and because of ai or next time you try to shoot a video. You know using the latest android phone and it looks crisp we’re using portrait effects and video and it’s doing a great job with the background blur that’s going to be because of ai or you’re trying to get that shot of your kid or running really fast.
The autofocus is quick enough that you get that shot. It’S all going to be because of ai. It’S all going to be because of these things. That qualcomm is testing, so might not think about sort of how the product gets made, but you will appreciate it when you actually use it. .