The Google Tech You Don’t Know About

The Google Tech You Don't Know About

Hi, this is Wayne again with a topic “The Google Tech You Don’t Know About”.
We’Re bringing an old school, listen team crispy, it’s got old man Edition. What’S going on guys, the driver’s consumer, and today we got a really cool chance to go to the Google headquarters and check out some really interesting stuff that they’ve been working on. We saw the secret things that we’re not allowed to talk about. Well, we can talk about. No, no, no! No! No.

You can’t believe we’ve already said too much, but no honestly, we did see a lot of cool stuff and I’m talking like top secret stuff. We got to go into Google’s Quantum lab and meet all kinds of Engineers working with quantum computers which look like chandeliers, but they’re over a hundred million times faster than the world’s most sophisticated computer we have today. You know we should probably get one of these in for, like a massive Tech. Unboxing put all jokes aside. This is the kind of stuff that the future is made out of, and you probably wouldn’t believe me, but these things have to be cooled with a temperature near absolutely zero, which is just mind-blowing to think about. But we saw some other stuff as well, like the Google Street View garage tour, and here we saw like the earliest iterations of the cameras and vehicles they use to map out and capture the Street View in Google maps to the kind of tech they’re.

Using now – and it’s just crazy to see how far they’ve come from equality and size perspective, and we also got to tour the X lab – and this is the area where they work on the craziest most outlandish Tech. I’Ve ever seen to solve some of the world’s largest problems, and it’s all really cool stuff. We’Ve been hearing about a lot of the cool things that Google’s been working on, since I think like IO earlier this year me personally, I was a really huge fan of the. What is it real tone, real tone, which is pixel 6 for now, but the idea is: is that they’re using machine learning to better, I guess expose for skin tones because, obviously that’s a huge problem, so these cameras don’t really expose well, for you know, darker skin Tones and Google takes that into consideration with this whole new method.

So that’s just something that really stood out to me, because I’ve been complaining even on the channel for years now that cameras. Sometimes it feels like they ain’t made for us, so it just feels really cool that Google’s solving problems that a lot of people don’t really even know exist. Well, I think the thing that a lot of this stems from is the machine learning right and I think, up until this point, a lot of it has been very theoretical. It’S been very like, oh, you can process blah blah faster. It’S not really relevant yeah. You don’t really see Within These Keynotes.

Sometimes, what the benefit is to the end user, how it really is going to impact us and our everyday lives, but now we’re starting to see these things come into fruition, uh, which directly impact us, so machine learning is actually starting to become a huge benefit Beyond Just something like the cool techy data things to hear about think about taking a photo right when you’re trying to capture stuff on a phone. It’S not just snapping a still like it seems, there’s a lot of processing that goes into before and after you press that shutter button yeah and when you think about it from the context of what they’re doing, with specifically with the skin tone. It’S able to take and go okay cool here, we’re going to adjust this one up down left right, whatever the case is, but to try to maximize what it should do on each side, which is so cool and that’s like a practical, real world benefit. Some of the other stuff that we saw today in regards to audio is another thing: that’s being done on device that can help people who may not be able to have sort of normal speech, but you can still be easily understood with some of the stuff. Like that’s the real world practical benefit, but when you train these neural networks like it makes a big difference and we’re starting to see a lot of this stuff come out yeah absolutely so what Austin is referring to is actually called project relate just the name in Itself, like you know it relate you know, because it can relate to some of the issues that a lot of people are having that you just don’t know, needs a solution like what Google offers. So it’s like this cool app that someone with a speech impediment can speak into the app kind of like what we see with Google Translate and there are like three different outcomes you can have it translate the speech into text or into a voice that is synthesized That can be more more easily understood for folks who aren’t used to hearing that kind of speech, or it can be used to control a voice assistant.

What they’re doing is essentially an extension of what they’re already doing with stuff like Google assistant right well, essentially, what they’re doing is they’re making an offshoot of that to then focus on you and your specific speech. So if you have a slightly different way of speaking that maybe isn’t being picked up by a normal algorithm that has been trained on quote-unquote normal voices, well, the way that they do with project relate. Is you sit down and you train the AI model itself and then they’re able to build a custom model attached to you, which is going to be linked to your Google account and then processed live on your device? This is not something that would be possible without the massive amount of hardware and the software that they’re developing right. So I think it’s worth to do this for, like that’s very, very cool, so I’m curious as someone who obviously spends a lot of time with tech.

What’S your level of comfortability with Google and the amount of data right so for me personally, I’m completely comfortable unless they start synthesizing an image of me and making these like political videos that I didn’t actually shoot. I feel like Hey listen if this is going to make my experience better sure what about you? I guess I’m a little more unsure of things so at its core, Google is an ad company right right. They have a huge amount of incentive to be the information company, where you get information of all kinds, to make Google search better to make all these kind of bits of information, whether it’s YouTube.

The Google Tech You Don't Know About

What are the cases right, but ultimately it does kind of come down to selling ads. Now I don’t think either of us can sit here and with a straight face, tell you that that is bad. We would not have jobs if YouTube ads did not exist, but I do think it’s worth considering the data you give and how it is being used not to say that you say no to it, but I think it’s it’s it’s not a absolute no-brainer.

The Google Tech You Don't Know About

Yes, I’m gon na. Do this all the time? No, no! That’S fair and I wouldn’t say: hey give all your data, don’t ask questions. Just go ahead: yeah um! What they’ve done and yeah there are going to be things that improve my experience, based on having voice data yeah. You know the voice day that I don’t I’m fine.

The Google Tech You Don't Know About

I don’t worry about too much, because any AI model that wants to learn my voice can just watch about 3 000 videos, so I’ve kind of given up on my voice being private right um. You know we shouldn’t give up on the camera. They just ran at that yeah. Let me go fix that real, quick, hello, welcome back to the Austin and judge, show yeah we’re currently taking donations to take the show on the road. So, if you like, is it, but it was cool to see some of the other things that Google was.

We talked so much around what we actually saw well, no, because we can’t talk about the secret things we saw walked out, they mind, wiped us now. One big thing that we got to see was actually the x-labs, so the x-lab is basically where Google finds a problem and they’re basically trying to solve it there through technology yeah and we’ve seen lots of different things. Come from this uh glass, Google, Google, Glass, waymo.

The self-driving car project, a lot of what makes Google cool is that they fund these Moon shots of Designing and making wild ridiculous things that, on the surface, don’t make sense right I mean we walked through a lab. Where, again, I don’t think we got footage of most of this, but like they have dummies next to water, filtration systems, next to robots. Next to Honda water cameras, it’s it’s like! They need to be able to prototype anything because, depending on the problem that comes up, someone goes hey.

You know what it’d be great. If I had internet in the middle of the desert, how do we do that? I don’t know. Oh, let’s go build a bunch of weird and see what happens, but I can appreciate that you know I can appreciate looking into what problems they can solve for people who, you might not know need that problem solved. It was pretty wild but uh.

It was actually a really good time how about the campus? Okay, the Bayview campus is the most outrageous building I’ve been in in my entire life: solar roof, insanely, huge ceilings, thousands of people working inside yeah, unbelievable what they’re, what they’re doing here and that took them. Seven years to start to finish from the idea to building these but yeah, we had a really good time at Google hanging out and seeing just all the crazy stuff that’s being worked on, but until next article guys, hopefully you enjoyed it. Thank you awesome Evans. Thank you Justin and Shout Out Justin hold on the camera. I know you listen and Justin’s tired, too yeah yeah.

This is an old school video all right, but yes, huge shout out to both of you guys. I appreciate you and we’ll catch you guys in the next one peace. Oh .