Hi, this is Wayne again with a topic “Android 12? 2021 Google I/O Keynote Highlights in 10 Minutes”.
All right, so the google i o keynote just wrapped up, and i just want to take a moment and tell you what they talked about. First, they focused on privacy. Then they focused on android 12 and wear os updates, and then they taught us a lot about android and what it means to have new advancements in technology with regards to ai. So we’ll talk about that a little bit more later, i’m luke pogba with android authority. Let’S take a look at what google announced at their 2021 keynote presentation, all right, so, starting out the event, google talked about a couple of new features and updates to google maps, the first of which is safer routing and the second of which is more eco-friendly routes. That basically lets you be environmentally conscious, while you drive now, apart from this, google also talked about a new system called smart canvas, which is basically a task collaboration based system similar to notion or asana that lets multiple, bigger teams. Collaborate on single projects, while having google meet on one side and having their documents and other things on the other. This is pretty cool and google is also integrating a bunch of different things, such as built-in noise cancellation and camera zoom adjustment to make sure that users are properly well lit and can be heard properly in every single meeting. This also comes with a couple of new features that lets you basically customize your view and move around documents and people to you like, which is really sorely needed in this industry.
Now this won’t be just available to businesses. It will be available to everyone upon release, but i’m not entirely sure when that will be from there. Google talked about a handful of different items such as google translate and how interpreter is now used four times more than it was last year. They also talked about google photos and how it can recognize more information in the photos, but the thing they really wanted to drive home was google’s new ai technology called lambda, which is stands for language model for dialogue applications. Basically, this new lambda technology is meant to provide users with a more human-like interaction when they talk to the google assistant. Google is developing this technology and this could lead to more natural conversations with the google assistant. It responds with logical and consistent answers rather than raw data. So when you would ask it like how’s the weather, it would say it’s getting warmer, so maybe you should wear a t-shirt. They also provided a couple of different interactions, such as talking to pluto, as if the ai was pluto and what it was like to be pluto. Another example was talking to a paper airplane what it was like to be a paper airplane and fly and different facts about paper airplane.
The technology is really early and google did stress this, but it’s really cool to see this adaptation and i’m curious to see how well this will be implemented in future years from there. Google also talked about new ai advancements that let you basically search something such as take me to a beautiful location and maps automatically saying here are some beautiful locations for you in your area or saying hey i’d like to take a look at this specific part of This video, where so-and-so, did this and google ai using that information to automatically go to that specific point in the video goo also talked about a bunch of advancements in their quantum computing and used michael pena. The celebrity to basically uh show you what you wanted to know about all that information. I won’t pretend to know what any of it meant, but if you want to learn more about it, i’ll leave some links down in the description below this year. Privacy was a big focus for google and they talked about a couple of new features, starting out with a new secure by default mantra and a password-free future, as well as more phone-based authentication. What this means for you is that there were new updates to things like the password manager and a new auto delete feature that basically deletes your data after 18 months, and this will automatically be set up for new users. There was a new 15 minute delete, search history button and a couple of other updates to the password manager tool that lets you import from other password managers and whatnot. Lastly, there was a quick fix tool that basically lets you take a compromised password and have it automatically updated by google in order for you to have no lapse in coverage when it comes to a secure password.
Now continuing with the google ai theme, google talked about a couple of updates to google search, the main of which is called mum. Mum stands for multi-task, unified model and apparently is a thousand times more powerful than the previous search model called birth. It requires a new, deep world, knowledge generates languages and trains on 75 plus languages, and the main thing this is going to be used for is trying to find new information in photos or screenshots. So they gave the example of taking some photos of boots and asking the question: can this be used to hike mount everest and then google will tell you? Yes, this can be used and give you information on where to find it and more articles and reviews and whatnot.
Now, whether or not this will be useful in real world life remains to be seen, but i could see this being used in a couple of edge cases right now. I’M really interested in film photography and one of the key things that i’m struggling with is identifying what camera models are in facebook marketplace posts so take, for instance, i’ll see a post. The photo is a little bit blurry, so i can’t tell the model, but this could be really useful, as i was able to take a screenshot and have google lens say: oh yeah, that’s a leica m2 and then i can look up the rest of the information From there there is that integration already, but it’s not as set as stone and well developed as google wants it to be. Now there were a bunch of different updates that i won’t cover in too much detail just because that would take too much time.
So if you want to know more, there will be links in the description, but there was some talk about google ar and how it has more support for pro players which lets you see. The pro players do stuff in real time. There’S also a new source vetting feature that lets you get about. This result results uh to basically prevent fake news from being spread, and there are some updates to maps including a new live view, feature for indoors and detailed street maps that show you crosswalks.
If you’re in a big city, they also had some updates to online shopping and new shopping graph. That basically gives you detailed information about reviews and even the sku that was used. There’S also a new cart feature that lets. You see all the carts that you have open inside google and whether or not you want to check out or not now, a new feature that was quite interesting was an update to google photos.
Called little patterns basically uses ai to match up very similar photos from over the years or within a group of different photos. This is really cool and there’s also a new feature called cinematic moments that basically uses machine learning to animate photos that have small movement and gives you a kind of animated pseudo animated photo. Now.
The thing you’ve probably been waiting for is the new updates to android 12. android 12 introduces something that they’re calling material you, which is basically you as a co-creator and basically lets you alter so many different small aspects of the ui there’s a new feature called color Extraction which basically adapts to what you set your wallpaper to and creates a color palette for your entire ui there’s a new lock screen that basically shows your notifications, and when you have no notifications, it just shows you a giant clock new animations for where you pick Up the device, so let’s say i picked up the phone, it would show me an animation that lights up the phone from the bottom left hand corner to the top right. If i turned on the phone using the power button, there would be an animation that lights up the phone from the right to the left of the screen.
And finally, we now have integration for the google assistant on pixel devices that lets you hold down the power button so that you can get the google assistant. There are a couple of more features, such as a privacy dashboard that lets you see at a glance what apps have access to what and there was a new feature that lets you change the microphone and camera access for specific apps in the quick settings panel of The notification slide: i won’t go into too much detail about android ui as a platform, because we’re going to have a more specific video covering android 12 beta that will be released in a couple of hours, so just make sure to keep your eyes out for that. Video wear os also saw some similar changes to android 12 and finally, we’re getting some new updates for wear os on honestly, a system that was kind of left in the dust so to speak. There’S a new unified, wear os platform that is developed in a partnership with samsung, it’s unclear as to whether or not where os is becoming tizen or tizen is becoming aware os, but they’re partnership and there’s a new, unified, open source system that developers can make use Of this, new partnership with samsung aims to focus on a couple of key pain points, the main of which is battery life performance and the app ecosystem. There’S also going to be major visual changes coming to the platform, along with new tools for developers to build. Basically, a more user-friendly and competitive ecosystem.
There are a bunch of other small updates to wear os, such as youtube, music and spotify, allowing for offline support. Also, there’s going to be google maps getting a new user interface and turn by turn directions feature. This should come in early june 2021, but the spotify system, it hasn’t been told exactly when that’s gon na happen. The keynote did take a more serious note, as google talked about its computational photography and how the smartphone industry has had some trouble accurately representing people of color and their skin tones to combat this, google is working with dozens of photographers and other color analysts to provide More information and better accurate computational photography that will accurately represent color when it comes to more diverse people groups. These experts are capturing thousands of images that will then head to google to help diversify the pool that feeds its machine learning processes. Finally, google is admitting that there is a problem and is doing something about it and that’s pretty much.
It google wrapped up the keynote presentation with a couple of more things, the first one being an ai-based technology that focuses on skin conditions and dermatology. Basically, the way it works is you can take three different photos of a skin condition and then google will list off what your options are for that skin condition. It will tell you more about the information and what your options are for health care providers.
Also, google t is something called project starline, which is a new 3d in real time, face. Video call – and this doesn’t really make sense – so i’ll – put something up on screen here, but basically it means that you can have a face-to-face video call and it’s in a 3d space. So it looks like the person is actually there with you.
I have no idea how this works and google was really not in depth on how it works, but i’m excited to see how this technology is implemented. In the future, i mean heck, even youtube. Videos could be viewed this way and that’s something that i think is really cool again. That’S gon na wrap it up for this video and recap: if there’s anything that you found interesting about android, 12 or wear os or the new privacy features, do let me know down in the comments below i’ll be hanging out there for a bit again. Sorry for the rambling, i’m luke pollock with android authority and i’ll catch you in the next article .