Hi, this is Wayne again with a topic “Snapchat’s new AR camera can identify the world around you”.
You probably don’t think of it this way, but snapchat is actually one of the most used cameras in the world. More than 5 billion messages are made every day with its camera and right now that camera is used to mostly send photos and videos between friends. But what if it was more than that? What if it was sort of like a visual search engine, something that you could just point at say a plant and tell you what the plan is or identify a shirt that you see. That’S the idea behind scan a feature that snap is seriously upgrading and putting front and center in its camera.
Let’S try it out scan does just what the name implies. It scans the world around you through the snapchat camera. Its main goal is to help you understand.
More about what you’re, looking at like the nutrition info for whatever you’re eating or drinking based on the scene, you’re trying to capture scan can also suggest augmented reality effects, which snap calls lenses along with specific camera modes and soundtracks these camera shortcuts. Let you point the camera at the sky to summon sky-specific lenses or conjure an ar companion and music when you’re, recording dancing snap says that scan can detect hundreds of dog breeds thousands of plants, millions of songs and now similar types of clothing. You can shop for all without leaving the app with a new update to snapchat scan, sits directly below the main camera record button. It was more buried in another part of the camera interface before, but now snap thinks it’s smart and useful enough to deserve prominent placement.
Snap is mostly working with a bunch of other companies to power scan right now. The app vivino powers, its wine scanning plant, snap powers, plant scanning and so on and more scan abilities are coming soon. The website all recipes will let you scan a specific food ingredient and see, suggestions for recipes to make and snap wants to add scanning of furniture and other categories of objects down the road. One big area of focus for scan is suggesting similar clothes for you to shop, for snap is building this tech in-house and based on my experience, it still has a long way to go before it becomes something i think people will want to use, often ralph lauren Top, that’s not right at all. Snap hopes that if more people use scan, it will be an important way to discover the ar lenses its creators make.
That makes sense to me, because it can be honestly overwhelming to find the right lens. In the moment, when there are so many getting made now scan as it exists today honestly needs work and it can be a little clunky to figure out initially, but the idea is that it will get smarter over time. Is this right, pink, quill yeah? That’S right! Yeah: okay got that right to train scans.
Snap uses google’s open images, data set its own machine learning models and snapchats. Our story feature which crowd sources, public videos, people choose to submit for a specific event or theme. No data from scan is used for ad targeting and snap told me it isn’t focused on monetizing the feature, yet it’s easy to see how it could make money with more shopping or advertising tie-ins down the road, though other companies like google and pinterest have tried similar Visual search features before, but those really haven’t caught on and there’s a similar risk for scan here, snap will need to keep improving its functionality and accuracy before it becomes a truly integral part of the app. During my time trying it, there were many instances where scan incorrectly identified things or didn’t work at all, and the camera shortcuts were fun initially, but also confined to only a few lenses and situations like a sky scene or dancing. I think snap has a better shot at making visual search a thing on mobile phones, just because the camera is such a core part of the app experience over 170 million people already use scan at least once a month, and that was before snap put it front And center, in the camera like they are now you open your phone’s main camera, usually to take a photo or video and save it for later, but with snapchat you’re, usually opening the camera to have an experience with friends. So i think scan has more opportunities.
There scan gets a lot more compelling when you think about a future world with ar smart glasses. Like snap’s latest spectacles, i can easily see myself in a few years wearing spectacles that can scan things without taking my phone out at all. That’S pretty compelling for now, though, scan is pretty good but honestly hit or miss and fairly limited in what it can do. It thinks he’s an alaskan mel malamute. That’S because the color, a lot of people, ask that i think he’s part coyote. I think it’s going to take a while for me to get used to using my phone to scan and try to identify things in the real world, but snap has an early start at making it normal and hey pretty fun. Hey. All thanks for watching snapchat is recommending that i use this ar lens right now.
So here you have it. What do you think of scan? Would you use it? Let me know in the comments .