Apple makes iPhones and iPads more accessible with AI | TechCrunch Minute

Apple makes iPhones and iPads more accessible with AI | TechCrunch Minute

Hi, this is Wayne again with a topic “Apple makes iPhones and iPads more accessible with AI | TechCrunch Minute”.
Apple is using AI to make iPads and iPhones more accessible. Today is global accessibility awareness day, so Apple has announced a bunch of new features that should make iPads and iPhones more useful for users with disabilities. If this works as promise, this could be huge for users who are blind or can’t use their hands or have other disabilities. It could also benefit able-bodied users as well. The new features include built-in eye tracking for iPhones and iPads vocal shortcuts for easier handsfree control and music haptics that allow you to experience songs Through tap textures and vibrations. There are also some interesting upgrades for users in cars, such as vehicle motion cues, to help with motion sickness where you’ll see animated dots on the edge of the screen.

Apple makes iPhones and iPads more accessible with AI | TechCrunch Minute

That move with the motion of the car and carplay is getting voice control. There will be larger Bolder text for color blind users and notifications for deaf or heart of hearing users when there’s a sound like a car horn or a siren, but let’s get back to eye tracking, which is one of the most impressive new features. This means you’ll be able to use the front-facing camera on your iPhone or iPad, with an a12 chiper later to navigate your device without any additional Hardware or accessories. So you’re actually moving your eyes to navigate a website, and it should also work with third party apps under the hood apple is using AI to understand what the user is looking at and what gesture they want to perform, such as swiping and tapping. Other features include dwell control, which can tell when a person’s gaze has settled on something indicating that they want to select it. If some of this news sounds familiar, it’s because tech companies have been making accessibility announcements all week. For example, Google says its Gemini technology will be able to describe images for blind people.

People and open aai says its latest model can even describe the environment around someone startups are trying to address these issues too. 11 Labs, which makes AI voice technology recently released an app that can read web pages, PDFs and other documents, and you can choose from 11 different voices with every tech company pursuing its own version of AI and releasing product after product. It can feel overwhelming like who really wants this, but the benefit of these kinds of assistive features seems really obvious and really huge.

Apple makes iPhones and iPads more accessible with AI | TechCrunch Minute

Of course, when you’re talking about these kinds of essential tools, the bar gets higher because the consequences are greater if something breaks down. So we’ll see who actually meets that bar Amanda is hosting tomorrow. I’Ll see you next week.

Apple makes iPhones and iPads more accessible with AI | TechCrunch Minute

.