Apple machine learning projects span almost every aspect of the company’s activities, but in a new speech at an AI briefing, a senior exec spoke specifically almost the benefits for accessibility and wellness.
Ge Yue, Apple VP and managing director of Apple Greater China, gave her speech at the 2022 Globe Artificial Intelligence Conference in Shanghai …
Apple has given a rare speech at a global AI gathering, with vice president Ge Yue choosing to concentrate on Machine Learning in accessibility features […]
The company has chosen to illustrate the technology through accessibility features in Apple Watch, and AirPods Pro […]
She said that “Machine Learning plays a crucial role” in Apple’s hope that its products “tin can help people innovate and create, and provide the back up they demand in their daily lives.”
“We believe that the best products in the world should meet everyone’south needs,” she continued. “Accessibility is i of our cadre values and an important part of all products. We are committed to manufacturing products that are truly suitable for everyone.”
“We know that auto learning can help disabled users provide independence and convenience,” she said, “including people with the visually dumb, the hearing impaired, people with physical and motor disabilities, and people with cognitive damage.”
Ge Yue gave the instance of the Assistive Bear on feature on Apple Watch, which the company introduced last year, alongside eye-tracking on iPad.
To support users with limited mobility, Apple is introducing a revolutionary new accessibility characteristic for Apple Watch. AssistiveTouch for watchOS allows users with upper body limb differences to savour the benefits of Apple Spotter without ever having to touch the display or controls.
Using congenital-in move sensors like the gyroscope and accelerometer, along with the optical heart rate sensor and on-device machine learning, Apple Watch can detect subtle differences in muscle movement and tendon action, which lets users navigate a cursor on the display through a series of hand gestures, similar a pinch or a clench. AssistiveTouch on Apple Lookout enables customers who have limb differences to more easily answer incoming calls, command an onscreen motility pointer, and access Notification Center, Command Center, and more.
She said that this utilized on-device automobile learning.
“This function combines machine learning on the device with information from the built-in sensors of Apple tree Picket to help detect subtle differences in muscle movement and tendon activity, thus replacing the display tapping.”
Apple views accessibility as one of the visitor’due south core values, and its tech tin brand a huge departure to the lives of people with disabilities. Ane reader spoke earlier this year well-nigh pocket-sized things making a big deviation.
I always thought it bonkers when using Siri on iPhones, for years users can identify a call by proverb “Hey Siri, call…”, but until now at that place’s been no “Hey Siri, stop call” control. Information technology lead to a lot of daily frustration as I can’t press the ruby-red button on the iPhone screen to hang upward a phone call, so this prompted me to campaign for it. I’m really glad Apple has listened and resolved the contradiction in iOS 16! Hopefully, it volition also be of employ to anyone who has their hands full.
That point is 1 others have echoed: Accessibility features may be aimed primarily at those with disabilities, but tin can often evidence beneficial to a much wider audience.
Apple also sees machine learning having huge potential for future health features, says Ge Yue.
Proverb, too, that “our exploration in the field of wellness has but begun,” she says that Apple tree believes that “auto learning and sensor engineering have unlimited potential in providing health insights and encouraging good for you lifestyles.”
Photo: Xu Haiwei/Unsplash
FTC: We apply income earning auto affiliate links.
Bank check out 9to5Mac on YouTube for more Apple news:
About the Writer