What does a company that’s already created excellent accessibility solutions and is preparing to introduce a mixed-reality device do when it comes to gesture controls? Make a smart ring to control the experience, apparently.
Apple has invented one ring to rule them all
We think we know Apple will soon introduce its first take on mixed-reality glasses. It has been developing the UI components toward such a launch for years. We don’t really know when they will appear.
This speculation has become a gift we’ve waited for so long now it’s unwise to guess, though some say we may get to see these things this fall, or perhaps in January. Most don’t expect Apple’s $2,000 goggles will ship until later than that, and the company apparently predicts 1.5 million sales in year one.
Powered by M-series chips, these things will likely consist of a combination of Apple’s existing tools around gesture recognition, voice, movement, and eye direction. And it looks possible the company has come up with some way to make gesture commands more effective. It recently won a patent for a specialized smart ring system that can be used with mixed reality applications.
This is all very Minority Report
The patent, noted by Patently Apple, relates to a smart ring being used alone, in pairs, or along with an Apple Pencil. That last piece seems noteworthy; I can’t help but imagine it means you’ll be able to write with your pencil in mid-air to create documents and other assets within your virtual environment — if not immediately, then certainly down the road.
The rings carry one or more self-mixing interferometry (SMI) sensors that can detect and identity gestures. Think about the clench gesture you can already use to answer calls using your Apple Watch as an example of gesture recognition.
The patent shows several different implementations, including use of a wearable device with its own built-in sensors, and with one or two of these patented ring controllers. It describes these systems as having the capacity to understand gestures such as pinch, zoom, rotate.
Why you need a user interface for glasses
In a sense, Apple needs to build something like this. If it wants to nurture a rich developer opportunity, it must create systems capable of delivering experiences equally as complex as those we get when using a mouse and keyboard or MultiTouch.
At the same time, it requires gesture recognition to be accurate, which might be where these Apple rings come into play. Enabling its glasses with the capacity to understand finger movement and hand gestures seems a natural reflection of the work the company has been doing since the invention of MultiTouch, or, indeed, the adoption of the mouse in the Mac.
Existing devices on the market don’t offer such nuanced controllers, which limits cutting edge ways to make real world use of such systems. Once the company does introduce a mixed-reality system that provides this kind of usability, it will be offering an enterprise platform for multiple uses — training, navigation and more.
Build them and use cases will come
In 2021, Deloitte described some of the use cases we can anticipate from such machines.
“Apple’s entry into the eyewear market will be the game changer for all participants as the technology gets normalized and popularized. Apple has a long history of disrupting new markets and ultimately growing the addressable market size well beyond initial expectations,” wrote Morgan Stanley.
“The arrival of new smart glasses in late 2022 and throughout 2023 puts AR devices on the cusp of becoming an everyday technology,” said CCS Insights analyst Manning Smith.
All Apple needs is one ring to bind use of them.
Copyright © 2022 IDG Communications, Inc.