Technology

Watch Apple’s Siri blaze through requests with on-device processing


During the privacy segment of WWDC, Apple talked about moving Siri’s processing from the cloud onto your device, using the “Neural Engine” built into Apple silicon. While having the voice processing happen on your phone instead of one of Apple’s servers is obviously better for privacy, it can also help speed up performance and reliability, as Apple showed off in its demo.


The power of on-device learning.

Now let’s see how fast it is when I try it.

Compared to my demo, Apple’s is decidedly more snappy — partially because I don’t have to deactivate Airplane Mode each time I turn it on. (My phone still requires an internet connection for the requests that come after, but the on-device model doesn’t.) Full disclosure: my demo took a few takes, and the first few times the phone did warn me that turning on Airplane Mode would make Siri inaccessible, and I had to tap the switch that turns it off, since I couldn’t do it with my voice.

Apple processing Siri requests on-device should help its users be more confident about the privacy of their data: back in 2019, we learned that contractors were listening to some Siri requests, something that wouldn’t happen if those requests were being handled by your phone alone. While Apple eventually tried to make that situation right by being more transparent and making Siri recordings opt-in, handling more Siri requests on the phone is a good way to make the service a little more trustworthy.



READ NEWS SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.