Sprinkled around Apple’s iOS, iPadOS and macOS releases is a range of apps and enhancements that include machine learning at its core. Many were not revealed onstage, and some apps that almost definitely utilize AI were not listed as well, but here’s a brief rundown of the most influential mentions we’ve spotted:

Facial recognition for HomeKit: HomeKit-enabled smart sensors will use photos that you labeled on your phone to identify who is at your door, and even announce by name.

 

Native sleep tracking for the Apple Watch: It utilizes machine learning to analyze and track the gestures while you’re awake. The same process helps the Apple Watch to also monitor different behaviors such as dancing.

 

Handwashing sensors: The Apple Watch senses not just the action but also the sound of handwashing, triggering a countdown timer to make sure you ‘re washing for as long as you can.

 

Translate app: Due to on-device machine learning this operates absolutely offline. It detects the languages that are spoken, and can even translate conversations live.

 

Sound alerts in iOS 14: This accessibility function hasn’t been listed on stage, but it will encourage your iPhone to listen to items like doorbells, sirens, barking dogs or crying kids.

 

Handwriting recognition for iPad: This was not clearly defined as an AI-powered feature, but we would be betting dollars on donuts. AI is good when it comes to visual recognition activities and it is a appropriate opportunity to recognize both Chinese and English characters.

Take Away

This list contains absences — most notably Siri, the perennially disappointing digital assistant to Apple. Though Siri is AI-heavy, this year it mostly received cosmetic updates. A revamped design is definitely a good improvement, but when you equate Siri ‘s cumulative success with other AI helpers, it’s small fry.