AppleInsider is supported by its audience and may earn commission as an Amazon Associate and affiliate partner on qualifying purchases. These affiliate partnerships do not influence our editorial content.
Apple is researching how a future iPhone can save power by automatically pausing playback when you’re not paying attention to media.
Just about every Apple device bar the Apple Pencil can play music or some kind of audio, and most of them have microphones. Apple wants to use those microphones, and very many other sensors, to determine whether you’re paying attention.
Newly-revealed patent application “Proactive Actions Based on Audio and Body Movement,” could be applied to any Apple device, but it is specifically about stopping audio. And that’s specifically about stopping it in order to conserve battery power instead of wasting it on something you’re no longer interested in.
There are other possibilities, such as displaying information about an artist when you are detected as being interested. But as with most patent applications, this one is focused most on determining your lack of attention, and less on precisely what actions could follow.
Apple even says “various actions may be performed proactively based on the identifying the interest in the content” of the music or other audio being played.
Some of that identifying of interest could come from use of a device’s microphones, but this patent application is about combining information from multiple sources. It’s similar to the way that the new iPhone 14 leverages everything from changes in sounds, acceleration, and even barometric pressure to detect a car crash.
In this case, the device would perhaps start with head movement.
“The method identifies a time-based relationship between one or more elements of the audio and one or more aspects of the body movement,” says Apple. “For example, this may involve determining that a user of the device is bobbing his [or her] head to the beat of the music that is playing aloud in the physical environment.”
The patent application doesn’t say this, but that head-bobbing detection suggests this could utilize AirPods or perhaps even Apple Glass.
Apple also wants to detect body movement, such as dancing — or jumping out of your armchair when the Giants win the Pennant, the Giants win the Pennant.
“In another example, user motion is recognized as an indication of interest based on its type (e.g., corresponding to excited behavior),” continues Apple, “and/or the movement following shortly after the time at which a significant event, e.g., a touchdown, occurs in the audio.”
“The method identifies an interest in content of the audio based on identifying the time-based relationship,” For example, this may involve determining that a particular song is playing and that the user is interested in the song based on his or her movement matching the beat of the song.”
Most of this supposes that it is the Apple device that is playing the music, the podcast, or the live sports coverage. But the patent application also covers detection of external audio, such as when you may be dancing in the street.
That’s partly about whether your attention is away from whatever else your phone is doing, but perhaps mostly because of checking for context. You may be bopping your head as the iPhone is playing Taylor Swift, but maybe you’re on a subway train that happens to be vibrating in 4/4 time.
In that case, the iPhone would use what its microphones pick up to determine ambient environment sounds. It surely couldn’t be enough to tell the iPhone you’re on the N, Q, R or W.
Yet if the iPhone identifies the sound of train tracks, and you keep bopping even between the different kind of tracks that Taylor Swift makes, it all helps build up a picture of what you’re concentrating on.
The patent application is credited to three inventors. They include Devin W. Chalmers. whose previous related work has concerned directing a user’s attention in an Apple AR environment.