There is already an auto Shazam feature on iPhone but Apple is researching a way for Shazam to intelligently identify songs based on how your body reacts to music. As per a patent application for the technology, this could work with AirPods, a headset, a heads-up display on a vehicle windshield, and more.
Apple devices could identify songs based on your body responds to music with auto Shazam
After acquiring Shazam in 2018, Apple added a new Music Recognition toggle to iPhone that can help users quickly figure out which song is playing at a party or on the radio without having to open a separate app. In September last year, it was revealed that the app’s functionality had been used to identify over a billion songs straight from the Control Center in iOS.
In addition to this, you can also enable an auto Shazam functionality that uses your device to listen and log details of all songs identified. This feature is exclusive to iOS devices.
Now, Apple is looking to expand auto Shazam. On Thursday, the United States Patent & Trademark Office published a patent application (via Patently Apple) from the Cupertino tech giant that relates to a possible next-generation feature for Shazam.
More importantly, the patent describes an all-new feature that could determine that a user is interested in audio content by determining that a movement, such as a head bob, and trigger the app to identify the tune that you’re enjoying based on your head movement to the beat.
The method identifies a time-based relationship between one or more elements of the audio and one or more aspects of the body movement based on the first sensor data and the second sensor data.
For example, this may involve determining that a user of the device is bobbing their head to the beat of the music that is playing aloud in the physical environment. Such head bobbing may be recognized as a passive indication of interest in the music.
As per the patent, this auto Shazam functionality could work with a bunch of devices including headphones, an iPhone, a Mixed Reality HMD, an iPad, smart contact lenses, a heads-up display on a vehicle windshield, and more.
Device resources may be used efficiently in determining that a user is interested in audio content. This may involve moving through different power states based on different triggers at the device. For example, audio analysis may be performed selectively, for example, based upon detecting a body movement, e.g., a head bobbing, foot tapping, leap of joy, first pump, facial reaction, or other movement indicative of user interest.
As with other patents Apple files, it is important to note that the company files many applications on a weekly basis, so we do not know for sure if this technology will be implemented in Apple’s devices in the future.