Using speech recognition, in future Spotify will be able to play songs based on users’ “emotional state.”
Both Apple Music and Spotify already have recommended music playlists, but they’re typically based on what you’ve liked before. Now Spotify wants to gauge what mood you’re in, and play songs that match.
According to BBC News, Spotify has been granted a patent that will let it “make observations” about a user’s voice — Â and where they are.
“It is common for a media streaming application to include features that provide personalized media recommendations to a user,” Spotify says in its patent, as seen by BBC News. But that requires users “to tediously input answers to multiple queries” about their music preferences.
Spotify instead proposes that it listens to users’ speech, and works out what they’re doing. So it could detect if music were playing to someone “alone, [in a] small group, [or] party.”
Then it could also “intonation, stress, [or] rhythm,” to determine whether the user is “happy, angry, sad or neutral.” It could then also utilize other information it may have about the user, including their “gender, age and accent.”
Knowing a user’s location and age, could presumably lead Spotify to be able to play music that was especially popular in that area when the user was growing up.
While this patent specifically refers to music, Spotify has also been expanding its non-music offerings. Most recently, it added a series of audiobook versions of literary classic novels.
Apple Music lets users search by mood, and then recommends playlists. You can also browse for moods, in iOS 14.
Watch the Latest from AppleInsider TV
AppleInsider has affiliate partnerships and may earn commission on products purchased through affiliate links. These partnerships do not influence our editorial content.