All of this raises the question: What’s the idea behind showing users how they consumed music? A clue might lie in a study the Spotify research team wrote about last May. The researchers showed 10 users their personal data profiles based on their Spotify account that had information about their top songs (of the past month and all time), top genres, how many playlists they created, and when they listened to Spotify. They discovered that laying out a user’s personal listening history data actually allowed them to “reflect on their identities as listeners,” and it let them see if they were only listening to music while working, or if they had periods of intense obsession with a specific artist.
We care about this kind of insight even if intuitively, we know that it’s just music, and what’s more, we all know that we’re not the only fans of Taylor Swift or Lorde. It might be that seeing a narrative (now with a hint of an emotional arc) created around the songs that shaped your year always feels a bit personal, and at times, revealing. (If you feel up to it, you can let an outside AI judge your Spotify). A FiveThirtyEight writer once mused that Spotify seemed to know him better than he knew himself.
So, how exactly is Spotify doing this? We know that they have reams of data they’ve collected from listeners (381 million monthly active users at last tally). Here’s what kind of analyses they’ve been running behind the scenes to understand what their users like to hear.
When Spotify was first founded in 2006, its aspiration was to be a music library. Personalization came later, when the app’s engineers realized that enabling people to discover new music that they might like could elevate their experience. And this could be done by feeding an algorithm information about a user’s listening history, music choices, duration of play for certain songs, and how they respond to recommendations (are they liking them, skipping them, replaying them, saving them).
“Personalization was an empowering experience for listeners who didn’t have the time or knowledge to create endless unique playlists for every dinner party or road trip,” Oskar Stål, Spotify’s VP of Personalization, said in an October 2021 blog post. “It opened up discovery on a broader level, enabling hundreds of artist discoveries per person per year.”
Their approach to this type of personalization hinged on two main areas of research: user modeling and intricate musical analysis. Spotify attempts to model user behavior on the app by figuring out methods to project in-app activities into human traits and emotion, and tethering music experiences to mood and situational contexts like time of day, week, or season. Knowing this could allow them to change what they recommend on a Friday night versus a Tuesday afternoon. Recommended playlists can pop up on carousels across your home screen; there are personalized playlists like Discover Weekly, Daily Mix, and Radio playlists.
Plus, a new feature called “enhance” allows you to retrieve recommendations within a playlist you’ve already created, and just this week, Stål said in a video presentation that the team at Spotify was considering an approach that employed a human editor alongside the machine learning algorithm to create audio experiences that could maybe mix and match songs with podcasts and others. Spotify has even been testing a neural network called CoSeRNN that weighs certain features such as past listening history and current context to suggest song recommendations that suit the moment.
Source : https://www.popsci.com/technology/spotify-audio-recommendation-research/