On its grand quest to improve upon human performance in various areas and industries, there are certain challenges that artificial intelligence (AI) will have a harder time tackling than humans, such as abstract thought, social cues and spatial reasoning. Perhaps the most difficult human endeavor for machines to master involves the experiencing of art.
The traditional recommender systems for music are still quite limited, incapable of picking up the underlying emotional tone of the music that only great DJs and talented curators are adept at identifying. But what if machines could be taught to find the emotional connection of songs making recommendations faster and without bias, than even the most ardent music fanatic ever could?
Everyone who has ever tried knows how difficult it can be to find the perfect song for a special occasion, or one that perfectly amplifies the mood in a film scene. Choosing the wrong tune can ruin a cherished moment or reduce the impact of a film, regardless of the actor’s talent or the great storyline.
The Stockholm-based music company Epidemic Sound knows much about this challenge. The company’s sole purpose is to help individuals and companies soundtrack their stories with quality music. With a music library consisting of thousands of songs that filmmakers can choose from, they help creators find exactly what they need in order to produce the perfect match to generate the right feeling or mood.
Epidemic Sound puts a great deal of effort into assigning multiple tags for each song in order to describe both the style and the mood. Currently, this is done manually by Epidemic Sound’s music curators – a task that is both time-consuming and without safeguard against human bias or personal taste.
Epidemic Sound sought to reimagine their tagging system to improve efficiency, bias and accuracy. Using the Peltarion Platform, they developed Deep Tagger – an AI model that can accurately tag songs based on emotion and style by analyzing spectrograms of raw audio data. Deep Tagger can also find similar songs by ranking them according to mood. Over time, this audio-based clustering will result in greater catalog metadata, which will improve search and discovery for Epidemic Sound’s users.
As Owen Meyers, Music Curator at Epidemic Sound, explains: ”With an AI system such as Deep Tagger, our curators can succeed in making even more specific tagging, resulting in the work being much more efficient and less prone to human bias.” Music recommendation, based on accurate and descriptive tags, is the core of Epidemic Sound’s business, and therefore must be constantly improved in order for them to stay ahead of competition.
Due to the fact that Deep Tagger has the ability to tag songs at a finer granularity level, it is now possible to make more detailed queries. Meyers explains, “This is particularly important in production music when you're looking for the perfect drop, that feel-good intro or a special rise.” Through Deep Tagger, Epidemic Sound can help its users find songs with the exact musical qualities they are looking for across a whole library of music.
So, is it correct to say that Deep Tagger understands art? Probably not. But while AI’s ability to effectively understand art is limited, Deep Tagger has shown that AI can certainly groove and will serve as a great dance partner to filmmakers looking for the ultimate soundtrack to their stories.
Curious about our customer Epidemic Sound? Check out this interview with Oscar Höglund, Epidemic Sound's CEO & Founder.
Photos by Pauline Norden & Epidemic Sound