Is it a horror movie or a romantic comedy? AI can predict based on music alone

0

Music is an indispensable element in a film: it creates the atmosphere and the mood, stimulates the emotional reactions of the viewer and significantly influences the interpretation of the story by the audience.

In a recent article published in PLOS One, a research team from USC Viterbi School of Engineering, led by Professor Shrikanth Narayanan, sought to objectively examine the effect of music on film genres. Their study aimed to determine whether AI-based technology could predict a movie’s genre based solely on the soundtrack.

“By better understanding how music affects the viewer’s perception of a film, we understand how filmmakers can reach their audiences in more compelling ways,” said Narayanan, University Professor and Niki Chair and Max Nikias in Engineering, Professor of Electrical and Computer and Computer Engineering and Director of the Signal Analysis and Interpretation Laboratory at USC Viterbi (SAIL).

The idea that different genres of movies are more likely to use certain musical elements in their soundtrack is rather intuitive: a light romance can include rich string passages and lush lyrical melodies, while a horror movie can. rather present disturbing and piercing frequencies and oddly jarring notes. .

But while previous work qualitatively indicates that different genres of movies have their own sets of musical conventions – conventions that make this love movie sound different from this horror movie – Narayanan and his team searched for quantitative evidence. that elements of a film’s soundtrack could be used to characterize the genre of the film.

The study by Narayanan and his team was the first to apply deep learning models to the music used in a movie to see if a computer could predict a movie’s genre based solely on the soundtrack. . They found that these models were able to accurately categorize a movie’s genre using machine learning, supporting the idea that musical characteristics can be powerful indicators of how we perceive different movies.

According to Timothy Greer, a Ph.D. student at USC Viterbi in the computer science department who worked with Narayanan on the study, their work could have valuable applications for media companies and creators to understand how music can improve other forms of media. This could allow production companies and music supervisors to better understand how to create and place music in television, movies, commercials and documentaries in order to elicit certain emotions in viewers.

In addition to Narayanan and Greer, the study’s research team included Dillon Knox, a Ph.D. student in the Department of Electrical and Computer Engineering, and Benjamin Ma, who graduated from USC in 2021 with a BS in Computer Science. , a master’s degree in computer science and a minor in music production. (Ma was also named one of two 2021 USC Schwarzman Fellows.) The team worked at the Center for Computational Media Intelligence, a research group at SAIL.

Predict the genre from the soundtrack

In their study, the group looked at a dataset of 110 popular movies released between 2014 and 2019. They used the genre classification listed on the Internet Movie Database (IMDb), to label each movie as action, comedy, drama, horror. , romance or science. -fiction, with many films spanning more than one of these genres.

Then they applied a deep learning network that extracted auditory information, such as timbre, harmony, melody, rhythm, and tone from the music and score of each movie. This network used machine learning to analyze these musical characteristics and was found to be able to accurately categorize the genre of each film based solely on these characteristics.

The group also interpreted these patterns to determine which musical characteristics were most indicative of the differences between the genres. The models did not give details of the types of notes or instruments associated with each genre, but they were able to establish that tonal and timbral characteristics were the most important in predicting the genre of the film.

“Laying this groundwork is really exciting because now we can be more specific in the types of questions we want to ask about how music is used in movies,” Knox said. “The overall experience of the film is very complicated and being able to computer analyze its impact and the choices and trends that go into its construction is very exciting.”

Future directions

Narayanan and his team examined the auditory information from each movie using a technology known as audio fingerprinting, the same technology that allows services like Shazam to identify songs from a database by listening to recordings, even in the presence of sound effects or other background noise. This technology allowed them to see where musical signals occur in a movie and for how long.

“Using the audio footprint to listen to all of the film’s audio allowed us to overcome a limitation of previous studies of film music, which typically only looked at the entire soundtrack album of the film. movie without knowing if or when the songs from the album appear in the movie, “Mom says. In the future, the group wants to take advantage of this ability to study how music is used at specific points in a movie and how musical cues dictate the evolution of the film’s narrative as it unfolds.

“With the ever-increasing access to film and music, it has never been more crucial to quantitatively study how this medium affects us,” said Greer. “Understanding how music works in conjunction with other forms of media can help us design better viewing experiences and create moving, impactful art. “


Subscribe to our newsletter!


Leave A Reply

Your email address will not be published.