2014-08-11

Co.Labs

This Is Going To Change How You Listen To Music Forever

A new algorithm brings machine intelligence uncomfortably close to the human creative process.



Dr. Lior Shamir began his experiment with a curious question: Could an algorithm “understand” the music of popular artists like the Beatles?

“The results showed that the computer clearly identified that the music of the Beatles has a continuous progression, shifting from one album to the next,” Shamir says. Using nothing but patterns inherent in the music, the computer could accurately tell which album the band had written first; then second; then third. “The sorting was based on audio data alone, without any additional information about the albums,” says Shamir, who is an associate professor at Lawrence Technological University near Detroit, MI.

Identifying the patterns that make songs alike is one of the Holy Grails of machine intelligence, which means Shamir’s algorithm could become one of the most valuable pieces of code in the media industry. Recommender systems are the heart and soul of every music service, and media distributors will pay anything (remember the Netflix Prize?) for technology that can use their large databases to predict what else users are going to like. But when "machine listening" is this freaky accurate, what does that mean for the future of music and the people who live and breathe its creation and curation?

From "Please Please Me" To "Abbey Road"

Shamir’s project began, oddly enough, when he developed an algorithm for use in classifying whale song. “By mining the data we were able to determine that, just like people, whales in different geographic locations have different accents or dialects,” he says. As the project evolved, Shamir had an idea: If the algorithm worked so well for analyzing whale song, how would it fare when it came to analyzing pop music?

“Naturally, when you’re looking at music, you start with the Beatles,” he says. “So that’s exactly what we did.” He populated a database with samples of the Beatles’ music, taken from each of their 13 albums. Using a set of 2,883 numerical content descriptors he was then able to break the music down into a variety of different numerical values--ranging from pitch and tempo, to other patterns we do not regularly associate with music.

Once this had been done, Shamir was able to use a the variation on the weighted K-Nearest Neighbor algorithm to determine the measure of similarity between two different songs.

“All of these experiments were done in an unsupervised fashion, which means that the computer was not guided by human intervention, but was just asked to provide the best network of similarities between the albums as the computer understood them,” he continues.

The sorting of the albums into chronological order was something Shamir was not expecting. Nonetheless it happened--meaning that the algorithm was able to pick 1963’s Please Please Me as the Beatles’ first album, followed by With the Beatles, Beatles for Sale, A Hard Day’s Night, Help, and Rubber Soul. Following this the algorithm moved on to the group’s psychedelic rock albums Revolver, Sergeant Pepper's Lonely Hearts Club Band, Magical Mystery Tour, and Yellow Submarine, before finishing with The White Album, Let It Be, and Abbey Road.

Revolutionizing Discovery

After the Beatles’ music had been analyzed, Shamir turned his attention to other popular music bands--such as ABBA, U2, and Queen. In each case his algorithm was again able to sort the albums into the order in which they were recorded, despite being given no information other than the music.

It made a few interesting observations along the way. For instance, during the categorization process, it appeared to have made a mistake when it listed the Beatles’ twelfth and final album Let It Be before its eleventh one, Abbey Road. However as any Beatles fan knows, the majority of the songs on Let It Be were actually recorded at the start of 1969, before Abbey Road--thereby making the algorithm correct.

The algorithm also proved accurate when it came to determining which songs by another band (Tears for Fears) were played by the original band members, and which were played by their replacements.

“It has the makings of a great music discovery tool,” Shamir says. “It could really help to make different artists’ work more accessible to everyone. There could well be some great musicians out there that you would really like, but whose work you’ve never been exposed to before.”

This is the business model behind a company like BookLamp, which combs through books and has proven capable to dividing them up into different genres or other subcomponents based on repeated keywords. (BookLamp, not coincidentally, was acquired by Apple for upwards of $10 million--most likely to serve as a competitor to Amazon’s algorithmic Amazon X-Ray service, which allows readers to see at which points characters or terms appear in a book.)

By turning these tasks of categorization over to an algorithm, a lot of the flaws of human-based review systems like Yelp could be avoided.

The Computer Science Of Hit Songs

All of this skirts around a bigger issue we are just seeing the early stages of: algorithms being used in the creative process. While an increasing number of jobs--from legal work to medical diagnosis--are now routinely carried out by algorithms, creativity is an area that is often thought to be “safe” from automation. Not according to everyone, though.

Several years ago Jason Brown, professor of Mathematics and Statistics and Faculty of Computer Science at Dalhousie University, extracted several mathematical patterns from the music of (once again) the Beatles, and used this to record some soundalike songs by applying the information he had learned.

More recently, the Iamus project of computer scientist Francisco Vico composed more than 1 billion songs in a wide range of genres by using the power of algorithms. Architect Celestino Soddu similarly uses genetic algorithms to create endless unique and unrepeatable designs by simply entering the “rules” that define a certain type of building or style.

By looking at the patterns in songs--ranging from tempo, time signature, and harmonic simplicity--algorithms could be used to help decide whether to sign new pop acts. It could even cut out the middleman completely by taking onboard the song components of a hit and generating music by itself. For example, Lior Shamir feels that over time it should be possible for his algorithm to create new musical tracks which sound like offcuts from, say, that the Beatles’ Revolver album.

“Theoretically it’s certainly possible,” he says. “The problem right now is the amount of computing power you would need to let the computer do that kind of composition.” That doesn’t mean he’s content to wait for Moore’s Law to catch up, though.

“One way we’re trying to get around this is to create a vector representation of the music in MIDI form,” he continues. “The structure of a MIDI is much easier for a computer to work with, and reduces the complexity by several orders of magnitude.”

This subject often provokes strong reactions, because it’s perceived as taking away a fundamentally human part of the creative process. After all, if music labels could continue generating new songs in the style of popular artists for relatively little money, why risk hiring unproven (human) acts?

Why This Doesn’t Mean The End (For Humans)

But fortunately not everyone sees it this way. In fact, there are plenty of examples of algorithms being used as a part of the creative process that can help people--whether this means getting unlikely movies funded in Hollywood, or pushing artists to create bold new works.

Epagogix is an example of the former. Epagogix works with some of the biggest studios in Hollywood, using a neural network to forecast box office numbers.

“We produce a number very, very early in the development process, and this number then be taken into account when a studio is budgeting a particular film,” says CEO Nick Meaney. “Based on our forecast, it might be that a certain actor is hired or not hired, for instance, because we’ve predicted that while the movie might be able to make a profit, it will always have a certain ceiling.” In this way, studios can start off by working out how much money a film will make, and then reverse-engineering it to ensure it finds its optimal audience.

Alexis Kirke is another believer in the way humans and algorithms can sit side by side in the creative process. As permanent research fellow in music at the Interdisciplinary Centre for Computer Music Research at the U.K.’s Plymouth University, Kirke thinks pattern-spotting can only be a good thing for broadening the scope of musical exploration.

“The algorithm is not replacing us, it is enhancing us,” he says. “It is not cheating to use the algorithm, because it won’t take long for the most skilled writers and composers to push the algorithm beyond its original intended use, leaving the average ‘cheater’ creatively behind.”

[Image: Flickr user Andy Doyle]




Add New Comment

6 Comments

  • tom

    It got it wrong (but mostly right)....The White Album was BEFORE Yellow Submarine, not after - however, YS used a bunch of older songs on it, with just FOUR new ones....I wonder if it would work from SONG to SONG, rather than album to album.....?

  • Mark Glomski

    Maybe now we can settle the question of "did Ringo play drums on all those songs?"

  • Mark Harrison

    Whats to settle? No he did not. Paul sat in for him more than once.

  • Bill Willard

    I'm pretty sure Jim Keltner played on a few tunes as well in the last days.

  • Olga Gulan

    I find that although most popular music is based on the same chord progressions and all the components that are given in the article above, there will always be room for change and inovation in music. I'm not so glad that algorithms are now used for creating or analysis of music, just go back in time and listen to all the music created before the 21st century and compare the value of genius music created entirely by humans. The importance of composing and even being a musician has dropped drastically with computers. The technology and algorithms are fascinating, I admit, but personaly I dissaprove of computers taking over human creativity.

  • I agree that there will always be room for human agency in creativity, but the question concerns where the line between humans/technology should be drawn. After all, a flute is technology, and it imposes its own limitations/possibilities on the creative process.