Grammys 2012: Would a computer have picked Adele?

Grammys 2012: Adele won six awards at Sunday's Grammys. But Adel should have lost to Bruno Mars, according to a computer algorithm that predicts which songs will be hits and which will be flops.

Toyota Motor Corp's partner robots play instruments at the company's showroom in Tokyo in 2008.

Toru Hanai/Reuters/File

February 13, 2012

Why do artists like Adele and Kelly Clarkson always find themselves topping the charts – and winning Grammy Awards –  instead of bands like Mumford & Sons?

New research from a team at the University of Bristol's Intelligent Systems Laboratory led by Dr. Tijl de Bie aims to answer that question by employing  computer software designed to automatically analyze audio tracks and evaluate their "hit potential."

The algorithm, based on the last 50 years of UK top 40 singles, examines songs for 23 features from loudness, tempo, and duration to energy and "danceability." The computer program attempts to determine a given song's hit potential by comparing its values to historically successful and unsuccessful singles. The team found that they could predict whether or not a song would make it into the British Top 5 singles or languish at the bottom of the charts with an accuracy of around 60 percent. However, the accuracy of the algorithm can vary by era. In the years around 1980, in particular, the equation had trouble reliably picking hits, indicating a particularly creative period in pop music.

Sunday night, Adele won big, taking home six Grammy Awards, including Record of the Year.  We put the 2012 Grammy Record of the Year nominees (which are selected by industry luminaries for outstanding quality) into the Bristol  team's Score a Hit website.  Here's how it ranked these songs:

Adele - "Rolling in the Deep"

Adele's single, according to the algorithm, is squarely in the "Hit" camp. The computer judges have given it a score of 7/12 on their "Songometer."

Bon Iver - "Holocene"

Sorry. But the folky Bon Iver, whose quiet and understated "Holocene" likely scored low for danceability and volume. The University of Bristol boffins have given Iver a measly 2/12 Songometer units.

Can Syria heal? For many, Step 1 is learning the difficult truth.

Mumford & Sons - "The Cave"

"The Cave" doesn't fare well either. The program awards the roots/folk-rock combo Mumford & Sons an embarrassing 3/12, betraying an inability to do "The Robot" dance to banjo picking.

Katy Perry - "Firework"

Another hit! Katy Perry's undeniably commercial "Firework" has been awarded 10/12 by the automaton judge, garnering Perry a strong second place finish.

Bruno Mars - "Grenade"

"Funky," you can almost hear the judge-bots saying as they bob their clockwork heads to the strong beat and soaring vocal of Bruno Mars's sure hit. "Grenade" scores 10/12 Songometer units – and the robo-Grammy for Record of the Year.

The algorithm results are influenced by historical trends, say Bristol researchers.  For example, as grandma could probably attest, the loudness of music, both hit and non-hit, has been increasing since the beginning of the data set in 1960, with a particularly large jump in the late nineties. Hit songs in the early 1960s were more likely to be simple harmonically, with complexity trending up sharply until the late 1960s and slowly falling off until the modern day. Also, interestingly, danceability wasn't an important factor in a song's hit potential until the late 1970s and early 1980s.

Not all hits conform to the software's preconceived notions, however, and Dr. de Bie notes that social factors can influence a song's sales.

For example Michael Jackson's 2009 No. 1 hit "Man in the Mirror" only achieved No. 21 in 1988, but topped the charts following news of the singer's death. "November Rain," a nine-minute epic from Guns N' Roses reached No. 1 in 1992 despite its length, lack of danceability, and unconventional-for-the-time harmonic structure, possibly on the strength of its huge-budget music video.

As Dr. de Bie has noted, social factors can throw his algorithm's success rate off.

In fact, research has been done in the role of peer influence on whether a song becomes a hit or not. In an published in the journal Science in 2008, participants were invited to a website featuring 48 unknown songs by different bands. Different groups of users were presented with the music in different ways, with some groups' song lists randomly organized and others' song lists organized in list form with clear indicators to their popularity among other users, similar to the way Billboard organizes its Hot 100 singles list. In groups exposed to more indicators of songs' popularity, the inequality of popularity between songs was greatly increased. More surprisingly, the unpredictability of which songs might become popular also increased dramatically.

Of course, the Grammy Awards are chosen by industry pundits, not music buyers. But if algorithms like the one developed at Bristol Intelligent Systems Laboratory could accurately predict commercial success, what will it mean for an industry increasingly focused on market research and focus-group testing? Could computer models like this one day be used to judge the commercial viability of potential pop releases, and what effect would that have on the creativity and inventiveness of major releases? These are difficult questions to answer, but as tools like these become more commonplace they may become more important for the music industry.

[ Video is no longer available. ]