
Thousand of genres and millions of songs can all be analyzed by just three attributes, according to research from several leading universities.
The finding comes from a major research effort conducted by McGill University, the University of Cambridge, Rutgers University, City University of New York, and the Stanford Graduate School of Business. The academic group, led by music psychologist David Greenberg and music researcher Daniel J. Levitin, found that genres and adjectives are often overlapping and insufficient when it comes to accurately describing music.
After significant statistical analysis involving 102 different pieces of music and 26 different genres and sub-genres, the research team distilled those clusters down to the following: Arousal, Valence, and Depth. According to the study, every single piece of music — across any era, style, or origin — can be accurately described by using those three areas.
These areas are more fully described as follows:
(1) Arousal describes intensity and energy in music;
(2) Valence describes the spectrum of emotions in music (from sad to happy);
(3) Depth describes intellect and sophistication in music.
Additionally, the research team also discovered that characteristics describing music from within a single genre (including both rock and jazz) could be accurately broken down by these same three categories. But on the edges of any ‘genre,’ things can be very difficult to classify, which is why most music fans have trouble defining their taste in music.
Indeed, the finding explains why most listeners prefer music that spans multiple genres, radically different styles, and divergent eras. As an example, a lower-intensity chill dubstep track might rank low on the Arousal scale, provoke a mellow emotional Valence, and offer only moderate intellectual sophistication. Those levels could match an acoustic coffeehouse-style song, and help to explain why one person likes two seemingly-different styles. Even a laid-back trap song could provoke similar rankings.
“Genre labels are informative but we’re trying to transcend them…”
All of that suggests that genres may be outdated, but more importantly, tied to societal and surface factors that aren’t as directly related to the core music itself. “The findings suggest that this may be a useful alternative to grouping music into genres, which is often based on social connotations rather than the attributes of the actual music,” the research group relayed. “It also suggests that those in academia and industry (e.g. Spotify and Pandora) that are already coding music on a multitude of attributes might save time and money by coding music around these three composite categories instead.”
At best, genres are clumsy tools to describe music, though at worst they limit our ability to accurately and completely describe a piece of music. “Genre labels are informative but we’re trying to transcend them and move in a direction that points to the detailed characteristics in music that are driving people preferences and emotional reactions,” Greenberg said.
The full study is available here (paywall protected).
Image by Smabs Sputzer, licensed under Creative Commons Attribution 2.0 Generic (CC by 2.0).
Also, all humans can be categorized by just three attributes:
1. dumb or dumber
2. hot or not
3. love or money ( © Prince, R.I.P.)
A Taoist might say: “If we look at similarities, everything is the same. If we look at differences, everything is different.”
In fact, he just did.
Actual musical attributes may do even better than genre or fuzzy notions like “sophistication” or “depth.” 100 years ago musicologists thought perhaps organizing music by the mechanism used to produce the sounds would be a good idea. However, in the age of electronics, that became meaningless and even impossible. Genre has been too polluted by marketers simply creating labels to distinguish products or convey message/brand to a market, so it’s also meaningless.
F major, triple meter, cross-rhythms, preponderance of octatonicism, and myriad other melodic, harmonic, rhythmic and sonic constructs and functions exist to aptly describe and categorize music. Yet they remain virtually unknown to those who would create useful categorization algorithms.
Some future generation may discover them again, though.
Think about this: The first two describe the state of mind of a listener, not the music they’re listening to. Therefore, they are utterly useless.
One person’s arousal is another’s deathly boredom, even when listening to exactly the same music.
Useful and helpful.