Secret mathematical patterns revealed in Bach’s music

Physicists found that the music of Johann Sebastian Bach contains mathematical patterns that help convey information

SCIENTIFIC AMERICAN

Baroque German composer Johann Sebastian Bach produced music that is so scrupulously structured that it’s often compared to math. Although few among us are emotionally affected by mathematics, Bach’s works—and music in general—moves us. It’s more than sound; it’s a message. And now, thanks to tools from information theory, researchers are starting to understand how Bach’s music gets that message across.

By representing scores as simple networks of dots, called nodes, connected by lines, called edges, scientists quantified the information conveyed by hundreds of Bach’s compositions. An analysis of these musical networks published on February 2 in Physical Review Research revealed that Bach’s many musical styles, such as chorales and toccatas, differed markedly in how much information they communicated—and that the musical networks contained structures that could make their messages easier for human listeners to understand.

“I just found the idea really cool,” says physicist Suman Kulkarni of the University of Pennsylvania, lead author of the new study. “We used tools from physics without making assumptions about the musical pieces, just starting with this simple representation and seeing what that can tell us about the information that is being conveyed.”

Researchers quantified the information content of everything from simple sequences to tangled networks using information entropy, a concept introduced by mathematician Claude Shannon in 1948.

As its name suggests, information entropy is mathematically and conceptually related to thermodynamic entropy. It can be thought of as a measure of how surprising a message is—where a “message” can be anything that conveys information, from a sequence of numbers to a piece of music. That perspective may feel counterintuitive, given that, colloquially, information is often equated with certainty. But the key insight of information entropy is that learning something you already know isn’t learning at all.