Skip to main content icon/video/no-internet

Decoding refers to the extraction of meaning, and in music meaning comes at a number of different levels. First, in the case of precomposed, notated music, there is the meaning of the notes on a page. The score has to be decoded for a musician to know how to play the piece that has been set down in front of him/her. Separate decoding processes are thought to occur for notated pitch compared to notated rhythm. Second, there is the meaning of the musical progressions, or structure. This sort of decoding helps the performer to make musical decisions and the listener to make sense of the work, though many large-scale structural markers are difficult to appreciate without detailed notational analysis. Taking a different perspective, the notion of decoding can also be related to emotion. Musical works are often attributed an emotional valence, such as joy or sadness, which is assumed to be perceptible to the majority of the listening audience. Nonetheless, performances of the same work can sound very different due to a performer's choice of emotional expression. The decoding of music includes notation and structure, while the decoding of emotion includes valence, arousal, and expressivity.

Decoding of Music

Reading from a musical score is a common task for many musicians. It consists of decoding a complex series of symbols comprised of arbitrary shapes arranged using the spatial information of left/right to convey duration and up/down to convey pitch. The shape of a particular note bears no iconic relation to its duration, but the distance between notes conveys its relative length; a larger gap after a note shows that it is played for longer. Furthermore, the vertical position of a note conveys its relative pitch, with a note closer to the top of the staff being higher.

Several theories of how music is read have been put forward, primarily suggesting that note sequences are grouped and compared to patterns held in long-term memory. Behavioral studies and eye-tracking technology have revealed a lot about how people read music. It has been found that music reading is easier in a familiar genre, and chunking notes into groups facilitates good sight-reading. Furthermore, how a note is decoded depends more on what it represents than its visual presentation, with a two-beat note at a fast tempo being read in the same way as a one-beat note at a slow tempo. Nonetheless, the spacing and style of notation does have an effect. Perhaps most striking is the finding that musicians cannot help but read musical patterns, in a similar way to people not being able to help reading the words that they see.

How are pitch and rhythm decoded? Research by Daniele Schön and Mireille Besson suggests that these core elements of music are decoded in different ways. Using the electroencephalography (EEG), they found that rhythm decoding does not affect pitch decoding and vice versa, though pitch is decoded more quickly. Furthermore, pitch and rhythm are associated with different neural circuitry. Several case studies of musicians suffering a brain injury report subsequent inability to either read pitch or rhythm (one being more strongly impaired than the other), indicating a degree of dissociation. Heschl's gyrus and the planum temporale are thought to be relevant to pitch processing, and while the study of rhythm impairment has shown less clear-cut processing centers, both sides of the cerebral cortex seem to be involved as well as the cerebellum.

...

  • Loading...
locked icon

Sign in to access this content

Get a 30 day FREE TRIAL

  • Watch videos from a variety of sources bringing classroom topics to life
  • Read modern, diverse business cases
  • Explore hundreds of books and reference titles

Sage Recommends

We found other relevant content for you on other Sage platforms.

Loading