Decoding the dynamic representation of musical pitch from human brain activity

N. Sankaran*, W. F. Thompson, S. Carlile, T. A. Carlson

*Corresponding author for this work

Research output: Contribution to journalArticleResearchpeer-review

13 Citations (Scopus)
34 Downloads (Pure)


In music, the perception of pitch is governed largely by its tonal function given the preceding harmonic structure of the music. While behavioral research has advanced our understanding of the perceptual representation of musical pitch, relatively little is known about its representational structure in the brain. Using Magnetoencephalography (MEG), we recorded evoked neural responses to different tones presented within a tonal context. Multivariate Pattern Analysis (MVPA) was applied to "decode" the stimulus that listeners heard based on the underlying neural activity. We then characterized the structure of the brain's representation using decoding accuracy as a proxy for representational distance, and compared this structure to several well established perceptual and acoustic models. The observed neural representation was best accounted for by a model based on the Standard Tonal Hierarchy, whereby differences in the neural encoding of musical pitches correspond to their differences in perceived stability. By confirming that perceptual differences honor those in the underlying neuronal population coding, our results provide a crucial link in understanding the cognitive foundations of musical pitch across psychological and neural domains.

Original languageEnglish
Article number839
JournalScientific Reports
Issue number1
Publication statusPublished - 1 Dec 2018
Externally publishedYes


Dive into the research topics of 'Decoding the dynamic representation of musical pitch from human brain activity'. Together they form a unique fingerprint.

Cite this