In two experiments, we examined whether the facial expressions of singers affect judgements of musical structure. In Experiment 1, a performer was recorded singing each of three intervals. Participants were shown the visual recording (no sound) and judged the size of the interval the performer was (imagined to be) singing. Judgments were made under five conditions of occlusion: no occlusion, occlusion of the mouth; occlusion of the eyes and eyebrows; occlusion of the mouth, eyes, and eye-brows; and occlusion of the entire face (only head movements visible). The results indicated that participants could decode pitch relations from facial expressions alone. Examination of the occlusion conditions indicated that participants could differentiate intervals based on eye-brow movements alone, and even based on head movements. In Experiment 2, we recorded a musician singing thirteen versions of the last phrase of “Silent Night.” Versions differed in the pitch of the final tone, which was either the (expected) tonic of the song, or one of the other tones of the chromatic scale, including the tonic one octave above the expected note of the song. Participants were shown the visual recordings of the performances (no sound) and judged the “goodness of fit” of final note, as conveyed in the facial expressions of the singer. Mean ratings closely corresponded to the standard major tonal hierarchy, suggesting that the singer successfully communicated tonal structure through the use of facial expressions.
|Title of host publication||Proceedings of the 9th International Conference on Music Perception and Cognition (ICMPC9)|
|Editors||M. Baroni, A. R. Addessi, M. Costa|
|Publisher||The Society for Music Perception and Cognition (SMPC)|
|Publication status||Published - 2006|