Model
Digital Document
Publisher
Florida Atlantic University
Description
This dissertation studies the neural basis of song, a universal human behavior. The relationship of words and melodies in the perception of song at phonological, semantic, melodic, and rhythmic levels of processing was investigated using the fine temporal resolution of Electroencephalography (EEG). The observations reported here may shed light on a ubiquitous human experience and also inform the discussion of whether language and music share neural resources or recruit domain-specific neural mechanisms. Experiment 1 was designed to determine whether words and melody in song are processed interactively or independently. Participants listened to sung words in which the melodies and/or the words were similar or different, and performed a same/different task while attending to the linguistic and musical dimensions in separate blocks of trials. Event-Related Potentials and behavioral data converged in showing interactive processing between the linguistic and musical dimensions of sung words, regardless of the direction of attention. In particular, the N400 component, a well-established marker of semantic processing, was modulated by musical melody. The observation that variations in musical features affect lexico-semantic processing in sung language was a novel finding with implications for shared neural resources between language and music. Experiment 2 was designed to explore the idea that well-aligned text-settings, in which the strong syllables occur on strong beats, capture listeners' attention and help them understand song lyrics. EEG was recorded while participants listened to sung sentences whose linguistic stress patterns were well-aligned, misaligned, or had variable alignment with the musical meter, and performed a lexical decision task on subsequently presented visual targets.
Member of