Research Portfolio

Learning and Memory for Cadences

cadence_tree
Top: Half Cadence—Haydn, String Quartet in F, Op. 76/2, i, mm. 15–20. Bottom: The cadential “tree of life,” estimated by the neighbor-joining method for the 245 cadences from a corpus of Haydn’s string quartets and then partitioned into the five cadence categories from Caplin’s typology (1998).

How do we know a passage of music is coming to an end? According to scholars in the learning sciences, much of what we understand about the conventionalized ending formulæ characterizing music of the classical style—what scholars have for centuries called cadences—is acquired implicitly over the course of many years. This research project considered the validity of this claim using corpus-analytic and experimental techniques, and drawing from theories of implicit (statistical) learning, schema theory, and expectation. The figure above demonstrates that a computational model simulating the behavior of listeners can replicate Caplin’s cadence typology by adapting algorithms from biology—which visualize the genetic similarities between biological species using phylogenetic trees—to classify a corpus of cadences from Haydn’s string quartets (a cadential “tree of life”).

Publications

Sears, David. 2016. The Classical Cadence as a Closing Schema: Learning, Memory, and Perception. PhD Dissertation, McGill University. pdf

Sears, David. 2015. “The Perception of Cadential Closure.” In What is a Cadence? Theoretical and Analytical Perspectives on Cadences in the Classical Repertoire, edited by Markus Neuwirth and Pieter Bergé, 251–83. Leuven: Leuven University Press. pdf

Sears, David, William E. Caplin, and Stephen McAdams. 2014. “Perceiving the Classical Cadence.” Music Perception 31 (5): 397–417. pdf

Sears, David, Marcus T. Pearce, William E. Caplin, and Stephen McAdams. Submitted. “Simulating Melodic and Harmonic Expectations for Tonal Cadences Using Probabilistic Models.”

 

Psychophysiological Responses to Music

physiology2
Participants listening to a string quartet at Tanna Schulich Hall in a psychophysiology experiment conducted using the Music Perception and Cognition Lab’s Audience Response System.

Psychophysiological measures like skin conductance, blood volume pulse, and electromyography of the facial musculature provide direct, objective evidence of autonomic and somato-visceral activation, so researchers often use the signals derived from these measures as converging evidence for emotional experience. In collaboration with several members of the Music Perception and Cognition Lab, I designed several experiments to examine the psychophysiological correlates of emotional experience during music listening. I also created several functions in Matlab to pre-process and visualize biosignal data, included here.

Publications

Chuen, Lorraine, David Sears, and Stephen McAdams. 2016. “Psychophysiological Responses to Auditory Change.” Psychophysiology 53 (6): 891–904. pdf

Trochidis, Kostas, David Sears, Diêu-Ly Trân, and Stephen McAdams. 2013. “Psychophysiological Measures of Emotional Response to Romantic Orchestral Music and their Musical and Acoustic Correlates.” In From Sounds to Music and Emotions, edited by Mitsuko Aramaki, Mathieu Barthet, Richard Kronland-Martinet, and Sølvi Ystad, 44–57. Berlin: Springer Berlin Heidelberg. pdf

Ogg, Mattson, David Sears, Manuela Marin, and Stephen McAdams. Submitted. “Psychophysiological Indices of Music-Evoked Emotions in Musicians.”

 

Sensorimotor Synchronization

dancing
Dancers performing Swan Lake (link).

Humans demonstrate a remarkable ability to synchronize to a beat. During music listening we clap, tap, and dance with relative ease, yet the perceptual and cognitive mechanisms underlying beat synchronization are not well understood. In collaboration with researchers in music technology and information sciences, this research project investigated beat salience—a measure of the perceptual prominence of the beat—in contexts related both to perception and to motor synchronization using electroacoustic music exhibiting a range of beat salience levels.

Publications

Weigl, David M., David Sears, Jason A. Hockman, Stephen McAdams, and Catherine Guastavino. 2013. “Investigating the Effects of Beat Salience on Beat Synchronization Judgements During Music Listening.” Paper presented at the Society for Music Perception and Cognition (SMPC), Toronto, Ontario, August 8-11.

Weigl, David M., David Sears, and Catherine Guastavino. 2013. “Examining the Reliability and Predictive Validity of Beat Salience Judgements.” Poster presented at the Cognitively Based Music Informatics Research Conference (CogMIR), Toronto, Ontario, August 7.

 

Popular Music Analysis

11-paranoid-android-sps-3
A screenshot from MarPanning of a Stereo Panning Spectrogram of Radiohead’s “Paranoid Android.”

A number of scholars have commented on the inadequacy of Western notation to represent parameters that remain essential to the composition and reception of popular recorded music, parameters like dynamics, tempo, texture, timbre, and spatial position. In an attempt to visualize a few of these parameters, Moore and Dockwray conceptualized a stereophonic track in terms of the ‘sound-box,’ a four-dimensional virtual space consisting of the dimensions stereo space, pitch (frequency), loudness, and time. To visualize this sound-box space using signal processing methods, I applied a method proposed by Avendano that extracts amplitude panning information from a stereophonic recording using spectrograms of the left and right channels. Tzanetakis et al. recently created a very elegant animation in the Marsyas framework using Avendano’s equations, which they’ve called MarPanning. The figure above presents a screenshot from the MarPanning animation of Radiohead’s “Paranoid Android.” To see a complete animation of the first two minutes, click here.

Publications

Sears, David. 2013. “Transcribing the Sound-box: Perceptual Principles and Compositional Choices.” Paper presented at the International Conference on Analyzing Popular Music (popMac), Liverpool, United Kingdom, July 2-4.  pdf