Digital Instrumentality and Automation

[Thursday @ 4:00pm – 5:00pm, Room 9]

Daniele Shlomit Sofer

“Reason and Measurement in Automated Music Analysis”


Abstract

Many of music’s technologies began with other intended applications. The hand-cranked, grooved metal cylinder and stylus employed to produce sound in Edison’s early phonograph was inspired by myograph instruments used in science and medicine. The phonograph’s predecessor, the kymograph was first used to cut into the physical body of animals to measure the heart rate, blood flow, and other functions of life (Mundy 2018). Nineteenth-century psychiatrists used acceleromyographs in human trials to monitor heart rates, physical convulsions, and vocal amplitude in sexually aroused women to diagnose the now largely dismissed malady of hysteria (Lomas 2012; Sofer 2018). Ethnomusicologists, too, employed tools like the HP6A tone arm and magnetic pickup from Harvard’s Collection of Historical Scientific Instruments (CHSI) to capture evidence of sounds made by Indigenous populations and otherwise ethnically and socially foreign musical phenomena. Just as vivisection—cutting into the physical bodies of animals—made scientists feel closer to finding the meaning and source of life, the graphic trace employed in human trials and later in music presented a semblance of reality more believable than human observation alone. 

Presenting an overview of historical instruments such as the kymograph, acceleromyograph, and the HP6A, this paper tracks a parallel history of electronically distilled graphics in both medicine and music. The paper identifies sexual pathology and scientific racism as predecessors to today’s computational analysis of music and sound in such wide-reaching applications as algorithmic optimization and recommender systems for streaming services, Voice Spectrogram Analysis employed by the Department of Justice (Lorenzen 1980), and in the robot-assisted acoustic measurement performed by first responders (Martinsson 2022). Like the early instruments of science and medicine, law enforcement, government agencies, and corporations employ automating tools to quantify sonic and physical aspects as proof for more abstract phenomena like emotion, sensation, and sonic perception. The talk illuminates leaps in logic between the measurement and the interpretation of scientific data to show how historical assumptions about the validity of measurement come at the expense of critical evaluation and ethical transparency in automated musical applications.

Biography

Dr. Daniele Shlomit Sofer (they/them) is Assistant Professor of Music Theory and Music Technology at the University of Dayton and co-founder of the LGBTQ+ Music Study Group. As a researcher, they examine various means of making music with electronic mediation through a lens of gender, sexuality, and race. Their monograph on this subject, Sex Sounds: Vectors of Difference in Electronic Music, was published by MIT Press in 2022. They earned a BA in music performance and honors at New Paltz, and completed degrees at two other SUNY schools, as well as a PhD in musicology at Kunstuniversität Graz in Austria. Their current research investigates how musical technologies inherit and express racial and gendered biases. 

Andrew McPherson

“Digital Musical Instrument Design as Critical Engineering Practice” 


Abstract

Musical instruments carry many identities, depending on one’s disciplinary perspective: an instrument acts as transducer from action to sound, as extension of the mind and body, and as product (and co-producer) of a cultural environment. There might exist a temptation to organise these views as progressive stages of enlightenment, widening the lens from mere technical objects to prostheses to broadly-defined “contexts for musicking” (Waters, 2021; Small, 1998). However, where design is a co-equal goal with analysis, the loop must be closed such that ecological insights inform future technical practice.

Every technology contains inscribed values, obvious and subtle (Akrich, 1992). Sensitive and entrepreneurial engineers have long shaped electronic music, from Robert Moog (Pinch and Trocco, 2004) to Roger Linn (Morrison, 2022) to Miller Puckette and David Zicarelli (Snape and Born, 2022) to name just a few. MIDI, codified by an industrial consortium reflecting computational constraints of the 1980’s, continues to exert global influence through ideology imported from the piano keyboard (Diduck, 2018; Dolan, 2012). Meanwhile, academic research in instrument design grapples with tensions between scientific roots (Gurevich, 2016; Born, 2022) and emergent epistemological and political issues (Hayes and Marquez-Borbon, 2020; Morreale et al., 2020), while struggling with uptake and longevity of new instruments (Morreale and McPherson, 2017).

This talk asks how engineering and design should best participate in the emerging anthropological and ecological discourse around instruments. What, ultimately, should a critically-engaged designer do differently? I propose that instruments should not be viewed as self-contained objects at all, but as entanglements, complex and irreducible webs of relationships between humans and things (Frauenberger, 2019; Waters, 2021). What is then needed is a theory of entanglement design, a situated techno-cultural practice that uses engineering as one tool amongst many to create and perturb entanglements without uncritically adopting a scientific logic that aspires to objectivity through careful isolation.

The talk will present early work on RUDIMENTS, a recently commenced 5-year ERC/UKRI- funded project. I will discuss case studies of value discovery in existing music technologies (Lepri and McPherson, 2022), and I hope to start a conversation on how we might gain foresight into the cultural implications of future engineering decisions. 

Biography

Andrew McPherson is a musical instrument designer, composer, engineer and viola player. He is Chair (Professor) in Design Engineering and Music in the Dyson School of Design Engineering at Imperial College London. He leads the Augmented Instruments Laboratory, a research team investigating new musical instruments, performer-instrument interaction, embedded hardware systems, and the cultural implications of technology. His digital and augmented instruments are widely used by performers; for example, the magnetic resonator piano, an augmented acoustic piano, has been used in over 30 compositions, several commercial albums and a major film score.  His research has led to two spinout companies following successful crowdfunding campaigns, including Bela, an open-source embedded hardware platform for creating rich and responsive interactive audio systems. He holds an ERC Consolidator Grant and a Senior Research Fellowship from the Royal Academy of Engineering.