top of page

Immersive Conducting

This project began in 2019 when Professor of Music, Heather Buchman, and Instructional Designer VR/AR Technologist, Ben Salzman, began collaborating on a method to use 360 spatial video to better prepare music students in the conducting class for the auditory and visual experience of conducting an orchestra.  


Heather Buckman conducting the Hamilton College orchestra

I, alongside four others, including Anthony Reyes, Caius Arias, and James Reynolds, were interested in developing a more realistic experience for students in the conducting class. Using Unity, a game engine, we built a virtual environment using the orchestra's 360 camera footage and developed a virtual baton that students could control while wearing VR headsets. 


Things I Found Interesting

In hindsight, I think what really engaged me in the project was the idea of trying to capture a relationship between the conductor and the orchestra. My experience with conductors, prior to this project, had only ever been on the receiving end. I played oboe in an orchestra for 10 years, and had played with several different conductors, though never gave much thought to the conductor's role during that time. 

For example, if a piece of music needs to be played delicately, the conductor communicates that through a particular set of gestures. Developing an environment that can understand and respond to those gestures was the central challenge of this project. 

Had we continued this project, these elements would have been my priority.

  • Cuing: Separating the audio file into the different sections of the orchestra (the brass, the strings). This would allow for greater control over the playback as the virtual conductor would be able to cue the entrances of different sections into the melodies. It would not be as much control as splitting the audio into different instruments, but that would increase the logistic complexity of the playback.

  • Inputting New Music: Each recording needs to be accompanied by a time stamp array indicating the placement of different beats. This is not particularly complex, as there are frameworks for that (the Librosa Python module, for example). The challenge would be in communicating elements of the sheet music, such as time signature changes, to the Unity engine, as different time signatures often are conducted in different styles. 

bottom of page