Flock to music
Flock to Music uses custom modeling software to imitate the behavior of complex natural phenomenon (like flocking). This data is then interpreted into a variety of musical parameters: frequency, rhythm, amplitude, chords, modes and timbral combination.Flock to Music is being developed in collaboration with Jeff Ens under the supervision of Philippe Pasquier at Metacreation Lab, SIAT, Simon Fraser University, Vancouver, BC, Canada.
My music has been inspired by nature and human behavior for forty years. With Flock to Music, I am analyzing complex natural and social behaviors and extracting data that I can then use to compose new works. My first project explored the complex natural behavior of “flocking”, but a composition could as well interpret urban data, such as traffic flow in different demographic areas. These phenomena are interpreted through data analysis or data models, and the results are used to control specific musical parameters.
The interpretation of data is expressed in several aspects of Live Structures compositions: in the overall structure of the work, in the audible shapes and gestures of the music and in the details of the orchestration, the pitches and the rhythm. The same data will also be controlling the MaxMSP tool when live processing is utilized (Live Textures).
TEAM
Flock to Music is being developed in collaboration with Jeffrey Ens under the supervision of Philippe Pasquier at Metacreation Lab, SIAT, Simon Fraser University, Vancouver, BC, Canada.
IMPETUS
My music has always been inspired by nature and human behavior. Instead of interpreting intuitively the world I live in, I chose to create tools that interpret data from various sources. In the case of Flocking to Music, we are creating models that imitate complex natural and social behaviors.
PROCESS
- In Flock to Music, different lengths of musical sections are initially created by controlling the parameters of the model.
- The data from the model is then sent to a tool that can translate the information into specific musical parameters.
- The data from the model is then sent to a tool that can translate the information into specific musical parameters.
- The interpretation of data is expressed in multiple aspects of the composition: in the overall structure of the work, in the audible shapes and gestures of the music, and in the details of the orchestration, including pitches and rhythm.
GOALS
- To create a system that can explore various form of behaviors in groups.
- To create a simpler and more direct way to create musical scores.
- To connect this data interpretation to Ocular Scores to create visual scores.
CONCLUSION
My first project explored the complex natural behavior of “flocking”, but a composition could as well interpret urban data, such as traffic flow in different demographic areas or crowd behavior during a riot.
DOCUMENTATION