Traditional instruments make sound by transforming the energy from our interaction with them–striking, scraping, blowing–into acoustic energy. When we perform on electronic instruments, though, our energy is transformed into data which is used to control the different dimensions of sound.
I touch my iPad, it reports the position of my finger (in X and Y), and the I can use those two numbers to control music–dynamically changing pitch, volume, timbre or any number of other dimensions.
In this video (and the rest of the playlist), I dig into “data-driven instruments”, thinking about how the different paradigm of using a digital interface to create numbers that we then use as control information for dimensions of music (with a little demo in Max/MSP)
Here are some examples of data-driven instrument performances from other artists: