Video presentation I made for the 2024 “Explainable AI for the Arts” (XAIxArts) Workshop, part of the ACM Creativity and Cognition Conference 2024.
A lot of these points I’ve discussed elsewhere (see playlist below), but this quickie presentation brings together these ideas, focusing on the aesthetic potential of this approach.
Check out the complete playlist for more hands-on creation of neurons and neural networks:
Using Holon.ist, OSC, and Pure Data to send my heart rate from my Apple Watch to the Arturia DrumBrute Impact.
I got an Apple Watch a couple of weeks ago, and, of course, on the very first day I started looking into apps that would allow me to send the data coming from my watch out as OSC messages. After some poking around I found Holon.ist by Holonic Systems, and decided to give it a try.
There’s actually way more in this app than I actually needed (I just wanted to get all the physical data out to my laptop), but it suits the purpose. So in this video, I show a quick demo of getting my heart rate as an OSC message into my laptop running Pd, and then converting it into a MIDI clock for the drumbrute impact.
Sending out raw MIDI data in Max/MSP with [midiout] for system messages and other live control.
Here, I use the [midiout] object in Max to send individual “note on” and “note off” messages, using our knowledge of the MIDI protocol. We can then expand that to algorithmic MIDI control of sequences in the Arturia DrumBrute Impact, including adjusting the clock and the song position pointer for funky, chaotic beats.
0:00 Intro 0:30 [midiout] 0:59 Basic concept – Note On 3:16 Note Off 4:32 Pitch Bend Change 5:32 Exploring Algorithmic Control 6:47 Controling Sequencers (DrumBrute Impact) 7:09 MIDI Clock Message 9:01 Algorithmic Clock Control 10:03 Start, Stop, and Continue MIDI Messages 11:29 Playing with the Song Position Pointer 13:30 Bringing back the Drunk [metro] 15:00 Closing / Next Steps
Doing some “samplecrushing” (downsampling) in Pure Data Vanilla to create dynamic aliasing artifacts.
0:00 Setting up [samphold~] 0:28 Simple downsampling and aliasing 0:55 Building a sequencer 2:33 Making the samplecrush dynamic 3:13 Making it stereo 3:50 Trying different timings and ranges
A failed attempt at machine learning for real-time sound design in Pure Data Vanilla.
I’ve previously shown artificial neurons and neural networks in Pd, but here I attempted to take things to the next step and make a cybernetic system that demonstrates machine learning. It went good, not great.
This system has a “target” waveform (what we’re trying to produce). The neuron takes in several different waveforms, combines them (with a nonlinearity), and then compares the result to the target waveform, and attempts to adjust accordingly.
While it fails to reproduce the waveform in most cases, the resulting audio of a poorly-designed AI failing might still hold expressive possibilities.
0:00 Intro / Concept 1:35 Single-Neuron Patch Explanation 3:23 The “Learning” Part 5:46 A (Moderate) Success! 7:00 Trying with Multiple Inputs 10:07 Neural Network Failure 12:20 Closing Thoughts, Next Steps
More music and sound with neurons and neural networks here:
Building a comb filter in Pure Data Vanilla from scratch.
A comb filter is a filter created by adding a delayed signal to itself, creating constructive and destructive interference of frequencies based on the length of the delay. All we have to do is delay the signal a little bit, feed it back into itself (pre-delay), and we get that pleasing, high-tech robotic resonance effect.
There’s no talking on this one, just building the patch, and listening to it go.
0:00 Playing back a recorded file 0:35 Looping the file 1:00 Setting up the delay 2:08 Frequency controls for the filter 2:52 Setting the range 3:48 Automatic random frequency 4:25 Commenting the code 5:39 Playing with settings
Creating an ambient music machine in Pure Data Vanilla with a “clamping VCA” that adds subtle distortion, imitating the envelopes in Roland TR-808.
I made a clamping VCA in Reaktor a few weeks back, and now here’s another example in Pd. Normally, amplitude envelopes in synths are a control envelope on the amplitude of the signal. When we use a “clamping VCA”, though, instead of controlling the amplitude of the waveform, we clip it at the desired maximum envelope. This means, when the VCA is all the way up, it sounds the same, but during the attack and release, we’ll get the addition of subtle (or perhaps not-so-subtle) distortion to our waveform.
I use [clip~] in Pd to achieve this effect, stealing the idea from Noise Engineering’s “Sinclastic Empulatrix” module, which, in turn, stole the idea from from the Roland TR-808 drum machine’s cymbal envelopes.
Doing some live processing of sleigh bells in Pure Data to create an “Interactive Holiday Noise Music System.”
Since it’s mid-December, let’s make some holiday music. If you’re sick of the standard cloying Muzak fare, though, you can make your own feedback delay sample-crushing interactive music system in Pure Data in an afternoon.
The main point here is getting a “trigger” from audio input crossing a loudness threshold. Once we have that trigger, we can use it to make changes in live-processing of a sound and trigger other sounds too. This is a simple idea, but its effectiveness is going to depend on what these changes are and how we play with the system.
0:00 Demo 0:26 Introduction / Goals 1:23 Input Monitoring 2:41 Direct (“Dry”) Output 4:08 Feature Extraction with [sigmund~] 6:55 Amplitude as Trigger 8:43 Triggering Changes in Delay 12:44 Sample-Crushing 17:03 Triggering an Oscillator 19:37 Oscillators into Harmony 23:35 Putting it all together 25:33 Closing Thoughts
Coding (well, “patching”) an artificial neural network in Pure Data Vanilla to create some generative ambient filter pings.
From zero to neural network in about ten minutes!
In audio terms, an artificial neuron is just a nonlinear mixer, and, to create a network of these neurons, all we need to do is run them into each other. So, in this video, I do just that: we make our neuron, duplicate it out until we have 20 of them, and then send some LFOs through that neural network. In the end, we use the output to trigger filter “pings” of five different notes.
There’s not really any kind of true artificial intelligence (or “deep learning”) in this neural network, because the output of the network, while it is fed back, doesn’t go back an affect the weights of the inputs in the individual neurons. That said, if we wanted machine learning, we would have to have some kind of desired goal (e.g. playing a Beethoven symphony or a major scale). Here, we just let the neural network provide us with some outputs for some Pure Data generative ambient pings. Add some delay, and you’re all set.
There’s no talking on this one, just building the patch, and listening to it go.
0:00 Demo 0:12 Building and artificial neuron 2:00 Networking our neurons 3:47 Feeding LFOs into the network 4:20 Checking the output of the network 5:00 Pinging filters with [threshold~] 8:55 Adding some feedback 10:18 Commenting our code 12:47 Playing with the network
Creating an artificial neuron in Pd:
Pinging Filters in Pd:
More no-talking Pure Data jams and patch-from-scratch videos: