My students getting started in Kyma often ask me how they can integrate it into music production in their DAW. Now, there are a lot of good reasons to get away from your DAW sometimes, and experiment with different workflows (including those built into Kyma), but let’s set those aside for the moment. With a Dante AVIO USB ($129) and the Dante Virtual Soundcard ($29), you can set up two channels of 48K digital audio in and out of your Paca(rana), enabling you to use it like a plug-in effect with low latency and without any conversion to and from analog.
I demonstrate this in Logic Pro X, but it should work in Pro Tools, Reaper, Ableton, or your other DAW of choice.
Over the last year, I’ve put together a collection of YouTube videos on Pure Data Vanilla for musicians with no previous programming experience required.
Originally, I was just making these videos for a class, but I quickly found there was an audience for Pd tutorials like this, and my videos expanded beyond the class materials to generative music patches, live databending glitch beats, and algorithmic 80s synthwave.
I’ve put together a three-part series on getting started with circuit-bending, from the initial testing and opening up toys to completed alien instrument.
(Part 3 coming next week)
Circuit-bending is the creative customization of consumer electronics with the goal of inventing new unique devices for sound-making, visuals, or other expressive goals. I’m a composer and sound-designer, not an electrical engineer, so my goal is to find fun sounds that I can use in creative ways (rather than any kind of serious circuit design).
For more of my creative electronics projects, check out here:
In order not to bury the lead, let me start by saying, Plurality Spring, a new game piece by me and Paul Turowski, is available for free download:
In the game, players use acoustic musical performance (via the computer’s microphone) to control robots exploring an orb in deep space.
This idea of a “game piece,” where an musical work emerges from performers playing within a set of rules, draws from historical models like John Zorn’s Cobra, or Christian Wolff’s For 1, 2, or 3 People. Since Plurality Spring is a digital video game, though, the performers’ live audio mixes with the in-game sound to create a kind of augmented reality performance piece (whether you perform in front of an audience or just on your own).
One of the things that was particularly enjoyable about the collaboration was both of our willingness to be flexible about the game as it evolved.
The game was originally about this “kid” following a glowing orb. The players directly controlled the orb, but not the kid.
Here is a screenshot of the early prototype:
It’s cute! It’s neat! But without picking apart our whole creative process (maybe another time…), ultimately we ended up with something very different, and ultimately much better.
Our “kid” became three robots, and the orb is no longer something players chase, it’s a traversable planet. Our aesthetic ideas, too, became much more developed, and we ended up with a really pleasing balance of cute and “gritty” in our final visual and audio design.
We premiered the piece on March 24th, 2017 at the Open Circuit Festival in Liverpool, and now it’s available for everyone.
Play it on your own! Perform it for an audience! Share your videos!
We’re excited to hear what kind of AR music you create.
More thoughts on the Kyma symposium later (I’m still processing an fascinating remark from Christian Vogel where he said “I’ve started thinking of my studio like a network rather than a chain”).
For now, I have some catching up to do.
At the end of my performance in Leicester, the game displayed a message that one could “download the game at simonhutchinson.com” a little prematurely.
Notice that I blame the game for this.
So, with apologies to the delay, here is the game (available in Web, Mac, Windows, and Linux versions):
By necessity, the audio and graphics have been simplified for this standalone version. The piece I performed at the symposium sent OSC messages from the game in Unity to Kyma, and, in order to do a “anyone can play” distribution, I had to bounce out the audio and bring them into Unity, so there’s less nuance in the real-time audio, but I’m sure this is a compromise that game developers must make all the time.
If you’re interested in the original work, you can see a video of a “studio” performance here:
Thanks again to those of you who donated through the Kickstarter, and, if you didn’t reserve your advance copy, you can now get the album on Bandcamp, CD Baby, iTunes, Amazon, Spotify or most other places where you might buy music.
The album already received a great review in the March issue of textura, praising the duo’s performance and describing bioMechanics as a “bold opener” where “sheets of metallic noise as well as beat patterns interact with the duo’s acoustic sonorities, making for something of a showstopper, even if it’s just seven minutes long.”
It’s a great CD all around, and a great addition to the collection of serious fans of new music or unique chamber wind repertoire.
Since last time, I added MIDI control and transferred things to a smaller circuit board (and Arduino) in order to fit everything back in the original case.
So what does this little synth do? It takes MIDI notes (from a keyboard or DAW), and uses them to trigger the sounds built in to the Intellivoice chip, which, as it turns out, consist of mostly numbers.
Check it out:
I might look into a few more tweaks. Currently, each word will always play to the end, even if another word is performed before it’s finished. I’m not sure if this is a property of the Intellivoice chip, or something I could fix in my Arduino program.
Why pursue a project like this one?
Games and gaming hardware are mass-produced devices with planned obsolescence and few serviceable parts. By “hacking” and customizing gaming hardware we regain personal ownership of these devices, and we can turn obsolete equipment into performable, expressive instruments.
An earlier version of this synth actually appears in the electronics of my new commissioned work, Hirazumi (more about the piece here and here), and I think the hacked Intellivoice fits perfectly into the post-digital, cyberpunk aesthetic.
Hoping to “veg out” a bit upon returning from holiday travel, I put on one of the new Ghost in the Shell animated films.
It’s always funny how ideas connect, but, earlier in the day, I had been chatting with my friend and colleague, composer Aaron Rosenberg, about my current composition project, Hiraizumi, and, in talking about the piece, he had referred to the electronic parts as “sci-fi moments” (fondly, of course).
I previously wrote about how I was crafting the electronics to evoke distortions in digitally-mediated memory, and, revisiting Shirou Masamune’s world of Ghost in the Shell, primed to think about sci-fi music, I realized that my wind ensemble piece falls into the cyberpunk genre.
In Ghost in the Shell, memory (and its fallibility) is a recurring theme. People with cybernetic brains are able to directly access the internet, but this connection opens people up to having their minds and memories directly changed (and possibly hacked), and interacting with others on the web breaks down the border between self and others.
If one’s self is defined by memories and experiences, inaccurate memories (or memories curated by Facebook), might reduce the sense of individuality. This loss of individuality and the dehumanizing effects of technology are common cyberpunk themes.
As the protagonist, Kusanagi, says:
There are countless ingredients that make up the human body and mind, like all the components that make up me as an individual with my own personality. Sure I have a face and voice to distinguish myself from others, but my thoughts and memories are unique only to me, and I carry a sense of my own destiny. Each of those things are just a small part of it. I collect information to use in my own way. All of that blends to create a mixture that forms me and gives rise to my conscience. I feel confined, only free to expand myself within boundaries.
Hiraizumi’s cyberpunk elements seemed even clearer, when, looking back on my choices of sound design in the electronics, I discovered moments that seemed inspired by sounds from Ghost in the Shell and Vangelis’s score to Blade Runner, another cyberpunk film that questions ideas of self and identity.
That all said, I wasn’t thinking about Ghost in the Shell when I started work on Hiraizumi, but I am a fan of Japanese cyberpunk, and these things are all rattling around in my head
Created without conscious intent, these cyberpunk themes are a byproduct of the expressive goals of the piece, and I look forward to where the music will take me as I finish my journey to the double bar.
(As an end note, I should also mention that it’s no surprise that one of my old musical heroes, Cornelius, did the soundtrack for the new Ghost in the Shell series.)