If you missed my live stream a few weeks back here is an unedited YouTube live replay of my solo multimedia concert. The performance is a combination of live arrangement and live playing that features original electronic music with real-time visuals.
16:50 “Light Runner” live performance of song from a forthcoming album
22:34 End Program with audio excerpt from the song “They Walk Among Us” from album Reboot
As you can see some of these tracks will be released in the future. Follow or subscribe on my Bandcamp site to get notified of an album or single release.
All the music and sound is from Elektron Analog Four MKI, Octatrack MKII, and Digitone. The machines are not synced. I mostly play directly from the machines, but also play notes on the Analog Four via the Novation Launchkey Mini MK3 keyboard and on the Digitone via the Launchpad Pro MK3 grid. I also use the Launchkey Mini MK3 along with a Logidy UM3 pedal to control Resolume Arena 7 in real-time. Visual sources are from live camera input and other various sources.
The performance got some nice mentions by synth publications.
I got an amazingly kind mention on the Synth and Software Facebook page.
Synth & Software is an online publication and social media community for electronic music enthusiasts—anyone who enjoys listening or who plays, programs, or aspires to make music with synthesizers, samplers, DAWs, audio processors, and other instruments and software…
I view this entire rig as a multimedia “instrument”. One goal in designing this “instrument” was to accommodate the following use cases while being air locked in a performance cockpit where my performance flow would not change whether I’m performing:
Music sets in-person
Music + visuals sets in person
Music + visual sets via live stream
Music + visuals sets in person simulcast via live stream
When I’m performing in person I don’t normally use headphones and don’t look at the monitor on my laptop. Instead listen I to the P.A. or monitors. I look at the screen or wall/screen/scrim where the visuals are being projected. This incorporates the performance space as part of the vibe of the performance.
My streaming equivalent of this is to us a big screen right in front of me and perform with reference monitors at near concert volumes. Performing this way is much closer to performing on stage in front of an audience which puts more positive energy into the performance.
Conceptual Overview of 2020 Rig
Performance Cockpit – The region outlined with aqua highlighter is my performance cockpit where I play, perform, control live arrangements, and visuals.
Music Machines – All sound and music is produced via the following Elektron machines:
Analog Four – Analog synth with 4 synths and effects
Octatrack – Performance sampler, mixer, looper, effects processor, live arrangement of playback when needed
Digitone – Digital FM synth with 4 synth and effects
Live Camera for Visual Synthesis – I compose the camera shots in real-time by moving a Logitech HD Webcam which is a source for my visual software Resolume
Novation Launchkey Mini MK3 – Keys are used to play the Analog Four through hardware MIDI. RGB pads use custom MIDI modes to control Resolume through USB MIDI.
Novation Launchpad Pro MK3 – Velocity and pressure sensitive pads are used to play the Digitone through hardware MIDI. I have custom control modes that I can also use to control Resolume via USB MIDI.
Audio to the house / Audio interface with Loopback for Streaming – Yamaha AG06
Laptop for Visuals and Streaming – Microsoft Surface Laptop 3, Windows 10
Visual Software – Resolume Arena 7 software
Streaming Software – OBS Studio software for streaming
Streaming Endpoint – YouTube…
Using Midi Controllers to Decouple Visual Arrangements from Song Arrangement
In past incarnations of my multimedia show I controlled the arrangement and automation of visual effects for Resolume with Ableton Live “dummy clips” and automation envelopes. This meant the arrangement for visuals were implicitly locked with musical arrangement. For this new rig I wanted to decouple the visuals from the audio arrangement. This would allow me to perform the arrangement of the visuals manually making the experience more variable and unique with repeat performances.
I now control arrangement with the RBG pads on the Novation Launchkey MINI MK3.
Alternatively I can use the Logidy foot pedal for hands-free arrangement.
While I’m pretty busy playing and doing live arrangement, I can also use the Launchpad Pro MK3 with it’s custom MIDI modes and velocity sensitive fader grid feature to control parameters on Resolume.
FYI I’ve written past posts on these controllers in these categories:
My set list is made up of songs spread across each machine and I often perform an entire song using only one machine. In a few cases I will work across machines. For example, I perform the majority of my song “Disconnected” on the Octatrack and the leads from the Analog Four.
The machines are not synced and are not connected via MIDI. This allows for complete freedom to do live arrangement, change my set list on-the-fly. I might have corresponding patterns with matching tempos across machines but I select them and trigger them on-the-fly so I can play around with the pocket, tap tempo to shift or even change tempos or free-form play to add more tension.
Having no cross-machine dependency also means if a machine where to go down during a performance or have to be rebooted (which has never happened), the song and show can go on.
Fits in One Case and a Backpack
Another design goal for the rig was it had to fit into a Pelican 1510 case and backpack. This allows me to fly with the rig as well as load in to local shows with only one trip from the car.
I flew with an earlier version of this rig to Asheville, NC to perform at Mountain Skies in 2019 and manage to jam all this into a Pelican 1510.
With the addition of the new controllers I’m sure I could fit it all in the pelican with some overflow into my backpack.
Thanks for following along on this long read. If you’d like to get blog posts via email opt-in below.
Also swing by my Bandcamp page if you’d like to listen to some of my audio recordings, and my YouTube page to check out move videos.
My friend and mastering engineer Steve Turnidge just posted this on his FB page. It’s a video of the “Yellow Magic Orchestra – TECHNOPOLIS [Live] (Jun. 2, 1980) HD”. This is so great on so many levels with a super tight full band performance with some fun vocoding, an old-school modular, and the classic “unlit cig while playing on TV” punk move.
I think I listened to this like 5 times before moving on to a video of “TONG POO – YMO 1979 LIVE at THE GREEK THEATRE” which is the encore from this show!
Well there goes your morning just watching all the “Up Next” videos. You are welcome.
Last month I participated in #jamuary and created 7 new songs which I’ve posted as videos to this YouTube playlist. . It was really inspiring and freeing to just go for it and finish these short musical themes.
Via @truecuckoo … “The idea with Jamuary is to get going with what you have, and to help get comfortable sharing and performing little jams, and to get to know each other and appreciate what everyone is doing in the electronic music independent scene. Post a little music snippet every day of jamuary, or for as many of the days that you have the energy to. Browse the hashtag and get inspired. Get to know musicians you had never heard about, and perhaps even make some friends along the way.”
Below are the individual vids below (click on the pics of vids if you are reading this via email). The are numbered based on the day in January in which they were released.
Let me know if you have a favorite in the comments.
Day 04 Featuring the iPad and Resolume
I created the song from scratch in Korg Gadget. I’m still amazed at how fantastic these virtual Korg synths sound!
In this piece I used Korg’s emulations of the Arp Odyssey (Arp ODYESEi aka Lexington) and Korg Mono/Poly (MonoPoly aka Montpellier) for the leads. The bells from the gadget synth Helsinki which are bit-crushed. Drums are from various drum gadgets. The earth shaking smooth bass is from an emulation of the Korg MS20 (iMS-20 aka Memphis). I performed, arranged and mixed the piece in Gadget then rendered it out to AudioShare.
From there I normalized the audio and then exported to dropbox. To create the video I took a screen recording of the audio running in the AudioShare player on the iPad. I then trimmed that and used it as a source for Resolume. I then performed real-time synthesis using that video as a source.
Day 05 Featuring the iPad and Edradour 10 Year Single Malt Scotch
My #jamurary2020 Day 05 is improvised piano with Frippertonics looping with additional live sampling powered by Edradour 10 Year single malt scotch 🥃 😋.
I’m using a gorgeous virtual grand piano from the Korg Module app. I’m feeding that into the fantastic Audio Damage Enzo looper to do Frippertronics looping. Once I get some music rolling I live sample the looper plus live playing into the Samplr app. All this is being run through the gorgeous Eventide Blackhole reverb.
Day 06 on iPad and Launchpad Under the Watchful Eye of Gort
My #jamurary2020 Day 06 – under the watchful eye of Gort – is once again made all on one @Apple iPad Air with no external synths or audio processing. #aum is the host and the musical bed is a string part improvised with lovely @sugarbytesofficial Factory synth into an instance of the most amazing @audio.damage Enzo looper and a beat I created with most groovy @olympianoiseco Patterning 2.
I improvise a lead on @cmepro Xkey Air keys over the bed with the killer @korgofficial iMono/Poly synth and start fading in parts with the LaunchpadX. My lead and the musical bed start to fill up effects buffers in @sugarbytesofficial Turnado and I then improvize and control slicing in real-time with Launchpad X. The @Eventideaudio Blackhole appears throughout the signal flow to give the piece some sense of space.
Day 07 Performed on iPad at a Local Coffee Shop
I recorded my Jamuary2020 Day 07 synth improv video from Bittersweet Cafe in n Louisville, CO while enjoying a cappuccino ☺. One again this is on an @Apple iPad Air with no external synths or audio processing. #AUM is the host.
I started off by improvising a loop with a bell pad sound using @sugarbytesofficial Factory synth performed with GeoShred Pro as a MIDI controller using @audio.damage Enzo looper. I then used @audio.damage granular synth Quantum to make a textural bed underneath. The source is from an original field recording I made New Years Eve 2018 in a tube station in London. It’s about 200 people singing the same song as they jam into a tight tunnel to access tracks below ground. I build a loop using @audio.damage Enzo. The @Eventideaudio Blackhole appears throughout the signal flow to give the piece some sense of space.
I then live programmed the drums and bass elements using @olympianoiseco Patterning 2 with the circular grid and with real-time recording with pads on the glass. I finish off with a little distorted lead using GeoShred Pro with Factory synth.
A tech note – this was all battery-powered and also wireless as I used AirPod Pros to monitor the iPad and used AUM to record digitally record the session. GoPro Hero 7 was used for the video. It all fit into a small sling bag 😎.
Jamuary 2020 Day 13 – Nord Lead 4 Morph Mania
For my #jamuary2020 Day 13 I’m using only my trusty @nordkeyboards Nord Lead 4 with 3 custom presets from “init” arranged into a “performance” with a split.
For those not familiar with the Nord Lead 4, it has 4 complete synthesizers each with their own FX each. Presets called “programs” can be placed into one of four “slots”. Slots can be combined into “performances” where you can perform with all four slots in concert.
For EACH synth instance you create both impulse and continuous morphs which allow you to save states for pretty much any parameter in for the preset in a slot without changing presets 😮. You can save and trigger 7 impulse morphs per slot with impulse morph buttons on the left next to the wooden pitch stick. Morph buttons are momentary switches but you can latch them as I do 26 second mark. Continuous morphs are mapped to the groovy pumice mod wheel (or map this to an expression pedal).
I used this custom set of presets and “performance” when I performed as a special guest for the Boulder Laptop Orchestra (BLOrK) at the @cu_atlas Atlas Black Box at @cuboulder back in 2017. For this performance I used only the Nord Lead 4 with 14 programs each with a large number of morph states. Video here https://youtu.be/ScpItttfGlY.
My Day 29 #Jamuary is a #MusiqueConcrète piece I improvised using ONLY samples I recorded at my favorite barber shop @rockbarbers (http://www.rockbarbers.com/). A big thanks to Rock Barbers for allowing me bring my @zoomsoundlab H1n recorder to do a field recording session at their place 😀.
I made this using a @microsoft Surface Laptop 3 on Windows 10 with @Ableton Live, #Push 2, Wavetable, one drum rack and 3 instances of @audio.damage Quanta. This play-by-play below applies to the long version which is on YouTube here.
I start off playing an instance of Ableton Wavetable. I made a custom preset by importing the sound of a barber shaving cream hot lather dispenser to make a custom wavetable. The right waveform is barber shears. Further #SoundDesign from there to make it all an instrument. I play this on an Ableton #Push2 controller and record the notes I’m playing on-the-fly to create a MIDI loop. A bit of Echo and Reverb on this.
Once I get that going I move on the 2nd track which uses the most amazing @audio.damage Quanta granular synthesizer where I use a sample of barber shears. I improvise and record notes into a loop. I sweeten this using the Amazing Noises Outer Spaces @c74connect Max for Live reverb.
I move on to a 3rd track. Again I’m using Quanta. This time I’m doing granular synthesis using a field recording of the cash register drawer opening. The ding bell sound played across the keyboard and stretched with granular synthesis sounds a bit like chimes. Once again I use Outer Spaces as my reverb.
I move on to the 4th track, use Quanta yet again and this time the sample is the sound of a comb being banged inside a barber comb jar with Ableton Echo for effects.
I move on to the 5th track and bring things to a crescendo using an Ableton Drum rack loaded with samples of the footrest being dropped on an old barber chair, the sound of the razor strap on that old chair being jiggled, the din of the shop, shears, an electric clipper, the comb banging on the jar, and a blow dryer. I then start to fade dial back the energy and eventually return back where I started at Ableton Wavetable with a bit of barber shears.
I Hear Your Signals is album #2 in a series of alien invasion concept albums containing original electronica, dark ambient, and experimental tracks. This second album in the series, released in 2010, is a retelling of Album #1 – but – from the alien point of view inverted emotional curves!
My cover of “Derezzed” (Mile High Edit) was really an illustration of the show concept I designed and performed from 2010-2015. While that show was performed with all original music I also thought it would be fun to do a cover with the same show design – because – well I love Daft Punk. So let’s start with that…
Video “Derezzed” Cover (Mile High Edit)
I re-sequenced the song from scratch (no samples for the original were used). I performed the music and visuals live in a single take with no edits.
My rig at the time was Ableton Live, Percussa AudioCubes, Moog Etherwave Theremin Plus, and Tenori-On sending MIDI notes to an Ableton Rack with VSTs. I used the Machine to control the arrangement in Ableton on-the-fly and to live sequencing the parts I wasn’t playing live as well trigger clips in Live to automate changes in Resolume. The AudioCubes were used to trigger and control effects through gestures, as well as add 4 more dimensions of gestural control for effects for the Theremin. I used the Tenori-on to improvise a lead :^) More on all this below…
Background & Show Design
I created the show concept in support of my original alien invasion sci-fi theme concept albums Reboot, I Hear Your Signals, and – at the time – the forthcoming Fear Cannot Save Us.
Note: These albums are available on Spotify, Apple Music and most other outlets. The albums are free “name your price” on Bandcamp.
My design goal was to have the audience more easily relate to what I was doing with a complex and abstract rig by incorporating controllers that offered real-time visual feedback. This would allow them to correlate what I was doing with what they were hearing from the speakers. As the show evolved I amplified this further with projected visuals with real-time FX using only live camera input.
Visually the show design was a inspired in part by Tron, and John Carpenter’s “They Live”. I also used additional lighting to create shadows inspired by German expressionism films like “The Cabinet of Dr. Caligari“.
Here is an example of a live performance. More videos from shows at the bottom of the post.
I’m playing lead synth parts on “Gonna Rise” up from the album Fear Cannot Save Us (Spotify, iTunes).
It’s a custom synced oscillator preset from init on the Elektron Analog four.
Speaking of Elektron, this was a shakedown voyage for my Elektron trinity rig with Analog Four + Octatrack MKII + Digitone + Xkey Air 25 (Kentron USB Host) which I’ll be using for Mountain Skies electronic music festival in Asheville, NC in May.
I got my first set of Percussa AudioCubes in late 2009. They have informed and inspired so many of my songs and have been an integral element of my live performances throughout the years. With my 10th anniversary approaching this year I thought it would be fun to look back at some of my favorite moments with the AudioCubes to date and also offer some thoughts on where I’m heading next with the AudioCubes.
TheAudioCubesare a collection of wireless intelligent light emitting objects, capable of detecting each other’s location and orientation, and user gesture (Wikipedia)
Before we go any further I first wanted to offer huge thanks to Bert and Celine from Percussa who have provided musicians and multimedia artists with such an innovative, transformative, and expressive solution that allows us to build custom signature “instruments” for studio, performances, and installations. I especially appreciate their willingness to listen and incorporate feedback year after year to support artists.
The Evolution of AudioCubes in My Video and Pics
AudioCubes have gone through several revisions, with the latest version, the Wireless AudioCubes PRO, featuring full wireless communication and high end materials and components.
Through the years I’ve upgraded my cubes and all my cubes are now AudioCubes PRO so you’ll see various generations in the pics and videos.
I've seen a lot of Rick Wakeman videos on the YouTubes, but this one is pretty damn fantastic. It's a solo Performance Live At Montreux 2003 while touring with Yes - https://youtu.be/cx5ovZVGcd0. He looks pretty exhausted by the end of this blistering 5 minute plus performance. Fantastic!
This song was performed using Tenori-On as a MIDI controller and live sequencer driving racks for synths hosted in Ableton Live. The visuals are generated on-the-fly using live camera input as the only source with real-time visual FX.
[project ruori] (http://www.ruori.org) for close-up video footage Paul A Vnuck Jr. for opening still photo Greg and Hong Waltzer for hosting such a wonderful event Edward B Siedzik for running sound Anna E Siedzik-Torres for running lights
I stumbled on to a live video of this cool piece called “Amphis” by Luke Abbott. Enjoy.
Luke Abbott live performance of 'Amphis', filmed at Wysing Arts Centre on 8th March 2014. 'Amphis' features on the forthcoming Luke Abbott album 'Wysing Forest', to be released by Border Community on 23rd June 2014.
Listener/Viewer Notes This video is in HD and I captured the audio full fidelity right from my sound card so listen with some good headphones or on a good system and select HD for full-screen viewing. The video and audio were captured in one continuous take with no content edits.
Composer Notes To fit the back-story of my album, I set out to compose a song that sounded a bit alien in origin. To liberate myself from my typical compositional instrument of the keyboard, I decided to compose and perform the textures and melodies using only spatial controllers. I this case I used a Moog Etherwave Theremin, and a Percussa AudioCube. Once I got going with this notion I really got using 6 dimension of spatial control to go “Hendrix” with the Theremin. The title of the song has many meanings, one of which should be obvious to Theremin fans.
Producer Notes I'm routing the Theremin analog signal into Ableton Live and then I convert the the signal from pitch-to-MIDI in real-time. This signal is routed to various virtual instruments hosted in Live. I then use a Percussa AudioCube in Sensor mode to add 4 additional dimensions of modulate in real-time. So 6 dimensions of spatial control. I'm changing the signal routing of the Theremin to route MIDI to different virtual instruments on the fly using the Novation Launchpad.