Places where I or my words have somehow appeared:
Music Tectonics Conference, 2021: Panelist, “The New Wave of Music Creation”
Music Tectonics Conference, 2020: Panel Moderator, “Innovations in Music Making and Listening”
NAMM/A3E 2020: MIDI 2.0: A Developer Deep Dive
Audio Developer Conference 2019 co-presenter on MIDI 2
Audio Engineering Society 147th Pro Audio Convention, New York, 2019: “Vibrary: A Consumer-Trainable Music Tagging Utility” (co-author)
SXSW Interactive, 2019: Workshop, “Native Cross-Platform Development With C++ & JUCE”
NAMM/A3E 2019, panelist: “Cloud Integration Strategy: Server vs Service”
Summer NAMM, 2018, panelist: “Developing Creative Software for the Social Media Era Musician”
NAMM/A3E, 2018, panelist: “App Integration: Utilizing iOS and Android to Enhance Your Product Line”
NAMM/A3E, 2018, panelist: “Developing Music & Audio Applications with JUCE”
South by Southwest Interactive Festival 2014, panelist: “Software Project Estimation: Three Firms Light Up the Dark Art”
Mix Magazine, on MIDI 2
Music Tectonics Podcast, October 2020, discussing all kinds of future-facing things touching music and tech.
AudioGeek 11 podcast on MIDI 2
The Audio Programmer virtual meetup (slides/voice)
Resolution Magazine piece on MIDI 2.
Reverb.com on MIDI 2
MusicRadar on MIDI 2 (this one is especially good)
MusicTech on MIDI 2
Performer Magazine on MIDI 2
Bobby Owsinski’s Inner Circle Podcast (mostly MIDI 2 with some meandering thoughts…)
The Feature Story on MIDI 2
Managing Devices with XML-RPC (Dr Dobb’s, April 2003). First time I was paid to write.
(Mostly about the JUCE application framework)
MIDI 2.0 Scope (on MIDI.org) — overview of a developer tool I created for working with MIDI 2.0 channel voice messages.
A C++ Class Factory for JUCE — I needed to be able to instantiate C++ classes at runtime from their name, and didn’t like the approach that most examples use.
An RPC Framework for JUCE — a client needed me to break their existing monolithic application into two separate processes. This post talks about some classes I wrote to handle the communication between those processes.
Developing Audio Applications with JUCE part 1, part 2, part 3. I wrote these when I was working my way through the JUCE documentation, which at the time was spartan, and there weren’t as many excellent tutorials covering this ground as there are now. The code that these posts point at has aged badly, both because it targets a very old JUCE version, and also because there are some approaches taken here that I no longer advocate (don’t use the sample code as an approach for communicating between audio processors and UI code!) As an overall orientation to “these are the big pieces you need to understand and this is how they connect to each other if you want to write an audio host application,” it still mostly holds up.