Vixen 3 MIDI input

jchuchla

Supporting Member
We've had various requests over the years to add MIDI support to Vixen. In most cases, the request was very vague. I'd like to get a feel for how you think you'd use such a feature. I have my own preconceived notions, but i want to make sure i'm thinking about all of the use cases as we map out a plan for such a feature.

There's two different primary use cases that come to mind. One would be to use existing midi files, the other would be to use a live midi stream as an input while sequencing. A third use case might be to actually play a show as a live performance with a midi keyboard.

MIDI data contains different kinds of data. There's the obvious note data, but there's also controller information, and program information, as well as machine control commands and extended data. Knowing how it would be used would steer us toward which data we're interested in and how we translate it into vixen.

So if this of interest to you, let's get some discussion going to help us plan the roadmap. I think this would be a fairly straightforward project as long as we have a clear plan.
 
We've had various requests over the years to add MIDI support to Vixen. In most cases, the request was very vague. I'd like to get a feel for how you think you'd use such a feature. I have my own preconceived notions, but i want to make sure i'm thinking about all of the use cases as we map out a plan for such a feature.

There's two different primary use cases that come to mind. One would be to use existing midi files, the other would be to use a live midi stream as an input while sequencing. A third use case might be to actually play a show as a live performance with a midi keyboard.

MIDI data contains different kinds of data. There's the obvious note data, but there's also controller information, and program information, as well as machine control commands and extended data. Knowing how it would be used would steer us toward which data we're interested in and how we translate it into vixen.

So if this of interest to you, let's get some discussion going to help us plan the roadmap. I think this would be a fairly straightforward project as long as we have a clear plan.

I'm an xlights user (only option, since I'm on a Mac). However, based on my heavy-usage of a midi-file in one song, there is one thing I would have liked to see in a sequencer:

Instead of simply turning LEDs on/off in response to the midi-note, it would be nice to have the ability to control what effect is actually played during the note. For example: Carol Of The Bells, for my icicles, I would liked to have the ability to do a "drop-effect" when each note is played. In my Let It Go intro, I would like to have the ability to fade-out each note. Currently, I had to create all those effects manually.

https://vimeo.com/user60637531

I think also having access to the extended data would also be useful, beyond just the note-data. I would presume that some of this information would correlate to various notes played by various instruments. This could allow a user to assign different midi-instruments to different props. For people that are more musically-inclined, working with midi-files might be a better solution for working with a more complicated/technical sequence.

I really liked using the midi-file in xlights, but definitely think there's a lot more potential that could be developed in regards to using a midi-file (in any sequencer).

As far as having a live-midi connection: I think that could be a lot of fun, as well (thinking along the lines of the piano scene in the movie Big with Tom Hanks--having a keyboard that kids could jump on to play with the lights, or some variation of Dance-Dance Revolution).
 
We had some developer conversations about what would map to what. Our thought would be that it wouldn't really be practical to derive which effect to use from the midi data. There really just isn't a natural translation. That being said, i wouldn't want it to be simple on/off either. I think we'd map it to pulses instead. At least initially. This would use the note to determine which element to apply to, and the midi velocity to determine the intensity. In vixen, it's very easy to swap out effects. I think the value of the midi integration would really be for the positioning of the effects, not necessarily for the selection of the effects. So I'd envision the midi import/recording part being a first pass to get the effects onto the timeline. Then you'd be done with the midi part and you'd go back and swap out the effects with what you really want, and add color and such to it all.

Midi supports 16 simultaneous "channels" of data. Each channel typically represents an instrument. So we could have a separate map for each channel.

there's the concept of program change. In MIDI, this is typically used to tell an electronic instrument to change the instrument sound. in other words, change the piano to a harpsichord. Or something to that effect. We could interpret these program change messages as an instruction to change the map. So for example instead of mapping notes to your snowflakes, it could then change the mapping to your stars.

There's lots of ideas, we're trying to turn them into a plan.
 
Yeah, I agree that you can't derive which effect from the midi data. However, I think it would be useful to have options of which effect to apply to each note (i.e., fade/chase over the course of the note, or have the intensity linked to the note-velocity). Sounds good to have positioning of the effect based off midi, but allow user-selection of the effect that's applied.

As far as the midi-velocity--I think xlights only has the note-info. That would be a good example of "additional data" that would be useful to have.

As far as having to go back and swap out the effect for what you really want. That's basically what I had to do--and it's doable, but tedious. Essentially, there are two parts that would be great: 1) use the midi file to place the notes (and timing track) and 2) be able to assign which effect is played at each note. Mid-way through sequencing, I REALLY wanted the ability to go back and edit my midi-file. However, I knew I couldn't, as I had so many manual effects all places. So the ability to assign an effect to a note would really be a powerful combo.
 
ok, don't take this as me being argumentative, purely constructive. I want to get a handle on the desired workflow.

If you want to assign the effect played at each note, it sounds like it would need to prompt you for an effect decision at every note. Is that really what you're thinking? To me that sounds painful!

Keep in mind that a lot of this is probably fundamentally different because of the different xLights/Vixen workflows. (and i am trying to keep in mind the videos of the sequences you recently posted to make educated guesses about your workflow)

If you're using note data, i would think that you're working in clusters of the same thing. So if they're piano notes, they'd all be the same effect, just varying in time and intensity and element target. They probably wouldn't even change color much. Maybe it changes from section to section of the song, but overall, i would think it'd be done in groups. Anyone i know that's currently banging out notes for sequences is doing it with simple set level or pulse effects. So I'm not getting why you'd want to select an effect per note. If you're doing that, you might as well be placing the effects by hand.

Thinking back to your carol of the bells video, most of that is using just two different effects. At least, if you were doing it in vixen, it would only be two different effects. chase and pulse. And in that video you always do whole sections with the same effect. If the notes were all placed by midi, you'd literally only be left with a few minutes of work to swap out some sections from pulses to chases. (or vice versa)

So there must be something I'm not understanding in what you're saying.

Vixen doesn't have a "timing track" concept. I'm not 100% sure of the differences in usage, but the closest analog we have is "marks" and "mark collections". That's how we do beats and bars, but also any other reference points too. Midi does have a concept of tempo and sync. We could probably record that information into a mark collection. Though, typically the sequencer would be the master of the sync and tempo, so I'm not sure how that would apply. rarely would a midi file match a recorded audio file, so i can't see how there'd be any meaningful correlation between midi timing and audio timing. (unless of course the audio file is generated from the midi file itself)

I'm not looking at all at xLights in how to implement this. I'm only mentioning xLights here so I can understand your point of view how you did it there.
 
Yeah, I think some of the confusion is the workflows between vixen v xlights, and the fact that I have no concept of vixen, so what I'm saying may not translate over well.

Using my example in Carol of the Bells (to make things simpler), I assigned each icicle to a specific midi note. And that's one single effect on each icicle over the duration of the entire sequence. However, this effect can ONLY turn the icicle on/off based off the note in the midi-file. What would be beneficial, would be the option to define exactly what effect is played at each note. Instead of just on/off (for each icicle) at the appropriate note, I wish I could have had the ability to assign the effect (that is always applied to that icicle based off the note): either do a drop/chase-effect or a fade-to-off effect (which would be based off the duration of the note).

So I wouldn't necessarily want a different effect played at each note--just the ability to select WHICH effect is played (at all the notes). In xlights, I can't simply "swap-out" the effect. I had to use the midi-applied effect to see where the on/off is, and manually place my drop-effect over each note (for all the notes). This was the time-consuming part (and the part that prevented me from being able to go back and make any changes to the original midi file).

And true, most midi-files won't actually match the recorded audio file. However, it was a pretty simple task for me to go into a midi-editor, and make the minor tweaks necessary to align the midi and the recorded file. And I even added some additional notes that weren't in the original midi file, but knew that I wanted in my sequence--I was basically using a midi-editor to edit my sequence (and it really was a lot easier than trying to edit in xlights, other than the fact that each note was limited to on/off).
 
BTW, did you ever answer in the other thread which version of carol of the bells that was? (artist, album, name?) I haven't been able to find it.

Oh yes, this would be a completely different workflow. In Vixen, each note would be its own effect. We'd be creating some sort of mapping interface to assign an element or group to each note. (Vixen group/element = xLights Model/Node) If you wanted a whole icicle to turn on as a whole you'd use a pulse effect. If you wanted the icicle to drip, you'd use a chase effect. So if you wanted all of the "notes" on a whole verse, or a 16 bar section or whatever, you'd just select those effects over that timespan and swap them out to another effect. That's all regular existing Vixen stuff. Only the creation of the effects from midi would be new.

Quite honestly, after seeing your sequences, i'm surprised you did that in xLights. Your sequences look more like Vixen sequences than xLights sequences.
 
Carol Of the Bells, George Winston, December (album)

Yeah, for that sequence, I was really wondering how other sequencing software would handle what I was wanting to do. And quiet honestly, it wasn't *that* time-consuming to drop in all my effects, using the midi-effect as a guide. It's just that I couldn't alter the midi file once I started (or if I did, I had to be very careful about what I changed).

I'll be curious to see how it gets implemented in Vixen, and to hear what other methods people could conceive using a midi file--especially if you're able to utilize additional data in the file.

Right now, I think xlights piano-effect (the only one I've been able to find that can use a midi-file) is really geared to displaying a piano on a matrix. The way I used it wasn't the way it was intended to be used. Basically, that whole sequence was just an experiment--all the way from making the icicles (economical, waterproof, diffuse), to the wiring schematic (how to simplify a wiring schematic that works for random placement in a tree), to how to adapt a midi file with a recorded audio file and get the lights sequenced the way I wanted. And it all only took a week.
 
Aah yes, George Winston. I hadn't thought to search that one.

i would think that we'd see more use from recording live midi input rather than reading files. I know people have asked for that. I could see people using a midi keyboard or other controller and "playing their effects into the sequence"

On the Vixen side, we try pretty hard to keep our mechanisms as abstract as possible so they can be as flexible as possible to fit many use cases. That's why we like to hear all the ideas before we start working on it.
 
Hi Jon,

This is something I've been interested in since learning about Vixen. I've been keeping an eye on http://bugs.vixenlights.com/browse/VIX-411 and even had a go messing around with the Live context and a MIDI library in my Git fork. To be honest I was a little overwhelmed seeing the Vixen source as a hobbyist developer and haven't made much progress/had the time recently.

I have a Novation Launchpad Pro (a 10x10 RGB MIDI grid controller) and a Novation Launch Control (16 MIDI encoder knobs plus some buttons) I'd like to use with Vixen. A common feature in many audio applications is 'MIDI learn' whereby a button is clicked (or global hotkey) to active learn mode, then a control on the UI is selected, followed by pressing a MIDI note or turning an encoder and then finally exit learn mode. The MIDI control is then linked to that UI control (which could be highlighted to indicate it is mapped). There is also a screen to edit/remove mappings.

To play live the way I imagined it could work is a slightly modified version of the sequencer, effect blocks could drawn on the timeline as normal and then be triggered on a MIDI event (using the learn mode described above), it should be possible to trigger multiple effect blocks at once. Combine that with being able to adjust the effect parameters in real-time with other MIDI encoders or notes and you have a whole lot of flexibility.

To record a live performance you could write a MIDI file of the incoming events which could trigger the effects again on subsequent playback, in fact you could make it work like a loop pedal where additional MIDI data is merged in each time you run through, tweaking parameters and building up the sequence. After completing a sequence in this manner you could then render your live sequence to a regular timeline based Vixen sequence where further adjustments could be made. I can't see it being easy to go back the other way, but would that really be needed?

In a separate project of mine I'm using the RGB lights on the Launchpad to display a preview of what the lights are doing, much like the existing built in preview window. I plan to have a 'pallet mode' whereby I can quickly select a colour from the Launchpad, it could also be used render gradients/brightness curves/effect previews in Vixen.

Happy to clarify my rambling or help in any way, thanks for the great work you guys do!

Cheers

Jez
 
Jon,

Has any progress been made in this development? I have been searching for a way to speed up sequencing and this seems to be the most relevant discussion I have found. I can see the midi incorporation being very helpful if it allows the user to sequence on the fly while listeing to the audio track, save that sequence, and then develop it further with subsequent iterations.

My idea of the workflow is to focus on one or two props at a time and add their effects to the timeline via key presses while listening to the music. I just have a simple Renard SS24 setup, and feel it is very time consuming and exhausting to sequence an entire song.

I also think this could also be achieved solely through Vixen without the need for extra hardware, simply by adding the ability to assign a handful of keys on the computer keyboard to specific effects on specific channels so the user can place them on the timeline the same way they can put markers down in real time. And maybe I'm being greedy, but if preview window could reflect the key presses that would be cool too. This could be pretty intuitive and make sequencing a lot more fun!

-Brent
 
We haven't done anything with the MIDI functionality yet. It hasn't bubbled up to the top of the to-do list.

Because we use the keyboard for so many other things already, it wouldn't be very practical to use the keyboard as an input device.
 
Back
Top