I live just over the hill from you. PM me your address... would love to see others creations in this group that live close by.
Sent from my iPhone using Tapatalk
I'm south in Springville (just south of Provo). Where are you guys???
Jimboha
I live just over the hill from you. PM me your address... would love to see others creations in this group that live close by.
Sent from my iPhone using Tapatalk
Well, my wife and I drove up to Sandy to see this show.
Wow, so how does this work? Is the Piano man there every night, or is it played from recording?
I suspect the light show was completed without the piano player and the good timing and editing made it look like the piano was actually controlling the lights. Just my skeptical 2 pennies.
OK, I have this partway done with proof of concept code working. I have a patch to fppd which reads from a MIDI input and turns on/off FPP Pixel Overlay models based on the notes coming in on the MIDI port. I tested it with a virtual MIDI device, but it should work the same when I get a chance to test with a USB MIDI adapter after it arrives in the mail from Amazon.
Tonight, I went to our Yamaha YDP-181 and had my son play one of his Christmas recital songs so I could record it. I saved the MIDI recording to a USB stick and copied it to the FPP system and then played the .MID file on the virtual MIDI device using ALSA's 'aplaymidi'. fppd read the data from the virtual MIDI input and I was able to show my son that FPP was making my arches light up to match the notes he had just played on the keyboard. Once I get the USB cable, I should be able to hook the keyboard up sending live data to FPP. Our indoor tree with 200 pixels on it is about 5 feet away from the Yamaha, so I plan on testing the code live there first.
I still need to add some more enhancements and some form of UI. Currently, each note pertains to a single Pixel Overlay model and the code requires pixels. The pixel colors and brightness are determined by the velocity from the MIDI data. I want to set it up so that you can have multiple Pixel Overlay models per note as well as allow single-channel models in addition to pixel models. For pixel models, I'd like to have a few options such as whether to hard code a color or intensity or determine those using the velocity or perhaps by the note.
CaptainMurdoch,
I was wondering how your efforts are coming along and if you were able to come up with some form of UI?
I just noticed this thread. It's not really at all related to what Chris is doing, but we've had this project on the Vixen radar for a while. As Chris mentions, it's really a pretty trivial task of mapping inputs to models/elements.
Yeah, everyone is going to have a pianist sitting in front of their house when it's below freezing! it's a great demonstration of the potential of light shows, but practically it seems to be ridiculous!
Yeah, everyone is going to have a pianist sitting in front of their house when it's below freezing! it's a great demonstration of the potential of light shows, but practically it seems to be ridiculous!