Output Module Development

P. Short

Super Moderator
Staff member
Is there a simple hello-world example (or other examples) available for writing output modules?
 
A controller where, among other things, the data is sent in a non-standard order over WiFi. One of the intentions is that the system attempts to minimize the effects of missing (or greatly delayed packets) having the central node send a certain percentage of the data several seconds before it would be displayed and the display nodes would store that data. If there was a disruption the display node would have a certain percentage to the data needed for display stored in memory, and it would output a degraded display (perhaps 10 or 20 frames per second instead of 40). The core idea is to avoid as much as possible the situation where the visual display freezes for several seconds in the event of a disruption without increasing the amount of WiFi traffic.
 
I had made tutorial a long time ago for an effect module when when Vixen 3 was still in beta. I doubt things work the same now but if there is nothing else, maybe things didn't change too much.


Unfortunately I never updated it because I decided at the time Vixen 3 wasn't for me. If you do check it out and it is useless, let me know. I'll pull the videos if that's the case.
 
My understanding is the output controllers are set to a rate, say 20Hz. Every 50ms they wake up and sample their channel data from the Vixen engine. I don't think the design lets you get to data before or after the current frame.
 
Shoot. I would've like to make more use of Vixen's native capabilities.

However, all is not lost. I've written linux programs (that run on native Linux machines, native FreeBSD machines, or under WSL) to process either the .csv export file or that peek into Vixen's .tim and SystemConfig.xml files.

I will now retreat into my cave for a while.
 
This is a situation where it would likely be better to write an output for FPP and let vixen just send standard e1.31/ddp to FPP. Since you are going to want to run your show using FPP anyway, the output capabilities are needed there anyway.
 
A controller where, among other things, the data is sent in a non-standard order over WiFi. One of the intentions is that the system attempts to minimize the effects of missing (or greatly delayed packets) having the central node send a certain percentage of the data several seconds before it would be displayed and the display nodes would store that data. If there was a disruption the display node would have a certain percentage to the data needed for display stored in memory, and it would output a degraded display (perhaps 10 or 20 frames per second instead of 40). The core idea is to avoid as much as possible the situation where the visual display freezes for several seconds in the event of a disruption without increasing the amount of WiFi traffic.
Local SD Card or you would need to add more local RAM (a few GB) which is actually more difficult than it sounds.
 
@MartinMueller2003
I'm not sure that I agree with you.

Suppose the refresh rate is 40 Hz (i.e. 25 ms/update). During this time the pixel output device would need to receive 40 complete output frames worth of data per port (e.g. 750 bytes for a single-port device with 250 pixels). Suppose that of those 40 frames that it receives (in the no-packet-loss scenario) in that second that frame 0 is actually data for the first frame in the next second, frame 8 is actually for the the 8th frame two seconds in the future, frame 16 is for sixteenth frame three seconds in the future, and so forth, with frame 32 is for the 32th frame 5 seconds into the future. This means that the intended output data for those frames was received over the prior seconds and stored in RAM.

The pixel output device has everything that it needs for smooth output in the absence of packet loss.. It has received 35/40 of the frames for this second of output in the immediate time frame, and the other 5/40 was received over the prior 8 seconds. If there was a total blackout for one second, it has 5/40 of the total number of frames that it would otherwise output spread out over that second, and would need to interpolate for the remaining 35/40 of the output frames, and the following second would be missing 1/40 of the frames that would be output. If the blackout lasted two seconds, the first of those seconds would be as before and the second of those seconds would be missing 36/40 of it's output frames (the other four were received over the four seconds prior to the blackout and stored in memory).

The total amount of RAM needed for this scheme is 750 * (5+4+3+2+1) = 11250 bytes, which is hopefully available with the ESP8266 cpu. The net effect of this scheme is that the pixel output device can continue to output data, albeit at a reduced frame rate during an entire 5-second blackout. Having more RAM obviously increases the blackout time that can be overcome, or alternatively the quality of output during and after blackouts.

Since I have my own home-brew controller designs and Linux-based code for sending data to those controllers I'll play around with these concepts. As noted above, it's not likely that this will be feasible in the Vixen3 streaming environment, and is not needed in the local storage situation.
 
You misunderstand my intent. WiFi is NOT a real time medium. My intent is to say that the frame information should be stored in advance on the destination devices and then output as needed aka fseq files with synch messages.
 
I'm well aware of your intent, it has been quite clear over many posts over many months, and I'm largely in agreement with that. Streaming is fraught with difficulties such as those encountered by Paddler in the thread in the 'Pixels and Pixel controllers' forum. However, I think that it's fun to investigate alternative approaches that might go against conventional wisdom.
 
Back
Top