It’s been awhile since I posted! Earlier in the year, we had a few unexpected expenses come up in our family. I started spending my spare time in the evenings doing custom freelance programming to help meet the needs. I have been doing this for a few months now which has helped us out.
God continues to bring new visitors to this blog and I have been able to return emails, phone calls, Zooms, and help so many people implement the ideas and software that I’ve created here. It is truly a blessing to see how God has used this little blog I started a few years ago.
I’m excited to share a new project that I have been working on with my team: Control of our Canon XF cameras through a stream deck. We have a couple of these cameras here at my church, the Canon XF 705 series:
I have been mentoring the guys who work part time in A/V here with me on how to write code and specifically code modules for the Companion project that we use so heavily here. We decided it would be great if we had control of these particular cameras at our shader station alongside the shader control of our Marshall cameras (I wrote about that here) and our broadcast cameras.
These Canon cameras come with a LAN port (you can also use wifi) and it runs a little web server called the Browser Remote which allows you to have full control of all the camera functions, from focus/zoom/iris/gain all the way to recording, white balance, and shutter control. If there’s a button on the camera, chances are you can control it from the browser remote. You can even see a live preview of the camera!
The built in browser remote functions of the Canon XF series.
So we started doing some digging, and realized that there is an internal API on the camera that returns a lot of the data in simple JSON sets. Once you initiate a login request to the camera, it returns an authentication token, which must be sent along with every future request.
For feedbacks on the camera state, we simply poll the camera every second or so. The browser remote page itself seems to do this as well, so we just emulated that.
The browser remote unfortunately only allows one user at a time to be logged in, so when our Companion module is in use, the actual browser remote page can’t be used. But for our purposes, that’s not really an issue since we just want to have button control of the iris/gain functions when we use these cameras during live services. Now I don’t have to ask my operators to iris up or down, I can just do it right from the stream deck!
Here’s a little walkthrough video that shows the module in action:
The module will soon be a part of the Companion beta builds, so if you have a Canon XF series camera, go check it out!
At my church, we have 4 of these cameras: Marshall CV503
Marshall CV503 Miniature Camera
We use them during services to capture shots of the instruments (drums, keys, etc.) and whatever is happening on stage. They are great little action-style cameras, and they have SDI out on them so they are super easy to integrate into our video system.
They have a lot of adjustment options to them via a local joystick-style controller at the camera, but obviously, that’s challenging to use during a service if we needed to adjust the camera’s exposure. The menu is OSD and shows up on the live output. Plus they’re all over the stage and we can’t walk there during the service!
While I wish they were IP-controllable directly, this particular model does not have that option. They do, however, come with RS-485 serial connectors.
So we decided to create a remote shading system using a stream deck running Bitfocus Companion. The Marshall cameras support the VISCA protocol over RS-485. In fact, if you’re a Windows user, Marshall provides free software to control the cameras over RS-485.
Marshall provides this program to control, if you have Windows and want to connect your cameras directly to that computer.
We don’t use a lot of Windows computers around here, and that program requires that the computer running their software be the one physically connected to the cameras via serial. Not ideal for us because the cameras are on a stage and our computers typically are not. Marshall also actually makes a nice hardware RCP – but we didn’t want to pay for that.
So we did what you probably already guessed – put in a Raspberry Pi with a USB to RS-485 adapter that we could control remotely.
We have several wallplates across the stage with network tie lines on them that feed back to the rack room in a patchbay. So we made cables that connect to the RS-485 ports at each camera that then go back to a wall plate into a RJ45 port. We utilized the blue/white-blue pair on CAT6 cable. We used that pair because these are data pins in a normal network connection, which means if someone ever accidentally connected it straight to a switch or something, there would not be any unintended voltage hitting the cameras.
Each camera is set to its own camera ID (1-4), and the matching baud rate of 9600 (the default). Then in the rack room, we made a custom loom to take the 4 connections and bring them into a jack, which then feeds into the USB to RS-485 adapter on the Pi.
The Pi is a 4 model with 4GB of ram. Honestly, for what this thing is doing, we probably could have just run it off of a Pi Zero, but I wanted it hardwired to my network, and the bigger Pi’s come with ethernet ports built in.
DSD TECH SH-U10 USB to RS485 Converter with CP2102 Chip
When connected, it represents itself as serial port /dev/ttyUSB0. We originally planned to use the socat program in Linux to listen for UDP traffic coming from Companion:
To actually send the UDP data, we’re using the Sony VISCA module already built into Companion. The Marshall cameras use the same protocol over RS-485.
Using the socat method, we quickly found that it would only listen to UDP traffic coming from one instance of the module. We need 4 instances of the Companion module because we have 4 cameras, each with a different camera ID.
However, nothing a small Node.JS program can’t solve. So I wrote a program that opens the specified UDP port, opens the specified serial port, and sends any data received at that UDP port straight to the serial port. You just configure a new instance in Companion for each camera with the same IP of the Pi running the udp-to-serial program, and the camera ID that you configured at the Marshall camera.
Here’s a video that shows it all in action:
If you want to try this out for yourself, I’ve made the udp-to-serial repository available here:
If you’ve not heard of Proclaim, it is a presentation software similar in concept to ProPresenter. I’m not a user, but I have had several people write in and ask about how they could control it with a streamdeck, so I though I would share a quick post on how to do this with Companion and the midi-relay software.
This walkthrough will be on the Mac platform.
First, on the Proclaim computer, open the application, “Audio MIDI Setup”.
The Audio MIDI Setup program.
Now, in the Audio MIDI Setup application, go to Window > Show MIDI Studio.
Double click on the IAC driver.
Make sure the checkbox “Device is online”, and click Apply.
Now that the IAC driver is enabled, you need to download midi-relay on the Proclaim computer. You can get it here: https://github.com/josephdadams/midi-relay It is up to you if you want to run it directly from within Node or use a compiled binary. The results are the same.
Once midi-relay is running, you’ll see the terminal output window showing the available MIDI ports.
You can see the IAC Driver Bus 1 listed here.
Now open Companion. It can be running on the same Proclaim computer, or another computer on the same network. In the Web GUI, create a new instance of the midi-relay module.
Search for “midi” in the search bar and the “Tech Ministry MIDI Relay” module should show up.
In the configuration tab, type in the IP address of the computer running midi-relay. If the same computer is running Companion as Proclaim (and midi-relay), you can type in 127.0.0.1.
Now create a new button with a midi-relay action. Choose “IAC Driver Bus 1” for the MIDI Port, and the other MIDI values as you like. Proclaim will detect them in the next step, so the channel, note, and velocity are not too important as long as the note is unique for each action you want to take (previous slide, next slide, etc.)
Now in Proclaim, go to Settings, and click the MIDI Input tab. Click “Add Command”.
Select the command you want to be able to control from Companion. Here, I’ve chosen “Previous Slide”.
There are a lot of options you can control within Proclaim!
Once you select a command, Proclaim will start listening for the MIDI message.
Now go back to the Companion GUI and click “Test Actions” on your button.
Proclaim will detect the MIDI message and apply it to the command.
Repeat this for all the commands you want to control from your streamdeck with Companion and midi-relay.
That’s it! I hope that is helpful! As always, if you need some help along the way, don’t hesitate to reach out to me. If this post or others have helped you, take a minute and learn more about me.
I have had a few people ask if I could post another walkthrough with more precision on setting up midi relay to control Chroma Q Vista (formerly owned by Jands) with their stream decks.
Bitfocus Companion installed and running on a computer/device (it can be the same computer running Vista, or another computer on the network)
To set it all up:
First, you will need to set up the loop-back MIDI port. Open Audio MIDI Setup. It’s in Applications > Utilities.
In the Audio MIDI Setup window, choose Window from the top menu, then Show MIDI Studio.
This opens the MIDI Studio window. You will see a few options here such as Bluetooth, IAC Driver, and Network. Depending on how you may have configured MIDI ports in the past, the number of devices here can vary.
Double click the IAC Driver device. This will open the Properties window. The main thing you need to do is click the checkbox for “Device is online” (if not already checked). You may also want to change the device name to Vista.
You can close out all of the Audio MIDI Setup windows now.
Now you need to start midi-relay running. Open a Terminal window and change directory to where you put the executable file for midi-relay. I put mine in a subfolder within the Documents folder. It’s important that you run the executable while the Terminal window directory is the same folder the executable is in, or things may not work correctly. Once you’ve changed directory to the correct folder, you can drag the executable file from Finder to the Terminal window, or you can type in the executable name manually. Hit enter to run it.
When midi-relay starts up, it will give you a read-out in the console of all the available MIDI in/out ports. You should now have one that says Vista Bus 1.
Open Vista. Go to the User Preferences menu by selecting File > User Preferences.
Go to the MIDI tab.
Under the MIDI Show Control section, set the Device ID to 0 (zero).
Under the External MIDI Ports section, check the box next to the Vista Bus 1 MIDI port.
Click OK.
In Vista, right click on the cue list you want to use with MIDI control, and choose Properties.
Go to the MIDI tab.
Now open the Companion Web GUI on the computer that is running Companion.
Add a new instance by searching for Tech Ministry MIDI Relay.
In the instance configuration, type in the IP address of the computer running Vista and midi-relay. If you’re running Companion on the same computer, you can use IP address 127.0.0.1.
Click Apply Changes.
To Send a MIDI Note On and advance a cuelist:
Add a new button in Companion.
Add a new action to that button, using the midi-relay action, Send Note On.
Under the options for this action, choose the Vista Bus 1 for the MIDI port.
By default, it will send channel 0, note A0 (21), with a velocity of 100. Vista does not look for a specific velocity value, only channel and note. Vista will listen to any channel by default, but if you set a specific channel in the Vista MIDI settings, you will need to make sure you send the correct channel from Companion.
Go back to Vista and in the Cuelist Properties, MIDI tab, click Learn next to the Play item. The Play command is what advances a cuelist. The Learn function will listen for incoming MIDI notes and makes setting the MIDI note slightly easier (and it proves that it works). You can also just set the note manually if you want.
Go back to Companion and click Test Actions (or press the physical button on your stream deck if you are using one), and the Learn box in Vista will go away, and you’ll see that the note you sent from Companion is now populated in the Vista settings.
Now every time you press that button in Companion, it will advance that cuelist. If you have multiple cuelists, you will need to use different MIDI note values.
To Send a MIDI Show Control message to go to a specific cue in a cuelist:
Add a new button in Companion.
Add a new action to that button, using the midi-relay action, Send MSC Command.
Choose Vista Bus 1 for the MIDI port.
The default Device ID is 0 (zero) but if you changed that in Vista, make sure it matches here.
The Command Format should be Lighting – General and the Command should be Go.
The Cue field should be the specific Cue Number in Vista of the Cuelist you want to control.
The Cue List field should be the specific Cuelist Number in Vista.
Now every time you press that button in Companion, it will go to that specific cue in that specific cuelist.
Here’s a walkthrough video of these steps:
[wpvideo HZriRGlS]
I hope this is helpful! If you’re using MIDI relay, feel free to drop a comment and share how it is working for you!
A few months ago, I picked up this nifty device called a Stream Deck made by Elgato Gaming. It’s a 15-button USB keyboard with LCD buttons. It’s primarily marketed towards gamers who live stream so they can have quick access to commands and functions as they stream. The programmer in me couldn’t resist trying it out to help us with our production setup.
The Stream Deck sells for about $140.
Using the base software provided, I was able to fairly quickly implement a workflow to allow volunteers to have easy access to buttons that then fire commands on our Ross Dashboard Production Control ecosystem. If you’ve not used Dashboard before, you can read about how we use it at my church here. It’s fairly easy to set up a custom panel in Dashboard that runs an HTTP web server at a specific port, which in turns allow you to “click” a button on the panel by calling that button’s trigger ID remotely via a specific URL.
Using the “URL” method provided in the base software, we are able to make web calls to the Dashboard custom panels to fire the commands. All the logic/code remains in Dashboard, and this just becomes a method of executing those commands remotely via an HTTP request.
Here is a screenshot of the base software provided by Elgato. It’s very functional as is.
We used the base software for a few months without issue, however quickly realized the limitation of not being able to have bi-directional communication between our Dashboard Production Control and the individual Stream Decks. For example, several of our commands act as “toggles”, meaning we can have a few different state options that represent the current status of a device. If I only had one person making changes or operating the system, it wouldn’t be a huge issue. That person would hopefully remember what button they pressed last. However, when there are a lot of moving parts and multiple people controlling systems, the ability to update status on all devices becomes very helpful.
Enter NodeJS. People smarter than me took the time to write a base NodeJS library to control the Stream Deck. I hadn’t written in NodeJS before, but being a programmer, I was ready to learn something new. I downloaded and installed all the necessary libraries, IDE, etc. and quickly whipped up some code using the base library to control our stream decks. In just a few hours, I had something operational and started running it from the command line. I then spent a couple of weeks refining it and now we have a fully functional, self contained app that can run on Mac, Windows or Linux. It’s packaged using the Electron libraries made freely available with the Node platform.
This was the quick icon I whipped up for the software.
My controller software uses a base JSON file which defines the button structure of the stream deck. This makes it very flexible and expandable as our needs grow as I can just modify the JSON file to change the button structure.
Here’s an example screenshot of one of my button files, written in JavaScript Object Notation. This allows me to add or remove buttons very easily and also programmatically.
The software then parses that JSON and builds the buttons on the Stream Deck in real time. If a button has a trigger action assigned, the command is sent to the corresponding device. I’ve written support for several protocols, including the Dashboard Web Call, RossTalk (good for sending messages to your Ross equipment), OSC, VideoHub routing, and more. You can even do internal stuff like jumping from one button page to another, changing button images during actions, etc. Each button can support an unlimited number of button actions, which I called triggers.
The app runs completely in the tray with a simple context menu.
It also supports defining a set of devices, so if there’s a device you want to send messages to often, you can define the device in a separate file along with its host, port, type, etc. and then only refer to that device in the button structure. That way, if any of those related variables change, you only have to change it in one place.
The software also runs a basic TCP listener server on a specific port, and this is where the bi-directional communication comes into play. Anytime a command is run on the master Dashboard Custom Panel Production Control, it relays a message to the remote Stream Deck via the TCP listener and updates the state of the button.
The settings menu allows you to choose the button/device files you want to use as well as whether the TCP listener service and notifications are turned on.
A sample notification that can appear when a button is pressed. You can determine which triggers sent notifications.
This means that we can run commands from any originating location, whether it is the web-based production control (that I’m still developing), one of the remote Dashboard panels that connects to Production Control, one of the Stream Decks (we currently have 2 of them, one in each control room), or even the Master Control panel and every device will receive an updated status.
I also added a “Virtual Deck” option, which allows you to operate the software with or without having a physical Stream Deck attached. You can also choose to have the Virtual Deck operate independently of your physical Stream Deck, so it’s like having two decks in one!
Here is a screenshot of the Virtual Deck in action.
Here is what some buttons that have been toggled look like on the Virtual Deck. It’s very clear to see the current state of those buttons!
I am making this software freely available to anyone who can benefit from it. My hope is that the local church can make use of this to allow their volunteers to more easily operate tech equipment during services.
I’ve only built a Mac binary, but you can easily package it for Windows or Linux if needed.
I am working on an Editor function right now that will allow you to add/edit buttons without having to write them in JSON, but until then, you’ll have to make do with that option. Here’s a good tutorial on learning JSON if you need help: https://www.codecademy.com/courses/javascript-beginner-en-xTAfX/0/1
If I can help you out along the way, don’t hesitate to reach out!