ScreenDeck v2.0 is here! Multiple decks, hotkey support, and more

I published my first version of ScreenDeck at the end of last year and now I am excited to release version 2.0!

Here’s what’s new!

Multiple Decks

You’re no longer stuck with just one screen deck. You can launch as many as you want, each with its own layout and size. You can make them read-only by disabling button presses.

Profiles

Now you can save and instantly switch between different layouts. Whether you’re controlling slides on Sunday or streaming on Wednesday, just create, save, and then pick the profile you need.

X/Y Button Mapping

Instead of “keys per row” and “total keys,” everything is now based on columns and rows.

Hotkey Support

You can now assign global hotkeys to any button on any ScreenDeck. Press the assigned key combo on your keyboard and trigger any action, even when the deck is hidden!

Background Customization

Each deck can have its own background color and opacity.

Button or Encoder Mode

Turn any button into an encoder-style dial by right clicking on that key. Great for volume control, brightness, or cycling through options!

Window Memory

Decks will remember their position, size, and settings—no need to rearrange every time you start the app.


Download it now from the GitHub Releases page.

Need a custom Companion module or app? Hit me up!

gamepad-io: Using game controllers, the Web Gamepad API, Electron, socket.io, and a custom module to use any game pad or controller as a Satellite surface for Companion

I started on this over 6 months ago – and am excited to release this! Last year, I was approached by a programming client who had a simple ask: “Can I use an Xbox controller with Companion?” This of course got my wheels turning, for a few reasons. 1, It’s cool. 2, It hadn’t been done before. 3, Game controllers are inexpensive, and that excited me for anyone trying to do more while spending less, like so many of us are trying to do in church tech ministry.

My first response to this client was: “I can make this – but I want to release it as an open source project for the community. You’ll effectively help to sponsor the creation of a new resource for everyone, while gaining the custom solution you need.”

They agreed and I got started. I first created a small Electron app that runs a Renderer process with a Chromium window that accesses the Web Gamepad API. Any game controller that is connected to the computer (whether wired or wireless) will show up once a button on any controller is connected.

All of this controller data is tracked and stored, and then sent via socket.io to any connected client. The app itself could be used to send this controller data to anything. In my case, I made a Companion module that listens to this socket and then the module itself emulates a Companion surface.

My client needed a lot of variables and configuration options for what they were doing. Every button and axis movement is stored as a variable in the module. And, if you enable the option to use as a surface, you can press controller buttons and it will then in turn press the assigned button in Companion! I’ve been stress testing this with my client for a few months now and am super pleased with how well it works.

Here’s a video of it in action:

The module is in the Companion betas now, and you can download the app for free from my Github repository: http://github.com/josephdadams/gamepad-io/releases

It’s open source, so if you see something that could be better, submit a pull request!

Using Ross Dashboard and the Companion Satellite API to create a virtual touch surface on a Ross Video Ultritouch

My church, Fellowship Greenville, has been building a second campus now for a little over a year. It’s been an exciting process. The new auditorium will feature a control room much like what we have at our existing campus.

One of the newer pieces of equipment that we are putting in is a Ross Video UltriTouch HR. It’s a 2RU touch screen computer essentially, running Ross Dashboard. (I’ve written about Ross Dashboard before if you want to read about any of those.) Dashboard is a very flexible program that lets you program very custom interfaces to control your gear. We used it heavily until I started investing a lot of time toward Companion.

Once I knew we were getting one of these, I knew right away that I wanted to be able to use it as a satellite surface for Companion. Taking what I learned from my ScreenDeck project, and my OGScript knowledge (Ross’s flavor of Java/JavaScript that powers the custom panels in Dashboard), I was able to make this:

It was pretty easy to get simple buttons with text on them, and get the colors of the buttons to match Companion button colors. But I wanted the buttons to look like Companion buttons, and that took some work. Dashboard doesn’t have any image editing libraries that I was aware of, so I had to get creative. The image data coming from Companion is base64 encoded 8-bit RGB. I reached out to the Ross staff on their forums and they quickly got back to me with a helpful decoder function. It was similar to the one I had already written to decode the base64 encoded text data that comes from the Companion Satellite API.

Once I was able to decode it back to the binary RGB data, it was “simply” a matter of writing a function that saves these as bitmap files in a folder local to the panel and then changing the style of the button to show the new bitmap image.

And there we have it! I’m looking forward to using this on our UltriTouch as well as the TouchDrive touch screen as well.

The panel supports turning the bitmaps on/off, setting the button size, total keys, keys per row, and of course the IP/port to Companion. The satellite port is changeable on the Dashboard side but is currently fixed in Companion to 16622.

If you’re a Ross Dashboard user and want to tinker with the panel, I’ve made it available via Github on my RossDashboardPanels repository where I have shared some other panels as well.

If you ever need any custom Dashboard panels created (or Companion modules!), I do this for hire on the side to support my family. You can reach out to me via my website, josephadams.dev.

Streamlining Electron App Development with AI: Building a Virtual Stream Deck for Bitfocus Companion using the Satellite API

On the side from my full time job in ministry, I do coding-work-for-hire. It’s one of the ways I provide for my family. I’ve had opportunities to create custom dashboard panels, modules for Bitfocus companion, and lots of other bespoke solutions for whatever people need. (Hit me up if you ever need anything!)

One of the tools in my tool belt that I use regularly when coding is Github Copilot. It’s $10 a month and saves me so much time. Never heard of it?

GitHub Copilot is an AI-powered coding assistant developed by GitHub and OpenAI, designed to help developers write code faster and more efficiently. Integrated directly into popular code editors like Visual Studio Code, Copilot suggests code snippets, functions, and even entire blocks of code in real time as you type, based on the context of your project. It supports multiple programming languages and leverages a vast amount of open-source code to provide relevant suggestions, making it a valuable tool for both beginners and experienced developers looking to speed up their workflow, reduce errors, and explore new coding approaches.

It seriously saves me a lot of time by providing suggestions and workflows that I may never have thought of, while not necessarily doing things that I would not have done. After using it for a year and a half, I have it trained well on the ways I like to code.

Recently, I also signed up for OpenAI’s ChatGPT Plus plan. It’s $20 a month. I may not keep subscribing long term, but I’m trying it out. It gives me access to GPT 4o and DALL-E and all of their other tools. I used it to help me decipher a protocol for some paid work I was doing and it helped me save time. These tools are not at a point where I can just hand it the whole job and get a perfect response – but guiding it through the process in steps? I can get helpful responses that way.

After I was done with my protocol project, I simply asked ChatGPT, “give me a boilerplate typescript Electron app using my example”. I’ve shared several of my electron apps before. It’s my preferred method for cross platform apps (meaning they can run on MacOS, Windows, and Linux desktops). I wanted to see if I could guide ChatGPT through the process of giving me a new template to help take some projects further and implement standards and practices that I might not be aware of.

One particular project I’ve wanted to work on for awhile now is something I’m calling ScreenDeck. It’s essentially a screen based stream deck for Bitfocus Companion that uses the built in Satellite API to create virtual surfaces.

Every good project needs a logo, right?

I know the browser based emulator exists, but I wanted something that ran a little more “native looking” on the OS and could always sit on top of other windows so it’s immediately accessible.

I had started on it over a year ago, but the small nuances and things to code just felt overwhelming to implement in my “spare time”. However, together with my AI tools, I was able to quickly craft a new boilerplate template and apply it to the ScreenDeck project I had started a long time ago, and come up with a working solution in just a few days. It was a lot of back and forth with the chat, prompting it to craft more and more refined responses.

Like many of my other projects, I’m releasing ScreenDeck open source with the hopes that it will help the community – especially churches.

Here’s a simple 4 button, 1 per row, deck.
The settings allow you to configure how it looks, how many buttons, whether it’s always on top, etc. You can even change the bitmap size to create HUGE buttons!
Here’s a standard “Stream Deck XL” layout.
Some of the context menu options.
Because it uses the Satellite API in Companion, it shows up as a physical surface in Companion!
Because Companion sees it as a surface, this means you can do anything with it that you’d do to any physical surface.

You can download it here: http://github.com/josephdadams/screendeck

It’s available for MacOS, Windows, and Linux desktops!

Here’s a video showing it in action!

Using midi-relay to send MIDI feedback to Companion for control

I was recently hired to add a new feature to my midi-relay software that allows you to capture MIDI data coming on any MIDI port and send that data back to Bitfocus Companion where you can then store it as a variable or even use the MIDI data as a satellite surface to control buttons directly.

Here’s a video walkthrough:

This should enable a lot of fun things with MIDI and automation!

You can download the software from my Github page: https://github.com/josephdadams/midi-relay/releases/tag/v3.2.0

It’s free so go check it out. If the things I create are helpful to you, a donation to my family is always appreciated: https://techministry.blog/about/

Notify production team members remotely using open source software and low cost USB busy lights

At my church, we have a couple of these:

They’re great. Expensive, but they work well.

The problem for us is that anytime anyone presses the Call light on the intercom party line, any flashers on that party line will light up. This means we can really only have 1 unique flasher per line.

Sometimes, we want or need to get a specific person/position’s attention.

I created some software to help with this. It’s called beacon.

It’s a small app that runs in the system tray and hosts a network API so you can signal a USB busy light, such as the Luxafor Flag or Thingm blink(1). Or, if you don’t have or want a physical signal light, you can also have an on-screen dot that you can use.

I’ve designed this to work in tandem with a custom module for Bitfocus Companion, but since it does have a full API, you can implement any third-party integrations that you like. All of the documentation is on the Github repository: https://github.com/josephdadams/beacon

You can set a beacon to stay a solid color, fade to a new color, flash a color, and more. You can send custom notifications to the user’s window as well as play tones and sounds.

Here’s a video of the project in action to show you how you can use it:

Go check it out today!

https://github.com/josephdadams/beacon

Controlling Planning Center LIVE with a streamdeck, with timers and other variables

If you use Companion and are in tech ministry, you have probably used my PCO Services LIVE module. While in the process of converting this module to the new API we are using for Companion 3.0, I gave it an overhaul and added lots of new features!

Here is a video that shows how it works:

Go check it out for yourself!

Controlling a Canon XF Series camera using a stream deck and Companion by reverse-Engineering the Canon BrowSer Remote

It’s been awhile since I posted! Earlier in the year, we had a few unexpected expenses come up in our family. I started spending my spare time in the evenings doing custom freelance programming to help meet the needs. I have been doing this for a few months now which has helped us out.

God continues to bring new visitors to this blog and I have been able to return emails, phone calls, Zooms, and help so many people implement the ideas and software that I’ve created here. It is truly a blessing to see how God has used this little blog I started a few years ago.

I’m excited to share a new project that I have been working on with my team: Control of our Canon XF cameras through a stream deck. We have a couple of these cameras here at my church, the Canon XF 705 series:

I have been mentoring the guys who work part time in A/V here with me on how to write code and specifically code modules for the Companion project that we use so heavily here. We decided it would be great if we had control of these particular cameras at our shader station alongside the shader control of our Marshall cameras (I wrote about that here) and our broadcast cameras.

These Canon cameras come with a LAN port (you can also use wifi) and it runs a little web server called the Browser Remote which allows you to have full control of all the camera functions, from focus/zoom/iris/gain all the way to recording, white balance, and shutter control. If there’s a button on the camera, chances are you can control it from the browser remote. You can even see a live preview of the camera!

The built in browser remote functions of the Canon XF series.

So we started doing some digging, and realized that there is an internal API on the camera that returns a lot of the data in simple JSON sets. Once you initiate a login request to the camera, it returns an authentication token, which must be sent along with every future request.

For feedbacks on the camera state, we simply poll the camera every second or so. The browser remote page itself seems to do this as well, so we just emulated that.

The browser remote unfortunately only allows one user at a time to be logged in, so when our Companion module is in use, the actual browser remote page can’t be used. But for our purposes, that’s not really an issue since we just want to have button control of the iris/gain functions when we use these cameras during live services. Now I don’t have to ask my operators to iris up or down, I can just do it right from the stream deck!

Here’s a little walkthrough video that shows the module in action:

The module will soon be a part of the Companion beta builds, so if you have a Canon XF series camera, go check it out!

Using a Stream deck and a raspberry pi to create a remote control panel to adjust marshall cameras over ip with rs-485 control

At my church, we have 4 of these cameras: Marshall CV503

Marshall CV503 Miniature Camera

We use them during services to capture shots of the instruments (drums, keys, etc.) and whatever is happening on stage. They are great little action-style cameras, and they have SDI out on them so they are super easy to integrate into our video system.

They have a lot of adjustment options to them via a local joystick-style controller at the camera, but obviously, that’s challenging to use during a service if we needed to adjust the camera’s exposure. The menu is OSD and shows up on the live output. Plus they’re all over the stage and we can’t walk there during the service!

While I wish they were IP-controllable directly, this particular model does not have that option. They do, however, come with RS-485 serial connectors.

So we decided to create a remote shading system using a stream deck running Bitfocus Companion. The Marshall cameras support the VISCA protocol over RS-485. In fact, if you’re a Windows user, Marshall provides free software to control the cameras over RS-485.

Marshall provides this program to control, if you have Windows and want to connect your cameras directly to that computer.

We don’t use a lot of Windows computers around here, and that program requires that the computer running their software be the one physically connected to the cameras via serial. Not ideal for us because the cameras are on a stage and our computers typically are not. Marshall also actually makes a nice hardware RCP – but we didn’t want to pay for that.

So we did what you probably already guessed – put in a Raspberry Pi with a USB to RS-485 adapter that we could control remotely.

We have several wallplates across the stage with network tie lines on them that feed back to the rack room in a patchbay. So we made cables that connect to the RS-485 ports at each camera that then go back to a wall plate into a RJ45 port. We utilized the blue/white-blue pair on CAT6 cable. We used that pair because these are data pins in a normal network connection, which means if someone ever accidentally connected it straight to a switch or something, there would not be any unintended voltage hitting the cameras.

Each camera is set to its own camera ID (1-4), and the matching baud rate of 9600 (the default). Then in the rack room, we made a custom loom to take the 4 connections and bring them into a jack, which then feeds into the USB to RS-485 adapter on the Pi.

The Pi is a 4 model with 4GB of ram. Honestly, for what this thing is doing, we probably could have just run it off of a Pi Zero, but I wanted it hardwired to my network, and the bigger Pi’s come with ethernet ports built in.

I bought this adapter off Amazon:

DSD TECH SH-U10 USB to RS485 Converter with CP2102 Chip

When connected, it represents itself as serial port /dev/ttyUSB0. We originally planned to use the socat program in Linux to listen for UDP traffic coming from Companion:

sudo socat -v UDP4-LISTEN:52381 open:/dev/ttyUSB0,raw,nonblock,waitlock=/tmp/s0.locak,echo=1,b9600,crnl

To actually send the UDP data, we’re using the Sony VISCA module already built into Companion. The Marshall cameras use the same protocol over RS-485.

Using the socat method, we quickly found that it would only listen to UDP traffic coming from one instance of the module. We need 4 instances of the Companion module because we have 4 cameras, each with a different camera ID.

However, nothing a small Node.JS program can’t solve. So I wrote a program that opens the specified UDP port, opens the specified serial port, and sends any data received at that UDP port straight to the serial port. You just configure a new instance in Companion for each camera with the same IP of the Pi running the udp-to-serial program, and the camera ID that you configured at the Marshall camera.

Here’s a video that shows it all in action:

If you want to try this out for yourself, I’ve made the udp-to-serial repository available here:

http://github.com/josephdadams/udp-to-serial

Tally Arbiter 1.3 – Support for sending tally data to the cloud, feedback/control on a stream deck, and tally output on an M5StickC arduino

If you haven’t read about my Tally Arbiter project, you can read about it here and here. Today I’m excited to release version 1.3 which offers some exciting new features!

First, Tally Arbiter Cloud! Now you can send tally data from your local instance of Tally Arbiter to a server in the cloud. Anyone can connect to your cloud server without having to tunnel into your private production network. And, if you are doing remote production with switchers in multiple physical locations or networks, each location can run an instance of Tally Arbiter and the cloud server can aggregate all of the data together in real time! All you need in order to make a connection is a Cloud Key provided by the local client that matches on the server. Keys can be made and revoked at any time.

I’ve set up an Amazon EC2 instance running Ubuntu, with Tally Arbiter running on it. I set a cloud key and set up a cloud destination on my local server to send the data to the server running on EC2. Now, I can log into my EC2 server’s Tally Arbiter web interface and view the tally data from anywhere without having to VPN to the church network. This will make it easy for volunteers to use their personal phones to view tally without having to be in the private network.

Here is a video to show it in action:

Second, Feedbacks and Control through Bitfocus Companion on your stream deck! Companion 2.1 is out now, and if you run the latest build, you can use the new “TechMinistry Tally Arbiter” module to view live tally data from Tally Arbiter on any button on your stream deck. It also supports the ability to “flash” any connected listener client.

Third, a new tally listener client – the M5StickC! This is an inexpensive Arduino ESP32 “finger computer”. A friend of mine in the UK recommended this to me for a possible integration with the project. I bought mine off Amazon for $20 but you can buy them directly from the manufacturer for less than $10. It is a portable, easy-to-use, open source, IoT development board.

Programming this thing was fun because the code is all in C++ which I haven’t used since high school. The power of websockets and the socket.io protocol means that this microcontroller can connect to my Tally Arbiter server and communicate the same way any of the other listening clients do.

Here’s a video to show how it works and how to program one:

Version 1.3 of Tally Arbiter also comes with some other perhaps less exciting but still helpful updates:

  • All Settings, REST API, and Producer page now require a Basic Auth username/password to access.
  • In the settings or producer page, if you mouse over the preview and program boxes, Tally Arbiter will show you which sources currently have that device in that bus
  • The settings page will now show the number of device sources and device actions assigned to a device in the list.
  • Sources will now attempt to auto-reconnect if the connection is lost with a max retry of 5 times.

Lastly, I’ve set up a website for this project to help others who want to share about it. You can access it at: http://www.tallyarbiter.com

You can get the source code for Tally Arbiter and the listener clients from the Github repository: http://github.com/josephdadams/tallyarbiter

100% free and ready for you to use!

My hope is that this project enables churches and any organization who needs tally for their productions be able to attain it at a lower cost. I’ve put a lot of hours into developing this free software. If it is helpful to you, please let me know!