Search
Start typing to search...

Combine Different Productions in Separate Machines

Author:

Back to Introduction to LED Wall Production

Introduction

This guide focuses on utilizing more than one computer in the production process without using the existing readymade multimachine solutions in the camera compounds. Usually, using more computers has the benefit of sharing the GPU load, and in some instances, simplifies the management of separate projects.
This is not a Multi-Machine setup that we describe here. Instead, this configuration involves using two separate machines, with each running and controlling Aximmetry independently. 

This guide also complements our existing documentation on integrating Augmented Reality (AR) and Green or LED Wall production at AR in Green and LED Wall Projects and how to use different engines together (Aximmetry and Unreal) at Aximmetry and Unreal Combined Render.

Examples

One potential scenario for utilizing Aximmetry like this involves dedicating one computer to manage an LED Wall project, and another separate computer to handle an Augmented Reality (AR) overlay on top of it.
This setup represents just one of the many possible applications. For instance, rather than using AR, one might employ a virtual or tracked camera to overlay billboards. Similarly, it would be possible to start with a green camera on one computer and then overlay additional green camera billboards from a second machine, effectively increasing the maximum number of supported billboards. Similarly, this could be extended by rendering AR objects or displaying video overlays.
Moreover, there are even more complex applications conceivable; for example, one computer could be tasked with playing one scene, while the other loads the subsequent scene in the production queue. This arrangement would ensure seamless transitions between scenes, eliminating the hiccup of loading new scenes on a computer that is running the live production, while allowing to preview of the setup in the next scene. Undoubtedly, there exist many other use cases that we have yet to imagine.

LED Wall and AR

In the case of LED Wall and AR, there are two distinct scenarios:

  1. Direct Connection Scenario: In this setup, the studio camera is directly connected to the second machine (Machine 2), which is responsible for rendering the AR elements. A diagram representation of this configuration is provided here:

    However, this solution will not work with Digital Extension.
    NOTE: You still have the option to connect the studio camera's video feed to the first machine with an SDI or HDMI splitter. This enables you to monitor and hence easily set up the LED Wall.
  2. Digital Extension Scenario: For productions requiring Digital Extensions, the studio camera's video feed must be routed back to the first machine to be combined with Digital Extension. Subsequently, the mixed output is forwarded to Machine 2 for the AR overlay. This process is depicted in the following diagram:

    The downside of this scenario is that the two machines must have a video connection.

In both scenarios, Machine 1 is tasked with running an LED Wall camera compound, while Machine 2 manages an AR camera compound. None of them use the "+ AR" compounds.

For the final picture or recording, use the output from the AR machine to ensure the inclusion of augmented reality elements. The LED Wall machine only needs to output the video for the LED wall(s).

Setting It Up

Start Aximmetry on both machines. Ensure that both of them run at the same Frame Rate.

You have to control the machines independently. If you need synchronized switching of any element in both scenes, you could use an OSC remote-controlling app and send the OSC signal to both machines.

Tracking

To ensure that camera tracking information is accurately shared between both machines:

  1. If you use a TCP-based tracking system (e.g. Ncam):
    Simply specify the same IP (where the Ncam server runs) for both machines.
  2. If you use a UDP-based tracking system (e.g. Stype, Mo-sys, TrackMen, or any Free-D-based system):
    Adjust the system's settings to broadcast data to both Aximmetry machines. If direct broadcasting to multiple machines is not supported, proceed with option c.
  3. If you use a USB-based tracking system (e.g. HTC Vive, Antilatency, Intel Realsense):
    Direct data sharing between machines is not possible, use Aximmetry's internal system to forward tracking information via UDP or TCP.
    Connect the tracking system to the primary machine involved in the initial stage of production.
    • In the case of LED Wall compounds:
      Go to the INPUT panel, and specify the IP of the other machine in the Tracking Fwd Aux IP property:
    • In the case of other than LED Wall compounds:
      Use a Camera Tracking module, set up the tracking device(s) in it, specify the Tracking Fwd Aux IP, and connect it to a Force Execution module:

Take note of the (Tracking) Fwd Port value. Alter this number if required.

On the secondary machine, open Edit / Manage Devices and create an AxiBridge device using the Fwd Port number mentioned earlier:


Then simply use this device as the Tracking Device:

NOTE: You do not have to specify any Tracking Mode or Zoom Device. All the zoom, focus, and lens distortion information will be transmitted along with other data via the AxiBridge protocol. This is true even if you use a separate Zoom Device on the source machine.
NOTE: If using the Camera Tracking module to forward the tracking, then you have the option to choose to forward it in AxiBridge TCP. However, we recommend UDP (AxiBridge) over TCP (AxiBridge TCP) due to UDP's lower latency.

Adjust the Tracking Delay setting individually on each machine as the synchronization will probably vary.

Video Connection

There are various ways to send video between the machines. For the best quality and the least lag, we recommend a physical SDI or HDMI connection between the machines.

Article content

Loading
Close
Loading spinner icon
1/10