Using two machines: all graphics is in Unreal Engine
If you want to use Unreal Engine graphics both for the LED walls and for AR, it is not yet supported to do it on the same PC. However it can be done easily using two PCs, one is running the LED rendering, the other is running the AR. This is possible without sending any video information between the two machines since the final background graphics are already on the LED wall; you simply have to lead the camera signal to both machines.
This method also has the benefits of sharing the GPU load (which can be already high for both LED and AR), and also the easier management of the two UE projects, one for the LED, one for AR.
Please do the following.
Start Aximmetry on both machines. (Please do not be confused: it is NOT a multi-machine configuration we describe here, you simply use two Aximmetries on two machines independently.) Ensure that both of them run at the same frame rate either by setting it in Preferences / Rendering, or selecting Sync for on SDI cards with the same video mode. As normally this frame rate should be identical to the camera's one and ideally to the LED wall's one.
Plug the SDI / HDMI coming from the camera into both machines. You can use an SDI splitter (e.g. Blackmagic Design SDI Distribution Mini Converter), or an HDMI splitter depending on your setup.
Ensure that the tracking information of the camera gets into both machines:
a) If you use a TCP-based tracking system (e.g. Ncam) simply specify the same IP (where the Ncam server runs) for both machines.
b) If you use an UDP-based tracking system (e.g. Stype, Mo-sys, TrackMen, or any Free-D-based system) set the system so that it sends the data to both your Aximmetry machines. If the tracking system does not allow that then follow the c. option.
c) If you use a USB-based tracking system (e.g. HTC Vive, Antilatency, Intel Realsense), or for any reason, you cannot send the data to both machines, then you have to use Aximmetry's internal system for forwarding the tracking information to the other machine via UDP. Plug the tracking system into the machine that will run the LED rendering. Load your LED scene, go to the INPUT 1 panel, and specify the IP of the AR machine in the Tracking Fwd Aux IP property.
Also make note of the port number in the Tracking Fwd Port property (you can change it, if necessary).
On the other machine (which will run the AR scene) go to Manage Devices / Camera Tracking and define an AxiBridge device specifying the same port number as above.
Then simply use this device as the Tracking Device in the AR scene.
NOTE that you do not have to specify any Tracking Mode or Zoom Device on the AR side. All the zoom/focus and lens distortion information will be transmitted along with other data via the AxiBridge protocol. This is true even if you use a separate Zoom Device on the source machine.
Please note that using either method you'll have to adjust the Tracking Delay on both machines independently because they can differ.
Load the LED scene on one machine and the AR scene on the other. You have to control these independently. If you need synchronized switching of any element in both scenes please use an OSC remote controlling app and send the OSC signal to both machines.
You can omit the output of the LED machine. For broadcasting, the final picture uses the output of the AR machine. Also, any local recording with Aximmetry has to be done on the AR machine.
Using a single machine: AR graphics is in Aximmetry native engine
You can choose to build the AR elements in Aximmetry's native 3D engine. In this case, you can do the mixing with a single machine.
For that you have to start with the Render_General compound as in any other case when you build a native 3D scene. Then simply connect its output to the AR Overlay pin of LEDWallCam. The camera inside Render_General is controlled via tunnels from LEDWallCam, thus will always be automatically positioned in sync with the Unreal camera.
Then you can add any desired 3D elements attached to the Children pin.