No categories assigned

Using LED walls for virtual production



How the term “LED wall” is used in Aximmetry

On the Aximmetry control boards the term “LED Wall” means a contiguous segment of your LED wall structure that can be either described as

  • a flat rectangle
  • or a curved rectangle (where the curve is an arc of a regular circle)

Usually a LED wall structure can be divided into 3-4 segments that falls into the categories above. E.g. typical setups are:

  • a corner of 3 LED walls: 2 for walls in a right angle and 1 for floor
  • 1 wall (flat or curved) for main front display, 2 for both sides in right angles and 1 for ceiling (the latter 3 ones usually only serve as ambient light / reflection sources).

These are only examples, you can use any configuration.

If a curved rectangle is too long (e.g. a 270° circular display) then you have to split it into 3-4 segments in order to be rendered correctly.

You might also have to further split a LED wall (typically the front one) if you need a very high resolution rendering for it (e.g. 2 x 4K). In this case you might decide to split the GPU load among two or more PCs.

Please note that these segments does not necessarily have to be separated in terms of physical connection (HDMI or DP). Depending on the setup of your LED wall processor you can even transmit the entire image via a single HDMI/DP connection. The imaginary separation is only needed by Aximmetry in order to be able to render all the parts of the LED wall correctly. The only cases you need actual multiple connections are when you use multiple PCs for rendering, or the bandwidth of a single HDMI/DP connection is not enough.

Finally a “LED wall” actually can be a flat TV or a projector if that suits your scenario better.

In the rest of this documentation we use the term “LED wall” in the sense described above.

Render passes, single- vs. multi-machine configuration

In order to provide the correct image content for all LED walls, Aximmetry performs the following render passes:

  • "Frustum" rendering: the image portion that the camera currently actually sees on the LED wall. This should be done in highest necessary resolution and quality.
  • "Fill" rendering: one pass for each LED walls, they provide the content for the parts the camera does not see currently. Since the main purpose of these image parts is providing an ambient lighting and reflection environment they can be rendered in a reduced quality.

Depending on the number of LED walls and the resolution reduction of the Fill passes you might find a single PC enough to render all the content.

But if you need very high resolution rendering for all the Fill rendering you might need multiple PCs the perform that. In this case you have to specify which LED wall is rendered by which PC. Please note that the "Frustum" pass always have to be performed by all the PCs.

A Freeze option is also available for the Fill rendering which allows using a single machine even if you need high resolution Fill content.

We'll discuss the setup of these cases later.

Camera inputs

You can shoot the scene using a single or multiple physical cameras.

Note that when using LED walls the signal coming out from a camera is already the final composition of the real foreground and the virtual background. Therefore you can record the picture on the camera itself or lead the signal directly into your studio equipment for broadcasting/recording.

However there are a few reasons why it is still recommended to wire the signal back to the Aximmetry PC as well:

  • It is much easier to setup the position of the LED walls watching the camera picture. This is the preferred way, see below.
  • You might want to record the final content with Aximmetry.
  • You might want to make adjustments or any post-processing on the image.
  • You might want to composite further elements on the image like overlays, PIP, text, channel logo etc.
  • When you use the Green Frustum mode (see below) you might still want to see a previz of what the final image will look like.
  • You use realtime digital extension (see below)
  • You use multiple cameras. Please consider that even if you record all camera signals separately, you won’t be able to switch between them arbitrarily during a later editing session, because LED walls can only display the virtual background from the perspective of one selected camera at a time. Aximmetry UI provides the means for switching the camera input and the background projection in sync.

Camera tracking

Each camera must have a spatial information. It is normally provided by a camera tracking system, but it also can be specified manually if you want a fixed camera.

On the Aximmetry control boards you can switch between the cameras. The virtual background is always projected onto the LED walls from the tracked position of the currently selected camera. 

Note that the final image will only be correct when it’s is seen from the perspective of the selected camera. When it is seen from the angle of other cameras or from the studio personnel it might look weird, but this is normal.

The example LED wall configuration

Throughout this documentation we will use the following arrangement as an example. On every screeshot the visible settings will be in accordance with this arrangement.

LED wall properties

Front wall: 8m x 3.5m, curved, 2816 x 1232 pixels

2 rear walls: 2.5m x 3m, flat, 520 x 624 pixels each

UsingLedWallsForVirtualProduction Image1.png

Single-machine output frame arrangement

The LED wall processor receives the image of all the three walls via a single DP cable within a 3856 x 1232 pixel frame in the following arrangement:

UsingLedWallsForVirtualProduction Image2.png

Multi-machine output frame arrangement

In order to demonstrate a multi-machine configuration we split the front wall into two halfs, 1408 x 1232 pixels each. The right half will be rendered by the central machine, the left half by a second machine, and the 2 rear walls by a third machine.

UsingLedWallsForVirtualProduction Image3.png

The central machine sends its part into the LED wall processor via a DP cable in a 1408 x 1232 pixels frame:

UsingLedWallsForVirtualProduction Image4.png

The second machine send the left part via another DP cable also in a 1408 x 1232 pixels frame:

UsingLedWallsForVirtualProduction Image5.png

The third machine send the two rear images through a single DP cable. They are arranged in a 1040 x 624 pixels frame:

UsingLedWallsForVirtualProduction Image6.png

Startup Configuration

Single-machine configuration

Preview output

It is recommended to have a secondary monitor attached to your PC for preview purposes. If you have one select it as output #1.

UsingLedWallsForVirtualProduction Image7.png

If you do not have one, omit this step, you’ll be able to see to preview on one of the preview panels of Composer as well.

Final program output

If you need to forward the final output into your studio system for broadcasting/recording, or you simply want to watch it on a dedicated monitor, assign #2 to one of your SDI outputs. Pay attention on setting the correct frame rate that is matched with both the camera input and the LED wall output (see below).

UsingLedWallsForVirtualProduction Image8.png

LED wall output(s)

Normally you can send all the LED wall images merged into a large frame through a single HDMI or DP cable, and the LED wall processor will send everything to the right place.

The LED wall processor will report the total size of the frame and also the expected frame rate towards you PCs. This will appear as a monitor of the right size both in Windows and in Aximmetry.

In this case select this output as #3 and set Sync on it.

UsingLedWallsForVirtualProduction Image9.png

In the case you send the pictures through multiple HDMI/DP cables assign #3, #4, #5 etc. to these outputs, and set Sync on the first one.

UsingLedWallsForVirtualProduction Image10.png

Multi-machine configuration

Regarding what is multi-machine configuration and how to assign remote engines please consult this documentation. Here we only discuss the LED wall output related setup.

It is up to you how many PCs do you use and which LED wall(s) these PCs render. E.g. you can use a separate PC for each LED wall, or you can use 2 PCs each rendering 2-2 LED walls etc.

Central machine setup

The central machine will handle all the camera and tracking inputs, the final output, and the output(s) of its intended subset of the LED walls. The setup is basically identical to the single-machine case.

UsingLedWallsForVirtualProduction Image11.png

The next step is defining the two other machines as remote renderer engines. Let's suppose they are located at the IPs and on the LAN.

UsingLedWallsForVirtualProduction Image12.png

Remote engines setup

The following setups have to be done on the second and third machines themselves by starting the Aximmetry Renderer Config app.

Each remote engine will handle its intended subset of the LED walls only. If they're driven via a single HDMI/DP cable then assign #1 to the corresponding output. (In the case of multiple cables use the #1, #2, #3 etc indices.)

Second machine

UsingLedWallsForVirtualProduction Image13.png

Third machine

UsingLedWallsForVirtualProduction Image14.png

Channel matrix

After starting the Composer on the central machine go to Edit menu / Preferences, then Channel Matrix. Turn off Unified mulit-machine setup. You have to setup it up so that channel 1 and 2 go to the preview and final outputs of the central machine, while channels 3, 4, 5. go to the LED wall outputs of the corresponding machine. It should look like this:

UsingLedWallsForVirtualProduction Image15.png

It is a typical scenario for two machines and interprets as:

  • Channel 1, 2, 3 go to the outputs #1, #2, #3 of the local (central) machine
  • Channel 4 goes to the output #1 of the remote machine at
  • Channel 5 goes to the output #1 of the remote machine at

Rendering frame rate

The system rendering frame rate must match the refresh frequency of the LED wall system (and normally should match the camera frame rate as well). Start Composer and go to Edit menu / Preferences, then the Rendering section, and set the Frame rate.

UsingLedWallsForVirtualProduction Image16.png

Please note that the Frame size setting has no effect here since the Frustum and Fill rendering frame size will be specified individually on the control boards, see below.

Setting up an Unreal Engine scene

In your Aximmetry projects folder find the


folder and copy it into the Content subfolder of your Unreal project.

UsingLedWallsForVirtualProduction Image17.png

Load your project into Unreal Editor for Aximmetry.

Find the following blueprint in your Content and drag and drop it into the scene. The position does not matter.

UsingLedWallsForVirtualProduction Image18.png

UsingLedWallsForVirtualProduction Image19.png

Cook the scene for Windows.

Start Aximmetry DE if you haven’t started yet.

Create a new compound.

Drag and drop your uproject file into the Flow Editor. You will get:

UsingLedWallsForVirtualProduction Image20.png

Go to File Browser and find


Drag and drop it into the Flow Editor.

Connect everything in this way:

UsingLedWallsForVirtualProduction Image21.png

Setting up an Aximmetry native engine scene

For rendering the scene you have the provide one main camera for the Frustum rendering and one camera per used LED wall for the Fill rendering.

It's recommended to start with using the


compound from the beginning and change its internals to your needs.

You will also need the



All these have to be connected like this:

UsingLedWallsForVirtualProduction Image22.png

In order to edit your scene you can use the Free camera mode.

UsingLedWallsForVirtualProduction Image23.png  UsingLedWallsForVirtualProduction Image24.png

In this mode you'll only see your 3D scene on the preview monitor and you can edit it normally. When you finished editing and want to continue with setting up the LED walls simply switch back to normal mode.

UsingLedWallsForVirtualProduction Image25.png

Setting up the inputs

Camera inputs

As we discussed earlier it's highly recommended to wire back your camera signal into Aximmetry. For that you have to go to the INPUTS control board and specify an input device and its video mode:

UsingLedWallsForVirtualProduction Image26.png    UsingLedWallsForVirtualProduction Image27.png

Of course you can use a Mapped device as well as demonstrated in other tutorials.

On the preview and the output monitors you'll immediately see the camera image.

If you use multiple cameras repeat this step for INPUT 2, INPUT 3 as well. In order to select which camera's image is seen on the preview and final output go to the CAMERAS control board and use the CAM buttons:

UsingLedWallsForVirtualProduction Image28.png

UsingLedWallsForVirtualProduction Image29.png

Tracking input

For each camera you also have to specify a tracking device that reports the position of the given camera.

UsingLedWallsForVirtualProduction Image30.png  UsingLedWallsForVirtualProduction Image31.png

If you use a tracking device that needs lens data created with Aximmetry's own Camera Calibration tool you also have to select the lens file in the Tracking Mode property:

UsingLedWallsForVirtualProduction Image32.png

If you use an independent device for zoom and focus encoding you have to specify it as well:

UsingLedWallsForVirtualProduction Image33.png

For further info on using camera tracking devices and lens calibration please consult this and this documentation.

Tracking delay

Tracking information is used for two different purposes:

  • to determine which position the virtual graphics is projected onto the LED walls from
  • to help with specifying the LED wall positions in the 3D space using helper AR virtual graphics.

For the first one no delay is used at all, since we have to minimize the time elapse until the graphics is updated on the LED walls according to the current camera position.

For the latter one it have to be ensured that the real camera image and the helper graphics are moving together at the same time. For that you usually have to specify a delay for the tracking data:

UsingLedWallsForVirtualProduction Image34.png

It's measured in frames and can be a fractional value if necessary.

The right value has to be found by trialing. Go to the LEDWALLS control board and select the STUDIO view mode:

UsingLedWallsForVirtualProduction Image35.png  UsingLedWallsForVirtualProduction Image36.png

You'll see a virtual checker floor pattern appearing over the real image. You can use a chair or the LED walls themselves or any other object to check the synchronization of the movement.

UsingLedWallsForVirtualProduction Image37.png

Move the camera while adjusting the delay value to find the correct one.

The Zoom Delay setting is only needed if you use an independent Zoom Device.

Setting up the LED walls

Go to the LEDWALLS control board.

This particular compound allows using a maximum of 4 LED walls. Each of them has a row in the control board.

Selecting how many LED walls is used

With the On/Off button of the beginning of each row you specify which of them is used to describe your LED wall structure. According to our example we'll use only three of them.

UsingLedWallsForVirtualProduction Image38.png

Specifying the sizes

The first step is specifying the size and pixel resolution of each LED walls. Select the LED Wall X panels one by one and set the properties.

LED Wall 1

UsingLedWallsForVirtualProduction Image39.png

LED Wall 2

UsingLedWallsForVirtualProduction Image40.png

LED Wall 3

UsingLedWallsForVirtualProduction Image41.png

Building the arrangement of the output frame

The images that should be sent out to the LED walls are produced on the LED Wall X output pins of the LEDWallCam compound.

In our example we're using a single output (a single DP cable) for send all the 3 images. To achieve their arrangement we described earlier a series of Placer Precise modules have to be used.

UsingLedWallsForVirtualProduction Image42.png

UsingLedWallsForVirtualProduction Image43.png  UsingLedWallsForVirtualProduction Image44.png

UsingLedWallsForVirtualProduction Image45.png  UsingLedWallsForVirtualProduction Image46.png

UsingLedWallsForVirtualProduction Image47.png  UsingLedWallsForVirtualProduction Image48.png

The multi-machine case

In our multi-machine example you also have to activate LED Wall 4 and set the sizes like this:

LED Wall 1 and 4

UsingLedWallsForVirtualProduction Image49.png

The frame arrangement cab be done like this. LED Wall 1 and 4 can go directly to the corresponding outputs, only 2 and 3 needs to be merged into a single frame.

UsingLedWallsForVirtualProduction Image50.png

UsingLedWallsForVirtualProduction Image51.png    UsingLedWallsForVirtualProduction Image52.png

UsingLedWallsForVirtualProduction Image53.png    UsingLedWallsForVirtualProduction Image54.png

Checking the output arrangement

In order to see if all the size, resolution and frame arrangement setting are correct select the STUDIO view mode.

UsingLedWallsForVirtualProduction Image55.png

This will send a checker pattern to all LED walls.

UsingLedWallsForVirtualProduction Image56.png

UsingLedWallsForVirtualProduction Image57.png

Your settings are correct if:

  • The big index numbers at the centers are the correct ones.
  • You can see the red border on all four edges of all LED walls. The border must be 5 cm thick everywhere.
  • Each numbered square of the checker patterns have to be 50 cm x 50 cm. Use a measure tape to check that.

Note that the numbering of the checker pattern also reflects the dimensions of the LED wall in meters.

Specifying the LED wall positions

Remain in the STUDIO view mode.

UsingLedWallsForVirtualProduction Image58.png

Placing a flat LED wall

In our example we start with LED Wall 2, since it is a flat one.

Rotate the camera so that you can see the bottom center of the LED wall.

The goal here is to put the virtual image of the LED wall in Aximmetry into a good match with the image of the real LED wall using the helper graphics displayed on the preview monitor.

Initially the virtual image will be at the origin of the virtual space, so it might not even appear on the screen, or at least it is in a very wrong position.

UsingLedWallsForVirtualProduction Image59.png

In order to help the first step of the placing you can use the Put In Front feature of the LED Wall X panel.

UsingLedWallsForVirtualProduction Image60.png

UsingLedWallsForVirtualProduction Image61.png

By pressing the Trigger the virtual image of the LED wall is put exactly in front of the camera at the specified Put Distance.

UsingLedWallsForVirtualProduction Image62.png

This makes any further positioning much easier.

The LED wall usually have a gap between its bottom and the floor. We have to compensate it in Aximmetry, so measure the gap.

UsingLedWallsForVirtualProduction Image63.png

In our case it is 5 cm, let's enter it into the Y position field of the LED wall.

UsingLedWallsForVirtualProduction Image64.png

Now you can use the usual 3D editing tool of Aximmetry to move the virtual image into a matching position. Select your preview monitor for editing.

UsingLedWallsForVirtualProduction Image65.png

(If you are not using a fullscreen monitor, but a preview panel of Composer, please select Preview 1, 2 etc. accordingly)

Then select the LED Wall X panel.

UsingLedWallsForVirtualProduction Image66.png

A moving handle will appear at the bottom of the virtual image.  Use it to move the wall until it roughly matches the center of the real LED wall. Please pay attention on not moving the wall along the Y axis at all.

UsingLedWallsForVirtualProduction Image67.png

Switch to the Rotate mode either by clicking the Rot button or pressing E.

UsingLedWallsForVirtualProduction Image68.png

Rotate the wall until its bottom is parallel with the bottom of the real LED wall. Please pay attention on only rotating around the Y axis.

UsingLedWallsForVirtualProduction Image69.png

Now switch back to the Position mode either by clicking the Pos button or pressing W.

UsingLedWallsForVirtualProduction Image70.png

Move the wall until you reach a good match.

UsingLedWallsForVirtualProduction Image71.png

Please note that depending on the quality of the tracking and the lens calibration you may not reach a perfect match. If you get a slight ghost image it will not affect the quality of your final composition.

However if you cannot reach a reasonably close match then you might either have a lens calibration problem or your tracking is not set properly. E.g. a typical error is that the virtual LED wall appears to be bigger or smaller than the real one. This usually means that the virtual floor does not match the real one (the tracking system's zero plane is not at the real floor).

In our example the same procedure have to be done for LED Wall 3, since it's also a flat one.

Placing a curved LED wall

Our example LED Wall 1 is a curved one, let's setup that.

Firstly do the Put In Front and measuring and entering the Y position steps we described above.

The first goal is aligning a flat virtual wall to a tangent of the curved LED wall that goes through the center of the LED wall. A number of tricks can be figured out how to achieve this. One example is stretching a rope or a cable between the endpoint of the curve creating straight line.

UsingLedWallsForVirtualProduction Image72.png

Then use moving and rotating to align the flat wall to the rope.

UsingLedWallsForVirtualProduction Image73.png

Then without any further rotating move the wall to align it's center to the real LED wall's center. You can use the checkerboard pattern to determine where is the center of the walls.

UsingLedWallsForVirtualProduction Image74.png

Now rotate the camera so that you can see one of the side edges of the LED wall.

UsingLedWallsForVirtualProduction Image75.png

Adjust the Radius parameter of the LED Wall until you get a match.

UsingLedWallsForVirtualProduction Image76.png

UsingLedWallsForVirtualProduction Image77.png

Please note that Radius has a minimum value, the one that describes a half circle. As long as the property is under that value you will see a flat LED wall. Only after crossing that minimum value you will start to see the curve is changing with the parameter.

Studio free view

By turn on the mouse button of the STUDIO panel you'll get a free view of your virtual LED wall setup. You can move around freely with the mouse. Also you can see the current position of your tracked camera.

UsingLedWallsForVirtualProduction Image78.png

UsingLedWallsForVirtualProduction Image79.png

Precise aligning of touching LED walls

If two LED walls are touching you might need a more precise placement to avoid tearing in the rendered image.

E.g. in our multi-machine example we split the central curved LED wall into halves.

UsingLedWallsForVirtualProduction Image80.png

In this scenario you have to define the two halves as two separate curved LED walls and it takes extra attention to align them precisely to each other.

Firstly you can use the Studio free view described above. From a top view you can see the small misalignments and can adjust rotation and position parameters of either LED wall to resolve them.

UsingLedWallsForVirtualProduction Image81.png

Secondly, in the final view any significant misalignment will be seen readily. You can make some final adjustments in this view as well. For that turn off the SCENE viewing mode.

UsingLedWallsForVirtualProduction Image82.png

UsingLedWallsForVirtualProduction Image83.png

To see the misalignment more readily you can also use the checker pattern of the LUT panel. 

UsingLedWallsForVirtualProduction Image84.png

UsingLedWallsForVirtualProduction Image85.png

This pattern only appears within the camera frustum, therefore you have to rotate your camera so that it points to the problematic area.

Having finished with the setting simply turn off the LUT mode to see the final result.

Rendering parameters

If you turn off the SCENE viewing mode, you'll see the final virtual background rendered on each LED walls from the right perspective.

UsingLedWallsForVirtualProduction Image86.png

UsingLedWallsForVirtualProduction Image87.png

UsingLedWallsForVirtualProduction Image88.png

Positioning the virtual scene

Go to the INPUTS control board.

Using the SCENE panel’s Base Cam Transf property you can set the relative position of the virtual and real space thus determining which part of the virtual scene is seen on the LED walls and from what angle.

UsingLedWallsForVirtualProduction Image89.png

UsingLedWallsForVirtualProduction Image90.png  UsingLedWallsForVirtualProduction Image91.png

Frustum rendering

The system performs a separate render pass for producing the image portion that the camera currently actually sees on the LED wall in order to get the highest possible quality for that portion.

UsingLedWallsForVirtualProduction Image92.png

On the LEDWALLS control board the FRUSTUM panel provides options for this render pass.

UsingLedWallsForVirtualProduction Image93.png

UsingLedWallsForVirtualProduction Image94.png


The most important parameter is the pixel resolution. It's up to you to select a resolution that is good enough for your production while does not entertain too much load on the GPU. In general it is recommended to use the vertical resolution of your main LED wall. The horizontal resolution is calculated automatically from the provided Aspect Ratio.

UsingLedWallsForVirtualProduction Image95.png

Edge Expand

There's always a certain amount of delay while any change in the position / orientation of the camera is reflected on the LED wall as well. Therefore it's not enough to exactly render the frustum that the camera currently sees, because any movement of the camera would cause the slipping out of the rendered image from the camera's field of view.

In order to compensate this issue you can specify an Edge Expand value that expands the area that is actually renders. The faster camera movements you want to compensate the larger Edge Expand you have to specify.

UsingLedWallsForVirtualProduction Image96.png

By default the system renders this enlarged view using the same pixel resolution you specified. This can lead to the drop of the quality of the central part the camera actually sees. If you find this drop too noticeable you can select the Preserve Resolution option that will increase the pixel resolution proportionally with the amount of expansion.

UsingLedWallsForVirtualProduction Image97.png

Use this feature with care, because in can cause significant increase in the GPU load when used with larger Edge Expand.

It's also worth to note that Edge Expand is not only needed because of the delay. The camera lens will always have some degree of distortion. The frustum area projected on the LED wall will never be seen as a perfect rectangle through the camera. Its edges can shrink inwards which has to be compensated with expanding the area.

Virtual lens parameters

The following parameters can be set arbitrarily, independently of the actual settings of the real camera.

UsingLedWallsForVirtualProduction Image98.png

Per-LED-wall options

You can make color adjustments of the frustum image for each LED walls independently using the ADJUSTER X panels.

UsingLedWallsForVirtualProduction Image99.png

Normally you want to use LED walls with the same characteristics, so you won't need independent settings. In this case simple select all the ADJUSTER panels and set their properties together.

If you select a LED Wall X panel you can turn on/off the displaying of the frustum image individually. However this option is only meaningful if you use a multi-machine configuration, see more info later.

UsingLedWallsForVirtualProduction Image100.png

UsingLedWallsForVirtualProduction Image101.png

Fill rendering

For each LED wall a separate rendering is performed that provides the content for the parts the camera does not see currently. Since the main purpose of these image parts is providing an ambient lighting and reflection environment they can be rendered in a reduced quality.

Fixed position

By default the Fill rendering is performed from the perspective of the current camera position. In many cases this is unwanted, because you want a still reflection environment independently of the motion of the camera.

For that select the FILL panel and set the Use Fixed Position option.

UsingLedWallsForVirtualProduction Image102.png  UsingLedWallsForVirtualProduction Image103.png

Now the rendering is performed from the perspective of a constant "camera" position. The position itself can either be specified manually or you can use the Capture Fixed Position trigger to store the current position of the real camera.

UsingLedWallsForVirtualProduction Image104.png


Another way of creating a fixed fill image is literally freeze the last rendered frames. It's huge advantage is that it frees the GPU from the continuous rendering of all the Fill content, thus allowing to use a single machine for any number of used LED walls.

Screenshot 2021-03-24 171927.png

Of course, this method is only suitable if you do not need any changing in the ambient / reflection (e.g. the fluctuating light of a fire or any moving object).

Per-LED-wall options

Select a LED Wall X panel.

UsingLedWallsForVirtualProduction Image105.png

As we discussed the Fill rendering does not require the same quality than the Frustum rendering. In order to spare the GPU processing power you can specify a reduced resolution for Fill:

UsingLedWallsForVirtualProduction Image106.png

If this parameter is 1 then the rendering is performed in the pixel resolution of the LED wall itself (specified in Resolution property). Any other value will work as a multiplier for this base resolution. This way you can either increase or decrease the rendering quality.

You can apply a Blur on the Fill image in order to make the ambient lighting more diffuse.

UsingLedWallsForVirtualProduction Image107.png

UsingLedWallsForVirtualProduction Image108.png

This function is also useful when you choose to render in a low resolution and want to smoothen out the jagged edges on the picture.

You can make color adjustments of the frustum image for each LED walls independently using the FILL ADJUST X panels.

UsingLedWallsForVirtualProduction Image109.png

Note that the Fill adjustments can be specified not only independently for each LED wall, but independently from the Frustum adjustments as well. This way you can boost or alter the color of the ambient lighting effect or the reflection while the part the camera actually sees remain unaltered.

UsingLedWallsForVirtualProduction Image110.png

Green frustum

In some cases you might choose to render the final background in post-production and only use the LED-walls for ambient lighting / reflection. In this case you need a green background to be able to key out the actors in post.

For that activate the GREEN panel.

UsingLedWallsForVirtualProduction Image111.png

This will replace the Frustum rendering with a solid green color:

UsingLedWallsForVirtualProduction Image112.png

The color itself can be any of your choice:

UsingLedWallsForVirtualProduction Image113.png

You can also choose to only work with a part around the actor in post. For that use the Crop properties to specify a smaller green area withing the Frustum image.

UsingLedWallsForVirtualProduction Image114.png

UsingLedWallsForVirtualProduction Image115.png

When GREEN is activated you'll see green on your preview monitor as well, since it shows the incoming camera signal.

UsingLedWallsForVirtualProduction Image116.png

It's great when you are setting up the green area, but during the recording you naturally will want to see a previz of the final production instead. For that activate the PREVIZ KEYER panel.

UsingLedWallsForVirtualProduction Image117.png

It will key out the green part and replaces it with the image rendered for the Frustum.

UsingLedWallsForVirtualProduction Image118.png

The panel controls an Advanced B keyer that you can operate with as usual.

UsingLedWallsForVirtualProduction Image119.png

Tracking markers

For post-production work you'll need the camera tracking information as well.

One way to obtain it is recording the tracking data into FBX. On this please check the "Final composite recording" sections of this documentation.

But you can also display tracking markers in the green region of the LED wall and use them for traditional post-production tracking.

For that turn on Tracking Markers on the GREEN panel and setup the marker properties according to your needs.

UsingLedWallsForVirtualProduction Image120.png    UsingLedWallsForVirtualProduction Image121.png

UsingLedWallsForVirtualProduction Image122.png

Multi-machine settings

In the case of multi-machine configuration you have to specify which LED wall is rendered by which machine. This case be done be selecting the LED Wall X panels and setting their Engine property.

In our example it looks like this:

LED Wall 1

UsingLedWallsForVirtualProduction Image123.png

LED Wall 2 and 3

UsingLedWallsForVirtualProduction Image124.png

LED Wall 4

UsingLedWallsForVirtualProduction Image125.png

This will result in each machine rendering its own LED walls only, thus dividing the total GPU load among each other.

However it is important to note that the Frustum pass have to be rendered for all LED walls that can appear in the field of view of the camera. This results in a high extra GPU load for all the machines. But usually the side, rear, ceiling LED walls only serve as ambint lighting / reflection source and are not participating in camera frustum.

For these LED walls you can turn off Frustum displaying, in our example:

LED Wall 2 and 3

UsingLedWallsForVirtualProduction Image126.png

If you turn off Frustum for all the LED walls handled by a specific machine then the Frustum rendering will be omitted on that machine. In our example Remote #2 machine renders LED Wall 2 and 3, and we turned off Frustum for both of these, therefore Remote #2 won't render the Frustum pass.

This way you need less GPU power (fewer machines or smaller GPU) for the side LED walls.

Switching between cameras

Go to the CAMERAS control board.

If you use multiple physical cameras here you can select between them using the CAM 1, CAM 2, CAM 3 buttons.

UsingLedWallsForVirtualProduction Image127.png

UsingLedWallsForVirtualProduction Image128.png

When switching cameras both the video input and the perspective the LED wall images are projected from switched together.

However there's always a certain amount of delay while the change in the perspective reaches the LED walls and then the camera image reaches Aximmetry system. To ensure that the switch of the LED wall picture and the camera change happens the same time on Aximmetry's final output you have to entertain a Picture Delay value. Adjust this value until you get a satisfying result.

UsingLedWallsForVirtualProduction Image129.png

Additions to the final image

As we discussed before the incoming camera image already contains the final composition, therefore it is simply passed through to the program output.

However you may want make some final adjustments on the image. For that go to the INPUTS control board and use the ADJUSTER X panels.

UsingLedWallsForVirtualProduction Image130.png

UsingLedWallsForVirtualProduction Image131.png

Also, similar to other camera handling compounds, LEDWallCam also has the input pins that allow using the standard Overlays functionality in order to add lower thirds, channel logo etc..

UsingLedWallsForVirtualProduction Image132.png

Digital extension


Aximmetry is capable of extending the virtual graphics beyond the boundaries of the physical LED walls. The system generates a mask based on the camera position and the location and size information of the LED walls. Within the masked area the real camera image is seen (with the LED wall in the background) while outside of it the extended virtual space is.

Please note that in order to have a usable digital extension you need absolutely precise camera tracking and lens distortion. Therefore usually this feature only applied as a previz during the recording, the final digital extension is done in post-production.

Also note that this method is only usable if the actors are always seen in front of the LED walls. If they allowed to leave to outside then you need a green screen behind the LED walls.

For use the feature simply activate the DIGITAL EXT panel.

UsingLedWallsForVirtualProduction Image133.png

UsingLedWallsForVirtualProduction Image134.png

By adjusting the Edge Softness property you can get a harder or softer transition between the real and the extended image.

UsingLedWallsForVirtualProduction Image135.png

UsingLedWallsForVirtualProduction Image136.png

Color matching using a LUT

There will always be a color difference between the real and the extended image. Firstly the real image is shown on the LED wall, secondly it is seen through a camera. Both of them entertain some degree of color transformation.

As a first measure (especially if you only use the digital extension option for previz) you can use the DIGIT EXT ADJ panel's usual Brightness / Contrast etc. controls to get the virtual part's colors closer to the LED walls one.

However for a better match you will need a LUT. In the panel's LUT property you can specify any standard Cube LUT files.

UsingLedWallsForVirtualProduction Image137.png  UsingLedWallsForVirtualProduction Image138.png

Aximmetry also provides the means to create such a LUT file. For that activate the LUT measure panel.

UsingLedWallsForVirtualProduction Image139.png

At first you will see a checker pattern projected on the LED wall in the frustum area.

UsingLedWallsForVirtualProduction Image140.png

On your preview monitor you will see the 8 x 8 pattern coming through the camera and also 8 x 8 rectangular markers that designates the areas where Aximmetry will sample (and average) the colors from.

UsingLedWallsForVirtualProduction Image141.png

As you can see due to the camera's lens distortion the sampling areas might be a bit off. It is important that they are placed inside the pattern cells for the correct color sampling. To compensate the distortion adjust the following properties at will until you have all the sampling rectangles inside the corresponding pattern cell.

UsingLedWallsForVirtualProduction Image142.png  UsingLedWallsForVirtualProduction Image143.png

UsingLedWallsForVirtualProduction Image144.png

Specify the target folder and filename parameters.

UsingLedWallsForVirtualProduction Image145.png

Press the play button.

UsingLedWallsForVirtualProduction Image146.png

The system will go through the 32 x 32 x 32 different colors displaying and sampling a 8 x 8 portion at a time. The whole process takes several minutes.

UsingLedWallsForVirtualProduction Image147.png

At the end a .cube file is saved containing the 32 sized cube LUT. It is the file you can then specify for DIGIT EXT ADJ.

The Shot Period tells the interval between two 8 x 8 measures. Due to the total delay of the LED walls + the camera input it's necessary to wait between the changing of the colors and the sampling of the results. If you have a low delay in your system you can try to lower the Shot Period to shorten the total process time.

UsingLedWallsForVirtualProduction Image148.png

It's important that the displayed colors goes through the frustum ADJUSTER X of the given LED wall. Therefore please note that a given LUT file is only usable with the exact same physical settings of both the LED wall and the camera and the exact same settings of ADJUSTER.