Search
Start typing to search...

Aximmetry DE Scene Setup (AR)

Author:

NOTE: This document details the setup process for augmented reality within Aximmetry DE. It assumes a basic knowledge of Aximmetry DE and how to work with tracked cameras. If you are not familiar with these, please read Introduction to AR Production and Preparing the Unreal Project documentation before proceeding.

Unreal Project Settings

IMPORTANT: If you are not beginning with the Aximmetry Blank project, you must configure the settings as outlined in the Open or Create the Unreal Project section of the preceding documentation. In addition, you also need to set the following Unreal Project Settings:

  • Allow Through Tonemapper:
    • Navigate to Edit > Project Settings: Engine - Rendering / Postprocessing.
    • Set Enable alpha channel support in post-processing to Allow Through Tonemapper.
  • It is recommended to Disable Motion Blur:
    • Navigate to Edit > Project Settings: Engine - Rendering / Default Settings.
    • Deselect Motion Blur

Otherwise, if you started with the Aximmetry Blank project, the only setting you need to adjust is as follows:

  • FloatRGBA Pixel Format:
    • Go to Edit > Project Settings: Engine - Rendering / Postprocessing.
    • Set Frame Buffer Pixel Format to FloatRGB
    • Restart the Unreal Editor.
    • After making this change, it is normal for the Unreal Editor to have a gray, over-saturated appearance:

      NOTE: The Editor color also changes in the standard Unreal Editor when the Frame Buffer Pixel Format is changed to FloatRGBA.

Placing AR Objects

IMPORTANT: Place virtual objects in the scene and tag them with the “AximmetryAR” tag. Only tagged objects will show up in the final image.
Ensure the tag is added to the Actor / Tags array, not the Tags / Component tags array:

NOTE: Dynamically spawned objects must call the Camera blueprint for the tag to work, more on that later in the document.

Optional: Place a Shadow Catcher or Reflection Catcher.

Shadow and Reflection Catcher

The AR camera in Aximmetry is capable of capturing shadows and reflections on a fully transparent plane (zero alpha), providing a more immersive integration of virtual elements with the real world.

These solutions can be added by dragging and dropping them into your scene from the Content > Aximmetry_TrackedCam_AR > Blueprints folder in the Content Drawer panel:

Shadow Catcher

There is no limitation on the quantity of Shadow Catchers you can deploy within a scene.

After adding the Shadow Catcher to your scene, you're free to adjust its transformation as needed:

In Aximmetry, the Shadow Catcher’s physical plane won’t be visible; only the captured shadows will be displayed:

To fine-tune the shadows to match the real-world environment more naturally, adjust the Shadow Strength parameter from the CATCHERS panel in the Aximmetry INPUTS control board.

NOTE: Avoid adding the "AximmetryAR" tag to the Shadow Catcher; otherwise, its white material will become visible in Aximmetry.

Reflection Catcher

The maximum number of Reflection Catchers is 3. Keep in mind that reflections have a very high-performance load.

Once you added the Reflection Catcher to your Unreal scene, you are free to modify its transformation to your liking:

In Aximmetry, you'll observe the reflections cast by objects on the Reflection Catcher's invisible plane:

To ensure these reflections blend seamlessly with the real environment, change the Reflection Blur and Reflection Strength parameters within the CATCHERS panel of the INPUTS control board.

NOTE: While ambient lights may illuminate the Reflection Catcher's plane in the Unreal Editor, they will not influence the Reflection Catcher in Aximmetry, if the Reflection Catcher doesn't have the "AximmetryAR" tag.

Fog and Sky

Due to the functionality of AR cameras, which replaces camera backgrounds with a fully transparent color (zero alpha), specific Unreal Engine features—including Volumetric Cloud, SkyAtmosphere, Atmospheric Fog, Exponential Height Fog, and some sky sphere methods—may not be compatible with AR cameras and could cause artifacts. Therefore, it is recommended to delete these objects when working with AR cameras.

However, sky representations can still be achieved using a sky sphere.
While Local Height Fog can be utilized from the Window > Place Actors panel for fog effects:

NOTE: Similarly, materials using additive blending do not blend correctly over areas displaying only the real camera's feed. Instead of additive blending, we recommend using AlphaComposite blending.

Tone Map and Glares

Adjust video post-process effects via the TONE MAP and GLARES panel in the INPUTS control board in Aximmetry:

You can find information about the Tone Mapper and Glares' Bloom, Streaks, and Ghosts in the Post Process Effects documentation. For an in-depth exploration of the Tone Mapper, refer to the Tone Mapping Methods documentation.

These Glares options are exclusive to AR camera compounds due to the composite nature of reflections, shadows, backgrounds, and virtual graphics, which necessitates tone mapping in Aximmetry rather than in Unreal.
NOTE: With Unreal's tone mapping bypassed, Unreal settings such as bloom, exposure, chromatic aberration, dirt mask, color grading, and film post-process options will not influence Aximmetry’s AR camera output.

AR in Green and LED Wall Projects

The Green + AR camera (tracked, 1-3 billboardsand LED wall + AR camera (tracked, 1-9 walls) support the rendering of AR elements either above or behind standard functionalities. 
In Green (tracked) camera projects, this AR solution is useful when creating composite productions where part of the real-world studio is visible alongside virtual reality.
In LED Wall projects, this AR solution allows you to render graphics in front of the talent and the final image.

To mark an object for AR rendering, simply assign the “AximmetryAR” tag to it.

This functionality differentiates between AR content and normal camera output through the dual rendering process, where two separate cameras render the AR and standard graphics respectively. These separate renders can be observed distinctly in Live Sync mode in the Unreal Editor:

In Aximmetry, the AR layer is blended over the composite of the virtual and real-world studios:

To blend either above or behind standard functionalities:

  • Green + AR camera (tracked, 1-3 billboards) camera: When the AR Overlay Behind is turned on in the SCENE panel, then the billboard will be in front of the AR-rendered graphics. When the AR Overlay Behind is turned off, the AR graphics will always stay in front of the billboard.
  • LED wall + AR camera (tracked, 1-9 walls) camera: When the Put On LED is turned on in the AR OVERLAY panel, then the AR rendered graphics are displayed on the LED wall, placing the AR elements behind the talent. Conversely, when this setting is deactivated, the AR content is displayed in front of the talents. This occurs because the AR content is blended over the camera's live feed rather than being integrated with the imagery displayed on the LED wall.

Located beneath the option for overlay configuration, the (AR) Linear Blend setting adjusts the blending mode between AR and the standard camera, toggling between sRGB and Linear. By default, this setting is enabled and should not be modified. Unreal Engine processes alpha in Linear, and altering this setting can negatively impact the visualization of transparent areas. This is especially noticeable around the edges of objects, which may exhibit partial transparency due to anti-aliasing effects.

NOTE: With the + AR camera types, the Frame Buffer Pixel Format only needs to be set to FloatRGB when HDR is used in the camera compound.

IMPORTANT: Shadow and Reflection Catcher is not yet supported in the +AR of the Green and LED Wall projects. That is why there is only an AR Out pin on the Unreal project module and the reflection and shadow video pins are missing:

NOTE: These combinations with AR can also be achieved using multiple machines, effectively distributing and reducing overall performance load. For those interested in implementing such a setup, the Combine Different Productions in Separate Machines documentation details the process, but instead of the above "+ AR" cameras, the normal camera compounds have to be used in this case.

Dynamically Spawned Models and Actors

For objects created dynamically post-Unreal scene initialization, specific blueprint logic is required to assign the “AximmetryAR” tag, ensuring their visibility in Aximmetry.

After spawning the object, the Aximmetry_AR_Tracked_Camera's Update ARObjects action must be called:

NOTE: You can reference the AR Camera in the Level Blueprint. In other blueprints, you must search for the Aximmetry AR Camera.
NOTE: Update ARObjects is a resource-extensive action, try not to use it at every frame.

When using the Update ARObjects action, your actor must have the "AximmetryAR" tag on it.
Tags can be added directly within the blueprint using the Make Array node connected to a Set Tags action.
NOTE: The Set Tags action, by default, is created without a Target pin. To create it with the Target pin, drag a connection from an Actor pin (Return Value) into an empty space to place a new node, and then select Set Tags.

Article content

Loading
Close
Loading spinner icon
1/10