Search
Start typing to search...

UE5: How to Install and Work with the Unreal Engine-Based DE edition

Author:

Introduction

This document describes what is Aximmetry Dual Engine (Aximmetry DE), how to install it, how to get started with the included example projects, and how to create your own projects.
Please note that this document describes the usage of Unreal Engine 5. If you are working with Unreal Engine 4 you will find the corresponding document here.

Aximmetry DE is a combination of Aximmetry and our customized version of the Unreal Engine: Unreal Engine for Aximmetry. Aximmetry DE lets you create your virtual studios in the Unreal Editor and use them in Aximmetry to benefit both from the world-building and rendering features of Unreal and the advanced virtual studio capabilities of Aximmetry.

To get started, read through this document, install Aximmetry DE, take a look at the examples, and start experimenting with your own projects.

How to install Aximmetry DE

  • Run the Aximmetry DE installer
    • In the "Select Projects Folder" step, you can select which content packages should be downloaded and installed.
    • Install the "Common Library" package.
    • Install the “Tutorials and Examples” package. This contains some example projects.
    • The "Inventory" and "Studio: Demo Sets" packages are not required but we recommend you install these too.
    • The rest of this document will refer to the folder selected in this step as Projects.
  • The Aximmetry DE installer will download and launch the Unreal Engine for Aximmetry installer
    • We suggest you install Unreal Engine for Aximmetry to the same folder as Aximmetry (which is the default) but you can install it anywhere you want.

How to load the example projects in Aximmetry

We included 2 example projects to help you get started:

Basic ([Tutorials]:Unreal\Basic.xcomp) shows a minimal setup: a very simple Unreal scene with a virtual camera and a single billboard controlled by Aximmetry. A billboard can be described as the keyed camera image of a talent displayed on a plane in 3D space.

Complex ([Tutorials]:Unreal\Complex.xcomp) shows some more advanced features:

  • transparent objects in front of the billboard
  • high-quality reflections
  • 2 billboards with some advanced settings (one of them is affected by lights in the scene, the other is not)
  • virtual camera movement
  • controlling the position of a light from Aximmetry

To use the example project:

  1. Open the Unreal project in Unreal Editor for Aximmetry. The Unreal project file can be found at [Tutorials]:Unreal\Basic\Basic.uproject.
  2. Click Aximmetry / Cook Content for Aximmetry DE. This converts the project into a format that can be loaded by Aximmetry.
  3. After cooking is complete, the Unreal Editor is not needed anymore. You can close it if you want.
  4. Open the Aximmetry compound ([Tutorials]:Unreal\Basic.xcomp) in Aximmetry.
  5. After the project is loaded you should see the rendered scene in your output window.

The same steps apply to the Complex project.

How to create your own projects

The basic workflow is the following (each step will be explained in detail in the next section).

Create a new Unreal project. Some project settings have to be modified as detailed below. Add an Aximmetry Virtual Camera to the scene. This is a custom Unreal package that facilitates the connection between Aximmetry and Unreal, provides virtual camera functionality, and renders the billboards. Edit the scene, or build your virtual environment. After your scene is ready, you have to cook it, so it can be used by Aximmetry. Open Aximmetry, create a new compound and drag the Unreal project file into it. After connecting the modules as detailed below you should have a working Aximmetry DE project.

In Unreal

Create a new Unreal project. We recommend the Aximmetry Blank project with no starter content.
The Aximmetry Blank has modified Project Settings that are needed to work together with Aximmetry. If you are opening an already existing project you have to set these project settings detailed here, before continuing.

Add an Aximmetry camera to the scene:

  • Go to Aximmetry / Add Camera and select the camera that you want to use.
  • This will copy the assets from the Aximmetry Common Library to your unreal project and add a camera actor to the scene. For more details, see the Adding and updating cameras section.
  • Do not move or rename the asset directory. (Unreal Editor for Aximmetry expects them to be there and it might also break asset references.)
  • The position and orientation of the camera actor in the scene don't affect the output because they will be controlled by Aximmetry during runtime.

Edit the scene, build your virtual environment

Save the project

Cook the project by clicking Aximmetry / Cook Content for Aximmetry DE

After cooking is complete, the Unreal Editor is not needed anymore. You can close it if you want.

Converting Unreal 5 project into Aximmetry Unreal 5 project

You can also open an existing Unreal project with Unreal for Aximmetry. In this case, we recommend choosing the Open a Copy option when the project conversion window pops up.

After that you have to change the following Project Settings:

Aximmetry Virtual Camera has to be able to write to a custom stencil. To enable this:

  • Go to Edit / Project Settings / Engine / Rendering / Postprocessing
  • Set Custom Depth-Stencil Pass to Enabled with Stencil

Enable clip planes:

  • Go to Edit / Project Settings / Engine / Rendering / Lighting
  • Enable Support global clip plane for Planar Reflections

By default, the player pawn will show up in the scene as a gray sphere. This can be resolved in 2 ways:

  • A) Set the player model to an invisible one:
    • Go to Edit / Project Settings / Project / Maps & Modes / Default Modes
    • Create a new game mode by pressing the + button (Give it a name and press OK)
    • The new game mode should be selected as the Default GameMode
    • Open the Selected GameMode section
    • Change the Default Pawn Class to None
  • B) Move the PlayerStart outside of the scene. In this case, the gray sphere will spawn in an invisible area. (Its position is indifferent to Aximmetry.)

Set up the startup maps:

  • Save the map (File / Save Current)
  • Go to Edit / Project Settings / Project / Maps & Modes / Default Maps
  • Set both Editor Startup Map and Game Default Map to your map
    Note: If you rename or move the map asset you have to clear and select the startup maps again because Unreal is not updating the project settings correctly.

Then you can continue with adding the Aximmetry camera as said at the beginning of this section.

Setting up objects in front of the billboard

If you place objects in front of the billboard, you have to take some steps to avoid visual artifacts. The reason is that Unreal uses temporal antialiasing techniques to make the edges of objects look smooth.
Both Temporal Anti-Aliasing (TAA) and Temporal Super-Resolution (TSR) makes the billboard very blurry so we modified them to fix this problem. Our modified version needs to know about objects in front of the billboard to correctly handle edges and translucency.
To provide this information, you have to mark the objects that can occlude the billboard with a special stencil value:

If the object is opaque and you are using TAA:

  • Select the object.
  • In the Detail window, go to the Rendering section.
  • Enable Render CustomDepth Pass.
  • Set CustomDepth Stencil Value to 16.

Note: Not doing this results in aliasing and jittering at the edge of the object.

If the object is translucent and you are using TAA or TSR:

  • Select the object.
  • In the Detail window, go to the Rendering section.
  • Enable Render CustomDepth Pass.
  • Set CustomDepth Stencil Value to 32.
  • Go to the material editor by double-clicking the translucent material in the Content Browser.
  • In the Details window, enable Translucency / Allow Custom Depth Writes.
  • Set Material / Opacity Mask Clip Value to 0.

Note: Not doing this results in aliasing and jittering at the edge of the object and in front of the billboard.

In Unreal Editor for Aximmetry, the default antialiasing method is TAA because we found that TSR can cause rendering errors on the billboard if it is occluded by an opaque object.
If this is not an issue in your use case, you can switch to TSR.

Nanite virtualized geometry

Objects with Nanite mesh can not be in front of billboards when Allow Virtuals is turned Off or Light Wrap is turned On. Otherwise, the billboard will be rendered with artifacts.

Adding and updating cameras

Aximmetry cameras can be automatically added to the project by selecting one from the Aximmetry / Add Camera menu.
This will copy the assets from the Aximmetry Common Library to your unreal project and add a camera actor to the scene.

If there is a camera in the scene already, this menu can be used to replace the existing camera with another one.
A confirmation window will pop up with details about the operation.
If you continue, the existing camera actor and assets will be deleted and a new one will be added.

New versions of the Unreal assets and the corresponding Aximmetry compounds are released regularly.
When a new version is available, a notification will pop up in the lower left corner when you open the project, prompting you to update. Updating the camera assets is necessary to maintain compatibility with Aximmetry, so it is highly recommended.
If you choose to update, a confirmation window will pop up with details about the operation.
If you continue, the existing camera actor and assets will be deleted and a new one will be added.

If you are using an older version of the Aximmetry Common Library but your Unreal project contains a newer version of a camera, you will be prompted to downgrade to the camera version that is compatible with Aximmetry. This works the same way as the update.

In Aximmetry

Create a new compound

Drag in the .uproject file from your Unreal project root directory. This will create an Unreal Project module node.

Select the Unreal Project module and set the Connection parameter to Cooked

Connect the Out pin to the output of the compound. Now you should see an image rendered by Unreal appear in the output window.

By default, the camera is at the origin, so it's probably intersecting the geometry of the environment. You can use either a Camera Mover module or a Virtual Camera module to move the camera. We recommend you use the virtual camera because it automatically handles the billboards as well. To set up the camera movement:

  • Drag the virtual camera compound ([Common_Studio]:Camera\VirtualCam_Unreal\VirtualCam_Unreal_3-Cam.xcomp) into the Flow graph.
  • Connect the following pins (see screenshot below):
    • Unreal Project module: Out → VirtualCam: Rendered
    • Unreal Project module: B Mask→ VirtualCam: B Mask
    • VirtualCam: Cam Transform → Unreal Project module: Cam Transform
    • VirtualCam: Cam Horizontal FOV → Unreal Project module: Cam Horizontal FOV
    • VirtualCam: Cam Focus Distance → Unreal Project module: Cam Focus Distance
    • VirtualCam: Preview → Compound output
    • VirtualCam: Out → Compound output

  • Now you can move the camera around by holding down the mouse buttons and dragging the mouse in the preview window.

If you want to use a billboard:

  • Add an input. This can be a video (for example [Common]:Videos\Green\Green_Karim_Notes.mp4) or a camera input.
  • Connect the following pins (see screenshot below):
    • Source output (e.g. Video Player: Out) → VirtualCam: Test Input 1
    • VirtualCam: B1 Data → Unreal Project module: B1 Data
    • VirtualCam: B1 Texture → Unreal Project module: B1 Texture
    • VirtualCam: B1 AO Texture → Unreal Project module: B1 AO Texture
    • VirtualCam: B1 Refl Texture → Unreal Project module: B1 Refl Texture
    • VirtualCam: B1 Shadow Texture → Unreal Project module: B1 Shadow Texture

  • Go to the Billboards control board window and make sure the cropping, keying, etc. is set up correctly.
  • You can add a second billboard by connecting the B2 ... pins.

Connection with Aximmetry

Passing data from Aximmetry to Unreal

You can pass arbitrary data from Aximmetry to Unreal. This can be used to move objects, set light intensity, etc.

To pass a value, place a Get Aximmetry TYPE node in a blueprintTYPE can be any of the supported types: Color, Integer, Logical, Scalar, Text, Transformation, Trigger, Vector, and Video. The node will show up in Aximmetry as a pin on the Unreal Project module after you update the pin list. The pin list can be updated by clicking the red chain link button  on the Unreal Project module.

The pin name and the default value are determined by the Name and Default Value parameters of the Get Aximmetry node. The node also has an Order Index parameter which determines the pin order in Aximmetry. The pins are ordered in increasing Order Index order, but the value 0 is an exception, pins with this index will go to the end of the list. Both Name, Default Value, and Order Index must be static (i.e. you cannot use the outputs of other nodes as their input).

The output of these nodes is updated per frame.

The Aximmetry_Camera blueprint uses this mechanism to get values from Aximmetry, so you can find many usage examples in that blueprint.

Most nodes are straightforward to use but there are some trickier ones.

You can use the Level Blueprint to add the Aximmetry blueprint nodes. The Level Blueprint is a specialized type of blueprint that acts as a level-wide global event graph.

Get Aximmetry Trigger

  • Create the following nodes: Get Aximmetry TriggerEvent BeginPlayBind Event to Trigger, and Custom Event. (If you can't find the Bind Event to Trigger node, turn off Context Sensitive search in the right-click menu.)
  • Make the following connections (see screenshot below):
    • Event BeginPlay: Exec (white triangle) → Bind Event to Trigger: Exec
    • Get Aximmetry Trigger: Return Value → Bind Event to Trigger: Target
    • Custom Event (red square) → Bind Event to Trigger: Event

  • Use the Exec output of the Custom Event node to execute the desired action.

Get Aximmetry Video

  • The following example applies a video image received from Aximmetry onto a plane in Unreal.
  • Create a base material (This material will serve as a base for the dynamic material we will apply to the plane. For more information on dynamic materials refer to the Unreal Engine documentation.
    • Create a new material (Content Browser / Add New / Material)
    • Open this material in the material editor by double-clicking it in the content browser.
    • Add Texture / TextureSampleParameter2D parameter from the right-click menu. The name of this parameter can be anything you want, but it must exactly match the “Texture Parameter Name” input of the “Get Aximmetry Video” blueprint node we will add later.
    • Connect the “RGB” output of the texture parameter to the “Base Color” output of the material
    • You can customize this material further to achieve the effect you want but this topic is outside the scope of this document
    • Save the material

  • Create a reference to your target object (e.g. a TV screen) by dragging it into the blueprint from the World Outliner (Actor Reference)
  • Add the following nodes to a blueprint:
    • Event BeginPlay
    • Create Dynamic Material Instance
    • Set Material
    • Event Tick
    • Get Aximmetry Video
    • Set Texture Parameter Value
  • Connect the following pins (see screenshot below):
    • Event BeginPlay: Exec (out) → Create Dynamic Material Instance: Exec (in)
    • Create Dynamic Material Instance: Exec (out) → Set Material: Exec (in)
    • Create Dynamic Material Instance: Return Value → Set Material: Material
    • Actor Reference (out) → Set Material: Target (This will automatically add a Get Static Mesh Component node.)
    • Event Tick: Exec (out) → Get Aximmetry Video: Exec (in)
    • Get Aximmetry Video: Exec (out) → Set Texture Parameter Value: Exec (in)
    • Get Aximmetry Video: Return Value → Set Texture Parameter Value: Value
    • Create Dynamic Material Instance: Return Value → Set Texture Parameter Value: Targe
  • Set up the node inputs:
    • Create Dynamic Material Instance
      • Parent: Select the parent material you created in the first step
    • Get Aximmetry Video
      • Name: This will be the name of the video input pin in Aximmetry
    • Set Texture Parameter Value:
      • Parameter Name: This must exactly match the name of the TextureSampleParameter2D parameter of the parent material you created in the first step.

howtode image4.jpg

Get Aximmetry Transformation

This node will show up as a transformation input pin on the Unreal Project module in Aximmetry. You can connect any transformation you want. However, if you want to can control this transformation from Aximmetry in Edit mode, you can use a Scene Node:

  • Insert a Scene Node in your Aximmetry compound
  • Connect its World Transf output to the Unreal Project module (it is hidden by default - click the button in the upper right corner of the node to reveal it)
  • Enable Edit mode by choosing an output in the Edit / Edit Scene On menu (make sure the Preview output of the virtual cam module is connected to this output)
  • Select the Scene Node in the compound
  • You can move/rotate/scale the transform by using the handles in the output viewport

Passing data from Unreal to Aximmetry

This feature is currently limited to video outputs. It will be expanded in the future to cover all data types and an arbitrary number of outputs.

The Set Aximmetry Video blueprint node can be used to send a texture to Aximmetry. You can use multiple Set Aximmetry Video outputs using different names.

If there are no Set Aximmetry Video nodes in the scene, the default output of the Unreal Project module is the scene as it would be rendered to the screen by Unreal. It works like a hidden Set Aximmetry Video node with the name “out”. This default output has 3 possible formats that can be configured in Edit / Project Settings / Rendering / Default Settings / Frame Buffer Pixel Format.

Aximmetry Virtual Camera details

Setting up a virtual camera and billboard rendering in Unreal is a very complex process, so we built the Aximmetry_VirtualCam_3-Cam package to do most of the work.

It provides the following features:

  • virtual camera: position, orientation, field of view (FOV) controlled by Aximmetry
  • Billboard rendering
    • Maximum number of billboards: 3
    • Optional shadow
    • Optionally affected by lights in the scene
    • Optionally affected by the tonemapper in Unreal
    • Enhanced image quality and sharpness
  • Based on Cine Camera: you can add post-process effects etc.

The default settings should look pretty good for most use cases but there are some parameters you can tweak if it doesn't meet your needs:

  • Lit: Whether the lights in the Unreal scene affect the billboard. Disabled by default.
  • Cast Shadow: Whether the billboard casts a shadow. Enabled by default.
  • Inverse Tonemap: The tonemapper in Unreal can distort the colors of the billboard significantly. This option is used to counter the effect of the tonemapper on the billboard. Enabled by default.
    NOTE: Using this feature with heavily processed billboard textures may lead to unexpected results (even if the texture looks fine in Aximmetry). For example: increasing the brightness significantly with an Adjuster module. This is because this feature expects the pixel values to be in a specific range – values outside this range are clipped and the texture starts to lose detail.
  • Alpha Correction for Bright Background: Very bright backgrounds can make the billboard too transparent. This option can be used to fix this issue. Disabled by default.
    NOTE: 
    The alpha correction cannot handle certain cases correctly. Things like DOF blur and translucent objects behind the billboard may cause rendering errors.

Gamma correction for video inputs

The Get Aximmetry Video blueprint node has a pin called S RGB that controls whether the received video is treated as sRGB or linear.

sRGB should be enabled for textures that contain color information (camera footage etc. that will be rendered into the scene) and it should be disabled for textures used for numerical calculations in shaders (normal maps, masks, etc.) NOTE: This option only affects 8-bit textures. Unreal treats all non-8-bit and floating-point textures as linear.

The Set Aximmetry Video itself doesn’t do any gamma correction. Its output depends on the settings and pixel format of the texture or render target you connect to it. For more information on this, please refer to the Unreal Engine documentation.

The default output is always treated as sRGB regardless of pixel format.

Interactive editing

Aximmetry can connect to Unreal in two different ways, each one is useful in a different scenario:

  • Cooked mode: Aximmetry directly loads the content generated by Unreal, so running a separate Unreal Editor is not needed. This mode should be used in production because it provides the highest performance. This is what you have used so far.
  • Live Sync mode: Aximmetry connects to a standalone Unreal Editor instance. This is useful while you are building your scene in Unreal, because it lets you interactively edit and test the scene, without having to cook it after every change.

In Cooked mode, Aximmetry uses the cooked version of the project which is generated when you press File / Cook Content for Windows in Unreal. This means that if you edit the scene in Unreal, you have to cook the project again. You can leave Aximmetry running with the Unreal Project module in Cooked mode - it will pick up the changes when cooking is finished and reload the project automatically. If you added, removed, or changed any Aximmetry nodes in the Unreal blueprint as described in Passing data from Aximmetry to Unreal section, after cooking the project, Aximmetry will detect the change and a red chain link button  will show up in the upper right corner of the module. Click on this button to update the pin list.

In Live Sync mode, Aximmetry connects to the running Unreal Editor when you press the Play button. This is when the Unreal Project module will pick up the changes in the pin list as described above. Click on the red chain link button  to update the pin list.

To use Live Sync:

  1. Open the project in Unreal and press Play
  2. Drag the .uproject file into an Aximmetry compound and set the Connection pin to Live Sync
  3. Update the pin list by clicking the red chain link button 
  4. Add a virtual camera (or a camera mover) and connect the pins as described above
  5. Now you should see the same image both in Aximmetry and in the Unreal Editor

You can stop and start both the Unreal scene and the Aximmetry compound or even the Unreal Editor and Aximmetry itself, the connection will be re-established automatically.

The resolution of the viewport is set automatically based on the value of the Out Size pin just like in Cooked mode. If the game is running in a separate window, the window is also resized.

Other considerations

Depth of field (DOF)

When changing focus in Aximmetry, the Aximmetry camera changes the Cinematic Depth of Field method's manual focus distance.

The billboard is not affected by the depth of field effect due to limitations in the Unreal Engine.

Translucent objects - like the billboard - are rendered after the DOF blur is applied to the scene, so these objects are not blurred. There is an option to render translucent objects before the DOF is applied. This blurs the billboard, however, it leads to severe rendering artifacts (especially at the edges) because the depth information written by the billboard is not accurate enough for a good quality DOF calculation. For this reason, applying DOF to the billboard is not supported.

Ray Tracing in Unreal Engine

The Unreal Engine supports ray tracing. It consists of multiple features that can be turned on or off individually. See more information.

Please be aware that all ray tracing features have a high-performance cost, even on modern hardware. We suggest that you disable all ray-tracing features and then enable the ones that benefit your scene the most. How to configure these settings is described here.

The camera supports most ray tracing features but there are some caveats due to limitations in the Unreal Engine:

  • Ray Tracing Global Illumination is fully supported.
  • Ray Tracing Reflections and Ray Tracing Shadows are supported with some limitations. These features are described below in more detail alongside other reflection and shadow rendering methods.
  • Ray Tracing Ambient Occlusion and Ray Tracing Translucency are not supported.

Reflections In Unreal Engine

The Virtual and Tracked cameras support all reflection methods that are available in the Unreal Engine for rendering billboard reflections. However, there are some limitations due to how reflection rendering and ray-tracing work in the engine.
This section describes the properties, requirements, and limitations of each available reflection method.

To understand how various features interact with reflection rendering, the following variables need to be considered:

Unreal Editor for Aximmetry: Project Settings / Engine / Rendering / Global Illumination / Dynamic Global Illumination Method
Unreal Editor for Aximmetry: Project Settings / Engine / Rendering / Reflections / Reflection Method
Unreal Editor for Aximmetry: Project Settings / Engine / Rendering / Hardware Ray Tracing / Support Hardware Ray Tracing, Ray Traced Shadows
Unreal Editor for Aximmetry: Project Settings / Engine / Rendering / Lumen / Use Hardware Ray Tracing when available, Ray Lighting Mode
Aximmetry: BILLBOARDS control board (when using Virtual Camera) / Auto Mirror Offset, Mirror Offset, Mirror Feet Blur, Feet Blur Offset, Cast Shadows, Shadow Depth, Shadow Offset, Shadow Rotation, Lit, Render To Depth, Ray Traced Reflection Intensity
Aximmetry: TRK INPUTS control board (when using Tracked Camera) / Mirror Feet Blur, Blur Depth, Blur Amount, Mirror Offset Z, Cast Shadows, Shadow Depth, Shadow Offset, Shadow Rotation, Lit, Render To Depth, Ray Traced Reflection Intensity

Planar Reflections

High-quality reflections. Semi-transparent areas are reflected correctly.
Reflection Method must be None or Screen SpaceRender To Depth must be Off. A Planar Reflection actor must be added to the scene.

Lumen Reflections

Average quality reflections. Semi-transparent areas are not reflected.
Reflection Method must be LumenSupport Hardware Ray Tracing must be enabled. Use Hardware Ray Tracing when available must be enabled. Ray Lighting Mode must be Hit Lighting for ReflectionsRender To Depth must be On.
When a Virtual Camera is used, the Auto Mirror Offset, Mirror Offset, Mirror Feet Blur, and Feet Blur Offset billboard properties have no effect.
When a Tracked Camera is used, the Mirror Feet Blur, Blur Depth, Blur Amount, and Mirror Offset Z billboard properties have no effect.
When Lit is Off, the brightness of the reflection can be different from the brightness of the billboard. This applies only to the reflection of offscreen parts of the billboard.
When Ray Traced Shadows are enabled and Cast Shadows is On, the Shadow Depth, Shadow Offset, and Shadow Rotation billboard properties affect the reflection. This applies only to the reflection of offscreen parts of the billboard.
When Ray Traced Shadows are enabled, Cast Shadows is OnShadow Depth is greater than 0 and Lit is On, lighting can be incorrect on the reflection. This applies only to the reflection of offscreen parts of the billboard.
Increase Ray Traced Reflection Intensity until the brightness of the reflection matches the brightness of the billboard.

Screen Space Reflections

Low-quality reflections. Semi-transparent areas are not reflected.
Reflection Method must be Screen Space. A Planar Reflection actor must not be added to the scene. Render To Depth must be On.
When a Virtual Camera is used, the Auto Mirror Offset, Mirror Offset, Mirror Feet Blur, and Feet Blur Offset billboard properties have no effect.
When a Tracked Camera is used, the Mirror Feet Blur, Blur Depth, Blur Amount, and Mirror Offset Z billboard properties have no effect.

Standalone Ray Traced Reflections

Average quality reflections. Semi-transparent areas are not reflected.
Reflection Method must be Standalone Ray TracedSupport Hardware Ray Tracing must be enabled. Render To Depth must be On.
When a Virtual Camera is used, the Auto Mirror Offset, Mirror Offset, Mirror Feet Blur, and Feet Blur Offset billboard properties have no effect.
When a Tracked Camera is used, the Mirror Feet Blur, Blur Depth, Blur Amount, and Mirror Offset Z billboard properties have no effect.
When Lit is On, the brightness of the reflection can be different from the brightness of the billboard.
When Ray Traced Shadows are enabled and Cast Shadows is On, the Shadow Depth, Shadow Offset, and Shadow Rotation billboard properties affect the reflection.
When Ray Traced Shadows are enabled, Cast Shadows is OnShadow Depth is greater than 0 and Lit is On, lighting can be incorrect on the reflection.
Increase Ray Traced Reflection Intensity until the brightness of the reflection matches the brightness of the billboard.

Limitations

Overriding the Reflection Method via a Post Process Volume or camera post-process settings is not supported.
Changing the Reflection Method during runtime is not supported.
Increasing Ray Traced Reflection Intensity can cause a noticeable glow around the billboard if Dynamic Global Illumination Method is set to Lumen or Standalone Ray Traced. If this happens, try to decrease the Ray Traced Reflection Intensity until the glow is not noticeable anymore and the reflection is not too dark either.

Shadows In Unreal Engine

The Virtual and Tracked cameras support both rasterized and ray-traced shadows for rendering billboard shadows.
Ray-traced shadows can be enabled in Project Settings / Engine / Rendering / Hardware Ray Tracing / Ray Traced Shadows.
Project Settings / Engine / Rendering / Hardware Ray Tracing / Support Hardware Ray Tracing must be enabled.

Limitations

Cast Ray Tracing Shadows must be set to Use Project Setting for all light sources.
Changing Ray Traced Shadows during runtime is not supported.

Article content

Loading
Close
Loading spinner icon
1/10