Search
Start typing to search...

Using Tracked Cameras and Augmented Reality with an Unreal Scene

Author:

Introduction

This document describes how to set up camera tracking and augmented reality in Aximmetry Dual Engine (Aximmetry DE). The material assumes a basic knowledge of Aximmetry DE and how to work with tracked cameras – If you are not familiar with these please read How to install and work with the Unreal Engine based DE edition  and Setting Up Virtual Sets with Tracked Cameras first.

Camera Tracking

Setting up a project with a tracked camera is basically the same process as setting it up with a virtual camera – with a few minor differences which will be described here. How to set up a project with a virtual camera is detailed in the How to install and work with the Unreal Engine based DE edition.

Unreal setup

  • Enable global clip plane. This is required for the tracked camera to work properly
    • Go to Edit / Project Settings / Engine / Rendering / Lighting
    • Tick the Support global clip plane for Planar Reflections checkbox
  • Disable Motion Blur
    • Go to Edit / Project Settings / Engine / Rendering / Default Settings
    • Untick Motion Blur
  • Copy the Projects\Common_Studio\Unreal_Assets\Aximmetry_TrackedCam_3-Cam directory into the Content directory of your Unreal project
  • If you want to use the tracked camera in Live Link mode, you have to set the rendering resolution so that it matches what Aximmetry expects:
    • Go to Edit / Editor Preferences / Level Editor / Play / Game Viewport Settings
    • Set New Viewport Resolution to the value of the Out Size pin of the TrackedCam module in Aximmetry. (You can view the value of a pin in Aximmetry by hovering over it and holding the Ctrl key)
    • Start the game by opening the dropdown menu to the right of the Play button and selecting New Editor Window (PIE). This will start the game in a new window with the resolution you set in the previous step. From now on you can just hit the Play button, the game will start in a new window.

Aximmetry setup

  • Drag a tracked camera module into the flow graph from [Common_Studio]:Compounds\TrackedCam_Unreal\TrackedCam_Unreal_Prev_3-Cam_3-Billboard.xcomp
  • Connect the following pins (see screenshot below):
    • Unreal Project module: BA Mask → TrackedCam: BA Mask
    • Unreal Project module: BB Mask → TrackedCam: BB Mask
    • Unreal Project module: BC Mask → TrackedCam: BC Mask
    • Unreal Project module: Out → TrackedCam: Rendered
    • TrackedCam: Out Size→ Unreal Project module: Out Size
    • TrackedCam: Control Data→ Unreal Project module: Control Data
    • TrackedCam: B texture→ Unreal Project module: B texture
    • TrackedCam: Preview → Compound output
  • Setup tracked camera input. Please refer to Setting Up Virtual Sets with Tracked Cameras on how to do this.

Annotation 2020-08-28 123855.png

Aximmetry DE supports 3 methods for compositing the camera image with the virtual set. The compositing method can be selected by setting the Use Billboards and Allow Virtuals switches on the SCENE node in the INPUTS panel.

Use billboards: OFF, Allow Virtuals: not used

Aximmetry renders the image of the talent on top of the Unreal scene. This method is very easy to set up and has the smallest performance requirements, however it doesn’t support virtual objects occluding the talent or rendering shadows and reflections of the talent in the Unreal scene.

Use billboards: ON, Allow Virtuals: OFF

Aximmetry renders the image of the talent on top of the Unreal scene. Virtual objects can occlude the talent, reflections and shadows are supported. The camera image is accurately reproduced without adding noise or distortions.

Use billboards: ON, Allow Virtuals: ON

Compositing is done entirely in Unreal. Virtual objects can occlude the talent, reflections and shadows are supported. With this method, the talent can be lit by the lights in the Unreal scene. Virtual camera movement is also supported. The quality of the talent’s image may be slightly degraded.

Augmented Reality

Aximmetry DE is capable of rendering virtual objects in the Unreal Engine and compositing them onto a tracked camera image.

Unreal setup

  • Create a new blank Unreal project
  • Enable custom stencil
    • Go to Edit / Project Settings / Engine / Rendering / Postprocessing
    • Set Custom Depth-Stencil Pass to Enabled with Stencil
  • Enable global clip plane
    • Go to Edit / Project Settings / Engine / Rendering / Lighting
    • Tick the Support global clip plane for Planar Reflections checkbox
  • Disable Motion Blur
    • Go to Edit / Project Settings / Engine / Rendering / Default Settings
    • Untick Motion Blur
  • Copy the Projects\Common_Studio\Unreal_Assets\Aximmetry_TrackedCam_AR directory into the Content directory of your Unreal project
  • Drag the Aximmetry_TrackedCam_AR\Aximmetry_Camera_AR blueprint into the scene
  • Optional: Place shadow catchers (Aximmetry_Shadow_Catcher). There is no restriction on the number of shadow catchers.
  • Optional: Place reflection catchers (Aximmetry_Reflection_Catcher). The maximum number of reflection catchers is 3. Keep in mind that reflections have a very high performance cost.
  • Place virtual objects in the scene and tag them with the “AximmetryAR” tag. Only tagged objects will show up in the final image

Please make sure the tag is added to the Actor / Tags array and not the Tags / Component Tags array.

Annotation 2020-08-28 113008.png

  • If you want to use the AR camera in Live Link mode, set the resolution as detailed in the tracked camera section.
  • Set up the startup maps, save and cook the project as described in the How to install and work with the Unreal Engine based DE edition document.

Aximmetry setup

  • Create a new compound
  • Drag in the .uproject file from your Unreal project root directory
  • Select the Unreal Project module and set the Connection parameter to Cooked.
  • Drag an AR camera module into the flow graph from

[Common_Studio]:Compounds\TrackedCam_Unreal\TrackedCam_AR_Unreal_Prev_3-Cam.xcomp

  • Connect the following pins (see screenshot below):
    • Unreal Project module: Out → TrackedCam_AR: Rendered
    • Unreal Project module: Alpha→ TrackedCam_AR: Alpha
    • Unreal Project module: Reflection 1, 2, 3 → TrackedCam_AR: Reflection 1, 2, 3
    • Unreal Project module: Shadow → TrackedCam_AR: Shadow
    • Unreal Project module: Shadow Clean Plate → TrackedCam_AR: Shadow Clean Plate
    • TrackedCam_AR: Out Size→ Unreal Project module: Out Size
    • TrackedCam_AR: Control Data→ Unreal Project module: Control Data
    • TrackedCam_AR: Preview → Compound output
  • Add tracked camera input. Please refer to the Setting Up Virtual Sets with Tracked Cameras documentation on how to do this.

Annotation 2020-08-28 123816.png

  • In the INPUTS panel, select the SCENE module and use the Base Cam Transf to position the virtual objects to the right place on the tracked camera image
  • In the INPUTS panel, select the CATCHERS module and configure the Reflection Blur, Reflection Strength and Shadow Strength parameters to match the properties of the real surface to achieve a realistic result
Article content

Close
1/10