Search
Start typing to search...

AR/XR - Augmented Reality

Author:
Please note that this is a BETA version of the document. All information presented is correct but we are working on improving the details.

Introduction

With Augmented Reality (AR or sometimes referred to as XR), we can enhance our production by adding virtual elements to the surroundings of the talent. These elements can be static, or mobile virtual objects, lights, virtual screens, charts and bars for statistics, and even other talents located in distant studios. Usually, these elements look realistic and their purpose is to enhance the viewer experience. With the help of AR, the illusion that digital objects exist in the real world and interact with it can be created.

AR elements can be added in post-production to recorded footage, but since Aximmetry was made with broadcast productions in mind, we will discuss the case when AR elements are added during broadcast, so in real-time.

Basically, it is the opposite of the so far mentioned Green Screen and LED-Wall productions, since in those we have placed the real-life talent into virtual reality, meanwhile with AR virtual elements look as if they were placed in reality.

In the example below, you can see that a 3D virtual model of a racecar is added to a real office space.

In this article, we do not consider overlays, and lower thirds like crawls and channel logos AR. More on these in this document.

Why should we use AR in virtual production?

AR is commonly used in broadcasting production environments mostly for the reason that it is one of the most versatile ways to add information to a production.

It is an incredibly useful tool for showcasing sports, gaming, entertainment, medicine, education, business, and architecture. You can display infographics, data visualization, meteorological information, etc.

In the example below you can see a pie chart that shows the distribution of transportation means,

or in sports broadcasting, the virtual display of players’ positions on a real football field.

AR workflow

The steps of an AR production:

1. - The studio camera sends its image to the PC running Aximmetry. At the same time, the camera tracking system forwards the data on the physical camera's movements to Aximmetry.
2. - Aximmetry places the AR elements onto the studio camera's image using the camera tracking system's data to match the perspective of the rendering to the physical camera's.
3. - Aximmetry outputs the final image.

What does an AR studio look like?

AR can be added to any camera's image, but for broadcast production, it is generally used in a real studio environment. For example a newsroom studio or a studio for weather forecasts.

AR can be used with tracked or stationary cameras, but in this article, we would like to focus more on tracked camera productions, since these will give a more immersive viewer experience.

Besides the obvious necessities of a broadcast studio, like lights, cameras, props, and talents, for AR productions you will need the following:

  • Camera Tracking System
  • Virtual Production Software

The camera tracking system is used to track the studio camera movements and send this data to the virtual production software. The virtual production will change the perspective of the AR graphics to match the perspective of the studio camera, making the AR elements look more like part of the physical studio.

Article content

Loading
Close
1/10