Search
Start typing to search...

Basic Lens Calibration (without lens distortion)

Author:
Please note that this is an ARCHIVED document. All information presented here is considered obsolete, even if found partly relevant in the newest version of Aximmetry.

Basic Lens Calibration (without lens distortion)

In this tutorial, we will learn how to calibrate a real-world tracked camera to be in full sync with the virtual studio scene, meaning, how to synchronize their movement completely.

Starting Aximmetry Basic Calibrator

For this purpose, Aximmetry has a separate application named Basic Calibrator.

Its Startup Configuration window is nearly identical to the Composer’s one.

You can define any number of outputs, and each of them will show the same picture. For example, if you want to watch the picture both on your PC monitor and on an SDI display you have to set it up like this:

basiclens image1.png

To have tracking perfectly synchronized with the camera input image it’s advisable to define an SDI output with the same frame rate as the camera input has.

Also, map the camera input as usual. You can map multiple inputs as well if you want to calibrate multiple cameras connecting to the same PC.

basiclens image26.png

The added cameras naturally must be equipped with a tracking system.

If you haven’t defined the tracking system(s) used you can do it in the Manage Devices section as described in Setting Up Virtual Sets with Tracked Cameras.

We suggest you do not map the tracking systems in Device Mapper, instead switch between them later in the application.

Setting up the tracking device

After starting the application you’ll see a property editor near the left side of the window. Video Input is set to Mapped #1 by default, so if you’ve set it up in Startup Configuration it will work right away.

Select the tracking device installed on the camera:

basiclens image5.png

If you use standalone zoom/focus encoders that are independent of the camera tracking system specify the device representing them in the Zoom Device property.

Screenshot 2021-03-24 072505.png

IMPORTANT: do not specify the same tracking device both as Tracking Device and Zoom Device, because you'll get doubled rotation values. If you have one tracking system that sends both kinds of data then specify it in Tracking Device only.

Adding a new Calibration Profile

In the Calibration Profiles list, each entry corresponds to a lens data file belonging to a specific combination of a camera and a lens. This is important: the data is not only tied to a specific lens, but also to a camera the lens is mounted on. We’ll describe later what to do if you want to transfer an already calibrated lens to another camera.

First, you have to add a new entry to the list and give a proper name indicating the pair of the camera and lens. In this example, we use a PTZ camera, so we simply specify the model name:

basiclens image30.png

 basiclens image46.png

basiclens image37.png

basiclens image29.png

Setting up camera properties

In the top right corner, you can find the Properties button which allows setting the base properties of the camera.

basiclens image6.png       

First, you have to specify the sensor width of the camera. It usually can be found in the technical specification of the device. This value is not always identical to the actual effective sensor area used by the camera, but it doesn’t really matter. The value doesn’t have to be absolutely precise since you’ll use your own eyes during the calibration process anyway.

basiclens image28.png

If you use a tracking system that provides full position information as well, like Stype or Mo-sys StarTracker, etc., you’re done here.

But if you use a PTZ camera or a PTZ head that only provides rotation information you have to specify the spatial position manually.

Firstly we have to define two terms here:

  • PTZ rotation center is the point the PTZ mechanism rotates around (both pan and tilt).

basiclens image17.png

  • sensor center is the location of the center of the image sensor within the camera.

basiclens image18.png

Again, you don't have to be absolutely precise here (it’s impossible), try your best to measure them.

First, specify the distance of the rotation center from the floor:

basiclens image24.png

If you use a PTZ head with an arbitrary camera mounted on it, usually there is a significant distance between the rotation center and the sensor center. You have to specify this difference in the camera’s own coordinate system:

basiclens image21.png

basiclens image4.png

If you have a PTZ camera the two points usually can be considered nearly identical, so leave the values on zero.

basiclens image34.png

basiclens image8.png

Helper virtual graphics

On the bottom preview panel and also on all defined outputs you’ll see the camera input superposed by a schematic helper image of the virtual space.

basiclens image33.png

There are two kinds of sync we need to pay attention to. The first kind of sync is temporal, meaning it is about when the tracked camera and the virtual grid start and stop moving. We need them to move exactly at the same time. For e.g. when we suddenly pan the tracked camera, the virtual grid should move exactly at the same time as the real world image from our tracked camera. The second kind of sync is spatial, meaning it is about how much distance the tracked camera travel and how much distance our virtual grid travels. We need them to travel exactly the same distance. Syncing the beginning and end of a movement will help us sync the distance they travel, so let's start with time.

Of course, if you move the camera at first the real world studio and the virtual grid will glide relative to each other since you haven’t yet done the proper calibration. In other words, they are not in spatial sync. But you'll also notice, that they don't move together temporally, meaning one move later than the other. In other words, they are not in temporal sync. In this case, adjust Tracking Delay until you perfectly synchronize the starting and stopping of their movement.

NOTE: adjusting Video Delay is meant to be used only as a last resort.

NOTE: if the camera is not genlocked, then our tracking data can fall out of sync from time to time, meaning that we need to specify the Tracking Delay again and again.

basiclens image48.png

If you use a standalone device for zoom/focus encoders you can also define an independent delay for this device, though for the virtual grid to follow the zooming movements you need to add at least two calibration points, so you might need to set this value later.

Screenshot 2021-03-24 073045.png

It is advisable to set up the virtual grid to be aligned with the real wall of your studio. It will help to position the virtual marker later. Adjust the Pan Virtual property until the virtual grid is parallel with the walls:

basiclens image13.png

basiclens image39.png

Encoder positions

You have to perform several calibrations for several zoom encoder positions, and if necessary for several focus encoder positions.

The system will store several parameters for each zoom + focus combination. For other in-between positions interpolated parameters are provided by the system.

The necessary number of calibration points depends on the specific lens, usually 5 or 7 is enough along with both encoders.

In the fortunate case when your lens does not change its properties noticeably when adjusting focus, you only have to do the calibration along with the zoom values, meaning 5 or 7 measures in total.

Otherwise, you have to go along focus positions for each and every zoom position, meaning that the necessary number of points will be multiplicated, for e.g. 5 x 5 = 25.

You can always watch the current position of these encoders both in a pair of value fields and rendered over the camera picture. They normally fall in the 0 - 1 range, though some manufacturers might use an expanded range to define these values, for e.g. 0 - 4 range or 0 - 12 range.

basiclens image10.png

basiclens image40.png

Adding the first calibration point

In this example, we only go along the zoom positions and will ignore focus. First, we are going to synchronize the panning/ tilting movements of the tracked camera and the virtual elements.

Pull the zoom into the widest angle position.

Put any object with vertical legs into the studio, for example, a chair, a table, or a stand.

Place one of the virtual markers near a leg on the screen using the Marker 1 Pos property.

NOTES:

- all values of the Marker Pos and Marker Height are expressed in meters, and they should be proportionate to our actual point of reference, including its size and position relative to the camera.

- It is recommended to not change the value of the y axis (middle one) if the point of reference is on the same plain as our camera (specified by the full position tracking data or by the PTZ rotation center height from floor value).

- The recommended order of calibration points is as followed: Min zoom - Max zoom - Mid zoom - 1/4 zoom - 3/4 zoom - additional calibration points if needed. The second and third calibration points can be switched up.

basiclens image47.png

basiclens image11.png

If you rotate the camera you’ll see that the virtual marker and the real leg will drift relative to each other yet. In other words, they are not in spatial sync.

basiclens image25.png

Add a new calibration point.

basiclens image14.png basiclens image2.png

You can adjust the focal length for the current zoom + focus encoder position in the appearing window. You can either adjust the Focal length or the Field Of View value whichever you find more informative, they’re linked together.

basiclens image19.png

At the same time pan your camera back and forth. The goal is to find a focal length where the drifting of the virtual marker relative to the leg disappears or becomes minimal. When the virtual marker moves faster than our real-world point of reference, then our Focal length value is too low. When it moves slower than our real-world point of reference, then our Focal length is too high.

basiclens image35.png

When you’ve found a suitable value place the virtual marker precisely on the leg. If your test object has two legs, use Marker 2 as well.

basiclens image27.png

basiclens image22.png

Now pan/tilt your camera again for a final check, and refine the Focal length if necessary.

basiclens image44.png

Adding further calibration points

Adjust the zoom on your camera to reach the next position where you want to do a calibration.

basiclens image45.png

You’ll see that the virtual and real space will split again since the single focal length you’ve specified previously remains in effect.

basiclens image43.png

Add a new calibration point.

basiclens image2.png

Adjust the Focal length until you have an acceptable match.

basiclens image41.png

basiclens image3.png

And again, pan the camera and refine the Focal length if necessary.

Now test the interpolation. Adjust the zoom on the camera back and forth from the 1st to the 2nd calibration point.

If you find that the interpolation glides between two calibrated points, you can try another interpolation mode in the camera Properties window.

basiclens image6.png

basiclens image7.png

Repeat these steps for adding more calibration points, until you reach 5 or 7 equally distanced ones.

basiclens image31.png

If you’re still not satisfied you can always add more calibration points between the existing ones.

Focus Encoder

In the calibration point list above you can see a different focus encoder value for each zoom position. The reason for this is that we used automatic focus on the camera for this example. We only took the zoom encoder into account during the process and we ignored focus.

Unfortunately with many lenses, this method is not appropriate, because adjusting the focus on them also affects their focal length to a certain degree.

In these cases, we have to switch off autofocus and do the measurement for different focus positions for each zoom position.

For e.g. you can calibrate at 5 different zoom positions and 5 different focus positions for each zoom. In this case, you will have to calibrate for 5 x 5 = 25 different positions.

Of course, you’ll have positions where the image is heavily defocused. Try to set up an approximate match for these cases as well.

basiclens image36.png

Transferring lens to another camera

When you have a lens that you’ve already done the calibration with and you want to put this lens on another camera you can use the existing data and only have to set up the sensor width of the new camera.

Select your previous camera definition and click Duplicate.

basiclens image29.png basiclens image38.png

basiclens image15.png

basiclens image23.png

Open camera properties and adjust Sensor width until you get a match in any selected zoom position.

basiclens image20.png

basiclens image16.png

Using the created lens data

When you start the Composer and map your tracking device in Device Mapper you can select which lens data matches the camera the tracking device is mounted on. This selection is made in the Mode field.

basiclens image12.png

basiclens image32.png

When you start the Composer and load your tracking capable virtual set, on each INPUT panel you’ll find a property named External Lens Data.

basiclens image9.png basiclens image49.png

If it’s On it means that if the tracking system provides its own lens data then the system will use that. If the property is Off or no lens data is coming from the tracking system then Aximmetry’s own lens data will be used.

Article content

Loading
Close
Loading spinner icon
1/10