Starting Aximmetry Camera Calibrator
For this purpose Aximmetry has a separate application named Camera Calibrator.
Its Startup Configuration window is nearly identical to the Composer’s one.
You can define any number of outputs, each of them will show the same picture. For example if you want to watch the picture both on your PC monitor and on an SDI display you have to setup it like this:
To have a tracking perfectly synchronized with the camer input image it’s advisable to define an SDI output with the same frame rate as the camera input has.
Also map the camera input as usual. You can map multiple inputs as well if you want to calibrate multiple cameras connecting to the same PC.
The added cameras naturally must be equipped with a tracking system.
If you haven’t yet defined the tracking system(s) used you can do in the Manage Devices section as described in Setting Up Virtual Sets with Tracked Cameras.
We suggest you do not map the tracking systems in Device Mapper, instead switch between them later in the application.
Setting up the tracking device
After starting the application you’ll see a property editor near the left side of the window. Video Input is set to Mapped #1 by default, so if you’ve set it up in Startup Configuration it will work right away.
Select the tracking device installed on the camera:
If you use standalone zoom/focus encoders that are independent from the camera tracking system specify the device representing them in the Zoom Device property.
IMPORTANT: do not specify the same tracking device both as Tracking Device and Zoom Device, because you'll get doubled rotation values. If you have one tracking system that sends both kind of data then specify it in Tracking Device only.
Helper virtual graphics
On the bottom preview panel and also on all defined outputs you’ll see the camera input superposed by a schematic helper image of the virtual space.
Of course if you move the camera the real and virtual parts will glide relative to each other at first, since you haven’t yet done the proper calibration.
But you’ll see if the starting / stopping of their motion is not in sync. If this is the case adjust Tracking Delay until you have a perfectly synchronized starting and stopping.
If you use a standalone device for zoom/focus encoders you can also define an independent delay for this device.
It is advisable to setup the virtual grid to be aligned with the real wall of your studio. It will help positioning the virtual marker later. Adjust the Pan Virtual property until the virtual grid is parallel with the walls:
Adding a new camera definition
In the Camera definitions list each entry corresponds to a lens data file belonging to a specific combination of a camera and a lens. This is important: the data is not only tied to a specific lens, but also to a camera the lens is mounted on. We’ll describe later what to do if you want to transfer an already calibrated lens to another camera.
First you have to add a new entry to the list and give a proper name indicating the pair of the camera and lens. In this example we use a PTZ camera, so we simply specify the model name:
Setting up camera properties
In the top right corner you can find the Properties button which allows setting the base properties of the camera.
First you have to specify the sensor width of the camera. It usually can be found in the technical specification of the device. This value is not always identical to the actual effective sensor area used by the camera. But it doesn’t really matter, the value don’t have to be absolutely precise since you’ll use your own eyes during the calibration process anyway.
If you use a tracking system that provides full position information as well, like Stype or Mo-sys StarTracker etc., you’re done here.
But if you use a PTZ camera, or a PTZ head that only provides rotation information you have to specify the spatial position manually.
Firstly we have to define two terms here:
- PTZ rotation center is the point the PTZ mechanism rotates around (both pan and tilt).
- sensor center is the location of the center of the image sensor within the camera.
Again, you dont have to be absolutely precise here (it’s impossible), try your best to measure them.
First specify the distance of the rotation center from the floor:
If you use PTZ head with an arbitrary camera mounted on it, usually there is a significant distance between the rotaton center and the sensor center. You have to specify this difference in the camera’s own coordinate system:
If you have a PTZ camera the two points usually can be consider nearly identical, so leave the values on zero.
You have to perform several calibrations for several zoom encoder positions, and if necessary for several focus encoder positions.
The system will store several parameters for each zoom + focus combination. For other in between positions interpolated parameters are provided by the system.
The necessary number of calibration points depends on the specific lens, usually 5 or 7 is enough along both encoders.
In the fortunate case when your lens do not change its properties noticeably when adjusting focus, you only have to do the calibration along the zoom values, meaning 5 or 7 measures in total.
Otherwise you have to go along focus positions for each and all zoom positions, meaning that the necessary number of points will be multiplicated, for e.g. 5 x 5 = 25.
You can always watch the current position of these encoders both in a pair of value fields and rendered over the camera picture. They normally fall in the 0 - 1 range.
Adding the first calibration point
In this example we only go along the zoom positions and will ignore focus.
Pull the zoom into the widest angle position.
Put any object with vertical legs into the studio, for example a chair, a table or a stand.
Place one the virtual markers near a leg on the screen using the Marker 1 Pos property.
If you rotate the camera you’ll see that the virtual marker and the real leg will drift relative to each other yet.
Add a new calibration point.
In the appearing window you can adjust the focal length for the current zoom + focus encoder position. You can either adjust the Focal length or the Field Of View value whichever you find more informative, they’re linked together.
At the same time pan your camera back and forth. The goal is to find a focal length where the drifting of the virtual marker relative to the leg disappears or becomes minimal.
When you’ve found a suitable value place the virtual marker precisely on the leg. If your test object has two legs, use Marker 2 as well.
Now pan/tilt your camera again for a final check, and refine the Focal length if necessary.
Adding further calibration points
Adjust the zoom on your camera to reach another position where you want to do a calibration.
You’ll see that the virtual and real space will split again since the single focal length you’ve specified previously remains in effect.
Add a new calibration point.
Adjust the Focal length until you have an acceptable match.
And again, pan the camera and refine Focal length if necessary.
Repeat these steps for adding more calibration points, until you reach 5 or 7 equally distanced ones.
Now test the interpolation. Adjust the zoom on the camera back and forth from the widest to the narrowest angle.
If you find that the interpolation glides between two calibrated points, you can try another interpolation mode in the camera Properties window.
If you’re still not satisfied you can always add more calibration points between the existing ones.
In the calibration point list above you can see a different focus encoder value for each zoom position. The reason for this is that we used automatic focus on the camera for this example. We only took zoom encoder into account during the process and we ignored focus.
Unfortunately with many lenses this method is not appropriate, because adjusting the focus on them also affects their focal length to a certain degree.
In these cases we have to switch off autofocus, and do the measurement for different focus position for each zoom position.
For e.g. you can calibrate at 5 different zoom positions and 5 different focus positions for each zoom. In this case you will have to calibrate for 5 x 5 = 25 different positions.
Of course you’ll have positions where the image is heavily defocused. Try to set up an approximate match for these cases as well.
Transferring lens to another camera
When you have a lens that you’ve already done the calibration with and you want to put this lens on another camera you can use the existing data and only have to setup the sensor with of the new camera.
Select your previous camera definition and click Duplicate.
Open camera properties and adjust Sensor width until you get a match in any selected zoom position.
Using the created lens data
When you start the Composer and map your tracking device in Device Mapper you can select which lens data matches the camera the tracking device is mounted on. This selection is made in the Mode field.
When you start the Composer and load your tracking capable virtual set, on each INPUT panel you’ll find a property named External Lens Data.
If it’s ON it means that if the tracking system provides its own lens data then the system will use that. If the property is OFF or no lens data is coming from the tracking system then Aximmetry’s own lens data will be used.