XR - Geometry Calibration

Lets Smode know the position of cameras and Led Screens in the real world

GeometryCalibration GeometryCalibration GeometryCalibration GeometryCalibration

XR - geometry calibration needs a specific XR/AR license. If you’d like to try out these features to decide whether you want to purchase an XR/AR license, please contact us.

Video Tutorial

A video tutorial that uses a simulator so you can learn the calibration process without a real Stage

You can download the project file here:CalibrationTutoStart.zip

You can also learn the calibration process for FreeD with Zoom

Download the project file here:CalibrationTutoFreeD.project.zip


1) Theory

BeforeAfterGeoCalib BeforeAfterGeoCalib

How to let know Smode the positions of cameras and screens?

That’s the job for the geometry calibration. This calibration part will enable to Smode to detect the positions of the screens in the real stage to adjust the virtual one.

The calculation works thanks AprilTagTextureGenerator Icon April Tag and AprilTagTextureModifier Icon April Tag detector modifier , called Locators

The geometry calibration part consist of taking a sufficient number of shoot called Single frame of different points of view of your real stage with AprilTagTextureGenerator Icon April Tag broadcasted in screens.

During this step, you will have to do succinctly:

  • Take a shoot (= store an single frame)
  • Move the camera
  • Wait for the tracker Position&Orientation deviation to be at 0 ( or a very low value )
  • Take a shoot (= store an single frame)
  • Move the camera
  • Wait for the tracker Position&Orientation deviation to be at 0 ( or a very low value )
  • Take a shoot (= store an single frame)
  • Move the camera
  • Wait for the tracker Position&Orientation deviation to be at 0 ( or a very low value )
  • etc

GeometryCalculation GeometryCalculation

You don’t have to store a frame that display all the screen. But be sure to detect enough of AprilTagTextureGenerator Icon April Tag . Modify the focus of your camera if needed.


2) Before Starting a geometric Calibration

Before launching the latency calibration, you must ensure that:

Check that the camera position and orientation in the RootStageElement Icon Stage is approximately the same as in the Real Stage

For this you can:

  • Ask the people who set up the tracking device where de 0 is on the real stage.And place the TrackingSystem Icon Tracking System to this point.
  • Moves the camera up-down, front-back, and left-right axes. And rotate,if needed, the TrackingSystem Icon Tracking System according to your observation
  • Pan and till the camera to verify that the camera look in the same direction, if not you need to offset the orientation of the Tracker Icon Tracker

TrackerCorrectionPlacement TrackerCorrectionPlacement

Once you ensure the previous points, you can start a geometry calibration.


2.1) For Stype tracking system

  • [On Stype world]: Also ensure that the camera position is correctly supported by Stype (seen by all cameras). Example below is not good:

StypeNotGood StypeNotGood

  • [On Stype world]: Calibrate the min and max zoom of the Stype Computer.

For FreeD tracking system

You need to report the maximum and minimum values of zoom on your Camera model. Zoom in and out until the maximum values for min and max are reached. Verify if the values set in the Custom Zoom Interval are correct.

CustomZoomInterval CustomZoomInterval

If not, you can set it manually.


3) UI Check-out

Go to the Geometry Tab of the CameraAnalyzerWorkspaceComponent

  • Viewport : Display the stream of the XRCalibrator Icon XR Calibrator VideoInputTextureGenerator Icon Video Input
  • Enable Detector : Enable the detection of AprilTagTextureGenerator Icon April Tag and display helpers in the viewport.
  • Detection count : Number of AprilTagTextureGenerator Icon April Tag detected. (result of the AprilTagTextureModifier Icon April Tag detector modifier )
  • Tracker information : Display the current position orientation of the Tracker Icon Tracker of the PhysicalCamera Icon Physical Camera such as the deviation. A positive value of Position Deviation and Orientation Deviation means that your tracker is currently moving.
  • Send Locators : Display a AprilTagGridTextureGenerator Icon April Tag Grid in each Led screens.
  • Store Single Frame : Trigger to store a frame for calibration
  • List of single frames : every single frames appears in that list
  • Single frame information: display the number of AprilTagTextureGenerator Icon April Tag detected for each screens and the pixel gap between their position in the video input stream and the stage simulation.
  • Evaluate : Make an average evaluation of the pixel gap
  • Calibrate : Start a calibration.
  • Console output
  • Save as Calibration State : Save the calibration results as a calibration state.
  • Calibration States list : Every calibrations results can be called back as states. They appears in that list.

Learn more about XR - Geometry Calibration


4) Calibration process

CalibrationEditor-Geometry-Steps CalibrationEditor-Geometry-Steps

Enable “Send Locator” (1)

Wait until the “Standard deviation” parameter for position and orientation reaches 0 (2). This data represents “jitter” in the signal output from your tracking system. Either the camera is not yet stable or there is a problem, verify with the people who set up the tracking device tracking system.

Verify that a sufficient number of tags are detected (3). If necessary, adjust the Focus and use the Enable detector function (4) without being on Air to view the detected tags in the viewport

StandardDeviation StandardDeviation

Then you can store single frame (5).

and you can move the camera for the next frame,then wait until it is stable (Position/orientation deviation),Then store single frame (5).

The last steps need to be repeated several times.

GeometryCalculation GeometryCalculation

When you have multiple frame: you can press Evaluate (6) and delete or mute frames with errors way above the average.

GeometryCalibrationEvaluate GeometryCalibrationEvaluate

Once you have verified your frames press Calibrate (7)

Wait until the toggle is automatically unset at the end of the calibration


4.1)Stype Calibration process

When using Stype you need to chose if you want to use the optic data from Stype or calibrate a Prime lens optic in Smode in your PhysicalCamera Icon Physical Camera with the mode parameter


4.2) FreeD Calibration process

FreeD Calibration take time ( 30 min to 1 hour )

  1. Roughly place the position and orientation of the tracking system in the stage

  2. Scan zoom to get the min & max zoom => c.f. display in the Physical Camera “custom zoom interval

3a) Go to a wide zoom level and capture multiple frames with different orientations so that we cover the 4 corners of the camera image with aprils tag., do it for 3 different position

3b) Verify the the polynomials Fov, K1, k2, ShiftX, ShiftY are at degree 0 and press Calibrate

4a) Staying in the current position is to engage different new zooms, covering the whole matrix of the detection camera, with thethe whole matrix of the detection camera, especially in the wide shots (=> 8 frames in total)

4b) Calibrate with degree 2

4c) Calibrate with degree 4

5a) Look for problematic areas, make more frames

5b) Calibrate with increased degree

5c) Go back to 5a if there are still problematic areas

to save your Camera model check: PhysicalCamera Icon Physical Camera


5) Export Calibration

If you are satisfied with the calibration, export it. This will create a .geocal file directly in the Smode project.

SaveGeoCal SaveGeoCal

GeoCalFile GeoCalFile


6) TroubleShoots

  • Try calibrating with only one frame enabled
  • Verify that the screen are connected to the right output
  • Verify your UV if you are using Fbx File
  • Verify that the orientation of the screen is correct (with a test pattern for example)

Next step: XR - Color Calibration