XR - Color Calibration

Blend perfectly the walls colors with the virtual surrounding.

ColorCalibration ColorCalibration ColorCalibration ColorCalibration

XR - color calibration needs a specific XR/AR license. If you’d like to try out these features to decide whether you want to purchase an XR/AR license, please contact us.

BeforeAfterColorCalib BeforeAfterColorCalib

1) Theory

Objectives:

In order to XR - Extended and augmented reality , the colors of the video-input must be as close as possible to the colors of the overlay image. However, between the screens and the camera, the image passes through several colorimetric profiles.

The picture below has been taken before and after applying a color calibration to the extended:

BeforeColorCalibrationExample BeforeColorCalibrationExample

How it works:

Just like the Geometric Calibration, Smode will need one or several viewpoint of the setup to determine the color correction to be applied to the extended.

These points of view are not images but sequences of grids of different colors cast on the screens. Smode will then compare each color sent in each square with the one received. He will then determine which is the color model of your setup.

ShemaXRColorGrid ShemaXRColorGrid

Color model: the colorimetric profile of your setup.


2) Before Starting a Color Calibration

  • Your Smode is fully optimized in graphical performances. Use the Profiling feature if needed.
  • The frame rate is stable. If not, it might be because your set-up isn’t fully genlocked
  • You are filming every impacted screens
  • The camera does not move
  • Ensure that you have done a XR - Latency Calibration
  • Turn off every lights on the stage
  • Alert people not to pass in front of the camera

3) UI Check-out

CameraAnalyzerWorkspaceComponent-Color CameraAnalyzerWorkspaceComponent-Color

  • Viewport : Display the stream of the XRCalibrator Icon XR Calibrator VideoInputTextureGenerator Icon Video Input
  • Enable Detector : Enable the detection of AprilTagTextureGenerator Icon April Tag and display helpers in the viewport.
  • Detection count : Number of AprilTagTextureGenerator Icon April Tag detected. (result of the AprilTagTextureModifier Icon April Tag detector modifier
  • Tracker information : Display the current position orientation of the Tracker Icon Tracker of the PhysicalCamera Icon Physical Camera such as the deviation. A positive value of Position Deviation and Orientation Deviation means that your tracker is currently moving.
  • Send Locators : Display a AprilTagGridTextureGenerator Icon April Tag Grid in each Led screens.
  • Shoot viewpoint : Start the shoot of a viewpoint for calibration
  • List of viewpoints : every viewpoints appears in that list
  • ViewPoints information: display the number of AprilTagTextureGenerator Icon April Tag detected for each screens and the pixel gap between their position in the video input stream and the stage simulation.
  • Evaluate : Make an average evaluation of the differences of colors between Emitted colors and Received colors
  • Calibrate : Start a calibration. Calculation depends on the number of viewpoints shooted.
  • Console output
  • Save as Calibration State : Save the calibration results as a calibration state.
  • Calibration States list : Every calibrations results can be called back as states. They appears in that list.

4) Calibration process

In the XR Calibrator, Color tab, enable the Enable locator.

Try to detect the maximum amount of apriltag, especially at the junctions of the walls and the corner, as these are the places where there is the most need to catch up on color. Play with the focus of the camera to detect more of them.

StandardDeviation StandardDeviation

There is also the possibility to change the “Quad Decimate” parameter (In -> Detector: April Tag) to increase the number of tags displayed in the screens.

AprilTagDetector AprilTagDetector

QuadDecimate QuadDecimate

You need to lower the Decimate value, that will allow detecting more tag ( but Smode can slow down if your machine is not powerful enough )

Take a viewpoint shoot. Remove the locators before each viewpoint shoot to optimize performances.

ShootViewPoint ShootViewPoint

When a ViewPoint is taken, several different colors are sent to the screens at the level of each tag. Smode then records, for each detected tag, each color: the one that is emitted at this place of the screen, and the one that is received.

It is then able to deduce the difference between a color sent, at a given place on the screen, compared to the color received at the same place. You can visualize the data of a viewpoint by unfolding “viewpoints” at the bottom of the color parameter of the XR-Calibrator:

EmittedReceivedPerTag EmittedReceivedPerTag

Once your viewpoint has been shoot, move the camera to get another point of view, wait for your camera to be stable, and take another one. You don’t have to look at all the screens for a viewpoint-shoot. Look at the interesting parts of the stage to be calibrated.

Try as much as possible to take viewpoints seen from the front of the screens. Feel free to shoot twice on the same position. The colors sent being randomized, this can improve the quality of the measurements.

If you have a fixed position camera make one or multiple shoot form it’s position.

In some cases, it can be interesting to “merge” the color models of the screens that make up the walls of your setup.

Select the corresponding XRDisplayInformation Icon XR Display information and then in Color Model -> General -> Type -> Switch to Merge.

MergeColorModel-01 MergeColorModel-01

A panel warns you that several target parameters will be deleted. Press “YES”.

MergeColorModel-02 MergeColorModel-02

In XRDisplayInformation Icon XR Display information , merge the color model of one screen with the other one.

MergeColorModel-03 MergeColorModel-03

You are ready for starting a calibration. Press Calibrate.

By default only the color model is calibrated, the inverse color model is a lot more complex to setup and have good result with it ( and it take a lot more time to Calibrate )

The calculation generate a collection of LUT stored in the XRDisplayInformation Icon XR Display information and can be apply to your setup using a SmartLutTextureModifier Icon Smart Lut .


6)Export the Calibration data

Just like the XR - Geometry Calibration , you have the possibility to save your color calibration into .smartlut file. Those files can be re-imported later.

SaveSmartLutFile SaveSmartLutFile


7) Apply the Calibration

At this step, you have to Modify the colors of your AR and Extension compositions to make them match the colorimetric profile of your setup.

You can drag and drop the .smartlut you just exported it will create a SmartLutTextureModifier Icon Smart Lut that you can apply directly onto the Extended and AR compo in your VFX processor:

ApplyColorSmartLut ApplyColorSmartLut

Or you can use a PushModifierStackChannelLayer Icon Modifier Stack Layer so your color control are centralized in your show and you can adjust it by hand:

ApplyColorModifierStack1 ApplyColorModifierStack1

Once you have imported a SmartLutTextureModifier Icon Smart Lut you need to select the Display (for the optional angle data):

ApplyColorModifierStack2 ApplyColorModifierStack2

If you want to directly connect the LUT to the display without exporting it, you can create a SmartLutTextureModifier Icon Smart Lut and change it’s mode to “internal Direct” and the select the Lut display.

ApplyColorSmartLut1 ApplyColorSmartLut1


8) Correcting the LED color depending on the angle

The Display Angle Mask can also be helpful because it’s role is to mask any TextureModifier Icon 2D Modifier or TextureGenerator Icon 2D Generator according to the angle of the selected Stage Elements angle.