PR: Implement support *sUCS* colourspace and *sCAM* colour appearance model.#1349
Conversation
… transformations (XYZ_to_sUCS, sUCS_to_XYZ) and associated tests. - Add sCAM colour appearance model (XYZ_to_sCAM, sCAM_to_XYZ) and associated tests.
|
Thanks a lot @UltraMo114! I will take a look in coming days. |
There was a problem hiding this comment.
Thanks a ton @UltraMo114!
I will merge this directly in a staging branch.
57d9376
into
colour-science:feature/sucs-scam-staging
|
@UltraMo114 : As I'm polishing this on a separate branch, I noted that the Hue Composition data is different from that of the paper: "h_i": np.array([16.5987, 80.2763, 157.779, 219.7174, 376.5987]),vs Also, for the computation of M_scam = (C_S_ucs * spow(FL, 0.1) / spow(J_scam_safe, 0.27) * et) * Fm_s_paramvs |
|
@KelSolaar Thanks for the careful review and for spotting this. Please use the values and formulas as they are presented in the paper. Regarding the formula for M, you're right to point it out. My coding style there wasn't standard. However, it doesn't actually affect the calculation due to the left-to-right order of operations, so the result is correct.
|
|
Thanks! What about the chromatic adaptation transform, it seems to be quite different from that recommended in the Comprehensive color solutions: CAM16, CAT16, and CAM16-UCS paper but is not described in the sUCS / sCAM one. |
|
Thanks for the question. You are correct that the sUCS/sCAM paper doesn't detail a new chromatic adaptation transform. That's because it relies on the standard CAT16 transform. The key reason for the implementation you see is that sCAM is designed to work with tristimulus values (XYZ) under a D65 illuminant. To handle colors from any illuminant, we use CAT16 as a pre-processing step. The workflow is: Transform to Cone Space: The input XYZ color is converted to the M16 cone space using the standard CAT16 matrix. Adapt: Chromatic adaptation is performed in this cone space to the D65 white point. Transform Back: The adapted values are converted back to the XYZ space. This adapted, D65-equivalent XYZ is then used as the input for sCAM. This differs from the full CAM16 model, which performs its subsequent calculations directly within the adapted M16 cone space. My implementation is the standard method for applying CAT16 to a model that assumes a specific input illuminant like D65. Hope this clears things up, and feel free to ask if you have more questions. |
|
It looks very similar to Zhai and Luo (2018) with the addition of the degree adaptation aspect of CIECAM02. I was thinking it should / could be part of its own module in the |
|
You're right, the implementation is conceptually very similar to Zhai and Luo (2018) but incorporates the degree of adaptation from CIECAM02. I agree that it should be its own module within the colour.adaptation package to keep the code organized. Regarding the name, I would propose one_step_cat16. My reasoning is that academically, the term "CAT" (Chromatic Adaptation Transform) refers to the general process of adapting from a source illuminant to a target illuminant, without specifying whether the transform is from XYZ to XYZ or from XYZ to a cone space. The work by Zhai and Luo (2018) is essentially an extension of this concept, which could be described as a "two-step CAT." Therefore, naming our implementation "one-step CAT16" provides a clear distinction from their work and accurately describes its function within the broader context of chromatic adaptation. |


This Pull Request introduces the Simple Uniform Colour Space (sUCS) and the Simple Colour Appearance Model (sCAM), as described in Li & Luo (2024).
This work directly addresses and implements the features requested in issues:
Closes #1335
Closes #1238
Key contributions of this PR:
sUCS Implementation (colour.models.sucs):
XYZ_to_sUCS: Transformation from CIE XYZ to sUCS.
sUCS_to_XYZ: Transformation from sUCS back to CIE XYZ.
Associated matrices (MATRIX_SUCS_XYZ_TO_LMS, etc.).
Comprehensive unit tests for transformations, n-dimensional support, domain/range scaling, and NaN handling.
Docstrings with examples and references.
sCAM Implementation (colour.appearance.scam):
XYZ_to_sCAM: Forward transformation from CIE XYZ to sCAM correlates (J, C, h, Q, M, H).
sCAM_to_XYZ: Inverse transformation from sCAM correlates back to CIE XYZ.
Includes chromatic adaptation using CAT16.
Defines InductionFactors_sCAM, VIEWING_CONDITIONS_sCAM, and CAM_Specification_sCAM.
Hue computation and composition logic specific to sCAM.
Comprehensive unit tests covering transformations, n-dimensional support, domain/range scaling, NaN handling, and exception raising for invalid inputs.
Docstrings with examples and references.
Author:
The sUCS and sCAM models, along with this initial Python implementation, were developed by Li, Molin (UltraMo114).
Compliance:
The code adheres to the Colour Science contributing guidelines.
All pre-commit hooks pass.
All local tests (pytest) pass.
Reference:
Li, M., & Luo, M. R. (2024). Simple color appearance model (sCAM) based on simple uniform color space (sUCS). Optics Express, 32(3), 3100-3122. doi:10.1364/OE.510196