-
-
Notifications
You must be signed in to change notification settings - Fork 84
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Gamut mapping algorithm seems to be incorrect #91
Comments
I'm now convinced that there is a flaw in the gamut mapping algorithm. I spent some time actually going over the algorithm, and the issue (seems to me) to be related to the second condition in this loop: while ((high - low > ε) && (error < base_error)) { I removed that, I found that I get a smooth transition: I think the algorithm is kicking out too soon sometimes. |
@danburzo also noticed this: Gamut mapping: clarification of chroma reduction algorithm I agree, and need to overhaul this. The idea is to find, on each chroma-reduction iteration, whether one is
|
If it helps, I refactored the algorithm and removed some noticeably dead code: https://github.com/facelessuser/coloraide/blob/master/coloraide/color/gamut/lch_chroma.py. It performs much better for me. I have a number of examples here: https://facelessuser.github.io/coloraide/interpolation/. Obviously, some grayscale interpolation could use a higher resolution of stops as grayscale is more sensitive and requires a lower Delta E between the stops, but keeping the resolution a little lower provides a better response when people are goofing around and editing live as Python in the browser isn't going to be super fast 🙃 . |
I did notice in the linked issue (#63) the argument about and how it doesn't actually trigger. That is also what I found, which is why I removed it. if abs(delta - 2) < EPSILON:
# We've found the boundary
break TBH, it only triggers if moved up to be under I can see maybe playing with thresholds to maybe smooth some things further (lower Delta E target). But generally, things seem much smoother than they did before. I have not compared results with the |
I am starting to question whether the enhancement of adding all the delta E 2000 comparisons is worth the cost: So here, I lower the Delta E distance threshold to get "closer" as I wanted a kind of best case scenario (even if it is expensive) so I lowered the threshold to a Delta E of 1 target, whether it is easily seen in the picture below or not is debatable, but it does seem that the results are a bit smoother. But then, when just comparing against whether the result is "displayable/in gamut" the results are just as good or maybe even a little bit smoother.
Even if this is just a trick of the eye, I'm wondering if the cost of all the Delta E checks is worth it as those checks are expensive. I also acknowledge that there may be very specific cases where the Delta E checks give much better results, but this is just a first, quick evaluation. Anyways, maybe some of this information is useful. Though, it is making me question whether Delta E checks really give a significant enough increase to warrant the expensive cost (at least in reference to using them while bisecting - using it as a shortcut doesn't seem too bad). I would definitely be interested in knowing if there are certain known, worst case scenarios. |
Is this for the case where, say in LCH, L is > 100 so always out of gamut regardless of chroma? # If flooring chroma doesn't work, just clip the floored color
# because there is no optimal compression.
floor = color.clone().set('lch.chroma', 0)
if not floor.in_gamut():
return floor.fit(method="clip").coords() |
DeltaE 2000 is indeed expensive; it is optimized for giving comparable, consistent values for small differences regardless of where you are in the colorspace. So basically it is compensating for known deficiencies of Lab like the blue hue nonlinearity, the exaggeration of deltaE76 for high chroma colors, etc. Using a better colorspace, such as OKLab, also gives you a less expensive deltaE because it is just euclidean distance. I'm examining that now. |
I guess if we are strictly talking about colors that are already kind of in the acceptable CSS range, though I guess any color past that range would qualify. I think I've expressed this before, but generally fitting a color in one space for another space isn't always perfect, especially if the conversion isn't "perfect". Even though this color is clipped perfectly in sRGB, the trip back to LCH (even with the much better calculated matrices I'm using now), still can give you something a little off, which can be exaggerated in other spaces. >>> Color('lch(100% 0 0)').fit('srgb').convert('srgb').coords()
[0.9999999999999994, 1.0000000000000002, 0.9999999999999997]
>>> Color('lch(100% 0 0)').fit('srgb').convert('hsl').coords()
[137.14285714285714, 200.0, 99.99999999999997]
>>> Color('lch(100% 0 0)').fit('hsl').convert('hsl').coords()
[140.0, 75.0, 99.99999999999996] Anyways, that's another topic, and you can probably really only solve that by just rounding stuff off. My real concern was more not wasting time going through the bisecting if we already know it isn't worth optimizing.
Very cool. For usability, if the choice is DeltaE 2000 or just checking if in Gamut, unless there are specific cases where things are just way off, I'm thinking just checking if in gamut may be "good enough". But if there are scenarios where Delta E does much, much better, and there was a far better, less expensive DeltaE, that would definitely be really good. |
I see that color.js has been updated. Gamut mapping looks much better now 🙂. I also did more testing, and I did find some cases where using simple gamut checks was not ideal, so I think I will stick with Delta E as well 🙂. It may be more expensive, but I agree it gives the overall best results. |
I just updated it again and, if the gamut mapping space is OKLCH then it uses the (better, also cheaper) deltaEOK rather than deltaE2000. (It also adjusts the value for a JND, in that case, from 2 to 0.02). The default space for gamut mapping is also OKLCH, now. |
This is really more a question.
I know color.js defaults to gamut mapping instead of naive clipping, which is cool. And I understand that this helps give more accurate colors when converting from a large color space to a small color space. And color.js generally recommends against using simple clipping, but from what I've observed, gamut mapping seems to give choppy results in interpolation.
You can kind of see the solid bands and such instead of a smooth transition.
We can see here that clipping gives much smoother results (it was easier to show both in this fashion). The second gradient is the clipped one:
It seems in some cases that gamut mapping maybe isn't always the best. It seems that, for instance, if you are in a smaller gamut, and want to interpolate in a larger gamut, that maybe clipping is better. In the above scenario, you ramp up to the gamut limit, so it seems clipping will just keep it at that limit until it drops back into the gamut. If you gamut map in these situations, it seems it can create a discontinuity as the chroma compression can give a noticeably different color than what you were transitioning through.
I guess my question is whether I am missing something? Is there some kind of flaw in my thinking?
The text was updated successfully, but these errors were encountered: