Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Handle "Out-of-Gamut" Color in RGB to CIEL*a*b* to RGB Conversions

Tags:

c++

image

colors

I've got functions (c++) that convert a game image (SDL2 SDL_Surface) from RGB through CIEXYZ to CIEL*a*b* so that adjustments to hue, brightness, saturation, will be more visually natural than in HSV space. That works, with the exception of those pixels that are adjusted out of the RGB gamut in the process.

While it is easy enough to force a value back into gamut by:

  • individually cropping subpixel values below 0 to 0 and above 255 to 255, or

  • compressing and moving the whole pixel or whole image into the 0-255 range by dividing by (max-min) and subtracting min/(max-min);

these options lead to gross artifacts when doing multiple operating on the same image. I am looking for the least destructive method of handling out-of-gamut subpixels in code. Digging through many pages of Google results least to hundreds of Photoshop links, a few design oriented links and references to CMSs like LittleCMS.

I need an algorithmic solution to put into c++ code.

Note: Just doing some basic experimentation, using linear compression on the entire image leads to massive loss of brightness over hundreds of iterations with calculations happening as floats. Further insight into the sigmoid compression comment below is most welcome.

like image 516
justinzane Avatar asked Feb 03 '26 18:02

justinzane


1 Answers

The fundamental issue you face is multiple conversions between color spaces. If the conversion isn't lossless, then you will get cumulative artifacts.

The better solution is to maintain all of your imagery in one color space and do all of your manipulation within that color space. Treat conversion as a one-way street, converting a copy to RGB for display. Do not convert back and forth.

like image 190
Joe Z Avatar answered Feb 05 '26 07:02

Joe Z



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!