Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Color Quantization Algorithm

I'm working on a color quantization algorithm.

This is the basic process:

  • Convert the image to a set of three dimensional vectors (RGB space for instance).
  • Put that set in a list of sets.
  • While the number of sets in the list is less than the number of color you want:
    • Remove the worst set from the list.
    • Split it in two.
    • Add the two new sets in the list.
  • Done.

What I mean by "worst set" is the set where the accumulated distance between each vector and the mean vector is the bigger.

And this is how I "split a set":

  • Compute the mean vector by adding all vectors and dividing by vector count.
  • Compute a vector composed of the absolute differences between each vector and the mean vector. Normalize it and we got the normal of a plane that divide our vector set in two equal halves.
  • Use this normal two split the set in two sets depending on which side of the plane the vectors belong.

This works, basically, but image palette looks odd, like picked out of a linear gradient...

Is my algorithm plain wrong ? Can someone help ?

like image 988
Nicolas Repiquet Avatar asked Dec 12 '25 06:12

Nicolas Repiquet


1 Answers

The problem is that your algorithm depends a LOT on the initial sets. Because you only split sets, if two points that are very close to each other happen to be in different sets at the beginning they will always be in different sets. This is not good.

So yes - this is worse than the k-means algorithm.

like image 96
Petar Ivanov Avatar answered Dec 14 '25 18:12

Petar Ivanov



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!