Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Plotting RGB spectrum as 2-d color matrix?

Any suggestions on how I might go about plotting the RGB color space as a 2-D matrix? I need a theoretical description of what's going on; a code sample or pseudocode would be helpful but is not required. Thanks!

like image 927
Joseph Weissman Avatar asked Aug 31 '25 22:08

Joseph Weissman


2 Answers

If you want to represent every color in RGB space in a 2D grid, it may be impossible to avoid discontinuities / sharp borders in the result. But some mapping techniques will look better than others.

Examples from Possiblywrong.wordpress.com post allRGB: Hilbert curves and random spanning trees:

  • Traverse the pixels of the image via a 2-dimensional (order 12) Hilbert curve, while at the same time traversing the RGB color cube via a 3-dimensional (order 8) Hilbert curve, assigning each pixel in turn the corresponding color hilbert RGB 3D->2D

  • "Breadth-first traversal of random spanning tree of pixels, assigning colors in Hilbert curve order." Breadth-first traversal of random spanning tree of pixels, assigning colors in Hilbert curve order.

Also check out allrgb.com, "The objective of allRGB is simple: To create images with one pixel for every RGB color (16777216); not one color missing, and not one color twice."

like image 184
Mac Cowell Avatar answered Sep 03 '25 21:09

Mac Cowell


If you don't want to lose any information, you will need to use three dimension. If you can lose some dimensional information, then it's easy. Just do this:

// or HSV
int [256*256][256] colorMatrix;
for (int r = 0; r < 256; r++) {
    for (int r = 0; r < 256; r++) {
        for (int r = 0; r < 256; r++) {
            colorMatrix[256*r+g][b] = color(r, g, b);
        }
    }
}
like image 40
Lie Ryan Avatar answered Sep 03 '25 20:09

Lie Ryan