Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

1-channel bitmap to an SDL_Texture

Tags:

c++

sdl-2

Given a single-channel bitmap (alpha from 0 to 255) represented by unsigned char[], what are the ways of making an SDL_Texture out of it? The bitmap has no extra data.

Answer

auto* surf = SDL_CreateRGBSurfaceFrom(bitmap, width, height, 8, width, 0, 0, 0, 0);
SDL_Color colors[256];
{
    Uint8 v = 0;
    for(SDL_Color& color: colors) {
        color.r = color.g = color.b = 0xFF;
        color.a = v++;
    }
}
SDL_SetPaletteColors(surf->format->palette, colors, 0, 256);
auto* texture = SDL_CreateTextureFromSurface(sdl.renderer, surf);
SDL_SetTextureBlendMode(texture, SDL_BLENDMODE_BLEND);

Original question

From what I understand, it is done by first calling SDL_CreateRGBSurfaceFrom(bitmap, width, height, depth, pitch, r, g, b, amasks).

1) One (dirty) way I tried, was creating a copy of the original bitmap represented by unsigned[], where every int was 0xFFFFFF00 by default, then a call to:
SDL_CreateRGBSurfaceFrom(bitmap, width, height, 32, 4 * width, 0xff000000, 0xff0000, 0xff00, 0xff) did it for me. (opaque is white and down to transparent, again, because 0xFF was used).

But I wanted another way, because this way required an additional memory allocation and a weird for-loop.

auto pixelsNum = width * height;
auto bitmap = new unsigned[pixelsNum];
for(auto i = 0; i < pixelsNum; ++i) {
    bitmap[i] = tmpBitmap[i] | 0xffffff00;
}


2) The second way I tried, was by using depth = 8, and therefore I didn't need to allocate any new bitmap, and instead (needed?) an SDL_Pallete.

auto surf = SDL_CreateRGBSurfaceFrom(bitmap, width, height, 8, width, 0, 0, 0, 0xff);
SDL_Color colors[256];
for(int i = 0; i < 256; ++i) {
    colors[i].r = colors[i].g = colors[i].b = 0xff;
    colors[i].a = i;
}
SDL_SetPaletteColors(surf->format->palette, colors, 0, 256);

This is by far cheaper in memory, and is constant.

So I'm interested if there is any better way
Thanks.

like image 213
iwat0qs Avatar asked Dec 08 '25 05:12

iwat0qs


1 Answers

SDL_CreateRGBSurface(0, w, h, 8, 0, 0, 0, 0xff); is enough to have only alpha. However when converting to texture, renderer will try to choose closest format graphics hardware supports, and there wasn't any pallete support in hardware for quite a time. Resulting texture will most likely have SDL_PIXELFORMAT_ARGB8888 format regardless of your approach. I see no support in SDL for 1 channel textures, but it might still be possible with using opengl textures directly (e.g. SDL_GL_BindTexture). Of course it will force you to stick with one specific SDL_Renderer implementation.

like image 154
keltar Avatar answered Dec 10 '25 20:12

keltar



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!