Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Enabling OpenGL extensions

Tags:

opengl

glsl

lwjgl

I'm trying to perform some integer operations (division and modulo) in my GLSL shader, but they don't seem to work and I read that I need to enable EXT_GPU_shader4 in order to get integer operations. What I can't find, however, is how to do that. Is the line:

#version 330 core
#extension GL_EXT_GPU_SHADER4 : require

enough? Or do I need to enable it using the C API somehow too? Currently I get an error during compilation that the extension is not supported. I'm on a GeForce 670; a pretty recent card.

like image 720
atanamir Avatar asked Oct 25 '25 15:10

atanamir


1 Answers

If you want to enable GPU_shader4, then yes, that line will do it.

However, you shouldn't be enabling EXT_gpu_shader4 at all. OpenGL 3.0 already incorporates all of this extension into core functionality. There's no reason to enable an extension to access stuff you already have access to thanks to your #version declaration.

like image 199
Nicol Bolas Avatar answered Oct 27 '25 05:10

Nicol Bolas



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!