A tutorial I read specifically advocated using glVertexAttribIPointer for integer data, uploading data into a ivec4 data type in the shader.
I was wondering why I can't use glVertexAttribPointer with the GL_INT data type instead. Why will it corrupt my data? Reading the API reference gave no clue to this.
I looked at the OpenGL 3.3 (Core Profile) spec, and all I could find was this wording on page 29:
Data for an array specified by VertexAttribPointer will be converted to floating-point by normalizing if normalized is
TRUE
, and converted directly to floating-point otherwise. Data for an array specified by VertexAttribIPointer will always be left as integer values; such data are referred to as pure integers.
I couldn't find any explicit mention that there would be error checking, unspecified values, or undefined behavior, but presumably, if your attribute array converts integers to floating-point, but your shader takes integers, you'll end up with incorrect attribute values.
I know that very old hardware was used floating point everywhere, so ivec4
would just be vec4
in disguise. Therefore, even if this works on your hardware, you shouldn't do it.
Summary: Don't do it.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With