Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

WebGL differs from OpenGL preprocessor on same graphics stack

I just come upon an interesting effect by Chrome's use of the GLSL compiler. The statement

#define addf(index) if(weights[i+index]>0.) r+=weights[i+index]*f##index(p);

does not compile stating

preprocessor command must not be preceded by any other statement in that line

It seems that the ## syntax is unsupported. However, on the same platform (eg. Linux 64bit, Nvidia GPU) the same shader compiles and runs fine. Why this? I thought the shader compiler is part of the GPUs driver stack and would be used in both cases. So why this different experience?

like image 396
dronus Avatar asked Feb 19 '26 18:02

dronus


1 Answers

Actually WebGL is also quoted as "OpenGL ES 2.0 for the Web", so there are some differences to OpenGL.

The WebGL spec ( https://www.khronos.org/registry/webgl/specs/1.0/ ) tells us: "A WebGL implementation must only accept shaders which conform to The OpenGL ES Shading Language, Version 1.00."

Looking into the GLSL ES 1.0 spec ( https://www.khronos.org/registry/gles/specs/2.0/GLSL_ES_Specification_1.0.17.pdf ) I found:

Section 3.4 defines the preprocessor and also states "There are no number sign based operators (no #, #@, ##, etc.), nor is there a sizeof operator."

So whatever the browser's implementation does internally, it follows the standard :)

like image 94
Tobias Schlegel Avatar answered Feb 22 '26 23:02

Tobias Schlegel



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!