meindnoch 5 days ago

WebGL/OpenGL doesn't use sRGB in the shaders. If you load an sRGB texture, or render to an sRGB surface, the API automatically applies the gamma- (or inverse gamma) curve, so the shader only ever sees the linear values.

1
kookamamie 5 days ago

Correct, it uses just numbers without any specific information about a colorspace being involved. Decoding and encoding sRGB happen during (texture) read and write stages.