I found an odd bug (?) when adding texturing support to glChess.
If you use a texture with dimensions of not 2^N the performance is complete crap. Now OpenGL requires textures with these dimensions but what I assume happens is if you don't provide them in that form it falls back to a software mode (maybe Mesa). Seems like good behaviour at a glance.
But it got me thinking - Is it always good to fall back like this? Yes it gives you maximum feature support (hopefully replacing the fall back functions with optimised ones) but it does make things confusing for the user/developer. I could not work out why things were so bad (and was trying all sorts of tricks to work out what it was) and it was only on a hunch that I found it. (I think I have found this issue perviously when developing).
What would have been nice is some sort of warning that this was occuring (not that I can think of a way to fit a warning like this into OpenGL).
1 comment:
Perhaps they need to return error codes from the GL function that initialises the texture.
"This texture's dimensions are not a value 2^N, you suck and we are now going to render things slowly."
Post a Comment