Michael Vance at Loki found this one:
The GL_ARB_texture_compression spec says that the internalformat
parameter to glCompressedTexImage[123]DARB() should be a GLint.
However, the glext.h file (generated from gl.spec) has internalformat
as a GLenum.
I think the problem stems from the fact that gl.spec uses the
PixelInternalFormat token for internalformat:
CompressedTexImage2DARB(target, level, internalformat, width, height, border,
imageSize,
data)
return void
param target TextureTarget in value
param level CheckedInt32 in value
param internalformat PixelInternalFormat in value
...
PixelInternalFormat maps to GLenum for every other function that
uses it in gl.spec.
Right before the entries for glCompressedTexImage[123]DARB() is the
comment:
# Arguably TexelInternalFormat, not PixelInternalFormat
So it appears that someone thought about this but it wasn't resolved.
I suggest that TextureComponentCount be used instead of PixelInternalFormat.
TextureComponentCount is used by glTexImage[123]D for internalformat and
it maps to GLint.
Can someone fix this, then generate a new glext.h file?
Thanks.
-Brian
|