There hasn't been a reply to this problem yet. Can someone
at SGI fix this?
Brian Paul wrote:
> Michael Vance at Loki found this one:
> The GL_ARB_texture_compression spec says that the internalformat
> parameter to glCompressedTexImageDARB() should be a GLint.
> However, the glext.h file (generated from gl.spec) has internalformat
> as a GLenum.
> I think the problem stems from the fact that gl.spec uses the
> PixelInternalFormat token for internalformat:
> CompressedTexImage2DARB(target, level, internalformat, width, height, border,
> return void
> param target TextureTarget in value
> param level CheckedInt32 in value
> param internalformat PixelInternalFormat in value
> PixelInternalFormat maps to GLenum for every other function that
> uses it in gl.spec.
> Right before the entries for glCompressedTexImageDARB() is the
> # Arguably TexelInternalFormat, not PixelInternalFormat
> So it appears that someone thought about this but it wasn't resolved.
> I suggest that TextureComponentCount be used instead of PixelInternalFormat.
> TextureComponentCount is used by glTexImageD for internalformat and
> it maps to GLint.
> Can someone fix this, then generate a new glext.h file?