[Top] [All Lists]

[ogl-sample] internalformat param inconsistency

To: ogl-sample@xxxxxxxxxxx
Subject: [ogl-sample] internalformat param inconsistency
From: Brian Paul <brianp@xxxxxxxxxxx>
Date: Sat, 09 Sep 2000 17:59:10 -0600
Organization: VA Linux Systems, Inc.
Reply-to: ogl-sample@xxxxxxxxxxx
Sender: owner-ogl-sample@xxxxxxxxxxx
Michael Vance at Loki found this one:

The GL_ARB_texture_compression spec says that the internalformat
parameter to glCompressedTexImage[123]DARB() should be a GLint.

However, the glext.h file (generated from gl.spec) has internalformat
as a GLenum.

I think the problem stems from the fact that gl.spec uses the
PixelInternalFormat token for internalformat:

CompressedTexImage2DARB(target, level, internalformat, width, height, border, 
        return          void
        param           target          TextureTarget in value
        param           level           CheckedInt32 in value
        param           internalformat  PixelInternalFormat in value

PixelInternalFormat maps to GLenum for every other function that
uses it in gl.spec.

Right before the entries for glCompressedTexImage[123]DARB() is the

# Arguably TexelInternalFormat, not PixelInternalFormat

So it appears that someone thought about this but it wasn't resolved.

I suggest that TextureComponentCount be used instead of PixelInternalFormat.
TextureComponentCount is used by glTexImage[123]D for internalformat and
it maps to GLint.

Can someone fix this, then generate a new glext.h file?



<Prev in Thread] Current Thread [Next in Thread>