Sorry for abusing this list a bit, but I think here are some of the best
people for answering the following question (if not, please give me a hint
where to ask instead):
What is the rationale for "GLsizei" being an "int" on all platforms I know of
and not an "unsigned int"? Reading the description in the OpenGL spec, one
might expect the former, but not really the latter. Is it because of some
legacy reason with IRIS GL? Is it simply because of the name (which might be
GLsizeui otherwise)? Or is it simply by accident? :-)
This is not an academic question, we are discussing how to standardise similar
types for OpenAL (http://www.openal.org/), see e.g. the mail at:
http://opensource.creative.com/pipermail/openal-devel/2005-July/000974.html
Cheers,
S.
|