RE: glBlendColorEXT problem

New Message Reply Date view Thread view Subject view Author view

From: Luc Renambot (renambot++at++cs.vu.nl)
Date: 03/13/2002 02:25:54


It works for me in OpenGL. You could try something like:
        // Init
        char*ext = (char* )glGetString(GL_EXTENSIONS);

        if(strstr(ext,"EXT_blend_minmax"))
                printf("EXT_blend_minmax supported.\n");

        if(strstr(ext,"EXT_blend_subtract"))
                printf("EXT_blend_subtract supported.\n");

      // Max setting
        glEnable(GL_BLEND);
        glBlendFunc(GL_SRC_ALPHA,GL_ONE);
        glBlendEquationEXT(GL_MAX_EXT);

Luc.

> -----Original Message-----
> From: Gabriel Nicula [mailto:gabnicu++at++yahoo.com]
> Sent: Tuesday, March 12, 2002 19:14
> To: info-performer++at++sgi.com
> Subject: glBlendColorEXT problem
>
>
> I have a Performer app that needed to use
> glBlendColorEXT, and it didn't work. I use Linux
> RedHat 7.2 with GeForce 3 Ti 200 and P4, with the
> latest drivers from nVidia, and noticed that
> glBlendColorEXT does NOT work.
> To test more, I've made a simple glut OpenGL app to
> test glBlendColorEXT, and it didn't work. I've put the
> same simple program on an SGI O2 workstation and it
> worked perfectly. Performer runs very well.
> Did someone else encountered this problem?
>
>


New Message Reply Date view Thread view Subject view Author view

This archive was generated by hypermail 2b29 : Wed Mar 13 2002 - 02:17:44 PST

This message has been cleansed for anti-spam protection. Replace '++at++' in any mail addresses with the '@' symbol.