running out of performer shared memory

New Message Reply Date view Thread view Subject view Author view

Anita Kishore (kishore++at++triavest.com)
Wed, 25 Jun 1997 16:36:14 -0700


Hi :

        I am having trouble with performer shared memory after I put new code
to pfMalloc (in its shared arena) some structures in pfList. The message I get
at that line is :

pfMemory::new() Unable to allocate 48 bytes from arena 0x18040000.
        Try using pfSharedArenaSize() to increase the arena size
        (currently 262144.00 KBytes) and check for adequate setrlimit()
        values and available space on swap (or pfTmpDir()).

and the program dumps core after this. The core file also points to the same
problem.

I increased shared arena size to 454288000 bytes (can't increase beyond this).
But I still get the same message even though it gives current size as the new
one. The file that I load is very simple and there should be enough memory
for everything, because very big scenegraph files (minus the new code) have
so far been loading allright. I am sure that the new allocation is not taking
more memory than what would be used when loading big scene graphs (unless a
different arena is used by performer for its nodes?). I made sure that ipcs
output is clean.

Is there a call to check the shared memory usage from the program? How can I
further increase the arena size? Any help on this problem is greatly
appreciated.

thanks a lot
-anita

kishore++at++triavest.com

-- 
Anita Kishore
=======================================================================
List Archives, FAQ, FTP:  http://www.sgi.com/Technology/Performer/
            Submissions:  info-performer++at++sgi.com
        Admin. requests:  info-performer-request++at++sgi.com

New Message Reply Date view Thread view Subject view Author view

This archive was generated by hypermail 2.0b2 on Mon Aug 10 1998 - 17:55:30 PDT

This message has been cleansed for anti-spam protection. Replace '++at++' in any mail addresses with the '@' symbol.