Re: Pipes

New Message Reply Date view Thread view Subject view Author view

Angus Dorbie (dorbie++at++sgi.com)
Tue, 28 Jul 1998 14:05:05 -0700


William Sherman -Visualization wrote:
>
> Angus Dorbie wrote:
>
> > Steve Baker wrote:
> > >
> > > No - you can't have a 'window' (an X-window specifically) that straddles
> > > more than one PIPE (or more than one pfPipe for that matter). There have
> > > been cases where multiple PIPES have been used to drive a single video -
> > > IIRC, the Disney Aladdin ride did that at one time. It needs some special
> > > hardware though.
> >
> > We have this for ONYX2 now, it was announced at Siggraph.
> >
> > Reality Monster uses image readback to scale graphics for a single
> > channel across multiple pipes but you had to have an application which
> [...]
>
> There are two ways I can envision multiple PIPEs being used to
> feed a single video output -- temporal interlacing and spatial
> division of the frame buffer.
>
> I would guess that temporal interlacing would be easier to
> handle, and I understand the old Aladdin ride used temporal
> interlacing. Is this the method that the new Onyx2/IR2 does?

The term temporal interlacing is an awfull term which has some
unfortunate and potentially confusing connotations, particulatly
since conventional interlacing is also temporal in nature, although
I know what you mean by this at least use interleaving or something.

I'll just describe what it does to avoid confusion, I don't feel
it's correct to call this interlacing.

Say you have 3 pipes, and needed 60 Hz, each pipe would render at
20Hz but would be genlocked & swapreadied to render at 120 degrees
out of phase with each other. Every 60 Hz vertical retrace therefore,
one of the pipes would have completed rendering the channel and that
pipe will send LVDS digital video to the display generators where
you take either digital or analog video to the display. All display
generators are daisy chained on the LVDS bus, so you need only tap
one DG board for the final output.

The application runs at 60Hz and other performer processes run out
of phase at a lower rate in sync with the out of phase swap for each
of the pipes. At each 60 Hz interval the data is taken by the pipe
processes which have just swapped and sent data out to video and
rendering begins anew.

Ultimately you get a single video stream which is composed of images
from the pipes in round robin fashion, the video is limited to the
bandwidth of a single DAC and/or the bandwidth on the LVDS bus. But
high resolution progressive scan is certainly within the scope of
what can be done.

Cheers,Angus.

-- 
"Only the mediocre are always at their best." -- Jean Giraudoux 

For advanced 3D graphics Performer + OpenGL based examples and tutors: http://www.dorbie.com/ ======================================================================= List Archives, FAQ, FTP: http://www.sgi.com/Technology/Performer/ Submissions: info-performer++at++sgi.com Admin. requests: info-performer-request++at++sgi.com


New Message Reply Date view Thread view Subject view Author view

This archive was generated by hypermail 2.0b2 on Mon Aug 10 1998 - 17:57:45 PDT

This message has been cleansed for anti-spam protection. Replace '++at++' in any mail addresses with the '@' symbol.