From: GAUVIN Yves (yves.gauvin++at++sncf.fr)
Date: 07/27/2000 00:27:04
Hi,
I wan't to use a magellan, flock of birds and a cyberglove with linux
performer.
Does somebody have some codes for the driver?
PS : I see the work of DIVERSE but it is not available under Linux...
Yves GAUVIN
olivier Billard a écrit :
>
> I ve used it in a VR performer based api.
> one or two hands with flock of bird tracker and stereo in a helmet.
> don't know for CAVE, but for performer it's nearly easy to deal with it.
>
> you just may be annoyed with thumb kinematics and accuracy drift due to
> temperature variations if not using the full package for tracking:
> in my API a pc is connected to the devices and gives data to the onyx, so I
> cannot deal with the full package (loder and animatics)
> ----- Original Message -----
> From: Raymond de Vries <raymond++at++sara.nl>
> To: <info-performer++at++sgi.com>
> Sent: Tuesday, July 25, 2000 3:59 PM
> Subject: CyberGlove, VirtualHand Suite & Performer
>
> > Hi all,
> >
> > I am looking for any information about using 2 CyberGloves, VirtualHand
> > Suite and Performer in one (CAVE) app. While looking for information at
> > the Virtual Technologies site (makers of the CyberGlove) about the "Does
> > VirtualHand support Performer?" I found the shortest answer ever: "Yes".
> > No further information...
> >
> > So this brings me to you: do you have any experience with this
> > combination? The VirtualHand toolkit has shown to do hand kinematics and
> > rendering, provided that I use the 'standard' setup as descibed in the
> > manual: a combination of glove and tracking sensor with the VirtualHand
> > scene graph. However, since I want to use it in our CAVE I can't let
> > VirtualHand control the tracking sensor directly (as it is already
> > controlled by the CAVElib (until you tell me otherwise :-)))
> >
> > Ideally, I would like to build a Performer node from the data provided
> > by the VirtualHand toolkit, and let the VirtualHand toolkit handle the
> > kinematics. As I see it now this results in (roughly) 2 problems:
> > - How do I tell the VirtualHand toolkit to construct a human hand (as
> > they call it) without a tracking sensor, ie from only a glove?
> > - How do I use the VirtualHand data to build a Performer node? One
> > possibility seems to be to use the OpenGL display lists which are
> > generated by VirtualHand toolkit. Or not?
> >
> > Thanks a lot
> > Raymond
> >
> >
> > Btw I am using VirtualHand Suite v1.0
> >
> >
> > --
> >
> > Raymond W. de Vries
> > VR Consultant/Programmer - CAVE / Highend Visualization
> >
> > SARA Computing Services Amsterdam
> > The Netherlands
> >
> > r a y m o n d ++at++ s a r a . n l
> > http://www.sara.nl/
> > -----------------------------------------------------------------------
> > List Archives, FAQ, FTP: http://www.sgi.com/software/performer/
> > Submissions: info-performer++at++sgi.com
> > Admin. requests: info-performer-request++at++sgi.com
> >
>
> -----------------------------------------------------------------------
> List Archives, FAQ, FTP: http://www.sgi.com/software/performer/
> Submissions: info-performer++at++sgi.com
> Admin. requests: info-performer-request++at++sgi.com
This archive was generated by hypermail 2b29 : Thu Jul 27 2000 - 00:30:11 PDT