frerichs@ux1.cso.uiuc.edu (David J Frerichs) (03/11/91)
Next to our booth at EOH, a couple of BioEng students had rigged up a Power Glove to an IBM PC using only a parallel port. (so it was said, they weren't exactly hardware gurus but I saw no evidence of any extra circuitry) They were using the sonic position sensors to read in the vector of your hand and translated it to keystroke input... move your arm in a certain sequence and it would run a program. I kept asking them how they decoded the signals from the glove to the port. I suppose they never understood what my question was exactly because each time they stammered back about there being a 6502 inside the glove and that is where the signals came from. Eventually I gave up, partially because I was getting frustrated, partially because the resolution of the glove is piss poor anyway and not suited for useful interaction (IMHO). Come to think of it, I don't think they had conquered the decoding of gestures, since their demo didn't involve moving the fingers to accomplish anything. Oh, well... ADDENDUM: There was a slight error in the posting from the person who came to EOH to see Future Vision Technologies unit. The normal display device of their system IS a dual-screen, headmounted headset. The "look in" display was only for the public exhibition. (You can't have 5000 people putting on and taking off the goggles all day.) The sensory isolation effect is much better with the headset. [dfRERICHS University of Illinois, Urbana Designing VR systems that work... Dept. of Computer Engineering Networked VR. IEEE/SigGraph _ _ _ frerichs@ux1.cso.uiuc.edu _/_\__/_\__/_\_ frerichs@well.sf.ca.us \_/ \_/ \_/ ]