garye@hpdsla.HP.COM (Gary Ericson) (12/09/88)
A few months ago, IBM announced it was working on something it called the Paper-Like Interface (PLI) project. To quote InfoWorld: "It uses an electronic pen to write on the transparent [digitizing] tablet, which lies directly on an LCD screen, combining the functions of a keyboard and a monitor, according to project manager Jim Rhyne. The computer was trained to recognize handwritten characters and symbols, allowing users to perform tasks faster than possible with either a mouse or keyboard, IBM said." The idea of this kind of interface has been around for a long time. Have there been any human factors studies done on this type of technology? Specifically: - would a user consider this kind of interface more intuitive than, say, a mouse and keyboard, as some have claimed? is it "faster"? - what are the ergonomic effects of looking *down* at a tablet instead of *straight-ahead* at a screen for an extended period of time? Gary Ericson - Hewlett-Packard, Technical Systems Division phone: (408)746-5098 mailstop: 101N email: gary@hpdsla9.hp.com
bradb@ai.toronto.edu (Brad Brown) (12/10/88)
RE IBM's announcement of a "paper-like" interface: I don't know very much about it, but the latest issue of ComputerWorld (a trade rag) that crossed my desk carried a blerb about a new product from Wang Information Systems that has an LCD pad that you write on with a stylus. It was not clear what kind of software it ran or whether it was able to do full general character recognition, but it's a real product... A few months ago I heard of a similar product from another company, can't remember who, that tried to acomplish the same thing. Both systems seem to be very expensive and it's not clear to me that the technology is there to make them work really well. Q: How many people would really want an interface like this? I would love to draw on paper for things like drawings and equations, and it would probably be very nice for menu selections if the menus would change as you touched them. I don't really think, however, that a stylus would be better for text input -- in my case, I type *much* faster than I write and my writing is not very good. My typing is so much better than my writing that I prefer to compose text directly into a word processor, where one of the advantages is that I can look at my notes or a paper while I type, letting the word processor deal with word wrap and stuff... (-: Brad Brown :-) bradb@ai.toronto.edu
nick@hp-sdd.hp.com (Nick Flor) (12/13/88)
In article <88Dec10.134912est.10521@ephemeral.ai.toronto.edu> bradb@ai.toronto.edu (Brad Brown) writes: > >Q: How many people would really want an interface like this? I would >love to draw on paper for things like drawings and equations, and it >would probably be very nice for menu selections if the menus would >change as you touched them. I don't really think, however, that a >stylus would be better for text input -- in my case, I type *much* >faster than I write and my writing is not very good. My typing is >so much better than my writing that I prefer to compose text directly >into a word processor, where one of the advantages is that I can look >at my notes or a paper while I type, letting the word processor deal >with word wrap and stuff... > > (-: Brad Brown :-) > bradb@ai.toronto.edu Well maybe not for text input, but you have to remember that typing is so much slower than thinking. If the machine that supports the stylus input is very good at recognizing patterns, then you could input your thoughts in short hand, or some concept driven alphabet, or more generally, whatever minimizes the action specification for representing a thought on the stylus device. (Read ch.3 (Cog. Eng.) by D. Norman in the UCSD book, for an excellent (and I'm not saying this just because he reads this group) model on the the 7 stages of user activity) Of course we can read faster than we can talk, so, at least in the case of short hand, we don't have to display our data in the format we input it. Nick -- + Disclaimer: The above opinions are my own, not necessarily my employer's. + + Oh, sure. Sure. I'm going. But I got | Nick V. Flor * * o * * + + your number, see? And one of these | Hewlett Packard SDD * * /I\ * * + + days, the joke's gonna be on you. | ..hplabs!hp-sdd!nick * * / \ * * +
karl@ficc.uu.net (karl lehenbauer #) (12/14/88)
In article <88Dec10.134912est.10521@ephemeral.ai.toronto.edu>, bradb@ai.toronto.edu (Brad Brown) writes: > RE IBM's announcement of a "paper-like" interface: > Q: How many people would really want an interface like this? I would > love to draw on paper for things like drawings and equations, and it > would probably be very nice for menu selections if the menus would > change as you touched them. I don't really think, however, that a > stylus would be better for text input -- in my case, I type *much* > faster than I write and my writing is not very good. My typing is > so much better than my writing ... An obvious answer is to draw a typewriter keyboard on your LCD display and use its touch sensitivity to determine what "keys" you type. This approach could work with the VIVED helmet/data glove as well. I would like to see an LCD/touchscreen setup the size of an office whiteboard -- totally interfaced to our development systems, of course, multiwindowed with all the character recognition capabilities discussed, both to turn entered data into a nice high-content-proportional-to-number- of-bits form (ASCII text, rather than a bitmap or something), plus it would clean up my penmanship :-) -- -- uunet!ficc!karl
bradb@ai.toronto.edu (Brad Brown) (12/15/88)
In article <2442@ficc.uu.net> karl@ficc.uu.net (karl lehenbauer #) writes: >In article <88Dec10.134912est.10521@ephemeral.ai.toronto.edu>, bradb@ai.toronto.edu (Brad Brown) writes: >> RE IBM's announcement of a "paper-like" interface: > >> Q: How many people would really want an interface like this? ... >> I don't really think, however, that a stylus would be better for >> text input ... > >An obvious answer is to draw a typewriter keyboard on your LCD display >and use its touch sensitivity to determine what "keys" you type. That's not a very good answer -- have you ever tried to type on a membrane keyboard (one that has no tactile feedback) before? It's not easy or accurate. Perhaps the keyboard will start to go away in machines like executive workstations and shop-floor computers, where it only gets in the way anyway, but in the case of people who have to work with text, be it words or program code, I think the keyboard is the better way to go, perhaps *aided* by touch-sensitive input devices. Now if we could only make a _better_ keyboard so everyone would be happy... (-: Brad Brown :-) bradb@ai.toronto.edu
garye@hpdsla.HP.COM (Gary Ericson) (12/16/88)
> An obvious answer is to draw a typewriter keyboard on your LCD display > and use its touch sensitivity to determine what "keys" you type. I don't think that would work as well. Someone I know did an informal study on touch-typing and found that you couldn't vary the standard keyboard size and feel by very much at all before typing fell apart. I don't think the display would have appropriate tactal feedback to make typing as efficient as on a standard keyboard. Maybe with practice, however. > This approach could work with the VIVED helmet/data glove as well. This I like... > I would like to see an LCD/touchscreen setup the size of an office > whiteboard ... I like this too. What about making it swivel so you could also sit down at it like at a drafting table and lean on it? I wonder how long you could work at something like that (vertical whiteboard or drafting table) without tiring or getting a sore back/neck. > -karl lehenbauer --------- Gary Ericson - Hewlett-Packard, Technical Systems Division phone: (408)746-5098 mailstop: 101N email: gary@hpdsla9.hp.com
rwl@uvacs.cs.Virginia.EDU (Ray Lubinsky) (12/19/88)
In article <2442@ficc.uu.net>, karl@ficc.uu.net (karl lehenbauer #) writes: > > An obvious answer is to draw a typewriter keyboard on your LCD display > and use its touch sensitivity to determine what "keys" you type. Please, no! Have you every used a membrane keyboard (a la Atari 400 long ago in a home computer market far away)? Yuck! Absolutely no tacile feedback -- drives a touch typist crazy. > This approach could work with the VIVED helmet/data glove as well. Now we're talking, if you can include some kind of force-feedback within the glove. Then you can have a virtual keyboard that feels like whatever kind of keyboard you're comfortable with; with the helment, I guess you could make the keyboard *look* like whatever you wanted too. If I were feeling nostalgic, perhaps I'd make it look like my Dad's old c. 1940 Olivetti manual typewriter.... -- | Ray Lubinsky, UUCP: ...!uunet!virginia!uvacs!rwl | | Department of BITNET: rwl8y@virginia | | Computer Science, CSNET: rwl@cs.virginia.edu -OR- | | University of Virginia rwl%uvacs@uvaarpa.virginia.edu |
don@brillig.umd.edu (Don Hopkins) (12/19/88)
Membrane keyboards just don't cut it. What I've always wanted was real mechanical keyboard with a little dynamically programmable lcd (or whatever) display on *each* keytop. I think it'll be a while till such a keyboard could be manufactured, let alone a mass produced reliably and cheaply. (I could be wrong, I hope!) Mechanical keyboards receive a *lot* of abuse. (I just cleaned out an old keyboard that practically had fur mats (maybe the cat's been sleeping on it...)) When you press the caps lock key, all the letters should shift to upper case. When you're typing into a numeric field, everything but the numbers and editing keys should be dimmed. Context sensative function keys with dynamic labels. You should be able to look at the keyboard to see what font you're typing in. It would be great for typing in funny symbols and big fonts like Kanji. I think Bell Labs did some neat stuff using a half-silvered mirror above a keyboard, to overlay dynamically changing graphics and text on the keys. The keyboard and your hands were beneath the mirror, which reflected an inverted video image at you, so the graphics appeared to float above your hands. That way your fingers didn't get in the way of seeing the key labels. I could dig up a reference if anybody's interested. But I still want something I can put in my lap. -Don
bertrand@cui.UUCP (IBRAHIM Bertrand) (12/19/88)
>>> RE IBM's announcement of a "paper-like" interface: >> >>> Q: How many people would really want an interface like this? ... >>> I don't really think, however, that a stylus would be better for >>> text input ... >> >>An obvious answer is to draw a typewriter keyboard on your LCD display >>and use its touch sensitivity to determine what "keys" you type. > >That's not a very good answer -- have you ever tried to type on a membrane >keyboard (one that has no tactile feedback) before? It's not easy or >accurate. Perhaps the keyboard will start to go away in machines like >executive workstations and shop-floor computers, where it only gets in >the way anyway, but in the case of people who have to work with text, >be it words or program code, I think the keyboard is the better way to >go, perhaps *aided* by touch-sensitive input devices. Now if we could >only make a _better_ keyboard so everyone would be happy... Here, at the University of Geneva, we have built more than 15 years ago such an interface based on a plasma display and an infrared "touch panel". The display was square (about 8 inches sides) with 512x512 pixels and the input device had a resolution of 160x160. Any opaque object could be used to point to the screen (including a finger) and the position could be sampled 50 times per second, allowing a complete tracking of the movements of the finger. The equipment had also a regular keyboard attached to it. So, for text intensive applications like text editors, we used the keyboard for character input and the "touch panel" for cursor positioning and text selection with the finger in a very natural way. I think that having to use a stylus makes it much less comfortable since you have to pick it to make a selection and drop it to type on the keyboard. For graphic intensive applications like graphic editors where you had very little text to enter but mainly free hand drawing or rubberbanding, we used a simulated keyboard drawn at the bottom of the screen. Typing was not very fast (about two characters per second) but acceptable. An interresting consequence of the tight coupling of the input and output devices was the disappearence of the notion of cursor (for graphics). The finger WAS the cursor. Another interesting point was also the use we made of the finger up <-> finger down transitions to build a more user friendly interface. For those who are interested in getting more information about it, I can send hardcopies of papers published on this. Bertrand Ibrahim email: to BITNET to EAN from BITNET IBRAHIM@CGEUGE51.BITNET bertrand%cui.unige.ch@CERNVAX ARPA IBRAHIM%CGEUGE51.BITNET@CUNYVM.CUNY.EDU bertrand%cui.unige.ch@ubc.csnet UUCP cernvax!cui!bertrand.uucp UUCP mcvax!cernvax!cui!bertrand.uucp UUCP cui!bertrand@cernvax.UUCP EAN bertrand@cui.unige.ch JANET bertrand%cui.unige.ch@uk.ac.ean-relay
karl@ficc.uu.net (karl lehenbauer #) (12/20/88)
> In article <2442@ficc.uu.net> I wrote: > >An obvious answer is to draw a typewriter keyboard on your LCD display > >and use its touch sensitivity to determine what "keys" you type. In article <88Dec14.210656est.10862@ephemeral.ai.toronto.edu>, bradb@ai.toronto.edu (Brad Brown) writes: > That's not a very good answer -- have you ever tried to type on a membrane > keyboard (one that has no tactile feedback) before? It's not easy or > accurate. ... I think the problem with membrane keyboards is more one of accuracy than feedback, that is, you hit the keys and the keyboard doesn't register them. A paper-like interface, one that can record where a stylus is, should easily overcome that part of the problem. To provide some feedback, perhaps a click sound could be played, and of course the text would usually be shown as it was typed, somewhere on the "paper." As a simple test, move your hands off the keyboard and try typing on your desk as if it were a keyboard. It seems like it would work OK for me, again if the paper-like interface was highly accurate and responsive. Also, it makes custom keyboards and non-standard key arrangements, like Dvorak, really a snap to prototype and implement, and does so without locking anyone into anyone else's keyboard decisions. I understand your complaints, but I do not think they can be construed as an overwhelming indictment of the concept until more work has been done. -- -- uunet!ficc!karl "The greatest dangers to liberty lurk in insidious -- karl@ficc.uu.net encroachment by men of zeal, well-meaning but without understanding." -- Justice Louis O. Brandeis
garye@hpdsla.HP.COM (Gary Ericson) (12/22/88)
> Here, at the University of Geneva, we have built more than 15 years ago such > an interface based on a plasma display and an infrared "touch panel". > ... for text intensive applications like text editors, we used the keyboard > for character input and the "touch panel" for cursor positioning and text > selection with the finger in a very natural way. > For graphic intensive applications ... we used a simulated keyboard drawn > at the bottom of the screen. I'm curious about using the touch panel in this manner. I assume the screen was vertical, meaning that the user would have to reach up and forward to touch the screen. Did users feel it is tiring or frustrating to suspend the hand in the air for extended periods of time, or does the benefit of direct contact with the graphics/data override this sensation? Have you thought of the possibility of a more horizontal display (with touch panel) instead? > For those who are interested in getting more information about it, I can send > hardcopies of papers published on this. Just in case my email doesn't get there, I'll also request a copy here (my address is below). Thanks for providing it. > Bertrand Ibrahim Gary Ericson Hewlett-Packard, Workstation Technology Division 1266 Kifer Road Sunnyvale, California 94086 USA phone: (408)746-5098 mailstop: 101N email: gary@hpdsla9.hp.com
bradb@ai.toronto.edu (Brad Brown) (12/22/88)
In article <314@cui.UUCP> bertrand@cui.UUCP (IBRAHIM Bertrand) writes: [... stuff about how drawing text with a stylus is not fun, how touch-sensitive input systems should have a keyboard as well, and how evil it would be to simulate a keyboard on the display ...] >Here, at the University of Geneva, we have built more than 15 years ago such >an interface based on a plasma display and an infrared "touch panel". The >display was square (about 8 inches sides) with 512x512 pixels and the input >device had a resolution of 160x160. Any opaque object could be used to point >to the screen (including a finger) and the position could be sampled 50 times >per second, allowing a complete tracking of the movements of the finger. > >The equipment had also a regular keyboard attached to it.... > >For graphic intensive applications like graphic editors where you had very >little text to enter but mainly free hand drawing or rubberbanding, we used a >simulated keyboard drawn at the bottom of the screen. Typing was not very >fast (about two characters per second) but acceptable. > >An interresting consequence of the tight coupling of the input and output >devices was the disappearence of the notion of cursor (for graphics). The >finger WAS the cursor. Another interesting point was also the use we made >of the finger up <-> finger down transitions to build a more user friendly >interface. This is the kind of thing that I was thinking of when I started the complaints about keyboardless systems. My concern was that machines like the new Wang thing or IBMs new offering would be 'sexy' but not fun to use after the first hour or so for people who need to do a lot of work with text. Adding a keyboard and integrating it into the system is the best idea. Further ideas: Several years ago HP had a version of one of their IBM PC compatable micros (HP-150?) that had a sort-of-touch-sensitive display. In the moulding around the display were two rows, one vertical and one horizontal, of IR LEDs with receptors. Software could read the state of some circuits that told them whether anything was blocking any of the diodes, and if so where on the screen the object was put. That meant that cursor movement in, say, a word processor, could be handled by touching the text location on the screen with a pencil. (HP supplied a version of WordStar that implemented this.) Function keys could be drawn anywhere on the screen, with any contents, and activated by touching. The device could resolve what line you were pointing at (25 line display) and could resolve down to two characters in the horizontal direction. (I think...) The machine was very nice, but innovative hardware dies fast when it tries to compete in the lowest-common-denominator IBM PC world, and this machine never cought on. Allong the same lines, Steve Ciarcia did a column in Byte a few years ago describing a home-brew version of the same thing. I think he was the consulting engineer who designed the HP part, but I can't prove it. Lastly, major shopping malls have touch-sensitive information kiosks that show little commercials on a display and let you pick from menus of store information. I have seen similar ideas in trade shows. What *I* would like to see is a flat-panel display that I could sit on my desk at a slight angle behind a detached keyboard. I would like my computer to run a major windowing system (SunOS, X, whatever) and be able to perform "mouse" operations by running my finger over the display. That makes my machine compatable with existing software and still lets me take advantage of the touch-sensitive input. Then I can enjoy all these new features while I wait for the idea to catch on and spawn lots of nice software that really takes advantage of it. ;-). (-: Brad Brown :-) bradb@ai.toronto.edu bradb@ai.utoronto.ca
ralphw@ius3.ius.cs.cmu.edu (Ralph Hyre) (12/23/88)
On keeping input bandwidth high with a paper-like interface: >> [I type] faster than I write and my writing is not very good. My typing is >> so much better than my writing ... >An obvious answer is to draw a typewriter keyboard on your LCD display >and use its touch sensitivity to determine what "keys" you type. But then some people will claim they need tactile feedback to type effectively. I think that you'll need a 'real' keyboard for a while. I believe that people will adapt to whatever input modality they're using the most, so that the handwriting speed will improve with practive, eventually surpassing typing. [Don't count on anyone except the computer to be able to recognize the chicken-scratch anymore. I'll bet that even the user won't be able to read it back the next day.] -- - Ralph W. Hyre, Jr. Internet: ralphw@ius3.cs.cmu.edu Phone:(412) CMU-BUGS Amateur Packet Radio: N3FGW@W2XO, or c/o W3VC, CMU Radio Club, Pittsburgh, PA "You can do what you want with my computer, but leave me alone!8-)" --
bertrand@cui.UUCP (IBRAHIM Bertrand) (01/04/89)
>> Here, at the University of Geneva, we have built more than 15 years ago such >> an interface based on a plasma display and an infrared "touch panel". > >I'm curious about using the touch panel in this manner. I assume the screen >was vertical, meaning that the user would have to reach up and forward to touch >the screen. Did users feel it is tiring or frustrating to suspend the hand >in the air for extended periods of time, or does the benefit of direct contact >with the graphics/data override this sensation? Have you thought of the >possibility of a more horizontal display (with touch panel) instead? > >Gary Ericson In fact, the first prototype, built in 1973 was vertical and was mainly used for text processing. The finger input was used for text selection and menu entry. I asked our secretary who was the main user of that equipment and she told me that she didn't feel it tiring since it was only from time to time that she had to raise her arm to the screen. Another prototype, built in 1975, was horizontal and integrated in a flat table. This one was mainly used for graphic applications. The user could easily rest his arm on the table while pointing at the screen with a finger. The keyboard could sit on the side of the screen or on the user's lap. There was enough room on the table for any paper, drawing or listing the user could need. The equipment was used for free hand drawing as well as for technical drawing. Having used this equipment quite a lot myself, I felt that it might have been a little bit more comfortable if the surface of the screen was not completely horizontal but rather at a slight angle. Bertrand Ibrahim email to BITNET to EAN from BITNET IBRAHIM@CGEUGE51.BITNET bertrand%cui.unige.ch@CERNVAX ARPA IBRAHIM%CGEUGE51.BITNET@CUNYVM.CUNY.EDU bertrand%cui.unige.ch@ubc.csnet UUCP cernvax!cui!bertrand.uucp UUCP mcvax!cernvax!cui!bertrand.uucp UUCP cui!bertrand@cernvax.UUCP EAN bertrand@cui.unige.ch JANET bertrand%cui.unige.ch@uk.ac.ean-relay
rfarris@serene.UUCP (Rick Farris) (01/06/89)
In an article, Gary Ericson asks: > I'm curious about using the touch panel in this manner. I assume the > screen was vertical, meaning that the user would have to reach up and > forward to touch the screen. Did users feel it is tiring or > frustrating to suspend the hand in the air for extended periods of > time, or does the benefit of direct contact with the graphics/data > override this sensation? Yes, yes, YES. Both Tektronix and Hewlett Packard have burdened users of their test instruments with touch screens. And yes, your arm gets darn tired. Also, after a couple of hours of use, the screen is so dirty and greasy from all those finger presses, that you can barely read it anymore. Touch screens that are meant to be both touched and read, are a big lose. Even those parts of the screen that don't have touch buttons get greasy from bracing your hands. Touch might be interesting as a replacement for a digitizing tablet, though. Rick Farris RF Engineering POB M Del Mar, CA 92014 voice (619) 259-6793 rfarris@serene.cts.com ...!uunet!serene!rfarris serene.UUCP 259-7757
kmont@hpindda.HP.COM (Kevin Montgomery) (01/10/89)
/ hpindda:comp.cog-eng / rfarris@serene.UUCP (Rick Farris) / 7:39 pm Jan 5, 1989 / > Yes, yes, YES. Both Tektronix and Hewlett Packard have burdened > users of their test instruments with touch screens. And yes, your > arm gets darn tired. Also, after a couple of hours of use, the > screen is so dirty and greasy from all those finger presses, that you Rick- how about using a closed pen (or other such device) to point instead of your fingers if they're dirty? Agreed, having one keep one's arm raised, Tai-Chi style, is no fun, but I still think color flat touch screens are the way to go. Color, because of the obvious advantages of multiple highlighting and the like. Flat, because with the current environments (read: windows) no matter how big your screen is, it ain't big enough, and CRT displays have an exponential cost to size ratio, whereas flat (LCD) screens have a more linear function (not to mention the fact that flat screens take up MUCH less space). Touch (or stylus) sensitive for the reasons discussed- namely a direct mapping from tactile work area to abstract work area (whereas with mice/keyboards the tactile area is different from the abstract area (window) and can cause confusion). And definitely adjustable for comfortability. Such projects are in research stages now, with development probably in the early 90s. My guess would be that some startup will hit the PC market with a color LCD touchdisplay sometime in the next couple of years, with the larger companies following. After some time, workstation interfaces will also be available (probably with the same speed that the windowing/big screen environment took control in the last 5 years or so). (generalized prediction/opinion) What does everyone think of having a touchscreen keyboard if the application isn't very typing-intense? Seems that a color, flat, horizontal, touch display could quite easily pop up a keyboard window for some short typing when necessary, then make it go away when not needed. Surely there's been discussions of the agonies of touch-typing in this group in the past- could someone post the issues and verdicts? kevin
garye@hpdsla.HP.COM (Gary Ericson) (01/11/89)
> Agreed, having one keep > one's arm raised, Tai-Chi style, is no fun, but I still think color > flat touch screens are the way to go. Assuming that you use a technology that doesn't require you to keep your arm out of the way (i.e., infra-red lines would be interrupted by your arm if you set it down on the screen), I was talking with someone about how it might be possible to distinguish between a small touched area (pen/finger point) and a large touched area (side of hand or forearm resting on the screen). If you can identify the pen/finger point, maybe you can let the user rest his arm or hand or elbow on the screen for support. This would make it much easier to use, at least for a semi-horizontal screen. > What does everyone think of having a touchscreen keyboard if the > application isn't very typing-intense? Seems that a color, flat, horizontal, > touch display could quite easily pop up a keyboard window for some short > typing when necessary, then make it go away when not needed. For me, if it was a small amount of text and I already had a pen in hand for pointing, I think I'd rather write out the text in longhand on the screen with the pen. Typing on a non-tactile-feedback screen would be more tedious to me than writing the words out by hand. > Surely there's > been discussions of the agonies of touch-typing in this group in the past- > could someone post the issues and verdicts? I think the conclusions have been that the lack of tactile feedback would make it difficult to be real efficient on a keyboard window. Some people say they would abhore the thing while others wouldn't mind using it (at least for small amounts of text). > kevin Gary Ericson - Hewlett-Packard, Workstation Technology Division phone: (408)746-5098 mailstop: 101N email: gary@hpdsla9.hp.com
reggie@pdn.UUCP (George W. Leach) (01/11/89)
In article <3500002@hpindda.HP.COM> kmont@hpindda.HP.COM (Kevin Montgomery) writes: > What does everyone think of having a touchscreen keyboard if the >application isn't very typing-intense? Seems that a color, flat, horizontal, >touch display could quite easily pop up a keyboard window for some short >typing when necessary, then make it go away when not needed. Surely there's >been discussions of the agonies of touch-typing in this group in the past- >could someone post the issues and verdicts? > Rather than a horizontal display, why not allow for angling the display as with a drafting table. Hunching over a flat surface can be fatiguing as well. -- George W. Leach Paradyne Corporation ..!uunet!pdn!reggie Mail stop LG-129 Phone: (813) 530-2376 P.O. Box 2826 Largo, FL USA 34649-2826
gln@arizona.edu (Gary L. Newell) (01/14/89)
In article <2690009@hpdsla.HP.COM>, garye@hpdsla.HP.COM (Gary Ericson) writes: > Assuming that you use a technology that doesn't require you to keep your arm > out of the way (i.e., infra-red lines would be interrupted by your arm if > you set it down on the screen), I was talking with someone about how it > > typing when necessary, then make it go away when not needed. > For me, if it was a small amount of text and I already had a pen in hand for > pointing, I think I'd rather write out the text in longhand on the screen > with the pen. Typing on a non-tactile-feedback screen would be more tedious > to me than writing the words out by hand. > Gary Ericson - Hewlett-Packard, Workstation Technology Division There has been quite a bit of work at IBM T.J. Watson Research in the last two years on Gestural Interfaces. The use of a transparent tablet over a flat display seems to be optimal for such a paper/pen like interface. It eliminates the problems of determining precise points being indicated on the screen. Applications in editting and spreadsheets or any info. processing problem are being looked at. I have a number of references for anyone interested. Also, note that a keyboard is only preferable to stylus when entering text, if you are dealing with a small alphabet language (English, proof reading symbols, etc.) not for languages like Japanese or Chinese, for these languages a good gestural interface would be superior and many foreign companies are doing work in this area. Much more work is needed in this area, especially in the areas of hardware design (tablet technology is not quite up to the task yet) and recognition algorithms (gestural recognition has subtle differences from 'normal' character recognition problems and new algorithms are likely to be needed if reasonable recognition results are to be gained). gary newell
warner@s3snorkel.ARPA (Ken Warner) (01/16/89)
In article <8710@megaron.arizona.edu> gln@arizona.edu (Gary L. Newell) writes: >In article <2690009@hpdsla.HP.COM>, garye@hpdsla.HP.COM (Gary Ericson) writes: >> Assuming that you use a technology that doesn't require you to keep your arm >> out of the way (i.e., infra-red lines would be interrupted by your arm if This is my main gripe with the current set of pointing devices. You have to use your whole arm to move an essentially weightless cursor. After a day of mousing around, I have a knot in my shoulder that can really hurt. Also my back is tired from the asymetrical position necessary to balance one extended arm. >> For me, if it was a small amount of text and I already had a pen in hand for >> pointing, I think I'd rather write out the text in longhand on the screen >> with the pen. Typing on a non-tactile-feedback screen would be more tedious > >There has been quite a bit of work at IBM T.J. Watson Research in the last >two years on Gestural Interfaces. The use of a transparent tablet over a >flat display seems to be optimal for such a paper/pen like interface. It >eliminates the problems of determining precise points being indicated on >the screen. .... What about the problem of paralax? The separation between the actual working surface and the pen is a small but finite amount. The medium of separation is refractive. Unless one's eyes are within a cone of ( I dunno what to call it ... perhaps workability ) it would be similar to trying to poke a fish in water. This is not much of an error to compensate for, but over time it could accumulate to a lot of extra work. Ken Warner
gln@arizona.edu (Gary L. Newell) (01/16/89)
In article <911@scubed.UUCP>, warner@s3snorkel.ARPA (Ken Warner) writes: > >There has been quite a bit of work at IBM T.J. Watson Research in the last > >two years on Gestural Interfaces. The use of a transparent tablet over a > >flat display seems to be optimal for such a paper/pen like interface. It > > What about the problem of paralax? Good question - it is a major problem, but one which it would seem might be overcome with new hardware advances, in the not too distant future - but in the short term and in experimental work, it is a reasonable alternative to have the user indicate the angle of viewing (by touching three or more displayed points, for example) and use a linear transform for the electronic inking. Of course, this compensation goes out the window if the user moves...... I'm not too familiar with state of the art transparent tablets, but I know that there are some in use that have a distance of 0.177 of an inch between the display plane and the stylus tip - if this distance is decreased even more, it would seem a viable input method. gary newell
kent@lloyd.camex.uucp (Kent Borg) (01/17/89)
In article <911@scubed.UUCP> warner@s3snorkel.UUCP (Ken Warner) writes: >This is my main gripe with the current set of pointing devices. You have to >use your whole arm to move an essentially weightless cursor. After a day of >mousing around, I have a knot in my shoulder that can really hurt. Also my >back is tired from the asymetrical position necessary to balance one extended >arm. A mouse is a mouse is a mouse, right? No. How your computer system interprets the pulses from the mouse can make a BIG difference. On a Macintosh I have the mouse tracking set to the fastest speed and I can move from corner to opposite corner of a 19" screen and never move the heal of my hand. I'm far to lazy to move my arm but it's all in the fingers if you have a well enough designed computer. Kent Borg kent@lloyd.uucp or hscfvax!lloyd!kent
kmont@hpindda.HP.COM (Kevin Montgomery) (01/17/89)
/ hpindda:comp.cog-eng / gln@arizona.edu (Gary L. Newell) / 2:10 pm Jan 15, 1989 / > I'm not too familiar with state of the art transparent tablets, but I know > that there are some in use that have a distance of 0.177 of an inch > between the display plane and the stylus tip - if this distance is > decreased even more, it would seem a viable input method. Hmmmm. "What if...." Stay tuned, kids...
kmont@hpindda.HP.COM (Kevin Montgomery) (01/18/89)
In article <911@scubed.UUCP> warner@s3snorkel.UUCP (Ken Warner) writes: >This is my main gripe with the current set of pointing devices. You have to >use your whole arm to move an essentially weightless cursor. After a day of >mousing around, I have a knot in my shoulder that can really hurt. Also my >back is tired from the asymetrical position necessary to balance one extended >arm. I think Neils Mayer mentioned a while back that he uses a track ball instead of a mouse (have the same interface- HP-HIL). That way he can position using his fingers and find the granularity (sensitivity) of motion to be much better. (let's face it- the shoulder muscles weren't meant for precise movement!) kevbop