carm@tove.umd.edu (Richard Chimera) (02/24/90)
In article <889@lclark.UUCP> miller@lclark.UUCP (John Miller) writes: >Concerning a system that gives the speaker automatic feedback on the mood >of his/her audience: > >I once saw such a system demonstrated for Rhetoretician Ivan Illich. >He was horrified and criticized the inventor for creating such a device. >He felt that this is essentially what Hitler did. If the speaker can use >the feedback immediately to modify what is being said, is there any limit >to the delusions that could be created by the modern politician? Are you (or Ivan Illich) saying a politician presently doesn't listen and look at crowd reactions to say more about one topic and less about another? My political comment is that an uneducated population can be persuaded/fooled into anything. Perhaps the issue at hand is that members (I was going to say "persons", but computerized agents can also be "members" in a group) in a distributed synchronous group could send mood indications randomly or different than that they are really feeling, whereas it is more difficult for a true crowd at a rally to fake a feeling. Of course this 'anonymity factor' has been researched a fair amount, and is a factor in groupware systems. This begs the question, can groupware systems handle the uncooperative members of a group. Is CSUW (Comp. Supported Uncooperative Work) a separate topic? Here at the Univ of Md, colleagues of mine are creating a teaching theater which will have groupware elements in it (given enough programmer hours :-), and one of the main theoretical issues (in my mind) is how to deal with uncooperative freshmen whose only goal is to disrupt the group environment. Comments? Rick Chimera Human-Computer Interaction Laboratory, U of Md