[net.cog-eng] Intro. to net.cog-eng

peterr@utcsrgv.UUCP (Peter Rowley) (08/15/83)

This newsgroup, net.cog-eng (Cognitive Engineering) has been created to
discuss psychology and particularly cognitive science *applied* to the
design of computer systems.  The group is not for discussions of social
or interpersonal psychology, except as they relate to the design of
computer systems, nor for pure cognitive science, which should appear
in net.ai.

A rationale for the field (and this newsgroup) is contained in Donald
Norman's "Learning and Memory", W.H.Freeman, 1982.  The book, a fine
introduction to modern cognitive science, contains the following:

"If you get the chance, examine the control room of an electricity-
 generating nuclear power plant.  Impressive, isn't it?  All those dials
 and knobs, switches and lights.  Did you ever wonder how the plant operators
 manage?  Do wonder. [There's evidence that part of the troubles at Three
 Mile Island were induced by this complexity] The design of the control panel
 is a combination of chance, luck, incompatible components, and almost
 complete indifference to the problems of human functioning ... A typical
 control room has more than 100 feet of panels, as many as 3000 meters and
 controls (with some meters requiring ladders or footstools to be read),
 and other meters or alarms located tens of feet away from the relevant 
 controls, all seemingly designed as if to maximize memory load and to
 minimize an operator's ability to match actual plant functioning to an
 internal, mental model of proper functioning."

"There are people who know how to do better, [the human-factors engineers].
 But [equipment designers] tend to dismiss their craft as "mere common sense."
 ... Yet there is more to design than common sense.  It is not a matter of how
 to best design a switch or a meter; it is a matter of trying to understand the
 functioning of the human, of deciding whether a switch or meter ought to be
 present at all.  A design should center on the functions and intentions of
 the human.  It shouldn't force the human to match the arbitrary needs of the
 machine.  System designers should start by considering the user.  All too
 often they start with the machine, and the human is not thought of until
 the end, when it's too late..."

Norman goes to to call ed, the Unix editor, an example of bad cognitive
engineering, creating unneccesary memory loads.  And there are worse software
analogues of the nuclear power plant control example.

Cognitive engineering, done well, would correct this function-before-user
design methodology and produce systems which better fit the abilities of
the user, producing systems less prone to operator error.

So, this group is to discuss this new discipline, and its mother fields of
psychology and cognitive science, as they apply to the design of computer
systems.  It will also faciliate communication between the rather few
(and geographically dispersed) researchers in this area.

laura@utcsstat.UUCP (08/16/83)

i want to get this note in on the ground floor. One of the
problems I have with certain "human factors engineers" is
that they think that they can use existing science to
provide an adequate model of a user. I do not think that
the science is that good yet. As a result, while
the "human factors" folk often talk about "user
specifiable software", i often find that this very idea
is lacking in their own thinking processes. Some person
discovers that (for example) Menu systems are good for
naive users doing accounting. They go on to generalise
that Menu systems are good for everybody doing everything.
When I point out that the menu systems I have used have
all driven me up the wall, I get told that I am being
hostile to the user population and that i want to
impose restricitons on them -- in effect turn them into
hackers. This was not my intention -- i wanted to
get rid of the restricitons imposed on ME.

laura creighton
utzoo!utcsstat!laura

sts@ssc-vax.UUCP (Stanley T Shebs) (08/16/83)

Not to pick nits (well, maybe a few) but ed is hardly a bad design.
Vi and emacs don't work very well on a hardcopy terminal, which is
what ed was originally designed for (I like it much better than
any of the other dozen-odd line editors I have used).  It has a
very orthogonal interface and is graduated in complexity, that is,
one can start work with a small handful of commands and later go on to
more sophisticated editing.  Also, while learning ed, I was able to guess
about things that I didn't know and it worked correctly.  If you
want programs to berate, try OS/360 and all of its unholy spawn,
or take cracks at PL/I, Algol68, Ada, or Interlisp.

Ed is like a hammer; you run the risk of losing credibility by
claiming it's poorly designed and needs improvement

					stan the leprechaun hacker
					ssc-vax!sts (soon utah-cs)

ps I am of course open to persuasion on these points (just hit me
on the head with a hammer!)

ka@spanky.UUCP (08/17/83)

I have to agree with Laura.  Human factors people tend to design for the
naive user.  Being a hacker, when given the choise between a user interface
designed by a hacker and one designed by a human factors engineer, I will
take the hacker's version any day.

I think that cognitive engineering is mostly common sense.  It is just
about as much of a discipline as systems engineering.  (Actually, that's
not a criticism; common sense is a very valuable commodity.)
				Waiting for the flames and
				Hoping that a cognitive engineer
				Doesn't ever get his hands on UNIX,
				Kenneth Almquist

steffen@ihu1f.UUCP (08/18/83)

I consider myself a UNIX expert, but I still have to dig into the manual to
check the options on commands I use infrequently.  Since nobody uses all
the commands all the time, we are all novices some of the time.  I look to
human factors to help me design a system that

	1) does not have common mistakes that even experts make, and
	
	2) provides help for the novice only when asked, so as to not get in
	the way of the expert.
-- 


				Joe Steffen
				ihu1f!steffen
				IH 2C-331 x5381