[comp.arch] Raw power and user interfaces

rang@cs.wisc.edu (Anton Rang) (12/09/90)

In article <KHB.90Dec7150745@chiba.Eng.Sun.COM> khb@chiba.Eng.Sun.COM (Keith Bierman fpgroup) writes:
>In order to make computers "simple" and "easy" to use, and accessible
>to the bulk of "nintendo junkies" (read folks who have better ways to
>spend their time than parsing ALGOL68 docsets) is by building vastly
>more powerful systems. 28mips (whatever they are) is several hundreds
>of orders of magnitude too small for a really user friendly system.

  Hmm.  I'm not sure you really need trillions and trillions more
operations per second than there are estimated to be particles in the
universe, but yes, more "horsepower" can be useful.

  However, that doesn't mean that people use it that way.  I currently
spend most of my time using two computers, a Mac SE and a DECstation
3100.  The Mac has an 8MHz 68000; the DEC has a 16MHz R2000, or
something like that.  If I have raw number crunching or megabytes of
data to process, I use the DEC.  If I've got word processing or
database work, I use the Mac--it's got a slower processor, but the
user interface is much better, so it's faster in terms of my time.

  Obligatory derogatory comment: Besides, graphics on my Mac is about
the same speed as X11R4 is on my DECstation.  (Grrr....)

  Now, if only we'd get some people improving user interfaces (or any
software) as quickly as hardware....

	Anton, waiting to see some new ideas in software
   
+---------------------------+------------------+-------------+
| Anton Rang (grad student) | rang@cs.wisc.edu | UW--Madison |
+---------------------------+------------------+-------------+

jgk@osc.COM (Joe Keane) (12/14/90)

Don't be fooled, it doesn't take a lot of CPU speed or memory to have a good
user interface.  The problem is that since more power is available, developers
will always trade it off for easier development and a longer feature list.
Some of the things that are done today would make PDP-11 programmers quite
sick.  Whether this is good or bad depends on your point of view.  One view is
that since we have so much CPU power and memory we shouldn't worry about using
a lot of it, and we shouldn't be applying values from ten years ago.  An
alternate view, which i tend to agree with, is that this carelessness leads to
overly complicated designs and systems which are harder, not easier, to debug
and maintain.  There are lots of new software paradigms, but what many of them
translate to is that we should add more and more to our software in order to
get the same thing done more easily.