[comp.sys.misc] User Interfaces

nick@lfcs.ed.ac.uk (Nick Rothwell) (04/14/89)

In article <3845@ficc.uu.net> peter@ficc.uu.net (Peter da Silva) writes:
>... When you can run a compile concurrently with Photon
>Paint then you can make the claim that the Mac supports background processing.

I think the Mac's almost there (is "almost" good enough? :-)) Certainly
I can run Kermit transfers behind the scenes. I'm typing this in from a
Mac running NCSA Telnet, which can happily scroll windows in the background.
We can also do background backups from Mac to Sun invisibly (apart from
activity on the Mac's hard disk).
   One of the problems is that the Mac world in inhabited by programs which
think that it's quite alright to throw up a modal dialogue and then go away
and think. Lightspeed C does this, although it could conceivably allow itself
to be backgrounded, giving the behaviour you want.
   Note my words: "allow itself". You're right: there's a burden on the
programmer with GetNextEvent() and all that stuff. This is sad. I think the
reason that the Mac doesn't timeslice is that it has to define exactly
when objects in memory might move around under the application. So: it only
happens at event-time. How does the Amiga handle this? Or doesn't it have
the same kind of heap management?

>Because on the Amiga, as on UNIX, the operating system handles time-slicing.
>Your program can completely ignore the user interface, sit on the CPU as long
>as it wants, and the system will happily chug along.

That isn't always what you want... If the application is "chugging along",
then it can't respond to asynchronous events generated by the user
interface, such as window-uncovering or highlighting - or should the
system do this, by hanging onto bitmap images? UNIX does it by virtue
of having another layer: my compilation runs fine in an X-window, because
there's another process emulating the terminal and responding to user
events. I don't think this is feasible for the hardware we're talking about.

>If you guys had written a decent conventional O/S for the Mac and stuck the
>user interface on top of it I'd be using it now instead.

There are benefits to the fact that the Mac's insides are tied together
in the way that they are: the event and dialog managers present a very
well crafted interface to the user (dialogs and menus come up before
refresh events take place, dialogs can be dismissed before they're fully
drawn, etc.). As a "power user", I really appreciate this attention to
detail. I don't know if a "conventional OS" can ever do this. X, SunTools
and the like don't. Of course, the downside is all the things you've
mentioned...

I'm sure we've had this conversation before... :-) I love my Mac. You love
your Amiga. But I'm all for healthy discussion of these kinds of issues...

>Peter da Silva

		Nick.
--
Nick Rothwell,	Laboratory for Foundations of Computer Science, Edinburgh.
		nick@lfcs.ed.ac.uk    <Atlantic Ocean>!mcvax!ukc!lfcs!nick
~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~
...while the builders of the cages sleep with bullets, bars and stone,
they do not see your road to freedom that you build with flesh and bone.

peter@ficc.uu.net (Peter da Silva) (04/15/89)

In article <1773@etive.ed.ac.uk>, nick@lfcs.ed.ac.uk (Nick Rothwell) writes:
>    Note my words: "allow itself". You're right: there's a burden on the
> programmer with GetNextEvent() and all that stuff. This is sad. I think the
> reason that the Mac doesn't timeslice is that it has to define exactly
> when objects in memory might move around under the application. So: it only
> happens at event-time. How does the Amiga handle this? Or doesn't it have
> the same kind of heap management?

The Amiga doesn't have the same kind of heap management. All memory is
fixed once allocated. Fragmentation is not so bad because most of the
short-lived objects are similar in size (mostly message packets).

> That isn't always what you want... If the application is "chugging along",
> then it can't respond to asynchronous events generated by the user
> interface, such as window-uncovering or highlighting - or should the
> system do this, by hanging onto bitmap images?

The Amiga supports three kinds of windows: SIMPLE_REFRESH, which work like
Mac windows and leave the refreshing to the application, SMART_REFRESH which
work like X windows with backing store, and SUPER_BITMAP which have a complete
off-screen image maintained by the O/S. In practice the overhead of SUPER_
BITMAP is considered too high so most programs use SIMPLE_REFRESH or SMART_
REFRESH. Also there are console windows (like XTERM windows), which are
actually SMART_REFRESH windows with a handler process attached. So you get
your choice of how you want your program to work.

> UNIX does it by virtue
> of having another layer: my compilation runs fine in an X-window, because
> there's another process emulating the terminal and responding to user
> events. I don't think this is feasible for the hardware we're talking about.

It's not only feasible but it's done. I can open up a CLI window, like an
xterm window with a shell in it, any time I want. The overhead is very low
because the overhead of Amiga tasks is very low, and the massive use of shared
libraries cuts down on memory usage.
-- 
Peter da Silva, Xenix Support, Ferranti International Controls Corporation.

Business: uunet.uu.net!ficc!peter, peter@ficc.uu.net, +1 713 274 5180.
Personal: ...!texbell!sugar!peter, peter@sugar.hackercorp.com.