[comp.society.futures] bumblebee cursors

nickyt@Apple.COM (Nick Turner) (04/20/88)

Peter Scott claims that buzzing bumblebee cursors take only a few instrux to
implement.  That may be true on his machine (what machine are you using, Peter?)
with its hardware-supported sprite animation, but on most bit-mapped screens
the cursor must be drawn one word at a time.  That means that to change the
cursor you do something like this:

   Replace current cursor block with (saved in heap) background data.
   Draw new cursor graphics.

This involves two block copies, one of them with an arbitrary boundary.  That
is not a huge load if your cursor is small but it does add up if you're doing
it 30 or so times per second.

Also keep in mind that on machines like the Mac and the Sun, the same sequence
of operations takes place every single time the cursor moves.  Talk about
squandering cycles!  No wonder the cursor becomes so jerky whenever you do
a disk I/O on the Mac.  No time for cursor updates, got to service the disk!

The answer is not to add more Mips... it's to farm out the disk and screen to
their own CPUs.

nickyt@apple.com

bzs@BU-CS.BU.EDU (Barry Shein) (04/20/88)

>The answer is not to add more Mips... it's to farm out the disk and screen to
>their own CPUs.
>
>nickyt@apple.com

What was that comment about the reincarnation cycle of graphics
engines.  You farm it out to an on-board CPU, then an off-board CPU,
then a separate intelligent "terminal", then once you have the
protocol straight you turn the front-end into a full-blown computer to
help with local storage needs, then you farm out the graphics to an
on-board CPU... :-)

	-B