[comp.sys.amiga.advocacy] OS Graphic Card Support: Part II!

2fmlhiccup@kuhub.cc.ukans.edu (02/12/91)

I want to clarify some of my opinions on the OS Graphic Card Suport I sent
recently...

A)  The graphics that come with the Amiga currently are wonderful for many
users that would not want to pay extra for 24-bit graphics standard.
B)  There are users that want and need 24-bit graphics and shy away from the
Amiga because it does not support higher graphic modes that are not standard,
but can be added on via a card.  

The problem is that because there is no real option to improve the graphics on
the Amiga, people consider it inferior to other computers that do.  If the
Amiga OS is not written to support better graphic modes with standardized
libraries, then it is really pointless to buy a graphics card.

Why isn't the OS being improved to handle higher resolutions/ more bitplanes,
etc?!?!

Solution (IMHO)
   Have all graphic card developers include a STANDARDIZED method of
communicating to the OS what resolutions the card supports, how it is stored in
memory(the image, hopefully in a rasterport compatable format), and how to
modify the pallette, etc.
   Have the Amiga OS call these routines to determine what modes are availabe. 
Add routines that a program may call to determine these modes.  Have the Amiga
OS be able to work with non-standard modes if they exist...

NOTE:  The program does not directly communicate with the card to find the
modes available, etc.  This is done by THE OS...  Therefore, one can put any
kind of graphic card in the machine, the OS can determine what it is...and any
program can use it because the program does not have to know how it works...

Any comments?

Jason Wilson - An Aspiring Amiga Programmer...

lron@uunet.uu.net (Dwight Hubbard) (02/13/91)

In article <28532.27b759c9@kuhub.cc.ukans.edu>, 2fmlhiccup@kuhub.cc.ukans.edu writes:

> I want to clarify some of my opinions on the OS Graphic Card Suport I sent
> recently...
>
> Why isn't the OS being improved to handle higher resolutions/ more bitplanes,
> etc?!?!

It isn't?  You mean all the neat new functions for handling things like the
A2024 monitor, PAL, NTSC, Superhires and Productivity  formats, ect don't exist
under 2.0?  If you have SAS 5.10 take a look at the include file
graphics/displayinfo.h in the 2.0 includes.

> Solution (IMHO)
>    Have all graphic card developers include a STANDARDIZED method of
> communicating to the OS what resolutions the card supports, how it is stored in
> memory(the image, hopefully in a rasterport compatable format), and how to
> modify the pallette, etc.

Why not just have them write a version of the graphics library for the card?
Of course the routines for handling the Copper wouldn't need to be supported
since the Card may not have that capability and the blitter and sprite routines
may need to be emulated depending on the card.  But it wouldn't require new
software to make use of the board either.

> NOTE:  The program does not directly communicate with the card to find the
> modes available, etc.  This is done by THE OS...  Therefore, one can put any
> kind of graphic card in the machine, the OS can determine what it is...and any
> program can use it because the program does not have to know how it works...

Sounds a lot like the AddMonitor command under 2.0 works.  It would just need
to be extended to allow the specification of the graphics.library name.

-------------------------------------------------------------------------------
-Dwight Hubbard                   | USENET  : easy!lron@uunet.uu.net          -
-Kaneohe, Hawaii                  | CFR     : 31:910/101 (Dwight Hubbard)     -
-------------------------------------------------------------------------------

greg@ccwf.cc.utexas.edu (Greg Harp) (02/13/91)

In article <28532.27b759c9@kuhub.cc.ukans.edu> 2fmlhiccup@kuhub.cc.ukans.edu writes:
[Jason's post about standardizing video cards so the computer can figure out and
 use any of them.]

>Any comments?

Yes.  Why not just DIG (Device-Independant Graphics) ?

You see, with DIG, any Joe off the street can build a video card for the Amiga,
write a driver for it, and *BAM* new graphics features.  If Commodore just
standardized the interface hardware then they'd have to set a limit somewhere.

With DIG, you wouldn't really be limited since later revisions of the OS could
be written to take advantage of new features.  If a hardware standard was set,
then the hardware would have to be eventually updated to add features.

BTW, Commodore was looking for some DIG experts a while back.  Makes ya' 
wonder, eh?

>Jason Wilson - An Aspiring Amiga Programmer...

Greg
-- 
------------------------------------------------------------------------------
"I don't know what it is I like about you, but I like it a lot." --
                                         Led Zeppelin, Communication Breakdown
-------Greg-Harp-------greg@ccwf.cc.utexas.edu-------s609@cs.utexas.edu-------

cs326ag@ux1.cso.uiuc.edu (Loren J. Rittle) (02/13/91)

Jason Wilson writes:
> Have all graphic card developers include a STANDARDIZED method of
> communicating to the OS what resolutions the card supports, how it is stored
> in memory(the image, hopefully in a rasterport compatable format), and how
> to modify the pallette, etc.
<etc...>

Ahh, what you want is AmigaQu*ckDr*w... :-) :-)

Note: although I hate many aspects of the Mac, I will gladly concede
that I believe Apple's QuickDraw(tm) to be one of the best implemented
features of the programmers view of the Mac system.  I.e. it does not
drive a Mac programmer crazy...

Yes, it is what is needed to provide RTG, but it must come from
Commodore.  *And* it might have to be incompatible with the current
standard we now enjoy (or hate :-).

Loren J. Rittle
Well, maybe NewTek could set the standard... :-0
-- 
``NewTek stated that the Toaster *would not* be made to directly 
  support the Mac, at this point Sculley stormed out of the booth...''
-A scene recently seen in an Amiga vendor's booth at the last MacExpo.
Gee, you wouldn't think that an Apple Exec would be so worried about

gblock@csd4.csd.uwm.edu (Gregory R Block) (02/13/91)

From article <44104@ut-emx.uucp>, by greg@ccwf.cc.utexas.edu (Greg Harp):
> In article <28532.27b759c9@kuhub.cc.ukans.edu> 2fmlhiccup@kuhub.cc.ukans.edu writes:
> [Jason's post about standardizing video cards so the computer can figure out and
>  use any of them.]
> 
>>Any comments?
> 
> Yes.  Why not just DIG (Device-Independant Graphics) ?
> 
> You see, with DIG, any Joe off the street can build a video card for the Amiga,
> write a driver for it, and *BAM* new graphics features.  If Commodore just
> standardized the interface hardware then they'd have to set a limit somewhere.
> 
> With DIG, you wouldn't really be limited since later revisions of the OS could
> be written to take advantage of new features.  If a hardware standard was set,
> then the hardware would have to be eventually updated to add features.
> 
> BTW, Commodore was looking for some DIG experts a while back.  Makes ya' 
> wonder, eh?
> 

So, let's say they do their OS rewrite.  They could remove ALL code that uses
the custom chips.  Let's just say we want to apply DIG to the sound chips, too.
And maybe the blitter....  Anyways...  Apply it to anything we can.

Once removed, they could replace it with code that calls external drivers.  The
driver's name would default to the one for the custom chipset, which would
exist in ROM.  Perhaps.  Then, one would just have to set it to use other
drivers...  Who knows.  Perhaps set it up so that it reads the name of the
drivers to use from the devs:system-configuration file, so that the monitor
gets the output from the VERY beginning.....

Okay.  Now, let's see.  What's left?  How to handle the blitter moving data
to and from the card...  Maybe you could set it up so that there is a dma
chip on the card itself?   Or something.  As you can see, my forte' is not
hardware... :)
						Greg
-------------------------------------------------------------------------------
gblock@csd4.csd.uwm.edu | Amigas, amigas everywhere, but not a one can think.
-------------------------------------------------------------------------------

kdarling@hobbes.ncsu.edu (Kevin Darling) (02/13/91)

2fmlhiccup@kuhub.cc.ukans.edu asks:

> Why isn't the OS being improved to handle higher resolutions/ 
> more bitplanes, etc?!?!

I rather suspect that this is being done.  You can see evidence of it
in some of the 2.0 defs.  The CBM folk don't seem to be dummies...
Just overworked :-).

> Any comments?

Sure.  As with any computer, much of the speed of Amiga applications comes
from programmers directly diddling the gfx memory.  Obviously, either
this must stop, or else you'll still need separate versions for each and
every new piece of hardware... no matter what CBM does with DIG.
For instance, if a new display card used chunky pixels, instead of
separate bitplanes, direct diddling methods would differ.

People may make fun of the slowness of Atari VDI, NeXT Display Postscript,
and other DIG systems, but those guys can improve their hardware over time
without losing their apps.  On the Amiga, you can probably count the major
programs which use only the indirect gfx calls, on one badly mangled hand.

Unfortunately, for superfast realtime gfx work, I can't see any way to ask
programmers to use only DIG on _any_ machine right now <sigh>.  Still,
you might want to push at the app writers, instead of at CBM.  OTOH, if
you're happy with two monitors (one for "normal" apps, one for new DIG
apps on another board), then some of the above doesn't apply.

My second comment is this: CBM gains zippo by keeping their current DIG
work quiet... with the proliferation of new gfx boards right now, they
should release at least some guidelines and ideas.  Now, perhaps they
have already done this with gfx board developers... we just don't know.
But we can hope so!  best - kevin <kdarling@catt.ncsu.edu>

svante@kberg.se (Svante Gellerstam) (02/14/91)

In article <28532.27b759c9@kuhub.cc.ukans.edu> 2fmlhiccup@kuhub.cc.ukans.edu writes:
>A)  The graphics that come with the Amiga currently are wonderful for many
>users that would not want to pay extra for 24-bit graphics standard.

The standard in itself is not expensive - the actual implementation
is. Ie: the machine can be prepared to use 24-bit gfx, but might not
be equipped with it. 

>B)  There are users that want and need 24-bit graphics and shy away from the
>Amiga because it does not support higher graphic modes that are not standard,
>but can be added on via a card.  
>
>The problem is that because there is no real option to improve the graphics on
>the Amiga, people consider it inferior to other computers that do.  If the
>Amiga OS is not written to support better graphic modes with standardized
>libraries, then it is really pointless to buy a graphics card.

All current implementations (bar a rare few) of gfx cards for the
Amiga are niche implementations, solving specific problems. They
provide what people need in a few branches of the market.

Presently there is nothing for the rest of us - better gfx, or rather
the possibility to better gfx, for the majority of the software.

>Why isn't the OS being improved to handle higher resolutions/ more bitplanes,
>etc?!?!

It draws resources to design and keep a standard up to date. The only
single party in the market who can device a standard for this is
Commodore and they have a lot to do right now. Maybe there is not
enough resources to research into the subject?

Whatever the answer might be, it will cement the current approach to
handling gfx on the Amiga. This is sad because a lot of good cards
exist, but cannot reach the masses (and the low Amiga friendly prices)
due to the lack of a generalized interface.

The importance of a standard software interface towards the system can
be studied in the hard disk arena... In the beginning all HDs were slow
and expensive; with FFS, HDs became expensive(ish) and fast; now hard
drives are approaching the same prices as for other platforms, but are
generally a lot faster.

A well defined software interface lets the HW developers concentrate
on HW design, without the worries of small series, special solution
and software designed from scratch.

>Solution (IMHO)
>   Have all graphic card developers include a STANDARDIZED method of
>communicating to the OS what resolutions the card supports, how it is stored in
>memory(the image, hopefully in a rasterport compatable format), and how to
>modify the pallette, etc.
>   Have the Amiga OS call these routines to determine what modes are availabe. 
>Add routines that a program may call to determine these modes.  Have the Amiga
>OS be able to work with non-standard modes if they exist...
>
>NOTE:  The program does not directly communicate with the card to find the
>modes available, etc.  This is done by THE OS...  Therefore, one can put any
>kind of graphic card in the machine, the OS can determine what it is...and any
>program can use it because the program does not have to know how it works...

I am leaning towards the device.driver type of interface. The card may
contain som AutoBoot ROMs which tells the system what kind of bits per
pixels and gfx memory layout to expect.

This driver should contain a basic set of graphics IO commands. The
main idea is to allow accelerated gfx boards use the gfx processor for
vanilla operations. How about:

 * Read / Write pixels
 * Draw lines
 * Draw areas
 * Set Drawing Mode - JAM1 | COMPLEMENT etc
 * Draw Text

The basic Draw(), *Pixel(), Area*()and so on commands would simply map
to call to the device driver with a very modest overhead. Maybe
operations could be queued up for a driver by the system and then
executed in batch with very low overhead by some kind of GfxSync()
call a la X-windows.

If one abstracts the Amiga gfx subsystem of today, it consists of a
graphics processor and a lot of shared memory. It is possible to create
the Amiga windowing system without having the CPU touching CHIP. But
for providing the gfx data the gfx processor needs. The CPU doesn't
need to write pixels and such stuff.

Refine this model, and all gfx functions can be made through a single
IO port, leaving the gfx subsystem with all the bandwidth it needs,
and the processor going full tilt in its own memory. High CPU
performance and gfx troughput.

The idea does not exclude the use of shared memory and BitMap in CPU
addressable space. The driver takes care of those details.

If the OS then would allow for multi WB-screen setups, we would have
the base for all graphics processing / composing systems.

Here's my first draft of a working idea. Let's take it apart and
improve it. No comments are bad comments, merely a statement of
interest in the issue...

>Jason Wilson - An Aspiring Amiga Programmer...


-- 
Svante Gellerstam		svante@kberg.se, d87sg@efd.lth.se

svante@kberg.se (Svante Gellerstam) (02/14/91)

In article <1991Feb13.064714.9347@ncsuvx.ncsu.edu> kdarling@hobbes.ncsu.edu (Kevin Darling) writes:
>Sure.  As with any computer, much of the speed of Amiga applications comes
>from programmers directly diddling the gfx memory.  Obviously, either
>this must stop, or else you'll still need separate versions for each and
>every new piece of hardware... no matter what CBM does with DIG.
>For instance, if a new display card used chunky pixels, instead of
>separate bitplanes, direct diddling methods would differ.

Basically (the big differance, that is) the speed comes from the
Blitter and Copper. Even most games tends to be heavily blitter
oriented.

Most utility programs uses system calls for display update for
compatibility across OS revisions (with different degrees of success
:-). Those calls could be Blitter or CPU dependant - the application
doesn't know or care.

Unfortunately almost the entire gfx subsystem of today rotates around
the BitPlane concept (naturally). A few calls work independently (bar
the byte color def), (Write|Read)Pixel(), Move() and Draw() and so on.
A number of new calls duplicating the functionality of Blit* and other
Plane related calls would open the doors for DIG or RTG or whatever on
chooses to call it.

>Unfortunately, for superfast realtime gfx work, I can't see any way to ask
>programmers to use only DIG on _any_ machine right now <sigh>. Still, 
>you might want to push at the app writers, instead of at CBM. OTOH, if 
>you're happy with two monitors (one for "normal" apps, one for new DIG 
>apps on another board), then some of the above doesn't apply.  

Every Amiga is equipped with the PAD that made Amigas reputation over
the world. Since it is in there, the 1084 / 1950 or whatever still has
a function. A funny thing is that even the extreme video pros are not
prepared to look away from the games market...

>My second comment is this: CBM gains zippo by keeping their current DIG 
>work quiet... with the proliferation of new gfx boards right now, they 
>should release at least some guidelines and ideas. Now, perhaps they 
>have already done this with gfx board developers... we just don't know.  
>But we can hope so!

I also think that if Commodore told the world where they are in the gfx
race, many more would-be gfx card manufacturers would see the point in
making their gfx technology available to the Amiga market.

I also suspect that Commodore isn't actively trying to inform 3rd
party gfx HW developers. Anyone who knows different?

<kdarling@catt.ncsu.edu>


-- 
Svante Gellerstam		svante@kberg.se, d87sg@efd.lth.se

daveh@cbmvax.commodore.com (Dave Haynie) (02/15/91)

In article <1991Feb13.064714.9347@ncsuvx.ncsu.edu> kdarling@hobbes.ncsu.edu (Kevin Darling) writes:

>People may make fun of the slowness of Atari VDI, NeXT Display Postscript,
>and other DIG systems, but those guys can improve their hardware over time
>without losing their apps.  

Only if their apps use the DIG interface.  On the NeXT, you'd expect them all
to do so, since such a protected memory system makes it virtually impossible
to naturally do otherwise.  On the Atari, as I gather, the VDI is unused by
lots of programs.

>Unfortunately, for superfast realtime gfx work, I can't see any way to ask
>programmers to use only DIG on _any_ machine right now <sigh>.  

You can always ask programmers to do lots of things, but they won't always
do any of them.  So more than anything, you have to point out to these guys
how evil it is to do The Wrong Thing, and make them understand, without any
doubt, way in advance, that doing The Wrong Thing will eventually make their
program fail, and they'll have only themselves to blame.  If they're willing
to accept that, than so be it.



-- 
Dave Haynie Commodore-Amiga (Amiga 3000) "The Crew That Never Rests"
   {uunet|pyramid|rutgers}!cbmvax!daveh      PLINK: hazy     BIX: hazy
	"What works for me might work for you"	-Jimmy Buffett

farren@sat.com (Michael J. Farren) (02/19/91)

daveh@cbmvax.commodore.com (Dave Haynie) writes:

>On the Atari, as I gather, the VDI is unused by lots of programs.

For the Atari version of SSI's STORM ACROSS EUROPE, *I* certainly didn't.
Using VDI would have meant an inordinate penalty in terms of overhead,
in both code and data terms.  Not only that, the speed penalty was
significant, as were the limitations of the VDI interface.  Things
which the Amiga's OS made easy were impossible with the Atari VDI - and
I'm talking about simple things like multicolored cursors, or text
modes other than JAM2 (with no control of the background color, yet!).
Perhaps it's just that I am not familiar enough with the intricacies of
the Atari VDI to get it to do what I wanted - but when it took only a
couple of hours to write a custom routine which did *EXACTLY* what I
needed, as opposed to many hours or even days trying to coax the VDI
into giving me something *CLOSE* to that, my choice was clear.

In short: if anyone builds a VDI equivalent for the Amiga, you'd be
doing us all a disservice if it were not reasonably close to the "bare"
system interface, both in speed and in utility.

daveh@cbmvax.commodore.com (Dave Haynie) (02/22/91)

In article <1991Feb20.090728.2384@kberg.se> svante@kberg.se (Svante Gellerstam) writes:
>In article <1991Feb18.210422.5446@sat.com> farren@sat.com (Michael J. Farren) writes:

>>In short: if anyone builds a VDI equivalent for the Amiga, you'd be
>>doing us all a disservice if it were not reasonably close to the "bare"
>>system interface, both in speed and in utility.

>How would this be done? I mean one has to account for most types of
>major hardware in the area. Advantage has to be taken automatically of
>accelerators etc. Please elaborate.

Essentially, what you want here is a graphics model that exists on several
levels.  At the lowest level, you get real simple stuff, like "draw a pixel",
"draw several pixels", "read a pixel", whatever.  On top of that, you get a
commands like "draw a line", "draw a box", "move a region", "draw a character",
etc.  You might even go higher.  The details aren't really that important.  The 
basic idea is this.  On a dumb display, nothing more than visible memory, the 
graphics interface sends such commands to a program that's running on the host
CPU itself, which will be responsible for implementing all the commands.  It
could do them all via the very low level functions.  Move to something like
the Amiga, where you have a programmable graphics engine, and these commands
get sent to a program on the CPU which converts the commands to blitter
operations, at least for things like block operations, line draw, etc. that
Agnus does well.  Move to something like the A2410, with a graphics oriented
processor, and the commands are sent directly to that processor for 
interpretation, leaving the system's CPU totally free of the process.

Lots of graphics.library could work this way.  Other pieces, like blitter and
copper specific things, would be somewhere between real ugly and impossible
to implement on other display devices.  Although I understand the general
problem, I don't pretend to know how our software folks may be planning to
deal with device independence.

>Svante Gellerstam		svante@kberg.se, d87sg@efd.lth.se


-- 
Dave Haynie Commodore-Amiga (Amiga 3000) "The Crew That Never Rests"
   {uunet|pyramid|rutgers}!cbmvax!daveh      PLINK: hazy     BIX: hazy
	"What works for me might work for you"	-Jimmy Buffett

farren@sat.com (Michael J. Farren) (02/24/91)

svante@kberg.se writes, in reply to my posting:

>>In short: if anyone builds a VDI equivalent for the Amiga, you'd be
>>doing us all a disservice if it were not reasonably close to the "bare"
>>system interface, both in speed and in utility.
>
>How would this be done? I mean one has to account for most types of
>major hardware in the area. Advantage has to be taken automatically of
>accelerators etc. Please elaborate.

All I was saying was that any VDI scheme that anyone comes up with for
the Amiga should, at the very least, give you all of the capability that
the native ROM calls do.  A VDI with less functionality just won't be
used, or will at least cause much griping and moaning amongst the poor
programmers.  As it is, there are too many times you have to code your own
routines to provide capabilities that AmigaDOS (well, Exec, actually)
doesn't give you.  If a hypothetical VDI does not, for example, provide
the ability to specify the APen and Bpens, along with at least the four
basic text modes JAM1, JAM2, COMPLEMENT, and INVERSE, it won't be used.
Not by *me*, at any rate, unless I absolutely have to, and even then I'll
bitch about it.
-- 
+-----------------------------------------------------------------------+
| Michael J. Farren                                      farren@sat.com |
|                        He's moody, but he's cute.                     |
+-----------------------------------------------------------------------+

svante@kberg.se (Svante Gellerstam) (02/25/91)

In article <1991Feb23.203900.27912@sat.com> farren@sat.com (Michael J. Farren) writes:
>svante@kberg.se writes, in reply to my posting:
>
>>>In short: if anyone builds a VDI equivalent for the Amiga, you'd be
>>>doing us all a disservice if it were not reasonably close to the "bare"
>>>system interface, both in speed and in utility.
>>
>>How would this be done? I mean one has to account for most types of
>>major hardware in the area. Advantage has to be taken automatically of
>>accelerators etc. Please elaborate.
>
>All I was saying was that any VDI scheme that anyone comes up with for
>the Amiga should, at the very least, give you all of the capability that
>the native ROM calls do.  A VDI with less functionality just won't be
>used, or will at least cause much griping and moaning amongst the poor
>programmers.  As it is, there are too many times you have to code your own
>routines to provide capabilities that AmigaDOS (well, Exec, actually)
>doesn't give you.  If a hypothetical VDI does not, for example, provide
>the ability to specify the APen and Bpens, along with at least the four
>basic text modes JAM1, JAM2, COMPLEMENT, and INVERSE, it won't be used.
>Not by *me*, at any rate, unless I absolutely have to, and even then I'll
>bitch about it.

Ok, I just wanted you to spell it out for me. The thing I have been
trying to get acceptance for (breaking down open doors?) is some
proposal towards a DIG standard for Gfx cards.

Of course any graphics card should be useable through the ROM
routines, and behave as intended with standard graphics calls. All
flags and emulateable modes should be taken care of. Anything else
glues that hardware into a niche solution.

Now, when I have you attention, I would like to hear (see :-) if you
have any suggestions to put forward on how a device driver for ANY gfx
card should be put together.

>| Michael J. Farren                                      farren@sat.com |

/svante

-- 
Svante Gellerstam		svante@kberg.se, d87sg@efd.lth.se

daveh@cbmvax.commodore.com (Dave Haynie) (02/27/91)

In article <1991Feb25.090714.3701@kberg.se> svante@kberg.se (Svante Gellerstam) writes:

>Ok, I just wanted you to spell it out for me. The thing I have been
>trying to get acceptance for (breaking down open doors?) is some
>proposal towards a DIG standard for Gfx cards.

Nothing wrong with trying.  Thing is, lots of people will tell you to wait
until C= comes out with one.  Seems to me that's not absolutely necessary;
if a standard is truly device independent, then you can build a soft device
driver for one standard to support all devices in another standard.  I would
personally rather see programs even support two standards, rather than 100
different low level hardware register conventions, on a per-program basis,
like they do on the PC.

>Now, when I have you attention, I would like to hear (see :-) if you
>have any suggestions to put forward on how a device driver for ANY gfx
>card should be put together.

There are two basic approaches.  The simplest is the way Microsoft Windows,
or in a sense the Bridge Card, does it.  Basically, it goes something like 
this.  The graphics.library agent is responsible for building the bitmap in 
its internal format, which doesn't necessarily correspond to any physical 
bitmap format.  All this library actually needs to know about the target 
graphics card is the resolution desired.  When it completes an operation, the 
graphics.library sends an update message to the display's device driver.  This
driver then proceeds to translate the internal bitmap form to something it 
can live with, and the image appears.  On the Amiga, an extended graphics 
library could still use the blitter to do all rendering, etc. so this has the 
advantage of being a rather small leap from where we are now; the library 
basically needs the ability to communicate with video display device drivers.
Another advantage is that device drivers are relatively simple.  The main 
disadvantage of this approach is that it's slow, and very difficult to 
improve with more advanced hardware.

The other approach is to treat the graphic library like an object description
language, more along the lines of the Mac's QuickDraw, HP's laser printer
command set, or Postscript (though Postscript is more far more sophisticated).
In this case, most of the drawing commands get sent to the device driver for 
the alternate display.  That device driver interprets each command for its
particular display's characteristics.  While some bitmap translation will
occur, the display driver will also be handling higher level commands.  This
has the effect of making the device driver much more complex, but allows it
to easily use an alternate processor to manage the bitmap rendering, as most
Postscript devices, and a few high end Mac display cards, so today.

>Svante Gellerstam		svante@kberg.se, d87sg@efd.lth.se


-- 
Dave Haynie Commodore-Amiga (Amiga 3000) "The Crew That Never Rests"
   {uunet|pyramid|rutgers}!cbmvax!daveh      PLINK: hazy     BIX: hazy
	"What works for me might work for you"	-Jimmy Buffett

svante@kberg.se (Svante Gellerstam) (03/01/91)

In reply to Dave Haynie -- somehow this article got lost in space
between my feed and our machine. That's why references are wrong. Here
it is anyway.

>In article <1991Feb25.090714.3701@kberg.se> svante@kberg.se (Svante Gellerstam) writes:

>>Ok, I just wanted you to spell it out for me. The thing I have been
>>trying to get acceptance for (breaking down open doors?) is some
>>proposal towards a DIG standard for Gfx cards.

>Nothing wrong with trying.  Thing is, lots of people will tell you to wait
>until C= comes out with one.  Seems to me that's not absolutely necessary;
>if a standard is truly device independent, then you can build a soft device
>driver for one standard to support all devices in another standard.  I would
>personally rather see programs even support two standards, rather than 100
>different low level hardware register conventions, on a per-program basis,
>like they do on the PC.

I, and some others I guess, are ready to do this if it weren't for the
time involved in doing the job right. Our company works with importing
professional products for the Amiga. We also do some development on
important products, something wich may or may not grow in the future.
The work I do is mainly to provide support and develop the
possibilities of our new products. 

What I am trying to say is that I could do this for work if I had the
time - I would gladly share it. A little sacrifice here usually gives
a large return there. What I am trying to do now is to use the open
minds of all netters to find out what the task really would amount to.
I am the first to admit that I know to little now to be able to do
anything important in the field. But I am aching to learn - as I see
that this, for the professional user, very important field is totally
unattended (bar niche solutions). 

There are simply NO gfx cards running WB applications, presently.
There should be.

>>Now, when I have you attention, I would like to hear (see :-) if you
>>have any suggestions to put forward on how a device driver for ANY gfx
>>card should be put together.

>There are two basic approaches.  The simplest is the way Microsoft Windows,
>or in a sense the Bridge Card, does it.  Basically, it goes something like 
>this.  The graphics.library agent is responsible for building the bitmap in 
>its internal format, which doesn't necessarily correspond to any physical 
 [ stuff deleted...]

>The other approach is to treat the graphic library like an object description
>language, more along the lines of the Mac's QuickDraw, HP's laser printer
>command set, or Postscript (though Postscript is more far more sophisticated).
 [ stuff deleted...]

As I see it we need a graphics.library that is basically device
independent. This would mean that all present (nice) applications
would run right away, problably without taking advantage of color or
resolution but never the less running.

Gfxlib would then talk to a device.driver of some sort. We agree this
far (no?). I propose that the form of this driver is the core of the
problem. I may be wrong.

With that out of the the way, I merrily proceed to open fire (in
current terms :-|) in a maybe wrong direction.

There is a set of basic drawing commands:

	* Set / Get pixel
	* draw line
	* fill area (may be put into the area BLIT group of function)
	* other drawing primitives (circles, arcs, polygons etc)

These should be easy to implement on any gfx card utilizing all
resources of that card.

Then there is a set of functions that may clash with abilities on
board any given gfx card:

	* BLIT functions
	* Text rendering
	* Image rendering (pixel based gfx)

They clash in the sense that the data the driver is presented with may
have to be translated into another form, taking up memory and time. I
guess this is why you say it's the slow way.

The behaviour of the gfx.device could be of many types. I see two basic
classes that are represented:

	* Syncronous (like today, calls to graphics.library map
	  directly to addresses in gfx.device, via f x SetFunction())
	* ASyncronous (like X-Windows does it, queue up a bunch of
	  calls and send a Gfx_Sync command to make it appear)

I vote for the synchronous in this case because it will reqire no mods
or additions to the current gfx redering calls in graphics.library.
The driver may do some queueing itself if that's appropiate for that
specific card. 

That is the OS -> card high level part.

The new thing is that the OS has to be able to find out, and
preferably take advantage of the abilities in:

	* raw pixel resolution
	* Display DPI (for scale sensitive apps. DTP, CAD etc)
	* number of colors
	* possibly other esoteric functions of the card (like
	  genlocking in an external video signal in a window etc.)

These has to be taken care of through a new set of calls (I assume,
knowing little about these things under WB 2.0)

The main difference here is that any given users Amiga may have any
number of available resolutions. We cannot rely on things being
static, with Commodore handing out new ScreenMode flags to accepted
resolutions. Instead the user should by menu select from the available
resolutions the one WB should run on etc. Apps. has to have the
ability to tell this selector what kinds of resolutions it will accept
and what kinds it cannot use. F x a paint or video program may have
some use for HAM, but a word processor won't like it and so on. 

This seems to me to be a good conceptual baseplate (it's my own idea
:-) to discuss pros or cons in software for this or that card. Ok, any
jokes aside, this is a proposition for the gfx.device concept. This
concept has to be well defined and thought out. I have nor the
knowledge or the time to hammer details out, but I trust some of you
out there has. Please think this through and compare it (conceptually)
to other systems you know about. I will gladly engage in any
discussion. 

By being able to open multiple screens on multiple monitors, apps.
could provide the base for visual interaction in a network of Amigas.
F x artists put up their art on a clipboard (hmm if the
clipboard.device had its own screen...) to be used by the page makeup
people and so on. Dream on... 

> Dave Haynie Commodore-Amiga (Amiga 3000) "The Crew That Never Rests"

/svante
-- 
Svante Gellerstam		svante@kberg.se, d87sg@efd.lth.se