dca@toylnd.UUCP (David C. Albrecht) (04/06/91)
My problem with Commodore's non-improvement of the custom chips is not with their limitations in the A500 and A2000 class machines but rather that they are still the same in the A3000 class machines. Realistically, most of the operations the blitter can do are better done by the 68030 in the 3000. As Commodore has not demonstrated any ability to vastly improve their custom chips I would say it's time to move to solutions which lessen their dependence on said chips. As processors get faster the situation will only get worse. While sticking all these functions in custom chips have economic reasons I think given the way they seem frozen in stone it might make better sense to use a little more off the shelf external chips. Use the external RAMDAC that has built-in aliasing which is being incorporated in some PC VGA boards rather than waste chip real estate on the weenie RAMDAC we currently have. My impression was that this RAMDAC chip can give an apparent resolution increase without a corresponding video memory size penalty. Eliminate the blitter functions entirely and instead have the memory write which invokes the blitter interrupt the main processor and have it do the blitter functions. Use the increased real estate and RAMDAC performance this gives you to support more and better video modes. Keep the copper but hopefully use the space from the eliminated blitter/RAMDAC (I don't remember if these are even in the same chip with the copper) to make it support faster bus speeds. Maybe even go with external 16 bit D/As for sound (use the same ones in the CDTV?). Admittedly these will increase the complexity and cost of the 3000 motherboard but frankly that the 3000 runs a chip set of essentially the same speed and capability of the 1000 is a crime. David Albrecht
daveh@cbmvax.commodore.com (Dave Haynie) (04/16/91)
In article <336@toylnd.UUCP> dca@toylnd.UUCP (David C. Albrecht) writes: >...While sticking all these functions in custom chips have economic reasons >I think given the way they seem frozen in stone it might make better sense >to use a little more off the shelf external chips. Nothing wrong with off the shelf parts, of course. But you as long as you're comparing what's built into the Amiga with what's added on to a PC or a Mac, you're not speaking the same language. If your video is optional, you can provide cheap and expensive solutions, but if your video is built-in, you can generally offer only one built-in setup. Since a substantial amount of the Amiga revnues come from low cost systems, you can expect that any future advances must be something that can ultimately be used in a low cost machine. Of course, there are already lots of higher performance display boards for the Zorro II bus, it's just that programmers don't support them. When you're programming the built-in display chips, you have a nice big graphics toolbox, just like on the Mac. To support any other display board, at present, you have to resort to programming the hardware, like they do on the PClones. In other words, this is mainly a software problem. >Use the external RAMDAC that has built-in aliasing which is being incorporated >in some PC VGA boards The Edsun chip, I presume. >rather than waste chip real estate on the weenie RAMDAC we currently have. We currently don't have a RAMDAC. We have a CLUT, within the Denise chip, and an external DAC. We will always have a similar configuration in an Amiga chip set, since the CLUT needs to support special magic like HAM and Copper cycles. And high end Amigas have a video slot, which makes available the digital video output from the CLUT, something you never have access to with a conventional PC/Mac style RAMDAC. >My impression was that this RAMDAC chip can give an apparent >resolution increase without a corresponding video memory size penalty. That's true, as it is with HAM. These chips have several tricks they can perform, though in order to be compatible with PC RAMDACs, they have special modes that trade some of the 256 color CLUT entries for neat tricks. You kick the chip into a special mode, and all of a sudden, some of the pixel numbers no longer reference the CLUT, but special modes and functions. The simplest is a CLUT entry reload, kind of a static version of what the Amiga's copper can do with color table entries. You need, of course, about 6-8 consecutive pixels of the same color to sneek one of these changes in. The hardware dithering works similarly; a magic pixel value kicks it into dither mode, the next pixel value tells it the ending color, and then more magic values tell it the mix from the current value to the ending value. Also like HAM, such pictures take some amount of preprocessing from a traditional representation (prehaps 16 or 24 bit color) before they look good. >Eliminate the blitter functions entirely and instead have the memory write >which invokes the blitter interrupt the main processor and have it do the >blitter functions. You could do that, in fact, there was a toy on BIX that did this with some success on the A3000. However, while its true the 68030 on an A3000 can do some of the graphics functions faster than the blitter, keep in mind that unless you're totally graphics display limited, it still makes sense to have both, since they operate in parallel. >Use the increased real estate and RAMDAC performance this gives you to support >more and better video modes. Blitter and CLUT are in separate chips. The complexity of one does not affect the complexity of the other. -- Dave Haynie Commodore-Amiga (Amiga 3000) "The Crew That Never Rests" {uunet|pyramid|rutgers}!cbmvax!daveh PLINK: hazy BIX: hazy "That's me in the corner, that's me in the spotlight" -R.E.M.