bill@pro-gateway.cts.com (Bill Long, SysOp) (12/05/90)
I'm an Apple II user and try to somewhat keep up with the Mac world, and I have a couple of questions about 24-bit graphics. 1) Why is 24-bit better than 32-bit? 2) How many of the 16 million colors can be on-screen at once? Thanx... | ProLine: bill@pro-gateway |Internet: bill@pro-gateway.cts.com | UUCP: crash!pro-gateway!bill | ARPA: crash!pro-gateway!bill@nosc.mil | BITNET: bill%pro-gateway.cts.com@nosc.mil +----------"Maturity is overrated" - Garfield-------->Pro-Gateway 214/644-5113
peirce@outpost.UUCP (Michael Peirce) (12/06/90)
In article <6110@crash.cts.com>, bill@pro-gateway.cts.com (Bill Long, SysOp) writes: > > I'm an Apple II user and try to somewhat keep up with the Mac world, and I > have a couple of questions about 24-bit graphics. > > 1) Why is 24-bit better than 32-bit? 24-bit isn't better than 32-bit. 32-bit color is 24-bit color plus 8-bits for an alpha channel. Since Apple hasn't defined what the alpha channel is suppose to do (though some third parties have), this extra 8-bits of data isn't used by video cards. Often these two terms are used interchangeably. > 2) How many of the 16 million colors can be on-screen at once? With 24-bit color each pixel can fully specify a color (8 bits of red, blue, & green) so you can display as many colors on the screen as you have pixels. -- michael -- Michael Peirce -- {apple,decwrl}!claris!outpost!peirce -- Peirce Software -- Suite 301, 719 Hibiscus Place -- Macintosh Programming -- San Jose, California 95117 -- and Consulting -- (408) 244-6554
dcw@cunixa.cc.columbia.edu (David C. Worenklein) (12/06/90)
In article <6110@crash.cts.com> bill@pro-gateway.cts.com (Bill Long, SysOp) writes: >1) Why is 24-bit better than 32-bit? On the flip side... If the human eye can only discern ~10 million shades of color, when is 32-bit color ever needed? -David
marmoset@mondo.engin.umich.edu (Dave Walker) (12/06/90)
In article <6110@crash.cts.com> bill@pro-gateway.cts.com (Bill Long, SysOp) writes: >I'm an Apple II user and try to somewhat keep up with the Mac world, and I >have a couple of questions about 24-bit graphics. > >1) Why is 24-bit better than 32-bit? > Actually, it's not. The confusion arises because people sometimes use 24-bit and 32-bit to describe the same products. Apple's 32-bit Quickdraw system software supports pixels with as much as 32 bits of information each. These pixels are broken down as follows: -8 bits of red -8 bits of green -8 bits of blue -8 undefined bits (known as the "alpha channel") which can be used for video compositing, transparency information, etc. Some video overlaying hardware and image manipulation programs use the alpha channel for their own purposes. Most video boards currently ownly deal with the red, green, and blue channels. >2) How many of the 16 million colors can be on-screen at once? > It depends on your monitor's size. A 640X480 monitor, like Apple's 13" High-Resolution Color Monitor, can display 640*480= 307200 colors, while a 1024X768 19" monitor could display 1024*768= 786432 colors. >Thanx... > >| ProLine: bill@pro-gateway >|Internet: bill@pro-gateway.cts.com >| UUCP: crash!pro-gateway!bill >| ARPA: crash!pro-gateway!bill@nosc.mil >| BITNET: bill%pro-gateway.cts.com@nosc.mil >+----------"Maturity is overrated" - Garfield-------->Pro-Gateway 214/644-5113 +------------------------------------------------------------------------+ | Dave Walker, Detroit Art Services (DAS) | | marmoset@ub.cc.umich.edu "I don't read, I just guess" | | marmoset@mondo.engin.umich.edu -Happy Mondays, "Wrote For Luck" | +------------------------------------------------------------------------+
awessels@ccwf.cc.utexas.edu (Allen Wessels) (12/06/90)
In article <1990Dec5.163911.8000@cunixf.cc.columbia.edu> dcw@cunixa.cc.columbia.edu (David C. Worenklein) writes: >If the human eye can only discern ~10 million shades of color, when is 32-bit >color ever needed? That depends on how that 10 million was figured. It may be that someone figured out that the least noticeable difference in color the human eye can detect is 1/~10millionth of the visible spectrum. That doesn't necessarily mean that the eye can't detect differences between 2 24 bit pictures whose pixel's colors varied within that range. Also, it is a bad idea to use a general figure to model the extremes of sensetivity. While _everyone_ might not need or even appreciate the richness of 24-bit color, there may be quite a few people who can sense the difference. > -David - Allen
russotto@eng.umd.edu (Matthew T. Russotto) (12/06/90)
In article <6110@crash.cts.com> bill@pro-gateway.cts.com (Bill Long, SysOp) writes: >I'm an Apple II user and try to somewhat keep up with the Mac world, and I >have a couple of questions about 24-bit graphics. > >1) Why is 24-bit better than 32-bit? It isn't. 32-bit graphics means one of two things: 24-bit graphics with 8 bytes of padding, or 24-bits of RGB, plus 8 bits used as something else ( transparancy, etc) >2) How many of the 16 million colors can be on-screen at once? Using an Apple 8*24 or 8*24GC: 307,200 (one for each pixel of the screen) -- Matthew T. Russotto russotto@eng.umd.edu russotto@wam.umd.edu .sig under construction, like the rest of this campus.
hood@cbmvax.commodore.com (Scott Hood) (12/07/90)
In article <b.APBIWW@outpost.UUCP> peirce@outpost.UUCP writes: > >In article <6110@crash.cts.com>, bill@pro-gateway.cts.com (Bill Long, SysOp) writes: >> >> I'm an Apple II user and try to somewhat keep up with the Mac world, and I >> have a couple of questions about 24-bit graphics. >> >> 1) Why is 24-bit better than 32-bit? > >24-bit isn't better than 32-bit. 32-bit color is 24-bit color plus 8-bits >for an alpha channel. Since Apple hasn't defined what the alpha channel >is suppose to do (though some third parties have), this extra 8-bits >of data isn't used by video cards. Often these two terms are used >interchangeably. > > >> 2) How many of the 16 million colors can be on-screen at once? > >With 24-bit color each pixel can fully specify a color (8 bits of >red, blue, & green) so you can display as many colors on the screen >as you have pixels. > >-- michael > > >-- Michael Peirce -- {apple,decwrl}!claris!outpost!peirce >-- Peirce Software -- Suite 301, 719 Hibiscus Place >-- Macintosh Programming -- San Jose, California 95117 >-- and Consulting -- (408) 244-6554 Are you assuming that the system in question has had it's video memory expanded to several megabytes? I thought that the Apple 8*24 card only had enough standard memory to display 256 colors out of the 24-bit palette on the screen at once. Am I missing something? It is very nice indeed that once you have spent more money upgrading the video memory of the display device that you can display and seriously use as many colors as you have pixels out of the full 24-bit palette. Scott Hood -- -- Scott Hood, Hardware Design Engineer (A3000 Crew), Commodore-Amiga, Inc. {uunet|pyramid|rutgers}!cbmvax!hood hood@cbmvax.cbm.commodore.com "The views expressed here are not necessarily those of my employer!"
awessels@ccwf.cc.utexas.edu (Allen Wessels) (12/07/90)
In article <16343@cbmvax.commodore.com> hood@cbmvax.commodore.com (Scott Hood) writes: >Are you assuming that the system in question has had it's video memory >expanded to several megabytes? I thought that the Apple 8*24 card only >had enough standard memory to display 256 colors out of the 24-bit >palette on the screen at once. Am I missing something? It is very nice >indeed that once you have spent more money upgrading the video memory of >the display device that you can display and seriously use as many colors >as you have pixels out of the full 24-bit palette. Just do the math. On a 640*480 screen, 3 bytes per pixel to get 24 bits will require around 900k of memory. The 8*24 comes equipped to handle 256 colors and is expandible to handle 16.7 million. I don't know where you get your "several megabytes" number. 1024*768*3 requires somewhere around 2.35 million bytes, and 32 bit would take about 3 million. (I dunno if the alpha channel would be manipulated in the video memory - prolly would.) 24 bit color is very nice though, as you point out.
torrie@Neon.Stanford.EDU (Evan James Torrie) (12/07/90)
hood@cbmvax.commodore.com (Scott Hood) writes: >Are you assuming that the system in question has had it's video memory >expanded to several megabytes? I thought that the Apple 8*24 card only >had enough standard memory to display 256 colors out of the 24-bit >palette on the screen at once. Am I missing something? Yes. This seems to be a common misconception though (I've seen many users in other newsgroups [c.s.a in particular] get this wrong). Apple's 8*24 card comes ready with enough memory to display true 24-bit color. You're thinking of Apple's 4*8 board, which can be expanded (by plugging in video memory) to an 8*24... Both these boards have a 24-bit palette, but the 4*8 can only display 256 of those at a time though. Apple should have just named these the Apple 8, and Apple 24 board in my opinion... -- ------------------------------------------------------------------------------ Evan Torrie. Stanford University, Class of 199? torrie@cs.stanford.edu "And remember, whatever you do, DON'T MENTION THE WAR!"