[alt.next] NeXT

joel@peora.ccur.com (Joel Upchurch) (10/15/88)

In article <Oct.13.09.01.56.1988.8822@porthos.rutgers.edu>, gaynor@porthos.rutgers.edu (Silver) writes:
> rokicki@polya.stanford.edu writes:
> >17" monochrome monitor, providing 1120x832 pixels at two bits per pixel.
> >(Two bits/pixel implies four gray levels, rather than two, and allows for
> >richer graphics.) Monitor includes 8 bit codec for voice input and 16 bit
> 
> Would one of you more studly graphics folks tell me the advantage to doing
> things thus, as opposed to providing one-bit pixels at somewhat finer
> resolution?  Sufficient response will be answered in summarization.
> 

One thing you can do with grey scale is use anti-aliasing. With the proper
software this could improve the effective resolution a lot without
increasing the cost of the monitor. I remember reading a column by Steve
Gibson in Inforworld a while back where he talked about improving the
on-screen appearance on character sets greatly by using anti-aliasing with
only 4 levels of grey scale. This makes me wonder whether Display Postscipt
supports anti-aliasing of character sets. If it does then it could make
the NeXT machine a more effective DTP machine.
-- 
Joel Upchurch/Concurrent Computer Corp/2486 Sand Lake Rd/Orlando, FL 32809
joel@peora.ccur.com {uiucuxc,hoptoad,petsd,ucf-cs}!peora!joel (407)850-1040

edmoy@violet.berkeley.edu (10/16/88)

In article <Oct.13.09.01.56.1988.8822@porthos.rutgers.edu>, gaynor@porthos.rutgers.edu (Silver) writes:
> rokicki@polya.stanford.edu writes:
> >17" monochrome monitor, providing 1120x832 pixels at two bits per pixel.
> >(Two bits/pixel implies four gray levels, rather than two, and allows for
> >richer graphics.) Monitor includes 8 bit codec for voice input and 16 bit
> 
> Would one of you more studly graphics folks tell me the advantage to doing
> things thus, as opposed to providing one-bit pixels at somewhat finer
> resolution?  Sufficient response will be answered in summarization.

I thing the main reason for having 2 bit planes is to be able to do
highlighting of objects on the display, such as buttons and text in menus.  
Since PostScript is an imaging model, it can't do bit manipulation of the
image, such as using XOR for highlighting.  The Display PostScript in
the NeXT machine has enhancements to regular PostScript, but to handle
highlighting they have implemented something called "Alpha Channel" which
I don't understand very well.  Anyways, when they want to highlight black
text on a white background, they change the white to light or dark gray, and
then back to white when they are done.

Edward Moy				Principal Programmer - Macintosh & Unix
Workstation Support Services		Workstation Software Support Group
University of California
Berkeley, CA  94720

edmoy@violet.Berkeley.EDU
ucbvax!violet!edmoy

brad@cayman.COM (Brad Parker) (10/17/88)

From article <15522@agate.BERKELEY.EDU>, by edmoy@violet.berkeley.edu:
>> Would one of you more studly graphics folks tell me the advantage to doing
>> things thus, as opposed to providing one-bit pixels at somewhat finer
>> resolution?  Sufficient response will be answered in summarization.

Hurumph... I sez to myself - "I've done painters!" ;-)
I suspect the 2 bits of gray scale allows them to use "gray scale fonts" which
give the machines more "apparent" resolution. Go look at a Jupiter graphics
terminait uses the same concept). The font's look *much* better with just
a tiny bit (har har) more depth.
-- 
"What will you do when you wake up one morning to find that God's made you
blind in a beautiful person's world and all those great recepies have let you
down, and you're twenty and a half and you're not getting age where you go look
for the boys 'says I love you lets get married and have kids." -Billy Bragg.

bob@allosaur.cis.ohio-state.edu (Bob Sutterfield) (10/18/88)

In article <Oct.13.09.01.56.1988.8822@porthos.rutgers.edu> gaynor@porthos.rutgers.edu (Silver) writes:
>rokicki@polya.stanford.edu writes:
>> - 17" monochrome monitor, providing 1120x832 pixels at two bits per
>>   pixel.  (Two bits/pixel implies four gray levels, rather than
>>   two, and allows for richer graphics.) Monitor includes 8 bit
>>   codec for voice input and 16 bit
>
>Would one of you more studly graphics folks tell me the advantage to
>doing things thus, as opposed to providing one-bit pixels at somewhat
>finer resolution?  Sufficient response will be answered in
>summarization.

They didn't do color on the first cut because there were more
fundamental architectural things to do, and it had to get the first
machine out the door *sometime* this decade.

Their early development was on 1-bit monochrome systems.  The head of
the UI group noticed that he kept having to tell the hacquers
"remember, this will be multi-bits someday!" to keep their code from
unconsciously assuming that all the world is one bit deep.  Look how
hard the Mac folks had to twist things to make a color Toolkit - he
asserted that if they thought in multi-bits per pixel from the start
they could avoid the worst of the architectural flaws that would make
color tough.  But he had a hard time keeping tabs on every bit of
display code to make sure it was easily expandable.

So he went to the hardware group and persuaded them to make it 2 bits
per pixel.  Then he came back to the software group and said "guess
what?  We're two bits deep now!"

It turns out that once you've generalized to any N (where N>1) bits
per pixel, it's much easier to generalize to any other N (including
1).

So the 2-bit screen is not only much more beautiful (snazzy
antialiasing) but is an architectural exercise and a proof-of-concept
that the machine can someday be extended to more bits per pixel with
no great agony in the display software.
-=-
Zippy sez,								--Bob
Yow!  It's some people inside the wall!  This is better than mopping!