[comp.sys.mac.programmer] what I want to see in future Apple computers

sho@gibbs.physics.purdue.edu (Sho Kuwamoto) (06/16/91)

I'd like to resurrect this topic again.  Sometimes, it seems as if a
month doesn't go by in which we don't have some discussion of what we
want in future systems.  However, I haven't had a chance to discuss
this since the advent of system 7, so I'm going to try to start
another discussion.

This discussion usually comes in two flavors.  Either you talk about
reasonable improvements, or you talk about pie-in-the-sky wacked out
ideas.  I'd like to talk about some of both.

Multitasking:

I used to think that cooperative multitasking was ok for the mac.
Now, I find it a pain in the ass.  

Recently, I've been writing small programs that communicate with each
other using AppleEvents and the PPC Toolbox.  I've found that it's one
thing to write one program, and another thing entirely to write four
programs which send signals back and forth to each other.  The way
things stand, I find myself pulling my hair out trying to keep
everything straight.  Preemptive multitasking would simplify things.
On a similar note, I'd love to see protected memory.  When a program
crashes, I want an alert that says, "core dump."  I don't want the
entire machine to die.


Toolbox:

Oh, god, the Toolbox has gotten complicated.  I want to see a complete
rewrite.  (but then again, don't we all...)  Using a class library is
helpful, but it still doesn't feel right.  I want a new toolbox
written for a real object oriented language.  Instead of installing a
wdef proc, we should be able to subclass the window class and override
the draw method.  

Instead of sending an AppleEvent to another program, I should be able
to send it messages using this object oriented language.  Suppose C++
had a ->> operator, which was more like a SmallTalk message.  Then, I
could do something like this:

	Application	otherProg;

	otherProg = new Application('Spel', true);
	err = otherProg->err;

	if(!err)  err = otherProg->>OpenFile(file);
	if(!err)  err = otherProg->>SpellCheck();

The constructor would either connect to the existing program or launch
it if the process doesn't exist.  If the program had no SpellCheck()
method, the exception handler would deal with it.

There's been a little discussion of document based interfaces.
Instead of copying and pasting between applications, you might choose
"New Document" from the Finder.  You run applications to place
sections of text, graphics, what have you, all over the document.  All
sections are live, and opening the document opens all the applications
needed to edit the document.  I like this idea, and I think it fits in
well with the above.  Not *all* programs would run this way, (imagine
your C compiler) but there is some sort of interface here which I'm
just not smart enough to figure out.  


Hardware:

I think it's about time that the mac had more graphics horsepower.
I'd like to see a system where you didn't have to work around
Quickdraw to write a video game.  I also want eighty gazillion meg of
ram with eleventy-four RISC chips running at 100 Mega-GigaHertz.
Really, though, I'd like to see Apple come out with a new line by 1994
with a completely new interface, new, easy-to-program Toolbox, and
enough horsepower to make people drool.  It's not just a matter of
calculating spreadsheets faster.  Faster hardware would allow us much
more luxury in designing a slick interface.  And I think that a
machine like the mac which depends so much on graphics has no business
being slower than a Nintendo or what have you.

By 1994, the mac will be 10 years old.  The Classic is only marginally
faster than the original mac.  People will point out that the Classic
sells because there is a market for it.  I have no argument there.  It
is a great machine for some people.  I'm just saying that Apple can't
expect to stay alive by adding kluge upon kluge to the Mac Toolbox for
the rest of time.  I'd like to see a new product introduced in the
$5000 price range which blows your mind in the same way that the
original mac did.  And as long as they're going to do that, I want
them to do it in such a way that it allows us programmers to write
nifty programs with the least amount of effort.

I may not have said anything that hasn't been said before.  However,
I'd like to hear other people's views on the future of the mac, and
I'd also like to hear any gossip that's floating around.

-Sho
-- 
sho@physics.purdue.edu <<-- we all have our pipe dreams...

gourdol@imag.imag.fr (Gourdol Arnaud) (06/16/91)

In article <5282@dirac.physics.purdue.edu> sho@gibbs.physics.purdue.edu (Sho Kuwamoto) writes:
[Lots of cool stuff about future Apple computers]

Totally agree with you on all points.
1994? I'd say one year earlier than you think.
Be patient.

(BTW, shoudl it still be called a Macintosh? 
 Should Apple start examining other apples name?)

Arno.

-- 
    /=============================//===================================/
   / Arno Gourdol.               // On the Netland:  Gourdol@imag.fr  /
  / "A keyboard ! How quaint !"   -- Scott, Star Trek                /
 /=============================//===================================/

peterc@sugar.hackercorp.com (Peter Creath) (06/17/91)

| (BTW, shoudl it still be called a Macintosh?
|  Should Apple start examining other apples name?)
 
Well, they sure can't use Granny Smith (they already used it for the Apple
IIgs)  Maybe they'll do the Apple GD, RD, W, etc.?
 
I think they'll have to move toward other fruits though.

-- 

dorner@pequod.cso.uiuc.edu (Steve Dorner) (06/17/91)

In article <5282@dirac.physics.purdue.edu> sho@gibbs.physics.purdue.edu (Sho Kuwamoto) writes:
>everything straight.  Preemptive multitasking would simplify things.
>On a similar note, I'd love to see protected memory.  When a program
>crashes, I want an alert that says, "core dump."  I don't want the
>entire machine to die.
...
>helpful, but it still doesn't feel right.  I want a new toolbox
>written for a real object oriented language.  Instead of installing a
>wdef proc, we should be able to subclass the window class and override
>the draw method.  
...
>I think it's about time that the mac had more graphics horsepower.

For really good news, write:

	NeXT Computer, Inc.
	900 Chesapeake Drive
	Redwood City, CA 94063

They have exactly what you want.  An entry-level model will cost about the
same as a Mac IIsi.
--
Steve Dorner, U of Illinois Computing Services Office
Internet: s-dorner@uiuc.edu  UUCP: uunet!uiucuxc!uiuc.edu!s-dorner

sho@gibbs.physics.purdue.edu (Sho Kuwamoto) (06/17/91)

In article <grape> dorner@pequod.cso.uiuc.edu (Steve Dorner) writes:
>In article <nuts> sho@gibbs.physics.purdue.edu (Sho Kuwamoto) writes:
>>
>>I want a new toolbox
>>written for a real object oriented language.  Instead of installing a
>>wdef proc, we should be able to subclass the window class and override
>>the draw method.  
>
> [try NeXT]

Can you really do the above (subclassing the window class and
overriding the draw method to change the shape and look of a window)
on the NeXT?  It's a minor point, but I think it says something about
the desgin philosophy of the library.

A couple people have sent me email telling me that what I'm describing
is a NeXT.  I think it's a neat machine.  However, I'm afraid that
buying a NeXT would not make me disgustingly happy either.  A little
happier, maybe....

What I'm getting at is that the NeXT machine is *not* the kind of
machine that will make us drool in four years.  None of today's
machines are.  I'm hoping that in four years, even the lowly mac will
be able to do some tricks that today's NeXTs can't do.  I also expect
that tomorrow's NeXTs will be improved as well.  If my description
sounded like a NeXT (and upon rereading, it kind of does) it's only
because I lacked the imagination and typing stamina to paint a better
picture.   It means that I'd love to have a NeXT now, but I really
want something even better.

Language:

I'd like to see the toolbox tailored around a nice high-level object
oriented langauge.  By high-level I mean this: C is low-level.
Smalltalk is (from what I've heard) high-level.  C++ is medium-level.

No matter how many times we call C++ methods messages, they just
aren't messages.  They're functions which have been cleverly
disguised.  That's why these class libraries feel to me as if they
were shell applications written in an object oriented style.  

It seems to me that I should be able to write code that said, "If this
error condition is true, send all incoming messages except clear() to
this other object."

What I really want is a real object oriented library written for this
fictional language.  All method calls would be resolved at runtime,
and all data would be hidden.  This would allow you to do things like
run ResEditII, replace an object with a replacement, and run the
program without recompiling.  We would be able to swap code like this
since the behavior an object would be defined strictly by the behavior
of its methods.

The toolbox would be a set of these objects in ROM.  

The NeXT uses an object oriented library, but I've never had the
pleasure of writing for it.  However, knowing what little I know about
Objective C, I feel that it won't be as flexible and elegant as what
I'm envisioning in my head.  

Remember what I said (in my original article) about an object oriented
language in which sending messages to other applications was easy as
sending messages to objects in your own program?  Well, that comes
from some place in the back of my head which tells me that the
programming environment should be integrated with the system in such a
cool way that I couldn't possibly think of it no matter how much fish
I'd eaten.


Interface:

The NeXT machine seems evolutionary to me instead of revolutionary.  I
see it as a sleeker version of the mac.  It's faster, it has a neater
class library, it runs on top of UN*X.  Somehow, this is not the
vision that I have for the next generation of Macs.  If I wanted a
sleeker mac, I could buy a NeXT, I could buy a Sparc with OpenLook
(except for the fact that I hate OpenLook), I could buy an SGI
machine, I could do a lot of things.  When I say, "rewrite the
Toolbox" I mean rewrite it from the metaphor on up, not "get rid of
the bugs and make it nicer."

I don't remember exact dates, but the first Apples were coming out
around 1974.  The first IBMs came out in the early '80s.  As much as
we like to poke fun at IBM, I think it's fair to say that the IBM was
an improvement over the Apple IIs.

The macs came out around 1984.  These were the first low-price
machines with windows.  And it's not just the windows.  The mac seemed
to have a fundamentally different philosophy than the IBMs or the
Apple IIs.  

I think this is lost on many people.  Look at Microsoft Windows.  Some
of the things are done right, some other things are just ridiculous.
For that matter, look at some of the window managers and widget sets
for X windows.  OpenLook is proof positive that Sun doesn't have
anyone in their art department.  Either that or they are kept
incommunicado from their user interface group.  I like the NeXT's
3Dish interface.  Besides looking nice, it makes the objects which you
manipulate with the pointer stand out from the rest of the screen.  In
OpenLook, random things are given 3D shadows.  When you make a menu
selection, an ovalish depression follows your mouse down the menu
indicating which item is currently selected.  

The reason I bring all this up is that we have systems now which have
user interfaces built by copying things from other systems for all the
wrong reasons.  Before this tirade, I was talking about how IBM was
like a better Apple II.  In my view, the NeXT is a better Mac.  Before
you tell me that I'm being blasphemous, remember how much difference
there was between an Apple II and an IBM.  The Apple II had grown to
become a bunch of anachronisms from the days when 4K was a lot of
memory and hard disks were legendary devices known as "Winchesters."
To find out what was on your disk, you typed CATALOG.  There were no
directories, no nothing.  If you wanted a professional OS for the
Apple II, you bought a CP/M card.  If you wanted upper and lower case
or 80 columns, you bought another card.  All real programming was done
in assembly.  The one program I remember as having been written in a
high-level language was Wizardry, which was written in Pascal.

Of course, there are things the Mac does *better* than the NeXT, but
that doesn't really affect my argument.  

I think it's about time that someone shook up the house a little.  I'm
not at all sure that Apple is the company to do it, either.  Maybe
NeXT will provide us with the user interface for the 90s.  Hell, maybe
it'll be Xerox or SRI.  My long term wish for the future of the mac is
an ineffable vision of things being done "the right way."  I don't
know exactly what I mean, but I have some faint notions.  We want to
look at the mouse and decide if we want to keep that.  Then, we want
to look at windows and decide if we want to keep them.  We go on down
this list until we know what we like.  We look at hundreds of programs
and try to pick out the good from the bad.  We try to find new ideas
which would make the interface better.  We wrap this all up in a
beautiful programming environment and top it with a creamy cheese sauce.

-Sho
-- 
sho@physics.purdue.edu

russotto@eng.umd.edu (Matthew T. Russotto) (06/18/91)

In article <1991Jun17.145312.20601@ux1.cso.uiuc.edu> dorner@pequod.cso.uiuc.edu (Steve Dorner) writes:

>>I think it's about time that the mac had more graphics horsepower.
>
>For really good news, write:
>
>	NeXT Computer, Inc.
>	900 Chesapeake Drive
>	Redwood City, CA 94063
>
>They have exactly what you want.  An entry-level model will cost about the
>same as a Mac IIsi.

Right, and it might have a lot of horsepower, but no real ability to see it,
considering you don't have color.

--
Matthew T. Russotto	russotto@eng.umd.edu	russotto@wam.umd.edu
     .sig under construction, like the rest of this campus.

ken@dali.cc.gatech.edu (Ken Seefried iii) (06/18/91)

In article <1991Jun17.183500.10303@eng.umd.edu> russotto@eng.umd.edu (Matthew T. Russotto) writes:
>In article <1991Jun17.145312.20601@ux1.cso.uiuc.edu> dorner@pequod.cso.uiuc.edu (Steve Dorner) writes:
>
>>>I think it's about time that the mac had more graphics horsepower.
>>
>>For really good news, write:
>>
>>	NeXT Computer, Inc.
>
>Right, and it might have a lot of horsepower, but no real ability to see it,
>considering you don't have color.
>

Except, of course, on the colour NeXT's introduced, oh, 4-6 months
ago.  16-bit colour in the base model (which, as I recall, is some-
what more expenive than the Mac IIsi).  You might want to keep up 
on whats going on outside the Mac arena once in a while, especially
if you're going to be posting to the net.

N.B. - Before anyone starts the inevitable, can the Mac-v-NeXT flames
       go to alt.computers.religion?  It's getting to be a monthly
       thing...
--

	ken seefried iii	"I'll have what the gentleman 
	ken@dali.cc.gatech.edu	 on the floor is having..."

jimc@isc-br.ISC-BR.COM (Jim Cathey) (06/18/91)

In article <22654@imag.imag.fr> gourdol@imag.imag.fr (Gourdol Arnaud) writes:
>(BTW, shoudl it still be called a Macintosh? 
> Should Apple start examining other apples name?)

Sure!  If they manage to do it right and remove all the pesky archaisms
that are left over from making a stripped-down Lisa, as well as making
it blindingly fast, affordable, and perfect in every way...

My vote for its name would be

"Seek-no-further!"

Fits right in.  It's a name (or at least nickname) for a variety of apple,
plus it's an advertisement.  Remember, you saw it here first!

+----------------+
! II      CCCCCC !  Jim Cathey
! II  SSSSCC     !  ISC-Bunker Ramo
! II      CC     !  TAF-C8;  Spokane, WA  99220
! IISSSS  CC     !  UUCP: uunet!isc-br!jimc (jimc@isc-br.isc-br.com)
! II      CCCCCC !  (509) 927-5757
+----------------+
			"With excitement like this, who is needing enemas?"

ksand@apple.com (Kent Sandvik) (06/18/91)

In article <1991Jun17.145312.20601@ux1.cso.uiuc.edu>, dorner@pequod.cso.uiuc.edu (Steve Dorner) writes:
> For really good news, write:
> 
> 	NeXT Computer, Inc.
> 	900 Chesapeake Drive
> 	Redwood City, CA 94063
> 
> They have exactly what you want.  An entry-level model will cost about the
> same as a Mac IIsi.

For the moment it seems like that's the only place where you could buy one :-).

Kent
..just joking...

dorner@pequod.cso.uiuc.edu (Steve Dorner) (06/18/91)

In article <5294@dirac.physics.purdue.edu> sho@gibbs.physics.purdue.edu (Sho Kuwamoto) writes:
>fictional language.  All method calls would be resolved at runtime,
>and all data would be hidden.  This would allow you to do things like
>run ResEditII, replace an object with a replacement, and run the
>program without recompiling.  We would be able to swap code like this
>since the behavior an object would be defined strictly by the behavior
>of its methods.

Stop it.  Stop describing NeXT machines.  Just buy one :-).

>Of course, there are things the Mac does *better* than the NeXT, but
>that doesn't really affect my argument.  

I won't do it.  I just won't bite on this one.  I use both systems all day
long, but I just won't bite.

I also don't want to start a NeXT vs Mac war.  Sho just keeps talking about
his "ideal" computer, and it sounds so strikingly like the cube that I just
had to point it out.
--
Steve Dorner, U of Illinois Computing Services Office
Internet: s-dorner@uiuc.edu  UUCP: uunet!uiucuxc!uiuc.edu!s-dorner

jtn@potomac.ads.com (John T. Nelson) (06/18/91)

>Except, of course, on the colour NeXT's introduced, oh, 4-6 months
>ago.  16-bit colour in the base model (which, as I recall, is some-
>what more expenive than the Mac IIsi).  You might want to keep up 
>on whats going on outside the Mac arena once in a while, especially
>if you're going to be posting to the net.

Worse... the NeXT color doesn't have a color lookup table!  That means
any animation on the machine has to be painstakingly drawn.  No
animation tricks by modifying color lookup table slots.

I can't imagine anyone doing color without a CLUT.  Its just too
useful.

Lawson.English@p88.f15.n300.z1.fidonet.org (Lawson English) (06/18/91)

Peter Creath writes in a message to All

PC> I (BTW, shoudl it still be called a Macintosh? I Should Apple 
PC> start examining other apples name?)  Well, they sure can't use 
PC> Granny Smith (they already used it for the Apple IIgs) Maybe 
PC> they'll do the Apple GD, RD, W, etc.?  I think they'll have to 
PC> move toward other fruits though.

How abvout "The Banana Jr. 9000?"


Lawson
 

--  
Uucp: ...{gatech,ames,rutgers}!ncar!asuvax!stjhmc!300!15.88!Lawson.English
Internet: Lawson.English@p88.f15.n300.z1.fidonet.org

peterc@sugar.hackercorp.com (Peter Creath) (06/19/91)

| >>I think it's about time that the mac had more graphics horsepower.
| > >For really good news, write: > > NeXT Computer, Inc. > 900
| Chesapeake Drive > Redwood City, CA 94063 > >They have exactly what
| you want. An entry-level model will cost about the >same as a Mac
| IIsi.
|
| Right, and it might have a lot of horsepower, but no real ability to
| see it, considering you don't have color.
 
Where have you been for the past year?  Tibet?  There are currently 2 co
color models of the NeXT out, and a card for the cube.  They all have
32-bit.


-- 

duggie@pengyou (Doug Felt) (06/19/91)

In article <5294@dirac.physics.purdue.edu> sho@gibbs.physics.purdue.edu (Sho  
Kuwamoto) writes:
>What I'm getting at is that the NeXT machine is *not* the kind of
>machine that will make us drool in four years.  None of today's
>machines are.

I agree.  You can't discount horsepower completely though.  100 Mips with lots  
of RAM, 24-bit color, real-time audio/video compression and decompression, and  
so on will make a lot of people drool.  But I take your main point to be that  
the programming model for today's machines, and their user interfaces, are  
starting to look a lot alike, and that these are not satisfactory.  Amen.

>The NeXT machine seems evolutionary to me instead of revolutionary.  I
>see it as a sleeker version of the mac.  It's faster, it has a neater
>class library, it runs on top of UN*X.

Interface builder is a small step in the right direction.  Mach, Display  
Postscript, and Objective-C are real plusses.  Evolution is a good thing, and I  
think NeXTs are very cool.  But they're not all that innovative, especially as  
far as user interface goes.

>The reason I bring all this up is that we have systems now which have
>user interfaces built by copying things from other systems for all the
>wrong reasons.  Before this tirade, I was talking about how IBM was
like a better Apple II.  In my view, the NeXT is a better Mac.

Copying things because doing otherwise is too great a risk.  Too bad.

>I think it's about time that someone shook up the house a little.  I'm
>not at all sure that Apple is the company to do it, either.  Maybe
>NeXT will provide us with the user interface for the 90s.  Hell, maybe
>it'll be Xerox or SRI.  My long term wish for the future of the mac is
>an ineffable vision of things being done "the right way."  I don't
>know exactly what I mean, but I have some faint notions.  We want to
>look at the mouse and decide if we want to keep that.  Then, we want
>to look at windows and decide if we want to keep them.  We go on down
>this list until we know what we like.

You should check out GO Corporation's pen-based operating system, PenPoint.
They were featured in BYTE earlier this year, February, I think.  There is also  
a book called "The Power of PenPoint" which gives a quick overview of  
PenPoint's main features.

PenPoint has plenty of innovations in terms of the user interface and the  
underlying object model.  The most obvious user interface innovation to my mind  
is the notebook metaphor, presenting all documents to the user as pages in a  
notebook.  The main innovation of the underlying model is treating portions of  
documents as instances of applications.  Thus a document (from the user's point  
of view) might contain both text and graphics, allowing for easy editing of  
each, and supporting seamless 'copy and paste' between the text and graphics  
portions of the document.  Yet the text editing code and the graphics editing  
code may be completely separate, purchased independently by the user in  
response to her needs for different functionality.

PenPoint also, of course, supports the pen as an input device.  It has  
universal object ids to uniquely identify documents across all machines running  
PenPoint.  It has communications protocols to support networking when the 'net  
conection' might be made or broken at random-- i.e. when you carry your  
portable machine out of range of the local radio-based network in your  
building.  It has a flat memory model in which you treat files as though  
they're in RAM, not on disk (because on most machines this will be the case).   
Lots of interesting ideas.  The risk, of course, is that the implementation may  
be deficient in ways that are not clear because not enough people have tried  
doing stuff with it yet.  I have not read real docs on the system, only the  
above-mentioned article and book, which are sketchy, and there are some details  
that are worrisome.  What is clear, however, is that GO is trying to be really  
innovative, and not just copy existing models of user interfaces and  
object-oriented toolboxes.  Bravo!

I doubt Apple would be willing to abandon the Finder and the  
application/document structure we're all so used to.  I think they should, but  
then Apple is not the company it was ten years ago.  I think we're going to  
have to look to smaller companies to take the risks the bigger players won't.   
Let's hope they get the breaks Apple got that helped the Mac got established.

>-Sho
>-- 
>sho@physics.purdue.edu

Doug Felt
Courseware Authoring Tools Project
Stanford University

peterc@sugar.hackercorp.com (Peter Creath) (06/19/91)

Well, I just got through with playing with a NeXTstation Color for a few
hours, and, well, I was stunned at the utter lack of software.  Maybe it's
just that all the good stuff is really expensive.  I'll admit it's a damn
powerful computer, but compared to the Mac's software base at a comparable
age, it's pretty weak.  Of course it would help if someone high up at NeXT
got their thumb out and started ADVERTISING.  Do you realize how many
non-computer type people out there have even HEARD of NeXT?  ("Next what?"
will more than likely be their answer)  You'd be hard pressed to find one.
If you can find two or more, you're a phenomenon.


-- 

d88-jwa@cyklop.nada.kth.se (Jon W{tte) (06/19/91)

> duggie@pengyou (Doug Felt) writes:

   I doubt Apple would be willing to abandon the Finder and the  
   application/document structure we're all so used to.  I think
   they should, but then Apple is not the company it was ten years
   ago.  I think we're going to have to look to smaller companies


You obviously weren't at the WWDC, where the ATG showed a little of
how they look at the notebook user interface. They seem to lean more
on a HyperCard style interface, but with pencil additions. Of course,
it was a working prototype they had, and god knows what, when and if
something comes out of this.

There was also some talk on the document metaphor at the WWDC.

--
						Jon W{tte
						h+@nada.kth.se
						- Speed !

evensen@husc9.harvard.edu (Erik Evensen) (06/20/91)

In article <1991Jun18.135838.3444@potomac.ads.com> jtn@potomac.ads.com (John T. Nelson) writes:

   Worse... the NeXT color doesn't have a color lookup table!  That means
   any animation on the machine has to be painstakingly drawn.  No
   animation tricks by modifying color lookup table slots.

   I can't imagine anyone doing color without a CLUT.  Its just too
   useful.

Well, I hate to be a stick in the mud but to set the record straight,
it is my understanding, after reading the Palette Manager section of
Inside Mac VI that direct devices (i.e., those whiz-bang 24 bit
colorcards) do not have CLUT's either.  To quote: "Color table
animation doesn't work on a direct device -- it has no color table."
(p. 20-11)  Now maybe there's a trick to getting a direct device to
act like one which has a CLUT but I haven't seen anything about it;
correct me if I'm wrong.  My opinion is that if you want to write the
most compatible software, it would be wise not to use CLUT animation.

Don't get me wrong, I love the Mac and am strongly resisting the urge
to buy a NeXT...

--Erik (evensen@husc.harvard.edu)

freek@fwi.uva.nl (Freek Wiedijk) (06/20/91)

sho@gibbs.physics.purdue.edu (Sho Kuwamoto) writes:
>On a similar note, I'd love to see protected memory.  When a program
>crashes, I want an alert that says, "core dump."  I don't want the
>entire machine to die.

No.  You want an alert that says: "your program crashes."

Freek "the Pistol Major" Wiedijk                      E-mail: freek@fwi.uva.nl
#P:+/ = #+/P?*+/ = i<<*+/P?*+/ = +/i<<**P?*+/ = +/(i<<*P?)*+/ = +/+/(i<<*P?)**

jtn@potomac.ads.com (John T. Nelson) (06/20/91)

>Don't get me wrong, I love the Mac and am strongly resisting the urge
>to buy a NeXT...
>
>--Erik (evensen@husc.harvard.edu)


I'm not.  Me want!



=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
ORGANIZATION:  BLETCH (Bletcherous League of Evil Twisted Computer Hackers)
UUCP:          kzin!speaker@mimsy.umd.edu  INTERNET:   jtn@potomac.ads.com
SPOKEN:        Dark Hacker                 PHONE:      (703) 243-1611
FAVORITE MAGAZINE:   Midnight Engineer
TWISTED DIABOLICAL LAUGH:	Mwahh ah ha ha hah ha ha ha!

The Mythos of Dark Hacker:

"Controlled by the sinister and shadowy "suits" Dark Hacker now employs
the tools of computer science to free himself from the suit's will.
By day he is a lackey... but at night when the city sleeps he
becomes.... DARK HACKER!"
=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=

zben@ni.umd.edu (Ben Cranston) (06/21/91)

In article <EVENSEN.91Jun20083252@husc9.harvard.edu>
evensen@husc9.harvard.edu (Erik Evensen) writes:

> Well, I hate to be a stick in the mud but to set the record straight,
> it is my understanding, after reading the Palette Manager section of
> Inside Mac VI that direct devices (i.e., those whiz-bang 24 bit
> colorcards) do not have CLUT's either.  To quote: "Color table
> animation doesn't work on a direct device -- it has no color table."
> (p. 20-11)  Now maybe there's a trick to getting a direct device to
> act like one which has a CLUT but I haven't seen anything about it;
> correct me if I'm wrong.  

OK!   :-)

I understand all mainline Apple video cards have a Gamma correction table,
and that it is at least theoretically possible to do color table animation
of sorts by modifying this table on the fly.  I've never done so and hope
to never *have* to do so.

BTW "Gamma" refers to the non-linear relationship between electron beam
current and perceived light intensity.  A "Gamma table" is a lookup table
that (somehow) provides a compensation function.  Supposedly you want a
linear Gamma function if you're taking a picture of the screen, because
film's response is linear with number of incident photons, but for viewing
you want a different function.  Try pushing the "options" button in the
Monitors control panel and see what kind of gamma choices your particular
monitor supports.

nagle@well.sf.ca.us (John Nagle) (06/21/91)

     The Mac needs a protected-mode operating system desperately.  The
existing systems have too low a MTBF.  Attempting to do multitasking
on a machine without memory just doesn't work.  MTBF must be brought
up to at least 30 days running the Monkey with multiple applications
with every DA and init in widespread use loaded.

     Bringing out System 7 without protected mode so that it would work
on that boat-anchor luggable was a terrible mistake.  Apple may have blown
its chance to make its machines that cost like workstations work like
workstations.  

					John Nagle

(and no, A/UX, with its 92-step installation process, unprotected Mac 
environment, and slow operation, isn't a solution.)

sho@gibbs.physics.purdue.edu (Sho Kuwamoto) (06/21/91)

In article <pear> freek@fwi.uva.nl (Freek Wiedijk) writes:
>sho@gibbs.physics.purdue.edu (Sho Kuwamoto) writes:
>>On a similar note, I'd love to see protected memory.  When a program
>>crashes, I want an alert that says, "core dump."  I don't want the
>>entire machine to die.
>
>No.  You want an alert that says: "your program crashes."

Touche.  Actually, I want an alert that says:

  "The application Foo has unexpectedly crashed.  Would you like to
   save information about this crash?  (It will only be useful to
   expert programmers.)"

"Don't Save" would be the default button.

-Sho
-- 
sho@physics.purdue.edu

gibson@lespaul.Princeton.EDU (John Gibson) (06/22/91)

In article <25555@well.sf.ca.us> nagle@well.sf.ca.us (John Nagle) writes:
>
>Attempting to do multitasking
>on a machine without memory just doesn't work.

I'll say! ;-)

=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
John Gibson                           Princeton Univ. Dept. of Music
gibson@silvertone.Princeton.EDU                                    

speck@gorm.ruc.dk (Peter Speck) (06/25/91)

sho@gibbs.physics.purdue.edu (Sho Kuwamoto) writes:

>Touche.  Actually, I want an alert that says:
>
>  "The application Foo has unexpectedly crashed.  Would you like to
>   save information about this crash?  (It will only be useful to
>   expert programmers.)"

>"Don't Save" would be the default button.

I would like to add another button (being the default button):
  "Try to save changes"
This could be made be installing a "crash save" proc (e.g. the parameter
to InitDialogs) that would be called when the user pressed this button.
I think, that in many APPL's, one could save most of the stuff (in a
new file) without crashing again.

                       {{{{{{{{{{ speck@dat.ruc.dk }}}}}}}}}}}
                       {{{{{{{{{{{{ Peter  Speck }}}}}}}}}}}}}
                       {{{{{ Roskilde UniversitetsCenter }}}}}
                       {{{{{{{{{{{{{{{ Denmark }}}}}}}}}}}}}}}

zben@ni.umd.edu (Ben Cranston) (06/27/91)

In article <1991Jun20.172609.9795@ni.umd.edu> I wrote:

> ...  Supposedly you want a
> linear Gamma function if you're taking a picture of the screen, because
> film's response is linear with number of incident photons, ...

I received the following correction via email.  When I wrote back suggesting
it would make an excellent network followup, he gave me permission to repost
the item.  Posted by permission of author:

Date: Fri, 21 Jun 91 12:58:31 PDT
From: Theron Trowbridge <trowbrid@girtab.usc.edu>

Uh, is that really what gamma is defined as in the computer world?
Gamma for a film stock is MOSTLY linear.  The gamma describes, essentally,
the contrast of an image.  The number assigned to a Gamma describes its
slope as drawn on a sensitometry curve for a film stock.  The graph describes
the relationship (as you say) to the amount of photons hitting the film
it takes to cause a certain change in the desnity of exposed silver salts
on the film.  The higher the gamma, the steeper the slope, and the more
exposure it takes to cause a small change in density; this means there is
a higher contrast.  Lower gamma means, conversly, lower contrast.

However, there are curved parts of this graph at the top and bottom of the
slope.   They are kinda outside the "dynamic range" of the film.  But they,
too are affected by the film's gamma.  Minor thing, tho, and probably
parentheical to most computer applications.
Video gamma is essentially the same thing.  Exposure vs Intensity.

The only thing is, it takes twice a change of intensity of two-fold for the
human eye to notice a significant change.  Each doubling is an apparently
equal incrementation.  A films gamma is an exponental curve, but its main
purpose is not to describe this exponential relationship, but it has this
correction built into it so the relationship between exposure and intensity
can be better demonstrated...

Ugh.  it always steams me to find out that the engineers have taken another
ages-old term and re-defined it for their own purposes...

(I hope this makes some sense to you.  I was probably a teacher in a former
life and now I feel the urge to explain things to people.  This is not a
flame)

BTW - what photographers consider a "perfect" gamma -- the one that most
accurately reproduces its subject as the human eye sees it -- is .5
Gamma theoretically ranges from 0.0 to 1.0  and .5 is dead center.
a perfect 45 degree slope on the EvsD curve.

-Theron Trowbridge
trowbrid@girtab.usc.edu
AppleLink: D7029