[comp.sys.apple] Why Keep the //....

jm7e+@andrew.cmu.edu (Jeremy G. Mereness) (03/11/89)

I wish some people at Apple Co. could read this.

I have heard arguments from across the board over the Apple // being a
"dead architecture" (from MacWeek) and too old and so forth. However, I
have had a chance to talk to fellow students at Carnegie-Mellon's CS and
Computer Engineering Departments. Almost ALL of them I talked to got their
start on an Apple //. The exceptions were a few Pets, Atari's, and
Commodores, but ALL of them were familiar with the // and liked it when they
worked with it.

They agree that a 12 year old, if given a Mac or PS/2, will NEVER learn
programming or computer science.

The Apple // may not built like a workstation like a Mac, designed to handle
raw power with little overhead, but the // has more to offer to the young
programmer than any other machine. The logical progression from BASIC to
Machine Language to the first assembler was crucial to many CS and CE
majors today. Those that I have asked say this is NOT possible now that the
// is disappearing and the Mac has taken over. The Apple // generation will
be the last of the CS generation. If this sounds rash, consider trying to
interest a 14-year-old in UNIX. The next generation will look at computers as
appliances, things to be used but without a clue and much less an interest in
how it works.

* This, if for NO OTHER REASON, is why the Apple // should survive *

...and is one of the reasons that so many people remain loyal to the
machine. This, in my opinion, is the equation that the marketers in Apple's
high offices have neglected.

So, Apple, Let's see what your labs, your Cray, and your engineers can do.
Let's see a fast, capable OS, speed that doesn't discourage, and some Pride.

jeremy mereness
jm7e+@andrew.cmu.edu (arpa)

ALBRO@NIEHS.BITNET (03/12/89)

jm7e@andrew.cmu.edu (arpa) { Jeremy Mereness } writes:

>The next generation will look at computers as appliances, things to be used
>but without a clue let alone interest in how they work.

In my 12-16 years I made film by spreading silver chloride/bromide in gelatin
on glass plates, to use in my pin-hole camera, to be developed in solutions
made from the raw chemicals.  I made crystal (galena) radios and my own test
equipment.  And, I learned to write BASIC and assembly programs on a II. Now
in my 40's to 50's, I still write programs thanks to the IIe and IIgs, but
I buy my camera, rarely if ever listen to the radio (but would buy one if I
did), and send my film to Eckerts to be developed.

This sort of thing is inevitable.  It happened with photography, radio, and
it's happening with computers.  There are still a few of us who subscribe
to Radio-Electronics, and if I had any use for black & white photographs I
would still develop them myself, but in general only a small minority is
going to delve into the nitty gritty of things once those things are fully
reliable and, in the case of personal computers, supplied with excellent
programs that took a team of professional programmers two years to write.
That's why most Jr.High & High Schools let kids use calculators.

krazy@claris.com (Jeff Erickson) (03/14/89)

From article <AY6=r5y00W0dQ7=mpR@andrew.cmu.edu>, by jm7e+@andrew.cmu.edu (Jeremy G. Mereness):
> 
> I wish some people at Apple Co. could read this.

Me too!
> 
> I have heard arguments from across the board over the Apple // being a
> "dead architecture" (from MacWeek) and too old and so forth. However, I
> have had a chance to talk to fellow students at Carnegie-Mellon's CS and
> Computer Engineering Departments. Almost ALL of them I talked to got their
> start on an Apple //. The exceptions were a few Pets, Atari's, and
> Commodores, but ALL of them were familiar with the // and liked it when they
> worked with it.
> 
> They agree that a 12 year old, if given a Mac or PS/2, will NEVER learn
> programming or computer science.
> 
> The Apple // may not built like a workstation like a Mac, designed to handle
> raw power with little overhead, but the // has more to offer to the young
> programmer than any other machine. The logical progression from BASIC to
> Machine Language to the first assembler was crucial to many CS and CE
> majors today. Those that I have asked say this is NOT possible now that the
> // is disappearing and the Mac has taken over. The Apple // generation will
> be the last of the CS generation. If this sounds rash, consider trying to
> interest a 14-year-old in UNIX. The next generation will look at computers as
> appliances, things to be used but without a clue and much less an interest in
> how it works.

The "logical progression" from BASIC to machine language?  Pardon?  I guess
this is all a matter of opinion.  More and more CS and CE majors these days
are going through life without ever having to deal with assembly language
much at all, and NEVER having to deal with machine language directly.  I don't
think it's necessary to learn exactly what the computer's doing with your
code (machine language) before learning how to make your code WORK.  I do
think it's important to have some exposure to assembly language, but I
don't think it has to be that soon, unless of course that's what the kid
wants.

I went from Applesoft BASIC to Pascal instead of to assembly.  So I'm weird.
> 
> * This, if for NO OTHER REASON, is why the Apple // should survive *
> 
> ...and is one of the reasons that so many people remain loyal to the
> machine. This, in my opinion, is the equation that the marketers in Apple's
> high offices have neglected.
> 
> So, Apple, Let's see what your labs, your Cray, and your engineers can do.
> Let's see a fast, capable OS, speed that doesn't discourage, and some Pride.

If the Mac were simpler, or a simple environment could be introduced onto
it, I'd prefer a programmable Mac, simply because it does more.  I don't
mean HyperCard, either.  (What kind of programming language doesn't have
arrays?)  But out of all computers on the market right now, the one I'd by
for my kids to play on is an Apple //.

Not a GS, but still an Apple //.

You've raised a good point.  Is Apple going to make it's rumored K-12 Mac
programmer by 12 year olds?  Or is HyperTalk as good as it gets???  Anyone
from Apple out there?

-- 
Jeff Erickson     \  Internet: krazy@claris.com          AppleLink: Erickson4
Claris Corporation \      UUCP: {ames,apple,portal,sun,voder}!claris!krazy
415/960-2693        \________________________________________________________
____________________/              "I'm so heppy I'm mizzabil!"

REWING@TRINCC.BITNET (03/15/89)

We at Apple do read our mail.  And thanks.

--Rick Ewing
  Apple Computer, Atlanta

dseah@wpi.wpi.edu (David I Seah) (03/15/89)

>From article <AY6=r5y00W0dQ7=mpR@andrew.cmu.edu>, by jm7e+@andrew.cmu.edu (Jeremy G. Mereness):
>
>> The Apple // may not built like a workstation like a Mac, designed to handle
>> raw power with little overhead, but the // has more to offer to the young
>> programmer than any other machine. The logical progression from BASIC to
>> Machine Language to the first assembler was crucial to many CS and CE
>> majors today. Those that I have asked say this is NOT possible now that the
>> // is disappearing and the Mac has taken over. The Apple // generation will
>> be the last of the CS generation. If this sounds rash, consider trying to
>> interest a 14-year-old in UNIX. The next generation will look at computers as
>> appliances, things to be used but without a clue and much less an interest in
>> how it works. [STUFF DELETED]

I wish I had applied to Carnegie now!  This school is narced up with AT&T
PC6300s! Yuk.

Krazy@Claris.com (Jeff Erickson) responds
>
>The "logical progression" from BASIC to machine language?  Pardon?  I guess
>this is all a matter of opinion.  More and more CS and CE majors these days
>are going through life without ever having to deal with assembly language
>much at all, and NEVER having to deal with machine language directly.  I don't
>think it's necessary to learn exactly what the computer's doing with your
>code (machine language) before learning how to make your code WORK.  I do
>think it's important to have some exposure to assembly language, but I
>don't think it has to be that soon, unless of course that's what the kid
>wants.
>
>I went from Applesoft BASIC to Pascal instead of to assembly.  So I'm weird.

I too started with Applesoft, then went to machine language.  In those years,
one strong emphasis on the Apple II was game development.  After diddling with
slow Applesoft shapes, fast animation was my goal.  So, the step to machine
language was "logical" for me.

If I had had a Commodore 64, I would most likely be a rabid Amiga game player
today, majoring in English Lit or Physics.  The accessability of the Apple
II's internals (easily popped case, machine language monitor) made me familiar
and comfortable with the Physical Machine itself, and coaxed me into Computer
Engineering to learn more.  My Apple II background has given me an incredible
advantage over my classmates in CS and Comp Eng courses.  I think that the
machine language monitor is probably the best single feature of the Apple,
because it taught me how memory was organized, how data is represented in
memory, the concept of memory-mapped I/O, indistinguishability between code
and data, etc.  This stuff is not as important to the Pascal programmer, I
imagine, because the applications they write can depend on libraries that
handle machine specific features.  But who's gonna write these libraries, if
CSs and CEs (not Civil) can squeeze through college with minimal exposure to
Assemblers?  What kind of Computer Engineer would someone be with little
knowledge of machine language?  You would get an engineer who designs a
computer that really, really _sucks_ to program at the ML level!  Then you get
nasty firmware and system tools, fraught with bugs and bombs...

| <<<<<(((((( DAVE SEAH ))))))>>>>> |	Internet:  dseah@wpi.wpi.edu
| Worcester  Polytechnic  Institute |	Bitnet:	   dseah@wpi.bitnet
| Computer Engineering Class of '90 |	ALink PE:  Omnitreant

jm7e+@ANDREW.CMU.EDU ("Jeremy G. Mereness") (03/15/89)

>> The Apple // may not built like a workstation like a Mac, designed to handle
>> raw power with little overhead, but the // has more to offer to the young
>> programmer than any other machine. The logical progression from BASIC to
>> Machine Language to the first assembler was crucial to many CS and CE
>> majors today.

> More and more CS and CE majors these days
>are going through life without ever having to deal with assembly language
>much at all, and NEVER having to deal with machine language directly.

Sad, but true. I am discussing th Golden Days when hacking was for real and code
was optimized.

>I went from Applesoft BASIC to Pascal instead of to assembly.  So I'm weird.

Actually, that's what I meant, although I spent some time with the Apple //
Reference Manual making sense of the monitor until I got Merlin. The incentive,
ofcourse, was that BASIC was too slow for those games we wanted to write. It was
a great platform to try things out in, but its sluggishness made assembly a must
if the games were ever going to work. Nibble, Softalk, and InCider helped, too.

>If the Mac were simpler, or a simple environment could be introduced onto
>it, I'd prefer a programmable Mac, simply because it does more.  I don't
>mean HyperCard, either.  (What kind of programming language doesn't have
>arrays?)  But out of all computers on the market right now, the one I'd by
>for my kids to play on is an Apple //.

*sigh* I could go for a Mac //cx with a fast Apple //GS in one of the three
NuBus slots.... the GS still has an old // in it....

>Jeff Erickson     \  Internet: krazy@claris.com          AppleLink: Erickson4
>Claris Corporation \      UUCP: {ames,apple,portal,sun,voder}!claris!krazy
>415/960-2693        \________________________________________________________
>____________________/              "I'm so heppy I'm mizzabil!"
>


jeremy mereness
=============
jm7e+@andrew.cmu.edu (Arpanet)
r746jm7e@CMCCVB (vax.... Bitnet)

DANFUZZ@BROWNVM.BITNET (Dan Bornstein) (03/16/89)

I am another case of "If I didn't have a ][I wouldn't be a programmer."
As far as I'm concerned the non-windows //-series is still the best environment
for a "budding" programmer: You have BASIC to start with (Actually, I had
INTEGER...), and when you outgrow it, you can start ADDING machine language
with the MiniAssembler, and finally if you're so inclined, to switch completely
to machine.  My first real machine language program (i.e. not 10-line dealies)
was an outgrowth of an Integer program whose routines 1-by-1 I converted to
machine with MiniAssembler alone. I could not have done that on any other
computer I know.

Sure, now I want a Mac, or a Next, but that's not the point; Without something
like the //, programming would probably be a lot more foreign to me.


-dan

BitNet:   DanFuzz@BrownVM.BitNet
Internet: DanFuzz@BrownVM.Brown.Edu
EtherNet: Find me a long enough cable and I'll see what we can do.

kamath@reed.UUCP (Sean Kamath) (03/23/89)

OK, folks, this will probably be my last set of blathering for a
while.  This is the last half of my second senior semester, and at
Reed they make you write a senior thesisas a *undergraduate* to get
out.  I'm doing color digital image enhancement, and when I'm done, I
will send the best images I have to ncsa.  In anycase, don't expect to
hear hide nor hair from me till mid-may at the earliest. . . :-(

In article <AY6=r5y00W0dQ7=mpR@andrew.cmu.edu> jm7e+@andrew.cmu.edu (Jeremy G. Mereness) writes:
>
>I wish some people at Apple Co. could read this.

Some do.

>The Apple // may not built like a workstation like a Mac, designed to handle
>raw power with little overhead, [omitted]
>jeremy mereness

Unfortunately, the Mac has just a tad too much overhead to have any
real fun with.  I decided *not* to do my thesis on a mac ][ (5 megs or
Ram, 80 meg internal drive (28 ms), 68881 and lightscpeed C) simply
because of the imense overhead I had to deal with.  Look, the thing
couldn't keep up with the digitizer running at 38.4Kbaud doing
*nothing* else.  I got hardware overruns *always*.  In anycase, when
the little application I wrote would work just fine with about 400
lines of C code, I had well over 3000 lines, i.e. 2600 lines of "user
interface", which did little else then show me what was happenning.
And no, it's not just sloppy writing, rather, I did *all* the things
the guidelines tell you to do, except use the palette manager (I
stuffed the pallete myself to show the images).

I wouldn't even really call a Mac][ a "workstation" any more than I'd
call a IBM PC a "workstation".  A Sun is a workstation.  the Tek4317
I'm typing at is a workstation.  Not the Mac.  Least, not till it
get's VM :-)

And at $4000 *MINIMUM!*  It's priced like a workstation, anyway.

Sean Kamath

BTW:  I finally got Dhrystones to run on the Mac.  Results?  About the
same as a Vax with a load average in the 3-5 range (1500).  Intel 386
running Sys V: 2500.  NeXT machine (optical drive): 5000.  Of course,
dhrystones are pretty meaningless, so think about this:  A fellow
slave--er thesis student, is running time slices of fluiddynamics
problems (the magnus effect).  Forget programming this behemoth on a
Mac (it really need gobs o mem and process power.), but it takes about
6 hours on the vax, 3 on the intel, and a little over an hour on the
NeXT.  Same code. Identical.
-- 
UUCP:  {decvax allegra ucbcad ucbvax hplabs}!tektronix!reed!kamath
CSNET: reed!kamath@Tektronix.CSNET  ||  BITNET: kamath@reed.BITNET
ARPA: kamath%reed.bitnet@cunyvm.cuny.edu
US Snail: 3934 SE Boise, Portland, OR  97202-3126 (I hate 4 line .sigs!)