[comp.sys.amiga.tech] Modern computer uses

cedman@golem.ps.uci.edu (Carl Edman) (11/18/90)

In article <ggk.658820424@tirith.UUCP> ggk@tirith.UUCP (Gregory Kritsch) writes:
   cedman@golem.ps.uci.edu (Carl Edman) writes:
   >In article <ggk.658699652@tirith.UUCP> ggk@tirith.UUCP (Gregory Kritsch) writes:

   >   cedman@golem.ps.uci.edu (Carl Edman) writes:
   >   >In article <90318.162021DXB132@psuvm.psu.edu> DXB132@psuvm.psu.edu writes:

   >   So, whats your point?  It probably took 2 minutes to do the cr-lf
   >   program, and 5 for the clock.  And if they don't work, it'll take
   >   another 2 minutes to find out why not.

   >What I am complaining about ?
   >[raving-prophet-of-the-doom-of-computerdom-and-the-decadence-of-
   > -the-young-programmers-mode on ]

   Nice attitude.  Unfortunately, I'm 18 years old, and just got a little
   offended...  I hate being stereotyped.

3 reasons for you not to be offended:

        1. If you don't fit into the class of programmers I've described:
        Fine. I'm happy about every good programmer.

        2. This mode-message (like all the mode messages that people use),
        is at least semi-ironic (and that mostly self-irony)

        3. I'm only 20 years myself (altough I've been programming
        real computers for more than a decade)

   >But what happens ? Do new, imaginative mind-blowing programs appear at
   >every turn ?

   If you look hard enough, you'll notice some mind-blowing stuff out
   there.  I remember two years ago talking to someone about DTP, and
   watching their sudden realization that DTP didn't exist when they bought
   their A2000.

   Imagine what the ms-dos dudes are thinking at this moment of multi-media
   (Amigatroids have had it for a while, so we're less excited).

   Remember a program called Hypercard from Apple a few years ago?  Forget
   that, remember the idea of a window even more years ago?

   I know, a few small examples, but still, it is happening.  Computers
   today are used for a lot of things that were unimagined when I got into
   computers (er, I guess it all began 7 years ago with a TRS-80 CoCo). 
   You expect something that will boggle your mind, whereas almost every
   software developer out there is trying to make their software as simple
   and unintimidating to use as possible.

I've not denied that there is still some good and imaginative
programming going on somewhere. Most of the examples you gave were
good signs of this. What I said is that almost all software which is
written today for e.g. workstations is not better than the software
written 5 years ago. It only uses an order of magnitude more memory,
diskspace, and cpu power.

   Think sometime about the complexity of a file requester, compared to the
   list of files you get under WP on ms-dos.

MS-DOS ? Only at gun-point. WP ? Not even then.

A file requester is a fine thing. Writing a 150 KByte filerequester in
Smalltalk (only to continue to bash this language ... :-) with digitized
graphics and sound and linking it into every 20-C-lines-utility is not.
File requester belong in shared libraries (as ARP does to return to the
original topic)

   >44 kHz and 1024x640 graphics in 16 Mcolors.  Of course,not drawn
   >graphics or composed sounds. No, digitized sounds and graphics. Maybe
   >with a few hours of programming you could write a program which
   >generates the same sounds in a few kBytes. But , who cares ? Put it in
   >the digitizer and generate a Mbyte sample, it is so much easier.

   I dunno, I always imagined that theres a neato half way point.  Ie to do
   sound, you digitize instruments and then have code to vary the volume,
   pitch, and so on.  Of course, it gets stupidly complicated when you try
   for more than four voices.

There are many points against digitizing graphics and sounds for programms.
One of the most important (apart from space considerations), is that 1 Mbyte
of digitized graphics is just so much dead information. You can look
at it, but nothing more. You can't even scale it nicely. Such kind of
information does simply not take advantage of the interactive nature of
computers. Digitized information can be accessed just as well from a book.
1 Mbyte of postscript language description of graphics or 1 Mbyte of scores
in SMUS format is something entirely different. You can interact, manipulate
it on a level of abstraction which humans usualy occupy, and a program can
do things to this data, which are far more interesting. In addition to that
it is about 10-100000 times more information.

   >I am using large machines with lots of memory today, too. But I learned
   >to program on a computer with 1 kByte of memory and if you used more
   >than half of it, the screen was turned of as the screen memory was used
   >for the program. Maybe that shows. I hope it does.

   I learned on a machine with 16 kb, which was later upgraded to 32 kb.  I
   couldn't believe all the neat things I could do with 32 kb.  Now, I have
   512 kb, and I'm looking to upgrade because I don't have enough memory?

   Programs have become larger.  Fundamental reason #1: the machine op code
   doubled in size a few years ago (from 8 bits to 16).  Reason #2: The
   user has given up on cryptic commands.  Reason #3: The programmer
   started to realize there was an actual user, not just the program.

1# doesn't really hold water. While the op code sizes have indeed increased,
so has the power of the op-codes. Maybe a 6502 op-code is 8-bit and a 680x0
op-code is 16-bit long, but how many 6502 instructions do you need to do
a single 32-bit multiplication ?

#2: Cryptic commands ? That is the way people who don't understand something
try to blame it on the subject. Real programmers don't care if a command
is "cryptic". Anyway I don't see how that significantly affects your
code size.

#3: What do you mean by this ? That some programmers have given up hope
that their users can find the 'return'-key, without a little map
on the screen describing it's location ?

   I hope this isn't too philosophical for this newsgroup...

I've adjusted the followup line.

        Carl Edman


Theorectical Physicist,N.:A physicist whose  | Send mail
existence is postulated, to make the numbers |  to
balance but who is never actually observed   | cedman@golem.ps.uci.edu
in the laboratory.                           | edmanc@uciph0.ps.uci.edu

dailey@frith.uucp (Chris Dailey) (11/21/90)

In article <ggk.658820424@tirith.UUCP> ggk@tirith.UUCP (Gregory Kritsch) writes:
>cedman@golem.ps.uci.edu (Carl Edman) writes:
>>In article <ggk.658699652@tirith.UUCP> ggk@tirith.UUCP (Gregory Kritsch) writes:
>>   cedman@golem.ps.uci.edu (Carl Edman) writes:
>>   >In article <90318.162021DXB132@psuvm.psu.edu> DXB132@psuvm.psu.edu writes:
>>   So, whats your point?  It probably took 2 minutes to do the cr-lf
>>   program, and 5 for the clock.  And if they don't work, it'll take
>>   another 2 minutes to find out why not.

A simple trade-off.  Is taking 2 minutes to do a cr/lf program with an
object size of 500,000 bytes better than taking 1/2 hour with a size of
a 5,000 bytes?

>>What I am complaining about ?
>>[raving-prophet-of-the-doom-of-computerdom-and-the-decadence-of-
>> -the-young-programmers-mode on ]
>Nice attitude.  Unfortunately, I'm 18 years old, and just got a little
>offended...  I hate being stereotyped.

I'm only 21, and I'm not offended.  I think he was referring more to
people that learned to program recently on larger systems that allow
their projects to expand to fill (or excede) all available space.

>>graphics or composed sounds. No, digitized sounds and graphics. Maybe
>>with a few hours of programming you could write a program which
>>generates the same sounds in a few kBytes. But , who cares ? Put it in
>>the digitizer and generate a Mbyte sample, it is so much easier.
>
>I dunno, I always imagined that theres a neato half way point.  Ie to do
>sound, you digitize instruments and then have code to vary the volume,
[...]

But digitizing in any form takes up gobs of memory.  That was the
point.  People use digitized sounds as a norm instead of trying to
create a comparable sound pattern that takes up a lot less memory.

>>[...]But I learned
>>to program on a computer with 1 kByte of memory and if you used more
>>than half of it, the screen was turned of as the screen memory was used
>>for the program. Maybe that shows. I hope it does.
>I learned on a machine with 16 kb, which was later upgraded to 32 kb.  I
>couldn't believe all the neat things I could do with 32 kb.  Now, I have
>512 kb, and I'm looking to upgrade because I don't have enough memory?

I learned on a 3.5K machine in about 1982.  Later I had a 64K machine.
(You can probably guess which ones.)  There were many programs that
were much faster and just as complex as those available on the Amiga,
but the equivalent Amiga ones are often a factor of 10 (or more)
larger.

>Programs have become larger.  Fundamental reason #1: the machine op code
>doubled in size a few years ago (from 8 bits to 16).  Reason #2: The
>user has given up on cryptic commands.  Reason #3: The programmer
>started to realize there was an actual user, not just the program.

And, the one that I and I believe Carl Edman object to, #4:  People
have more space, and feel freer to waste it.

>I hope this isn't too philosophical for this newsgroup...

No complaints by me.

>  Gregory Kritsch                          | University of Waterloo
--
Chris Dailey   dailey@(frith.egr|cpsin.cps).msu.edu
BRD += DDR;
DDR = NULL;
num_countries --;

sck@arnor.uucp (11/21/90)

In article <1990Nov20.161846.24152@msuinfo.cl.msu.edu>, dailey@frith.uucp (Chris Dailey) writes:
|> In article <ggk.658820424@tirith.UUCP> ggk@tirith.UUCP (Gregory Kritsch) writes:
|> >cedman@golem.ps.uci.edu (Carl Edman) writes:
|> >>In article <ggk.658699652@tirith.UUCP> ggk@tirith.UUCP (Gregory Kritsch) writes:
|> >>   cedman@golem.ps.uci.edu (Carl Edman) writes:
|> >>   >In article <90318.162021DXB132@psuvm.psu.edu> DXB132@psuvm.psu.edu writes:
|> >>   So, whats your point?  It probably took 2 minutes to do the cr-lf
|> >>   program, and 5 for the clock.  And if they don't work, it'll take
|> >>   another 2 minutes to find out why not.
|> 
|> A simple trade-off.  Is taking 2 minutes to do a cr/lf program with an
|> object size of 500,000 bytes better than taking 1/2 hour with a size of
|> a 5,000 bytes?

   That all depends on the architecture that you are working on. If you have a machine that has 12 terabytes of memory, it is much more cost-effective to write code that is large but easier to write, because the cost of keeping that machine up and running is much greater than the cost of a 16 k machine.

 (note: I'm not saying that this is the case for workstations > but for larger systems, this is sometimes the rule.)

|> 
|> >>What I am complaining about ?
|> >[...] I'm 18 years old,
|> [...] I'm only 21, 

   Age, really doesn't matter. It only depends when you started to learn computing. I am only 23, yet I have been in computers for 14+ years. The question is not how old you are but, what language and architecture you learned on.

|> 
|> >>graphics or composed sounds. No, digitized sounds and graphics. Maybe
|> >>with a few hours of programming you could write a program which
|> >>generates the same sounds in a few kBytes. But , who cares ? Put it in
|> >>the digitizer and generate a Mbyte sample, it is so much easier.
|> >
|> >I dunno, I always imagined that theres a neato half way point.  Ie to do
|> >sound, you digitize instruments and then have code to vary the volume,
|> [...]
|> 
|> But digitizing in any form takes up gobs of memory.  That was the
|> point.  People use digitized sounds as a norm instead of trying to
|> create a comparable sound pattern that takes up a lot less memory.
|> 

    Once again, you must answer the question, is it more economical to spend many man-years of coding, to save a few Kb. On a machine with limited resources such as our beloved C=64, yes. But, on a machine that is more robust, like the Amiga, No. 

(note: I'm not advocating sloppy coding. But, I am saying that one should think before they write code.)

|> 
|> >Programs have become larger.  Fundamental reason 
|> > #1: machine op code doubled in size a few years ago (from 8 bits to 16).  
|> > #2: The user has given up on cryptic commands.
|> > #3: The programmer started to realize there was an actual user, not just the program.
|> #4:  People have more space, and feel freer to waste it.

   5: The overhead of having to deal with a graphical environment.
   6: The evolution of computers from simple adding machines to the workstations & super-computers we use today. 

(As the level of technology increases so does the required knowledge until the point where no one knows it all. As shown by the auto industry: in 1910, one person designed an entire car, now one person can't design the entire car becuase their scope of knowledge has been decreased while their depth has proportionally increased. Similarly, the vast diversity of the computer field which does not allow one person to know everything about an architecture. When I was a C=64 coder I knew almost every inch of tha



t machine, I knew that values of every register, the location of each bit. Now, on my Amiga and RIOS I have trouble remebering what my own code is doing much less what the operating system is doing.)

|> >I hope this isn't too philosophical for this newsgroup...
|> No complaints by me.

  A little diversion is good for the soul.

|> >  Gregory Kritsch                          | University of Waterloo
|> Chris Dailey   dailey@(frith.egr|cpsin.cps).msu.edu

------------------------------------------------------------------------
Scott C. Kennedy (sck@shed.watson.ibm.com) | (QP06 @ PACE.bitnet.com)
High Performance Computing Environment     |  Computer Science Dept.
I.B.M. Thomas J. Watson Research Facility  |     Pace University
------------------------------------------------------------------------

ggk@tirith.UUCP (Gregory Kritsch) (11/17/20)

cedman@golem.ps.uci.edu (Carl Edman) writes:
>In article <ggk.658699652@tirith.UUCP> ggk@tirith.UUCP (Gregory Kritsch) writes:

>   cedman@golem.ps.uci.edu (Carl Edman) writes:
>   >In article <90318.162021DXB132@psuvm.psu.edu> DXB132@psuvm.psu.edu writes:

>   So, whats your point?  It probably took 2 minutes to do the cr-lf
>   program, and 5 for the clock.  And if they don't work, it'll take
>   another 2 minutes to find out why not.

>What I am complaining about ?
>[raving-prophet-of-the-doom-of-computerdom-and-the-decadence-of-
> -the-young-programmers-mode on ]

Nice attitude.  Unfortunately, I'm 18 years old, and just got a little
offended...  I hate being stereotyped.

>But what happens ? Do new, imaginative mind-blowing programs appear at
>every turn ?

If you look hard enough, you'll notice some mind-blowing stuff out
there.  I remember two years ago talking to someone about DTP, and
watching their sudden realization that DTP didn't exist when they bought
their A2000.

Imagine what the ms-dos dudes are thinking at this moment of multi-media
(Amigatroids have had it for a while, so we're less excited).

Remember a program called Hypercard from Apple a few years ago?  Forget
that, remember the idea of a window even more years ago?

I know, a few small examples, but still, it is happening.  Computers
today are used for a lot of things that were unimagined when I got into
computers (er, I guess it all began 7 years ago with a TRS-80 CoCo). 
You expect something that will boggle your mind, whereas almost every
software developer out there is trying to make their software as simple
and unintimidating to use as possible.

Think sometime about the complexity of a file requester, compared to the
list of files you get under WP on ms-dos.

>44 kHz and 1024x640 graphics in 16 Mcolors.  Of course,not drawn
>graphics or composed sounds. No, digitized sounds and graphics. Maybe
>with a few hours of programming you could write a program which
>generates the same sounds in a few kBytes. But , who cares ? Put it in
>the digitizer and generate a Mbyte sample, it is so much easier.

I dunno, I always imagined that theres a neato half way point.  Ie to do
sound, you digitize instruments and then have code to vary the volume,
pitch, and so on.  Of course, it gets stupidly complicated when you try
for more than four voices.

>I am using large machines with lots of memory today, too. But I learned
>to program on a computer with 1 kByte of memory and if you used more
>than half of it, the screen was turned of as the screen memory was used
>for the program. Maybe that shows. I hope it does.

I learned on a machine with 16 kb, which was later upgraded to 32 kb.  I
couldn't believe all the neat things I could do with 32 kb.  Now, I have
512 kb, and I'm looking to upgrade because I don't have enough memory?

Programs have become larger.  Fundamental reason #1: the machine op code
doubled in size a few years ago (from 8 bits to 16).  Reason #2: The
user has given up on cryptic commands.  Reason #3: The programmer
started to realize there was an actual user, not just the program.

I hope this isn't too philosophical for this newsgroup...

>        Carl "When you have 'cat', who needs compilers ?" Edman
--
  Gregory Kritsch                          | University of Waterloo
    Fido:  1:221/208.11110  [1:163/109.30] | 1A Computer Engineering
    UUCP:  ggk@tirith.UUCP                 |--------------------------
           ...!watmath!xenitec!tirith!ggk  | Amiga Fanatic