[net.lang] levelheight

alexc@dartvax.UUCP (Alex Colvin) (01/06/85)

*** REPLACE THIS LINE WITH YOUR MESSAGE ***
Suggest the debate be retitled "levelheight".

laura@utzoo.UUCP (Laura Creighton) (01/09/85)

What a fun discussion! I get back from vacation and look what I get!
Thanks folks. Okay, what do we have here?

First of all, we are agreed that there are certain things which are
``low level''. Programs written in turing machine ``move the tape X
spaces to down, read tape'' instructions would qualify. They are
often provable (at least that's what the CS courses said: ``write
a program in turing machine code that does X. Now prove X''). 
Machine language is also low level.

Then there are higher level things like assemblers.

Then there are all the languages which are portable to more than one
piece of hardware. (Actually, i know a lot of machines that have
OS 360 assembler interpreters - this definition could stand some
work.)

Now what one wants to do is rank these languages. Naturally, one
wants one's favourite languages to score well and the ones that
one hates to score badly. (Why not? One must have a reason to
hate the languages that one hates.)

There may be an ``apples and oranges'' problem here. The way I think
when I think in Lisp is very different from the way that I think
when I think in C. (Someday i am going to have to get my hands on
APL and raise my consciousness more.) Comparing C and Lisp may be
a mistake, though comparing Pascal and C seems appropriate.

Okay, barring the ``apples and oranges'' types of questions, I think
I can come up with a working definition of ``high level''. The
``highness of level'' is a measure of how little you have to think
about something other than the problem you want to solve. 

Machine code scores very  poorly on this test: you have to do your
own character encoding and mostly that is what you are thinking of.
Assembler does somewhat better, but you still think about hardware
a lot.

The interesting thing is that C does not score poorly on this. Arguments
such as ``well the ++ instruction was clearly designed for the pdp11,
therefore C is low level'' do not wash -- once you have used C a fair
bit, typing i++ is as natural (and unthought) as typing i + 1.

Which brings up another point: only people who have used the languages
being compared are actual experts on how ``high level'' a language is.
The Basic user who has been programming in C for a week is going to
find C a very low level language for a while: most of his effort
will be in learning a new syntax and new constructs. The Pascal
afficionado who has overdosed on Wirth for a week may be unable to
see the value of variable length strings in his passion. Neither
of these sorts are going to be very useful judges of how ``high level''
C is.

So how about some heuristics for the design of (yet another! gasp,
just what computer science really needs!) a high level language.

1.	Line numbers and GOTOs make for a low level language. If I
	have to think about ``is it goto 100 or goto 110'' then I am
	wasting valuable thinking time on crud.

2.	all high level languages have a variety of types, including
	a string type.

3.	all high level languages have a way of defining types and
	let you use them conveniently.

4.	all high level languages have a way of ``hiding'' information.
	Here I am thinking of procedures and functions, but this can
	include modules as well.

Can anybody come up with any other heuristics? So far, my favourite
of the ``block structured'' types of languages is Algol 68 -- though
I am  told that Simula is rather nice as well.

For sheer joy of writing code I would rather be writing in Lisp, but
I don't think that Lisp is ``higher level'' than Algol: I just have
a severe case of Hofstadter's disease (all the world is recursive,
and hence wonderful) which I am entirely uninterested in curing...

Laura Creighton
utzoo!laura

g-frank@gumby.UUCP (01/09/85)

> Okay, barring the ``apples and oranges'' types of questions, I think
> I can come up with a working definition of ``high level''. The
> ``highness of level'' is a measure of how little you have to think
> about something other than the problem you want to solve. 
> 

I like this definition a lot.

> 
> The interesting thing is that C does not score poorly on this. Arguments
> such as ``well the ++ instruction was clearly designed for the pdp11,
> therefore C is low level'' do not wash -- once you have used C a fair
> bit, typing i++ is as natural (and unthought) as typing i + 1.
> 

I don't think that is a common argument against the high-levelness of C
A better one is that the lack of implicit pass by reference causes pointers
to be used in a way which doesn't have anything to do with the problem you
want to solve.  You want to say, "this variable is changed by the routine
I'm calling," and instead your program reads like R2D2's diary.

Another argument is that, in C, you must remember that things you
declare sometimes aren't what you think.  An array isn't an array, it's
a pointer to allocated storage, so you don't have to put a & in front of
it when passing it to a function.  But remember, the function doesn't
know its dimensions or its size, no matter how you declare it to the function,
so you have to define a manifest, or something.  And remember, if you cast
it, you have to treat its original type as a pointer type, and . . . oh,
forget it.

> 
> 1.	Line numbers and GOTOs make for a low level language. If I
> 	have to think about ``is it goto 100 or goto 110'' then I am
> 	wasting valuable thinking time on crud.
> 

Line numbers make for a low level language.  Go to's, when used in place
of looping constructs, and when used irrationally and in excess, make for
a low level program, even when written in a high level language.  On the
other hand, a lack of some sort of escape to a semantically meaningful
label (e.g., error_exit, try_again, etc.), can make programs less like
what we intended to do, as we desperately seek expedients to escape multiple
nested loops.

Mind you, I think "exceptions," as in Ada, are the highest level of all.
One may somewhat sophistically argue that gotos, when used only to implement
some kind of exception by the programmer, make a language higher level than
one that does not have them, all things being the same otherwise.

> 2.	all high level languages have a variety of types, including
> 	a string type.
> 
> 3.	all high level languages have a way of defining types and
> 	let you use them conveniently.
> 

Yup.  But does Lisp qualify for number 3?  How about Prolog?  (This is
not a rhetorical denial.  I'm curious how you feel about this.)

> 4.	all high level languages have a way of ``hiding'' information.
> 	Here I am thinking of procedures and functions, but this can
> 	include modules as well.
> 

And not just hiding, but making less susceptible of meddling, and of making
sure that invocations or uses of the encapsulation mechanism match its
definition.  In other words, unchecked and sloppy use of pointers, a lack
of array bounds checking, and no ability to check the number and type of
parameters to function calls all violate this one, in spirit at least.
(Don't tell me about lint.  YOU try to find it for anything but an ortho-
dox Unix system).


> Can anybody come up with any other heuristics? So far, my favourite
> of the ``block structured'' types of languages is Algol 68 -- though
> I am  told that Simula is rather nice as well.
> 

Modula-2, if you don't mind doing without gotos, and the complete lack
of a standard for library modules isn't too galling for words.  Ada, if
you don't mind doing without a compiler.

> For sheer joy of writing code I would rather be writing in Lisp, but
> I don't think that Lisp is ``higher level'' than Algol: I just have
> a severe case of Hofstadter's disease (all the world is recursive,
> and hence wonderful) which I am entirely uninterested in curing...
> 
> Laura Creighton
> utzoo!laura

I think you have the right idea.  What's really needed is a debate on
two opposing viewpoints:

  "A high level language is fun."

  "A high level language is beautiful, but not fun, because while
   giving greater expressivity, it also imposes discipline, which
   humans find inconvenient."  (The European view, I think).

But do you really enjoy writing code?


-- 
      Dan Frank

	"good news is just life's way of keeping you off balance."

franka@hercules.UUCP (Frank Adrian) (01/11/85)

In article <247@gumby.UUCP> g-frank@gumby.UUCP writes:
>> 2.	all high level languages have a variety of types, including
>> 	a string type.
>> 
>> 3.	all high level languages have a way of defining types and
>> 	let you use them conveniently.
>> 
>
>Yup.  But does Lisp qualify for number 3?  How about Prolog?  (This is
>not a rhetorical denial.  I'm curious how you feel about this.)
>
Common LISP qualifies for 3. Interlisp does. In any case, the use of
lists for data structures (although inefficient for the most part) doesn't
cause much confusion. The main problem is that you do have to define
access macros (something that Common LISP gives you automatically), but
how long does it take to write a couple of macros that take the n'th item
down a list? Or else store structure values on the property list. That way
you just do a (get subItem atomRepresentingTheStruct). You can't get much
more high level than that.
Prolog, on the other hand, is a toy...
>> 4.	all high level languages have a way of ``hiding'' information.
>> 	Here I am thinking of procedures and functions, but this can
>> 	include modules as well.
>> 
>
>And not just hiding, but making less susceptible of meddling, and of making
>sure that invocations or uses of the encapsulation mechanism match its
>definition.  In other words, unchecked and sloppy use of pointers, a lack
>of array bounds checking, and no ability to check the number and type of
>parameters to function calls all violate this one, in spirit at least.
>(Don't tell me about lint.  YOU try to find it for anything but an ortho-
>dox Unix system).
>
>
I'm sorry, but I really, really, really dislike strong type checking.
Unless your language allows operator overloading, there is no way to
write generic packages without defining everything as a huge, ugly
union (which sorta defeats the whole notion of the thing, eh?).
Array bounds checking usually just gets in the way.
Optional and multiple length procedure calls are so useful as to make
the idea of the language not having them completely ludicrous.
The only way to get secure code is to get a secure programmer...

>> Can anybody come up with any other heuristics? So far, my favourite
>> of the ``block structured'' types of languages is Algol 68 -- though
>> I am  told that Simula is rather nice as well.
>> 
>
>Modula-2, if you don't mind doing without gotos, and the complete lack
>of a standard for library modules isn't too galling for words.  Ada, if
>you don't mind doing without a compiler.
>
I have to agree with Laura on this one. Algol 68 had one feature that all
languages since then have forgotten... Orthogonality!!! The only language
that comes close in this respect is LISP. Any language that does not have
orthoganality (e.g., allowing me to assign pointers to procedures, etc.)
is brain damaged.

>> For sheer joy of writing code I would rather be writing in Lisp, but
>> I don't think that Lisp is ``higher level'' than Algol: I just have
>> a severe case of Hofstadter's disease (all the world is recursive,
>> and hence wonderful) which I am entirely uninterested in curing...
>> 
>> Laura Creighton
>> utzoo!laura
>
>I think you have the right idea.  What's really needed is a debate on
>two opposing viewpoints:
>
>  "A high level language is fun."
>
>  "A high level language is beautiful, but not fun, because while
>   giving greater expressivity, it also imposes discipline, which
>   humans find inconvenient."  (The European view, I think).
>
>But do you really enjoy writing code?
>
>
>-- 
>      Dan Frank
>
Yes, I do enjoy writing code. I don't like hairbrained notions of some
Godlike imperative called "security" getting in the way. For example,
you tell me how to get a generic routine written in a vanilla PASCAL.
You're right. The second view is a European view. Maybe the programmers
over there are so bad as to need this sort of thing (at least most of
the code I've seen, written by Europeans, is). The first seems to be
an American view. Give me #1 any day and let me get the job done...

				"Same as it ever was..."
					Frank Adrian

marti@hplabsc.UUCP (Robert Marti) (01/15/85)

In article <369@hercules.UUCP> Frank Adrian, in discussing the drawbacks
of strongly typed programming languages and of Pascal in particular, writes:

-----
>                       ...   I don't like hairbrained notions of some
> Godlike imperative called "security" getting in the way. For example,
> you tell me how to get a generic routine written in a vanilla PASCAL.
> You're right. The second view is a European view. Maybe the programmers
> over there are so bad as to need this sort of thing (at least most of
> the code I've seen, written by Europeans, is). The first seems to be
> an American view. Give me #1 any day and let me get the job done...
-----

Well, I happen to be one of those dumb Europeans myself, so feel free
to hit the 'n' key right away ... :-)

However, more to the point, I feel that the issue of type-checking is
not really dependent on the "intelligence" of the programmers. We all
make mistakes once in a while, some of us more, and some of us less.
(You obviously believe that you belong into the latter category.)

The real issue is this:
Assuming you make a certain number of mistakes per 1000 lines of code,
are you willing to introduce some redundancy into you program, namely,
type, variable, and procedure declarations, in the hope that the compiler
will catch most of the more trivial errors and hopefully some of the more
subtle bugs there and then? Or do you prefer to take the chance that most
of those errors surface one at a time, depending on whether the control
flow in your program happens to reach one of your erroneous statements?
In the latter case you probably will have to resort to some debugging tool
to track down the error, before being able to recompile your program.
Chances are that considerably more time is spent to fix this single error
than the time required to fix multiple errors caught by a good compiler ...

As an aside, I too feel that Pascal's type checking sometimes is more
of a nuisance than help. This is why I prefer to use Pascal's successor,
Modula-2, which among other improvements over Pascal includes so-called
type transfer functions for each predefined or user-declared data type
which correspond to type casts in C.

                                     Bob Marti
                                     marti.hplabs@csnet-relay
                                     ...!hplabs!marti

malcolm@kcl-cs.UUCP (Malcolm Shute.) (01/16/85)

First of all, may I apologise if this item does not actually  say
anything  which  has not already been said. However, I thought it
was worth noting that Lisp really is just a low  level  language.
There  are  many  higher  level  functional programming languages
(SASL, Hope, KRC, Sugar etc.) which look  more  like  high  level
imperative  languages  (Pascal,  Ada  etc.).  Lisp  really is the
assembler language of  functional  programming,  with  combinator
code  (Schoenfinkel, Turner etc.) forming a very good object code
for multi-processor functional processors. Here at Kings  College
London,  we  are  designing (or attempting to design) just such a
machine for wafer-scale integration. [Spot the plug for  our  own
work there.]

However, having said all of that, I remember  someone  noting  in
this  group  that  an  8080  would  look  on  a  32-bit machine's
assembler as a relatively high  level  language,  by  comparisson
with  its  own assembler. Similarly, Lisp is higher in level than
most  of  the  current  imperative  (e.g.  von  Neumann)  machine
assemblers.   In   effect,  running  a  Lisp  interpretter  on  a
conventional machine really amounts to running a simulator for  a
functional  machine  on  the  conventional  one, and then writing
programs in the functional one's assembler language.

laura@utzoo.UUCP (Laura Creighton) (01/17/85)

I was once talking to someone and got the distinct impression that
he thought that a compiler that did a lot of checking should
enable him to write totally bug-free programs. This bothered me.
Perhaps it was bravado, but I feared he would never learn to check
logic errors, on the assumption that ``if it compiled, it would be
fine.'' He was working as a short-term consultant where one can get
away with such trash. (I was working as a consultant where one often
fixes such trash).

I wonder whether this is widespread. The problem with safety devices
is that people rely on them to do their thinking for them in situations
where they shouldn't.

On the other hand, after picking logic and lint errors out of a net.sources
submission, i wonder if having a strongly typed C compiler might just
halve the work I do anyway.

Anybody done a ``Pascal programmers make X times as many logic errors
as C programmers'' type of survey?

Hmm. Maybe if they don't have to worry about type checking they can think
more about logic errors. But I worry -- if you don't worry about type casting
then are you really thinking about writing code at all?

Confusedly yours,
Laura Creighton
utzoo!laura

laura@utzoo.UUCP (Laura Creighton) (01/18/85)

Reply to Dan Frank:

I don't know how many people use the ``well, the ++ instruction was
clearly designed for the pdp11, therefore C is low level'' argument.
I have heard at from at least 10 independent sources, though. Maybe
it is only a Toronto-area-argument against using C.
The lack of pass by reference (explicitly) is interesting. It is
funny, but I don't see it as any more difficult to say ``*c'' than
``var c''. The problem I get to see a lot with languages that 
have explicit pass by reference is that *everything* is done by
pass by reference! Of course, no language can ever protect one from
stupidity...

I find that I almost never use arrays when writing C. I always use
pointers instead. it gets me wonderfing as to whether thinking of
arrays is a very good coceptual model at all. With double and
triple pointers I can have very non-regular shapes of data, which
has often proven useful. I can still remember how getting used to
not being able to dynamically build arrays the way I was used to
used to drive me nuts when I was first using C, though.

Of course, the concept of a ``for loop'' took a while to get used to
as well...

I have sometime used ``if (disaster) goto error_exit'' type constructs,
but not all that frequently. I have found that in C you almost always
can use perror to get what you want. THESE DAYS THE *error* ROUTINE IN
KERNIGHAN AND PIKE'S BOOK THE UNIX PROGRAMMING ENVIRONMENT IS A MUCH
BETTER SOLUTION THAN *perror*. COULD EVERYBODY PLEASE USE IT? [Thanks,
this has been a free announcemnet by the ad-hoc committee to improve
the quality of life for C programmers everyhere...]

Lisp qualifies for ``all high level languages have a way of defining
types and letting you use them conveniently''. Or, at least, COMMON
Lisp does. Right now there are too many Lisps... but any lanuage which
can write things like ``give me the third noun on this list which is
abstract, plural and starts with the letter ``g'' '' is rather good
at defining complicated types. . . . just different at it than the
normal block-structured type languages.

I don't know about Prolog. the last time I looked at Prolog I said
``shudder! Case is important in this language! My eyes will never
hack this one -- I will go away and only come back if there is so
much interest in this one that I figure it is worth my while, even if
I already know I will never program well in the language.'' I just can't
tell the differnce between `c' and `C' and `s' and `S' for instance, to
ever be very comfortable with the language.

Lint should be rather easy to write for any non-Unix system. I keep meaning
to write one and sell if to the PC world and make a fortune -- when I have 
time. Somehow, I never have time...

I think that the ``A high level language is fun'' versus ``A high
level language is beautiful, but not fun, because while giving
greater expressiveness it also imposes discipline, which human beings
find inconvenient'' debate is unnecessarily dividing.

Of course a high level language is fun! Watching yourself think in
efficient and different ways is always fun. I am not kidding about
learning a new language as a consciousness-raising experience. If
it were not fun, I would not do it. For the same reason it is
beautiful. but discipline is not the bitter pill so many people
think it is. If you want to do somethin well, you impose your own
sort of discipline and definition of ``what is well'' on the subject.
Haven't you ever written a particularily neat piece of code and looked
at it full of elation and thought ``wow! is that ever gorgeous!''?

If not, I question why you are doing whatever you are doing in this
field. Wonderful opportunities for sheer joy like this should not
be passed up. Of course, there are some people who write abysmal
code (the kind that it pains me to read) and probably feel good about
writing it. I figure that these people are the ones in need of more
discipline. [I remember the horrid days when I wrote code that I would
retch at now...anybody out there who has code written before 1978 by
me, kindly do the world a favour and burn it...]. They are busy with
self-deception when thy try to claim that their trash is ``perfectly
good code'' and really haven't ``groked in fullness'' how truly
beautiful well written code can be... (thanks to Robert Heinlein)

Once they have, the effort that goes into discipline gets sent back
hundredfold in the joy of writing really good code, so it is no big
effort.

I can't say that I have found code written by Europeans to be in any
way more disciplined (read: less trashy) than code written by North
Americans.  There is a better chance that it is written in
Algol 68 if it was written by a European which I find very pleasing to
read, but that is not the same thing...

Frank Adrian is right. Orthagonality is beautiful to contemplate and
beautiful to use.

Laura Creighton
utzoo!laura

goetter@yale.ARPA (Ben Goetter) (01/18/85)

/* flame ON */

Do remove your blinders.  Is C a toy compared to Assembler?

.ben.

g-frank@gumby.UUCP (01/19/85)

> 
> /* flame ON */
> 
> Do remove your blinders.  Is C a toy compared to Assembler?
> 
> .ben.

Did someone say C is a toy?  I must have missed the message.  I don't think
it's a toy.  It's too DANGEROUS.


-- 
      Dan Frank

	"good news is just life's way of keeping you off balance."

ee163acp@sdcc13.UUCP (DARIN JOHNSON) (01/20/85)

> 
> /* flame ON */
> 
> Do remove your blinders.  Is C a toy compared to Assembler?
> 
> .ben.


  Yes it is, but it comes with batteries.

steven@boring.UUCP (01/21/85)

In article <4948@utzoo.UUCP> laura@utzoo.UUCP (Laura Creighton) writes:
> I was once talking to someone and got the distinct impression that he
> thought that a compiler that did a lot of checking should enable him to
> write totally bug-free programs. This bothered me. Perhaps it was bravado,
> but I feared he would never learn to check logic errors, on the assumption
> that ``if it compiled, it would be fine.''
At a place I once worked at, they introduced a new Fortran compiler that did
much more run-time checking. All of a sudden the manager was beseiged by
people complaining about the new compiler because all of a sudden what used
to be working programs no longer worked, and demanding that the old compiler
be brought back. So it seems to be a property of people in general, rather
than any particular language/compiler.

> I wonder whether this is widespread. The problem with safety devices
> is that people rely on them to do their thinking for them in situations
> where they shouldn't.
I don't think this is a property of safety devices. I don't get the
impression that people drive worse if they're wearing a safety belt, for
instance. I like to have safety devices in programs to protect me when my
thinking fails, rather than to save me thinking.

> Anybody done a ``Pascal programmers make X times as many logic errors
> as C programmers'' type of survey?
Someone did such a comparison between Pascal and Fortran a few years back in
The Computer Bulletin, and found that Pascal programmers produced far fewer
errors.

> Hmm. Maybe if they don't have to worry about type checking they can think
> more about logic errors. But I worry -- if you don't worry about type casting
> then are you really thinking about writing code at all?
It goes without saying that the earlier errors are spotted, the better. If
it is possible to identify errors at compile time, so much the better,
because you are going to spend so much less time debugging.

I believe that programming languages can go much further in the help they
give programmers than they do at present. For instance, it is possible to
statically guarantee that nil pointers are never dereferenced (imagine the
bugs that that would prevent!) but I know of no language that supports it.

Steven Pemberton, CWI, Amsterdam; steven@mcvax.

mckeeman@wanginst.UUCP (William McKeeman) (01/21/85)

...nothing is more incomprehensible than a symbolism we do not understand..
...they have invariably been introduced to make things easy...

   A. N. Whitehead

(for your pleasure)  W.M. McKeeman  ...decvax!wanginst!mckeeman

wildbill@ucbvax.ARPA (William J. Laubenheimer) (01/22/85)

It seems that a lot of the proposed definitions for determining how "high-level"
a language is are focusing on things like "how many keystrokes do I need to
do thus-and-so". Not only does this lead to the silliness about "APL only
needs epsilon keystrokes to do this operation while C needs omega", but it
doesn't take into account languages which don't use keystrokes at all. How
would you apply this metric to a "language" which you might find on a Mac,
where you define variables by pointing to an area on the screen, which then
becomes a "bin" from which you can get a value or leave one for later; a
"function" is some sort of machine icon with connectors which you can
attach to bins (parameters) or other machines (expressions) or whatevers,
and a procedure or control structure looks like an assembly line, which
you can reduce to a building icon with the same kind of connectors as
the appropriate machine icons? How many characters is a mouse-click worth?
What's the value of wheeling the mouse around the display area? Is pointing
to a building-icon worth the same as pointing to a machine-icon?

These definitions seem to be pointing at an information-theoretic measurement
of the level of a language. Although I haven't studied information theory
to more than a passing familiarity with some of the concepts, these squabbles
seem to be precisely why the discipline was put together in the first place.
So let's condense all the stuff which tries to figure out how bulky the
source code is down to the following:

INFORMATION-THEORETIC MEASURE OF LANGUAGE COMPLEXITY: Language "X" is
"higher-level" than language "Y" iff, given any problem, there exists
a coding of a solution to the problem in "X" which, \\when represented as the
sequence of symbols in "X" implementing that solution//, requires fewer
bits than a coding of any solution to the problem in "Y" when represented
as the sequence of symbols in "Y" implementing that solution.

I have tried to construct a definition that will allow such tricks as
representing a keyword as a single token and an identifier as "just a name"
regardless of length, while still allowing for the fact that if a large
number of keywords are present, there is more information contained in
each keyword, and if many identifiers are declared, more information is
necessary to distinguish between them. This also allows other neat things
like the hieroglyphic language I briefly described above, or using various
compression techniques on the output. However, I hope I have prohibited
little tricks like the famous "smallest integer not sayable in fewer than
twenty syllables" paradox (translated to this environment, it comes out
as defining "matrix inversion" and all the terms relevant thereto, specifying
that you are interested in the shortest program written in that language
which solves "matrix inversion", and encoding that into some representation
which might require fewer bits than the solution to the problem).

Stated in less precise terms, what I have tried to construct is a metric
which will compare information content of the source representations of
the best solution in the two languages under discussion. This would
seem to be what all the "count-the-source-bits" people want. Am I a
"count-the-source-bits" person myself? I don't think so...

                                        Bill Laubenheimer
----------------------------------------UC-Berkeley Computer Science
     ...Killjoy went that-a-way--->     ucbvax!wildbill

josh@topaz.ARPA (J Storrs Hall) (01/22/85)

> 
> > I wonder whether this is widespread. The problem with safety devices
> > is that people rely on them to do their thinking for them in situations
> > where they shouldn't.
> I don't think this is a property of safety devices. I don't get the
> impression that people drive worse if they're wearing a safety belt, for
> instance.

I sure as hell do.  And I know many people who do, but wouldn't admit it.

Let me opine that there are *three* different approaches to this
high-levelness/typing morass (at least...).  Let me try to show how
they apply to, say, numbers:

1) Assembly style:  The "language" doesn't know about objects, just
operations.  You must keep track of the "type" of each location in
your head.  (For this reason, good assy. language programmers tend
to have very large heads :^)

2) Pascal style:  The language knows about all the types and keeps
strict track of them to let you know when you make a mistake.
Essentially the compiler works as an automated cribsheet to help you
keep track of the mass of detail you had to keep in your head before.

3) APL style:  The language presents the single concept "number"
to the programmer, who never has to worry about the implementation
thereof.  The programmer never even knows whether that number was
actually represented as a bit, integer, float, whatever.  The 
machinery of the system does all the worrying.

Disclaimer:  I do not claim to defend all the odd customs and mores
of APL, but advance this concept as an example.

Styles 2 and 3 have the common problem that the implementation
is restrictive; that is, the implementor (designer) must have had
in mind the kind of thing you are trying to do, or you can't
do it.

g-frank@gumby.UUCP (01/22/85)

> 2) Pascal style:  The language knows about all the types and keeps
> strict track of them to let you know when you make a mistake.
> Essentially the compiler works as an automated cribsheet to help you
> keep track of the mass of detail you had to keep in your head before.
> 
> 3) APL style:  The language presents the single concept "number"
> to the programmer, who never has to worry about the implementation
> thereof.  The programmer never even knows whether that number was
> actually represented as a bit, integer, float, whatever.  The 
> machinery of the system does all the worrying.
> 
> Styles 2 and 3 have the common problem that the implementation
> is restrictive; that is, the implementor (designer) must have had
> in mind the kind of thing you are trying to do, or you can't
> do it.

It depends what you mean by restrictive.  If a language has rich enough
tools for creating new types and defining operations on those types (Ada,
for example), you can do almost anything, type-wise.  The whole point of
derived types is that the implementor of the language admits that he/she
DIDN'T know what your were going to do, and provides suitable outlet for
your creativity.

Can you give examples of things Pascal prevents you from doing, type-wise
(leave out decent i/o, casts and type conversions, stuff like that - we all
know that these are silly omissions from the language, and they ARE in
Modula-2)?


-- 
      Dan Frank

	"good news is just life's way of keeping you off balance."

ndiamond@watdaisy.UUCP (Norman Diamond) (01/24/85)

> Can you give examples of things Pascal prevents you from doing, type-wise
> (leave out decent i/o, casts and type conversions, stuff like that - we all
> know that these are silly omissions from the language, and they ARE in
> Modula-2)?
>       Dan Frank

Theoretically, one could declare a record with a gigantic variant part,
which sets the size of an array component for every integer in some
sub-range.  Theoretically, one could write a procedure that could do
matrix operations on different-sized matrices when called by different
calling statements (or by two executions of the same calling statement).
However, PRACTICALLY speaking ....   This is answer #1 to the above
question.

2.  I predict that Ada's feature will be grossly abused, where the
programmer can specify the underlying values to be used in an enumerated
type.  Nonetheless, there are occasional valid needs for such a feature.

3.  Passing portions of arrays as procedure parameters.  That's a big
reason that there is still no substitute for Fortran, even for some new
code.

4.  Arrays of procedure names / function names.  C can simulate these
with function pointers.  (Ada has it for tasks, but that's not the same
thing and can only provide an expensive simulation for it.)

I'm sure there's more.

However, if one needs to do such things, there are still benefits to
debugging a Pascal program that does most of what's needed, before
turning to other languages.

-- Norman Diamond

UUCP:  {decvax|utzoo|ihnp4|allegra|clyde}!watmath!watdaisy!ndiamond
CSNET: ndiamond%watdaisy@waterloo.csnet
ARPA:  ndiamond%watdaisy%waterloo.csnet@csnet-relay.arpa

"Opinions are those of the keyboard, and do not reflect on me or higher-ups."