[comp.lang.misc] Sigh. Doesn't anyone bother to read the literature?

cs450a03@uc780.umd.edu (03/30/91)

Dave Chase writes:
>All this drivel about what language xyzzy can or can't do is pretty
>tiresome, since nobody is saying anything new.  The arguments are also
>pretty pointless, since most of the languages discussed (all but ML,
>Russell, and Quest), whether "statically" or "dynamically" typed, have
>pitifully inadequate type systems (Eiffel too -- nice touch, leaving
>that contravariance hole in there for procedure types), if they claim
>to have type systems at all.

Nothing new under the Sun, eh?

It may not be true for you, but I find that reviewing basic concepts
and assumptions is an extremely fruitful line of thought.  

>(2) does any "expert" out there care to explain how "data types are
>    values"?  (Hans Boehm is not allowed to answer this question)

"What operations are allowed on a datum" is representable as data.

The point is not that data types "are" values.  The point is that
viewing data types as values is a useful approach to problem solving.

Of course, "data type" means different things to different people.  To
some, it is the "word size" used to hold the bit patterns that
represent values.  To some, it is what kind of register holds these
values.  To some, it is a syntactic feature of a programming language.

If you are dealing with large data objects (much larger than the word
size of the machine you are working on), it becomes convenient to
"tag" these structures so that functions may reject values which are
outside their domain.  This is analogous to the sort of thing that you
need to do to insure that machine operations are doing meaningful
work, but it is also different.  

Disclaimer: this technique is not monopolized by any language, though
some languages specialize in certain forms of this technique.

Raul Rockwell

chased@rbbb.Eng.Sun.COM (03/30/91)

All this drivel about what language xyzzy can or can't do is pretty
tiresome, since nobody is saying anything new.  The arguments are also
pretty pointless, since most of the languages discussed (all but ML,
Russell, and Quest), whether "statically" or "dynamically" typed, have
pitifully inadequate type systems (Eiffel too -- nice touch, leaving
that contravariance hole in there for procedure types), if they claim
to have type systems at all.

People are also attributing the qualities of current language
implementations to the languages themselves; in many cases, either
"customization" (Chambers & Ungar) or constant propagation over the
types themselves ("Data Types are Values", Demers and Donahue) permits
dandy optimization both at compile time and at run time.  Does anyone
care to explain why truly polymorphic languages *must* make certain
speed compromises, regardless of implementation technology?  (I
suspect it is true, but I suspect that none of the usual suspects can
explain why it is true.)

Of course, there was no need to mention these papers, because I'm sure
that all you experts have read them and not only understand them, but
could do a much better job yourselves overnight in C++.

By the way, how efficient is a buggy program?

To translate:  here are some new challenges:

(1) why must a truly polymorphic (undefined term) language be less
    efficient than a non-polymorphic language?

(2) does any "expert" out there care to explain how "data types are
    values"?  (Hans Boehm is not allowed to answer this question)

(3) if you had to use "void *" to program generically in C/C++, what
    does this say about the "type system"?

(4) what's the contravariance rule?

It seems to me that one ought to have thought about these questions in
some detail before spouting off about type systems and expressiveness.

David Chase
Sun

brnstnd@kramden.acf.nyu.edu (Dan Bernstein) (04/01/91)

In article <10733@exodus.Eng.Sun.COM> chased@rbbb.Eng.Sun.COM () writes:
> People are also attributing the qualities of current language
> implementations to the languages themselves;

It's very difficult *not* to do that, especially for languages without
formal standards. Of course, one should give greater weight to more
modern implementations based on current research.

> (1) why must a truly polymorphic (undefined term) language be less
>     efficient than a non-polymorphic language?

Easy: you can implement an arbitrarily complex approximation to a Turing
machine through the type tags. Hence a less complex optimizer will not
be able to figure out the data flow. (This is the same argument as why
an optimizer cannot always convert an optimal pointer program into an
optimal array program.)

> (2) does any "expert" out there care to explain how "data types are
>     values"?  (Hans Boehm is not allowed to answer this question)

I'm not a programming language expert, but I'll repeat my usual mundane
argument for why it's useful to have data types as values: viz., with
such values, you can implement dynamic typing, and without such values,
you cannot (at least not obviously).

> (3) if you had to use "void *" to program generically in C/C++, what
>     does this say about the "type system"?

I don't think you need to do this in C++ for any of the applications
proposed here. Yes, you do have to implement some amount of typechecking
yourself if you want properly checked callback functions in C. So what?

> (4) what's the contravariance rule?
> It seems to me that one ought to have thought about these questions in
> some detail before spouting off about type systems and expressiveness.

Question (4) for you, David: What's Tarski's Q-system? Shall I be as
snobbish as you and claim that anyone who hasn't thought deeply about
Q-systems is not qualified to discuss programming languages?

---Dan

oz@yunexus.yorku.ca (Ozan Yigit) (04/01/91)

In article <335:Mar3121:55:3891@kramden.acf.nyu.edu> 
brnstnd@kramden.acf.nyu.edu (Dan Bernstein) writes:
>In article <10733@exodus.Eng.Sun.COM> chased@rbbb.Eng.Sun.COM () writes:
>> People are also attributing the qualities of current language
>> implementations to the languages themselves;
>
>It's very difficult *not* to do that, especially for languages without
>formal standards.

or, for those languages which have formal (or near-formal) standards that
you have not bothered to read ...

brnstnd@kramden.acf.nyu.edu (Dan Bernstein) (04/02/91)

In article <22188@yunexus.YorkU.CA> oz@yunexus.yorku.ca (Ozan Yigit) writes:
> In article <335:Mar3121:55:3891@kramden.acf.nyu.edu> 
> brnstnd@kramden.acf.nyu.edu (Dan Bernstein) writes:
> >In article <10733@exodus.Eng.Sun.COM> chased@rbbb.Eng.Sun.COM () writes:
> >> People are also attributing the qualities of current language
> >> implementations to the languages themselves;
> >It's very difficult *not* to do that, especially for languages without
> >formal standards.
> or, for those languages which have formal (or near-formal) standards that
> you have not bothered to read ...

Ah, yes, you're the type who thinks that ANSI C equals C, and that
Fortran 8X, oops I mean 9X, equals Fortran, and that Common Lisp equals
Lisp... Too bad. C in the real world is *not* the same as ANSI C. Sure,
I find ANSI C and Fortran 21XX and Common Lisp quite interesting. But
anyone who refuses to ``attribute the qualities of current language
implementations to the languages themselves''---except for Algol---is
deluding himself.

---Dan