[comp.unix.cray] why has Cray dropped CPP support from cf77?

bernhold@qtp.ufl.edu (David E. Bernholdt) (02/18/91)

The latest release of cf77 no longer supports automaticly running .F
files through CPP before compiling them.  This move seems to be a step
backward from the what I think is a _very_ useful feature common to
_many_ unix-based fortran implementations.  Does anyone know why they
did it?

patrick@convex.COM (Patrick F. McGehearty) (02/18/91)

In article <1298@red8.qtp.ufl.edu> bernhold@qtp.ufl.edu (David E. Bernholdt) writes:
>
>The latest release of cf77 no longer supports automaticly running .F
>files through CPP before compiling them.  This move seems to be a step
>backward from the what I think is a _very_ useful feature common to
>_many_ unix-based fortran implementations.  Does anyone know why they
>did it?

I know nothing about Cray's support decisions, but I do know that
while using cpp works as a preprocessor for many programs, it also
has weaknesses since the tokenizing rules for fortran and C are so
different.  For example, any token (string, identifier, whatever) that is
spread over a continuation line is not recognized by CPP.  Similarly, any
token with embedded spaces is not recognized by CPP.  Comment lines
in the middle of continuation lines can cause strange behavior.
A blanket decision to run CPP over large "dusty-deck" unstructured Fortran
programs can reach up and bite you without giving any indication of what
happened.  Use with caution!

hirchert@ncsa.uiuc.edu (Kurt Hirchert) (02/20/91)

In article <1298@red8.qtp.ufl.edu> bernhold@qtp.ufl.edu (David E. Bernholdt) writes:
>
>The latest release of cf77 no longer supports automaticly running .F
>files through CPP before compiling them.  This move seems to be a step
>backward from the what I think is a _very_ useful feature common to
>_many_ unix-based fortran implementations.  Does anyone know why they
>did it?

I don't know, but I could guess.  As I understand it, the standard C compiler
(scc) doesn't use cpp, but does preprocessing internally, instead.  Under
Unicos 6.0, scc becomes the default cc, and the previous cc becomes pcc,
presumably with the intention of eventually eliminating support for pcc.
If cpp is seen as being part of a product that is no longer going to be
supported, then I can see why they would not want to continue using it in
one of their other supported products.

Now, scc has switches so that it can be used as a replacement for cpp, so
I suppose you could lobby with CRI to run .F files through that, but I can
imagine the confusion when the documentation for cf77 says that under some
circumstances it runs your Fortran source through cc!
-- 
Kurt W. Hirchert     hirchert@ncsa.uiuc.edu
National Center for Supercomputing Applications

pmk@craycos.com (Peter Klausler) (02/20/91)

In article <1991Feb19.162007.28774@ncsa.uiuc.edu> hirchert@ncsa.uiuc.edu (Kurt Hirchert) writes:
>In article <1298@red8.qtp.ufl.edu> bernhold@qtp.ufl.edu (David E. Bernholdt) writes:
>>
>>The latest release of cf77 no longer supports automaticly running .F
>>files through CPP before compiling them.  This move seems to be a step
>>backward from the what I think is a _very_ useful feature common to
>>_many_ unix-based fortran implementations.  Does anyone know why they
>>did it?
>
>I don't know, but I could guess.  As I understand it, the standard C compiler
>(scc) doesn't use cpp, but does preprocessing internally, instead.  Under
>Unicos 6.0, scc becomes the default cc, and the previous cc becomes pcc,
>presumably with the intention of eventually eliminating support for pcc.
>If cpp is seen as being part of a product that is no longer going to be
>supported, then I can see why they would not want to continue using it in
>one of their other supported products.
>
>Now, scc has switches so that it can be used as a replacement for cpp, so
>I suppose you could lobby with CRI to run .F files through that, but I can
>imagine the confusion when the documentation for cf77 says that under some
>circumstances it runs your Fortran source through cc!

It would appear infeasible to use an ANSI Standard C preprocessor or
preprocessing phase as a general cpp-like macro processor for Fortran or CAL.
Why? ANS C preprocessing is token-based, not character-based, and applies ANS C
tokenization to its input. This works fine for C, of course, but has some
trouble with Hollerith data, CAL's odd O' and X' syntax, apostrophes in Fortran
comments, etc.

Beats me how we'll ever get rid of /lib/cpp, or if we should even want to do so.

khb@chiba.Eng.Sun.COM (Keith Bierman fpgroup) (02/20/91)

In article <1991Feb19.162007.28774@ncsa.uiuc.edu> hirchert@ncsa.uiuc.edu (Kurt Hirchert) writes:


   >did it?

   I don't know, but I could guess.  As I understand it, the standard C compiler
   (scc) doesn't use cpp, but does preprocessing internally, instead.  Under
   Unicos 6.0, scc becomes the default cc, and the previous cc becomes pcc,
....

Inasmuch as unix f77 has traditionally handled .F files in a
particular fashion (admittedly there are sometime a few implementation
specific details ;>) when wearing _my_ consumer hat, I'd very much
complain if a vendor made it stop working.

There are several ways for the vendor to ensure that f77 continue
working; as a consumer I don't care how.

Of course, as a vendor I have ideas ..... ;>

But seriously, if you folks like the functionality of #ifdef and
friends, you will have to complain whenever someone leaves it out
(from this discussion Cray, from others, IBM with their xlf on the
RS/6000 come to mind as the only exceptions in UnixLand).
--
----------------------------------------------------------------
Keith H. Bierman    kbierman@Eng.Sun.COM | khb@chiba.Eng.Sun.COM
SMI 2550 Garcia 12-33			 | (415 336 2648)   
    Mountain View, CA 94043

tchrist@convex.COM (Tom Christiansen) (02/20/91)

From the keyboard of pmk@craycos.com (Peter Klausler):
:It would appear infeasible to use an ANSI Standard C preprocessor or
:preprocessing phase as a general cpp-like macro processor for Fortran or CAL.
:Why? ANS C preprocessing is token-based, not character-based, and applies ANS C
:tokenization to its input. This works fine for C, of course, but has some
:trouble with Hollerith data, CAL's odd O' and X' syntax, apostrophes in Fortran
:comments, etc.
:
:Beats me how we'll ever get rid of /lib/cpp, or if we should even want to do so.

The ANSI cpp broke a lot of existing applications.  Wearing my sysadmin
and toolsmith hat, I get the feeling that the committee either didn't
recognize or else didn't care that cpp was a *general tool* used by many
utilities and script to do macro processing, and that by tying it in this
much closer to C, they break these existing applications.  I can no longer
use cpp in its ANSI form for a lot of things I used to be able to use it
for, like xrdb, perl, and various other Makefiles and sysadmin scripts.
Thank God ANSI hasn't gotten to m4 yet.

--tom
--
Tom Christiansen		tchrist@convex.com	convex!tchrist
 "All things are possible, but not all expedient."  (in life, UNIX, and perl)

dik@cwi.nl (Dik T. Winter) (02/20/91)

 > >In article <1298@red8.qtp.ufl.edu> bernhold@qtp.ufl.edu (David E. Bernholdt) writes:
About Cray dropping support for .F

 > In article <1991Feb19.162007.28774@ncsa.uiuc.edu> hirchert@ncsa.uiuc.edu (Kurt Hirchert) writes:
About scc taking over from cc.  (I would say good riddance, cc is very bug-
ridden.)

In article <1991Feb19.203555.8262@craycos.com> pmk@craycos.com (Peter Klausler) writes:
 > It would appear infeasible to use an ANSI Standard C preprocessor or
 > preprocessing phase as a general cpp-like macro processor for Fortran or CAL.
 > Why? ANS C preprocessing is token-based, not character-based, and applies ANS C
 > tokenization to its input. This works fine for C, of course, but has some
 > trouble with Hollerith data, CAL's odd O' and X' syntax, apostrophes in Fortran
 > comments, etc.
(Note that not all pre-ANSI cpp's where character-based.)
But there is more to come.  scc has the -E flag to only run the preprocessor.
It will include #line directives (those are not a problem; you can filter
them out), but it will not include them between source code lines, but in
the middle!  For example when the first line is:
      SUBROUTINE A
that becomes:
      
#line 1 ....
SUBROUTINE A
(note the line with 6 spaces on the line before #line :-)).
Now tell the Fortran compiler about that!  Of course you could correct that,
but that is one more tool to be used different on the Cray.

Another comment about the RS6000: yes, xlf does not use cpp, but you can
use the preprocessor on fortran sources without ill-effects.  Now commented
assembler is completely different (because # also starts comments in the
assembler).  Pre-ANSI cpp's would have no problem with that if you did not put
your comment # symbol in column one; with ANSI cpp's this is different.
--
dik t. winter, cwi, amsterdam, nederland
dik@cwi.nl

allison@convex.com (Brian Allison) (02/21/91)

In article <1991Feb20.101450.18745@robobar.co.uk> ronald@robobar.co.uk (Ronald S H Khoo) writes:
>Is there any reason why vendors like Cray an Convex shouldn't make
>use of /usr/local/lib/gcc-cpp -traditional for these purposes ?
>(in other words, why not retain the availability of a reasonably
>traditional cpp ?)

I don't know about Cray, but Convex's cpp has a -pcc switch to force it to
(quoting the man page) "behave compatibly with earlier preprocessors that
were not ANSI C conforming."  The C compiler has the same switch.
-- 
Brian Allison			"If pro and con are opposites, is
Convex Computer Corp.		 Congress the opposite of progress?"
Richardson, TX					- Richard Lederer

dik@cwi.nl (Dik T. Winter) (02/21/91)

In article <1991Feb19.230305.22563@convex.com> tchrist@convex.COM (Tom Christiansen) writes:
 > The ANSI cpp broke a lot of existing applications.  Wearing my sysadmin
 > and toolsmith hat, I get the feeling that the committee either didn't
 > recognize or else didn't care that cpp was a *general tool* used by many
 > utilities and script to do macro processing, and that by tying it in this
 > much closer to C, they break these existing applications.

The didn't care.  At least, I have read so many comments to this effect that
it must be true.  The reasoning was (these are my words, and I represent
nobody):
1.	The C preprocessor is just what it says it is: a C preprocessor.
2.	There are enough implementations of C compilers that do not give
	a freestanding preprocessor.
3.	There is a number of, slightly incompatible, implementations of the
	preprocessor.
Clearly 3 dictated that the preprocessor ought to be standardized (and there
were lot's of people that were yelling that the committee's design broke
existing applications; the same would have been true had they made other
choices).  In order to standardize it was necessary to tie more closely to
the C language; hence they didn't care.

Of course, all users that used cpp to process Fortran, Pascal, perl, and
what you have are in the dark.  (Is the utility calendar(1) now also broken?)
On the other hand, if you did use cpp to do only conditional compilation,
simple substitution and file inclusion, it would not be too difficult to
write a replacement.  One of these days I might even do that.
--
dik t. winter, cwi, amsterdam, nederland
dik@cwi.nl

ronald@robobar.co.uk (Ronald S H Khoo) (02/21/91)

[ no longer a cray/fortran issue, I think, so followups redirected ]

allison@convex.com (Brian Allison) writes:

> I don't know about Cray, but Convex's cpp has a -pcc switch to force it to
> (quoting the man page) "behave compatibly with earlier preprocessors that
> were not ANSI C conforming."

(Does Convex's .F.o default make rule call cpp -pcc ?)

Hmm, so why did Tom say that he could no longer use cpp for his perl
scripts ?  Surely all he's got to do then is to get perl -P to call cpp -pcc ?
Tom ?

Hmm..  Thinking about it, perl really does need to be told about cpp flags
for -P, doesn't it.  Should Configure try -traditional and/or -pcc and
use them if accepted ?  Not that I've ever used -P, but if I ever do ...

-- 
Ronald Khoo <ronald@robobar.co.uk> +44 81 991 1142 (O) +44 71 229 7741 (H)