[comp.std.c] Line length

diamond@tkou02.enet.dec.com (diamond@tkovoa) (05/07/90)

In article <1374@sdrc.UUCP> scjones@sdrc.UUCP (Larry Jones) writes:
>In article <1626@tkou02.enet.dec.com>, diamond@tkou02.enet.dec.com (diamond@tkovoa) writes:
>>In article <1990Apr26.125851.20728@contact.uucp> ross@contact.UUCP (Ross Ridge) writes:
>>>The macro a expands to 5 517 094 707 031 349 characters.
>>Yeah, I thought that was greater than 509.
>If you check the phases of translation, however, logical lines
>are a well-defined concept that occur BEFORE macro expansion.
>Since the preprocessor grammar is token based, the result of
>preprocessing is a stream of tokens, not logical lines, so the
>size of the result is not subject to the logical line length
>limit!

In Phase 3, "... decomposed into preprocessing tokens ... New-line
characters are retained."  So logical lines exist even after
tokenizing.  When do new-line characters go away?  In Pase 7,
"White-space characters separating tokens are no longer significant."

It seems that Tom Plum's letter to me agreed with the rules of the
standard, and the limit applies after macro expansion (as well as
before).  These rules were not overridden by an example (half-sarcasm
here), and no one has quoted an official interpretation.

-- 
Norman Diamond, Nihon DEC     diamond@tkou02.enet.dec.com
This_blank_intentionally_left_underlined________________________________________

gwyn@smoke.BRL.MIL (Doug Gwyn) (05/07/90)

In article <1645@tkou02.enet.dec.com> diamond@tkou02.enet.dec.com (diamond@tkovoa) writes:
>In Phase 3, "... decomposed into preprocessing tokens ... New-line
>characters are retained."  So logical lines exist even after
>tokenizing.

But there is no meaningful measure of line length (# of characters in
a source line) at that point, because one has tokens, not characters
in the original sense.

>These rules were not overridden by an example (half-sarcasm
>here), and no one has quoted an official interpretation.

Nobody had properly requested an official interpretation of this,
as of the last X3J11 meeting in March 1990.  If you care about this,
you should send in your request to CBEMA X3.

scjones@sdrc.UUCP (Larry Jones) (05/08/90)

In article <1645@tkou02.enet.dec.com>, diamond@tkou02.enet.dec.com (diamond@tkovoa) writes:
> In article <1374@sdrc.UUCP> scjones@sdrc.UUCP (Larry Jones) writes:
> >Since the preprocessor grammar is token based, the result of
> >preprocessing is a stream of tokens, not logical lines, so the
> >size of the result is not subject to the logical line length
> >limit!
> 
> In Phase 3, "... decomposed into preprocessing tokens ... New-line
> characters are retained."  So logical lines exist even after
> tokenizing.  When do new-line characters go away?  In Pase 7,
> "White-space characters separating tokens are no longer significant."

I'm not convinced that logical lines still exist just because the
new-line characters are preserved as new-line tokens.  I guess
the question is whether logical source line is a well-defined
term which refers explicitly to the result of Phase 2, or an
ill-defined term which just refers to the general concept of
lines.

> It seems that Tom Plum's letter to me agreed with the rules of the
> standard, and the limit applies after macro expansion (as well as
> before).  These rules were not overridden by an example (half-sarcasm
> here), and no one has quoted an official interpretation.

Sounds to me like there should probably be an official request for
interpretation on this issue.
----
Larry Jones                         UUCP: uunet!sdrc!scjones
SDRC                                      scjones@SDRC.UU.NET
2000 Eastman Dr.                    BIX:  ltl
Milford, OH  45150-2789             AT&T: (513) 576-2070
"You know how Einstein got bad grades as a kid?  Well MINE are even WORSE!"
-Calvin