[comp.text.tex] Why use TeX if ...

inm501@csc.anu.edu.au (05/04/91)

	I am using OzTeX on the Macintosh.  I've almost come to a conclusion 
that is not worth learning about TeX, LaTeX if:

(1)  you already have the program mentioned below;
(2)  you don't mind paying for the program; 
(3)  you don't want to transfer you documents to other platforms; or,
(4)  you are just an average user who doesn't require much sophisticated 
     typesetting.

Things that TeX is supposed to be good for:

(1)  Cross-referencing:
	I can do what BibTeX can do with things like EndNote and ProCite.
I believe that with a little bit of fiddling, I can get EndNote to cross
reference my diagrams, or probably equations as well.

(2)  Typesetting equations:
	I think MathType is doing a pretty good job and it is fool-proof.
Incidentally, MathType comes with a TeX interface which converts equations 
into TeX commands.

(3)  Macros:
	If you like to automate things, Nisus doesn't look bad at all. I've
played with the demo and it seems to be pretty powerful.



Things that I don't like about TeX and OzTeX (no flame intended!!):

(1)  It doesn't support background printing in OzTeX so that the Mac is 
held up when the document is waiting to be printed.

(2)  It is a *PAIN* to incorporate PS, EPS  diagrams in OzTeX document if you 
want to center it automatically.  Larry Siebenmann wrote a macro for BoxedEPSF 
for this purpose. It is pretty good but I had yet to get it to work for PS 
files.

(3)  A few files are generated along the way.  They usually add up to be bigger
than the corresponding Word file.

(4)  Most WP programs on the Mac do come with spelling checker these days.
To spell check a TeX files, you will have to detex it first or to find a 
spelling checker that will ignore TeX commands.  I don't think there is
such a spelling checker on the Mac yet.

	I am looking for someone to convince me otherwise.  Could someone give
me some concrete examples that TeX is superior than a combination of the Mac
programs mentioned above?

Ida

eijkhout@s41.csrd.uiuc.edu (Victor Eijkhout) (05/05/91)

inm501@csc.anu.edu.au writes:

>	I am using OzTeX on the Macintosh.  I've almost come to a conclusion 
>that is not worth learning about TeX, LaTeX if:

>(1)  you already have the program mentioned below;

>(2)  you don't mind paying for the program; 

but then below you say:

>Things that I don't like about TeX and OzTeX (no flame intended!!):

>(1)  It doesn't support background printing in OzTeX so that the Mac is 
>held up when the document is waiting to be printed.

>(2)  It is a *PAIN* to incorporate PS, EPS  diagrams in OzTeX document if you 
>want to center it automatically. 

If you don't mind spending a few bucks, why don't you buy
TeXtures, which uses most features of the Mac to make it
as smooth going as possible? In particular your above
objections don't hold anymore then.

>Things that TeX is supposed to be good for:

Waddayamean 'supposed'?

>(1)  Cross-referencing:
>	I can do what BibTeX can do with things like EndNote and ProCite.
>I believe that with a little bit of fiddling, I can get EndNote to cross
>reference my diagrams, or probably equations as well.

And without changing more than a single line you can alter
your bibliograhy style?

>(2)  Typesetting equations:
>	I think MathType is doing a pretty good job and it is fool-proof.
>Incidentally, MathType comes with a TeX interface which converts equations 
>into TeX commands.

MathType translates wrongly, if I remember a review in TeXline.
Furthermore, TeX is the best looking one, no competition.

>(3)  Macros:
>	If you like to automate things, Nisus doesn't look bad at all. I've
>played with the demo and it seems to be pretty powerful.

Oh innocence! Demo's always look good. Use both packages for
a while, and then tell me you can do with Nisus what you can
do with TeX.

>(4)  Most WP programs on the Mac do come with spelling checker these days.
>To spell check a TeX files, you will have to detex it first or to find a 
>spelling checker that will ignore TeX commands.  I don't think there is
>such a spelling checker on the Mac yet.

I'll grant you this one.

>	I am looking for someone to convince me otherwise.  Could someone give
>me some concrete examples that TeX is superior than a combination of the Mac
>programs mentioned above?

Take a reasonably long paragraph (15 lines or so)
Now (all without changing the text):
- make it one line longer
- make it one line shorter
- set it without white space at the end of the last line
- let the first line indent 10pt in the left and the
 last line 10pt into the right margin.

No other program but TeX (as far as I know) has the mechanisms
to do such things. I can come up with other examples, but
this is one I like.

Victor.

dhosek@euler.claremont.edu (Don Hosek) (05/07/91)

In article <1991May4.165602.1@csc.anu.edu.au>, inm501@csc.anu.edu.au writes:
 
> 	I am using OzTeX on the Macintosh.  I've almost come to a conclusion 
> that is not worth learning about TeX, LaTeX if:
 
> (1)  you already have the program mentioned below;
> (2)  you don't mind paying for the program; 
> (3)  you don't want to transfer you documents to other platforms; or,
> (4)  you are just an average user who doesn't require much sophisticated 
>      typesetting.
 
> Things that TeX is supposed to be good for:
 
> (1)  Cross-referencing:
> 	I can do what BibTeX can do with things like EndNote and ProCite.
> I believe that with a little bit of fiddling, I can get EndNote to cross
> reference my diagrams, or probably equations as well.

LaTeX is useful for more than just cross-referencing. The ability
to describe the document in terms of logical structures rather
than physical appearance is, IMHO, invaluable. The "style sheet"
concept of Microfluff Word comes close, but I have found it
tedious and frequently inadequate (try to come up with a section
heading style which supresses paragraph indentation on the first
paragraph of a section).
 
> (2)  Typesetting equations:
> 	I think MathType is doing a pretty good job and it is fool-proof.
> Incidentally, MathType comes with a TeX interface which converts equations 
> into TeX commands.

Could be. I have seen some equation editors which required one to
position superscripts etc manually. It still surprises me that
their are people who think that's a good idea.
 
> Things that I don't like about TeX and OzTeX (no flame intended!!):
 
> (1)  It doesn't support background printing in OzTeX so that the Mac is 
> held up when the document is waiting to be printed.
 
> (2)  It is a *PAIN* to incorporate PS, EPS  diagrams in OzTeX document if you 
> want to center it automatically.  Larry Siebenmann wrote a macro for BoxedEPSF 
> for this purpose. It is pretty good but I had yet to get it to work for PS 
> files.

You might want to look at Textures, a commercial version of TeX
for the mac. I'm not sure if they've finished 3.14 yet (Barry, if
you're looking in, perhaps you could comment). It does provide a
nice integrated environment for TeX; when I teach LaTeX courses,
my preferred environment is a lab with macs running Textures
since we can get right down to doing TeX without having to muck
about learning Unix or DOS or vi (yes, I once taught a class
where the students had to use vi; by the fifth day some of the
students were ready to drop kick the computer). On the other
hand, there are some slight awkwardnesses occasioned by the Mac
environment which has some assumptions a little _too_ different
from the original TeX environment for things to always work
perfectly (the most noticable is the spaces-in-filenames
problem). OzTeX is an attempt at making a Mac run TeX like a
mainframe so the interface is not quite as nice.
 
> (3)  A few files are generated along the way.  They usually add up to be bigger
> than the corresponding Word file.

True, but they have their uses. For example, it would not be
difficult to use AUX files to allow inter-document
cross-references with LaTeX (anyone up to the challenge? I say
fifteen minutes tops).
 
> (4)  Most WP programs on the Mac do come with spelling checker these days.
> To spell check a TeX files, you will have to detex it first or to find a 
> spelling checker that will ignore TeX commands.  I don't think there is
> such a spelling checker on the Mac yet.

Don't you want to spell your TeX commands right? Put 'em in the
dictionary. At first you will spend a while adding stuff in but
afterwards you'll find it quite handy.
 
> 	I am looking for someone to convince me otherwise.  Could someone give
> me some concrete examples that TeX is superior than a combination of the Mac
> programs mentioned above?

-dh

Don Hosek                  
dhosek@ymir.claremont.edu  
Quixote Digital Typography 
714-625-0147               

price@uclapp.physics.ucla.edu (John Price) (05/07/91)

In article <1991May6.180900.1@euler.claremont.edu>, dhosek@euler.claremont.edu (Don Hosek) writes:
>OzTeX is an attempt at making a Mac run TeX like a
>mainframe so the interface is not quite as nice.

	Personally, I think that's good.  I use TeX on our VAXen, but I'm 
using OzTeX on my Mac at home to write my dissertation.  I find that only 
having to deal with one interface is a good thing.  Also, Textures isn't 
(or at least *wasn't*) able to produce DVI files that I could transfer 
directly to my VAX - OzTeX is.

	The fact that OzTeX is free played no part in this decision - 
Textures is available to me at work.  I prefer OzTeX.  Your mileage will 
vary.

           John Price * * * * price@uclapp.physics.ucla.edu
           Where there is no solution, there is no problem.

dhosek@freke.claremont.edu (Don Hosek) (05/07/91)

In article <0094835B.BEE4B7E0@uclapp.physics.ucla.edu>, price@uclapp.physics.ucla.edu (John Price) writes:
> In article <1991May6.180900.1@euler.claremont.edu>, dhosek@euler.claremont.edu (Don Hosek) writes:
>>OzTeX is an attempt at making a Mac run TeX like a
>>mainframe so the interface is not quite as nice.
 
> 	Personally, I think that's good.  I use TeX on our VAXen, but I'm 
> using OzTeX on my Mac at home to write my dissertation.  I find that only 
> having to deal with one interface is a good thing.  Also, Textures isn't 
> (or at least *wasn't*) able to produce DVI files that I could transfer 
> directly to my VAX - OzTeX is.

Actually, if I remember correctly (it's been three years since I
did the in-depth evaluation), there was a utility program shipped
as part of Textures which does produce DVI files. The package I
evaluated was a fairly early one (it was still being sold by
Addison-Wesley at that time) so I'm pretty sure this utility is
shipped in _all_ versions of Textures.

-dh

Damian.Cugley@prg.ox.ac.uk (Damian Cugley) (05/09/91)

> From:	Victor Eijkhout <eijkhout@s41.csrd.uiuc.edu>
> Message-Id:	<1991May4.191951.26699@csrd.uiuc.edu>

> Take a reasonably long paragraph (15 lines or so)
> Now (all without changing the text):
> - make it one line longer
> - make it one line shorter
> - set it without white space at the end of the last line
> - let the first line indent 10pt in the left and the
>  last line 10pt into the right margin.

Great, now do it in TeX and make it "flow" around a PostScript picture
-- in less than 15 minutes.  Indent the left edge of the text of the
page to make room for a 50mmx50mm illustration.  Produce a title page
with the lettering set around a circle -- in less than half an hour.

There is a lot that TeX cannot do.  There is a lot that TeX can only do
with unreasonable difficulty -- very simple magazine layouts (with
*non*-floating illustrations) require a lot of TeX GrandMastery.  There
are very few fonts currently supported on most TeX installations -- even
when PostScript fonts are used, they aren't portable [and we, for
example, have two laser printers, one of which isn't PostScript].

What makes TeX good?  TeX is portable, has the most sophisticated
line-breaking and maths-setting systems, and has one semi-standard
markup style (LaTeX 2.x).  It is the only system where a pauper such as
myself has a chance of making and using their own typefaces.  On the
other hand people who don't have the time to waste becoming an
arch-hacker and who don't care about the rather poor typesetting
produced by DTP systems will be better off with MacWhatever.

---- Damian Cugley -------------------------------- pdc@prg.ox.ac.uk ---
    Computing Laboratory, 11 Keble Rd, Oxford  OX1 3QD  Great Britain   
------------------------------------------------------------------------
     malvern-request@prg.ox.ac.uk		   "share and enjoy"

eijkhout@s41.csrd.uiuc.edu (Victor Eijkhout) (05/09/91)

Damian.Cugley@prg.ox.ac.uk (Damian Cugley) writes:

>Great, now do it in TeX and make it "flow" around a PostScript picture

You mean around the figure, not just around the bounding box?
The latter is easy. I always wondered how you'd do the former.
It means you'd have to have a postscript interpreter aboard.

> Indent the left edge of the text of the
>page to make room for a 50mmx50mm illustration.  Produce a title page
>with the lettering set around a circle -- in less than half an hour.

Both no sweat.

>There is a lot that TeX cannot do.  There is a lot that TeX can only do
>with unreasonable difficulty -- very simple magazine layouts (with
>*non*-floating illustrations) require a lot of TeX GrandMastery.

Agreed. But TeX was never meant for that. It is basically
a book typesetter, not a page formatter.

>What makes TeX good?  TeX is portable, has the most sophisticated
>line-breaking and maths-setting systems, and has one semi-standard
>markup style (LaTeX 2.x).  It is the only system where a pauper such as
>myself has a chance of making and using their own typefaces.  On the
>other hand people who don't have the time to waste becoming an
>arch-hacker and who don't care about the rather poor typesetting
>produced by DTP systems will be better off with MacWhatever.

I don't agree. First of all it's not a choice of having
poor typesetting (and personally, I can live easier
without text flowing around figures than without decent
typesetting) *or* becoming an arch-hacker. If LaTeX
suits your purposes (and it does for many people)
using TeX is just fine.
Only if you want to do sophisticated page
design do you need some fancy dtp package.
But then, what's the use of doing that? I may do it
in a report that I'm writing, but when I submit that
to a journal they tell me 'great, just give us the figure
on glossy paper'.

Victor.

pauld@stowe.cs.washington.edu (Paul Barton-Davis) (05/10/91)

In article <1991May9.164341.14084@csrd.uiuc.edu> eijkhout@s41.csrd.uiuc.edu (Victor Eijkhout) writes:
>Damian.Cugley@prg.ox.ac.uk (Damian Cugley) writes:
>
>>Great, now do it in TeX and make it "flow" around a PostScript picture
>
>You mean around the figure, not just around the bounding box?
>The latter is easy. I always wondered how you'd do the former.
>It means you'd have to have a postscript interpreter aboard.
>

Myself and Tim Bradshaw (tim@cstr.ed.ac.uk) talked a lot about this a
couple of years back. We came to conclusion that TeX's internal model
of typesetting operations is not powerful enough to do this. You
cannot even specify the exact shape of the picture and wrap around it,
at least not in any general manner. We began work on a program called
TinT (TinT Is Not TeX) that used Lisp as its extension language, was
TeX compatible (i.e all your macros would work) and retained, amongst
other things, TeX's hyphenation and math setting features. It was
intended that TinT would pass the trip test. The primary difference
was the internal model - we were going to change it from "boxes+glue"
to something a little more PostScript like, and rather more reflective
of digital typesetting than the model Knuth picked. We liked to think
that were writing the TeX Knuth would have written if PageMaker had
been around when he began.

Unfortunately, I moved to the US, and Tim moved to Edinburgh, and that
was that. Who knows, perhaps we or someone else will pick up the
challenge again.

>> Indent the left edge of the text of the
>>page to make room for a 50mmx50mm illustration.  Produce a title page
>>with the lettering set around a circle -- in less than half an hour.
>
>Both no sweat.
>
>>There is a lot that TeX cannot do.  There is a lot that TeX can only do
>>with unreasonable difficulty -- very simple magazine layouts (with
>>*non*-floating illustrations) require a lot of TeX GrandMastery.
>
>Agreed. But TeX was never meant for that. It is basically
>a book typesetter, not a page formatter.
>

I would have said that TeX is a formatter using a boxes+glue model. If
you can do it with boxes+glue (or rules and leading) then you can do
it with TeX. Printers used to charge a lot more for the type of thing
that Mac software has made easy, but TeX works like a typesetter, not
like a computer.

>>What makes TeX good?  TeX is portable, has the most sophisticated
>>line-breaking and maths-setting systems, and has one semi-standard
>>markup style (LaTeX 2.x).  It is the only system where a pauper such as
>>myself has a chance of making and using their own typefaces.  On the
>>other hand people who don't have the time to waste becoming an
>>arch-hacker and who don't care about the rather poor typesetting
>>produced by DTP systems will be better off with MacWhatever.
>

The hacking problem is primarily caused by the fact that the TeX
extension language was designed (apparently) as a macro replacement
language. If TeX has a "normal" or "proper" programming language, then
it would a lot easier to extend, without getting used to its
arcane grammar.

>I don't agree. First of all it's not a choice of having
>poor typesetting (and personally, I can live easier
>without text flowing around figures than without decent
>typesetting) *or* becoming an arch-hacker. If LaTeX
>suits your purposes (and it does for many people)
>using TeX is just fine.
>Only if you want to do sophisticated page
>design do you need some fancy dtp package.

Here, I don't agree with Victor. There are a *lot* of things that are
very difficult to do with TeX by way of general solution, and very
slow and tricky to do in any specific instance. I've come across some
very tough things even for very normal pages. Yes, I solved them, but
then I've been willing to put hours into learning. I've seen Mac
people produce stuff that is way more complex than anything I've done
with TeX - the only thing is is that its normally a per-page process.
That's where TeX really wins - once you master how to do something
properly, it works again, and again and again.

>But then, what's the use of doing that? I may do it
>in a report that I'm writing, but when I submit that
>to a journal they tell me 'great, just give us the figure
>on glossy paper'.
>

Buy an 1200dpi printer, get some good halftoning software, and do
it in PostScript. This is not an ad for the company I used to work
for, of course .... :-)))

>Victor.

-- paul

-- 
Paul Barton-Davis <pauld@cs.washington.edu> UW Computer Science Lab	 

"People cannot cooperate towards common goals if they are forced to
 compete with each other in order to guarantee their own survival."

edward@priam.Berkeley.EDU (Edward Wang) (05/10/91)

In article <1991May9.204113.17636@beaver.cs.washington.edu> pauld@stowe.cs.washington.edu (Paul Barton-Davis) writes:

>...

>The primary difference
>was the internal model - we were going to change it from "boxes+glue"
>to something a little more PostScript like, and rather more reflective
>of digital typesetting than the model Knuth picked.  We liked to think
>that were writing the TeX Knuth would have written if PageMaker had

Surely the box-and-glue model is a superset of the Postscript
(positioning by coordinate) model.  I agree that most of the glue
features wouldn't be necessary if Tex had a more complete programming
language, but I don't think it really gets in the way.

The parshape problem is not an effect of using boxes and glue.
In fact, I think it's possible (within the Tex line-breaking
model) to specify paragraph shapes by the actual shapes,
rather than by list of line lengths (\parshape).
Some suitable shape-description language has to be designed, however.

>The hacking problem is primarily caused by the fact that the TeX
>extension language was designed (apparently) as a macro replacement
>language. If TeX has a "normal" or "proper" programming language, then
>it would a lot easier to extend, without getting used to its
>arcane grammar.

I agree.  I can think of three things wrong with the Tex language:
textual substitution macros (rather than functions or even
Lisp-like macros), dynamic scoping (lexical is better, both is best),
lack of a programmer-visible representation of text (boxes are not enough).
On the last point, it would be sufficient to have a side-effectless
way to map text (strings as in the input document) into typeset text.
As it is, the result is inconsistencies like Latex's fragile commands.

pauld@stowe.cs.washington.edu (Paul Barton-Davis) (05/10/91)

In article <1991May10.065219.23433@agate.berkeley.edu> edward@priam.Berkeley.EDU (Edward Wang) writes:
>In article <1991May9.204113.17636@beaver.cs.washington.edu> pauld@stowe.cs.washington.edu (Paul Barton-Davis) writes:
>
>>...
>
>>The primary difference
>>was the internal model - we were going to change it from "boxes+glue"
>>to something a little more PostScript like, and rather more reflective
>>of digital typesetting than the model Knuth picked.  We liked to think
>>that were writing the TeX Knuth would have written if PageMaker had
>
>Surely the box-and-glue model is a superset of the Postscript
>(positioning by coordinate) model.  

In prinicple, I would agree with this. However, in practice, it
doesn't function as a superset. TeX's model is based on
ignoring the contents of each box, which works for general text, where
each set character can be accurately described by a box along with a
few extra details to handle areas where it is not contained by the
box. However, this doesn't work for images, and is very difficult to
work with when curves are heavily in use. You can make boxes+glue have
the same flexibility as a completely generic imaging model like
PostScript, but to do you have to use boxes that are point equivalents.
This makes TeX run like a dog ...
				     
				     I agree that most of the glue
>features wouldn't be necessary if Tex had a more complete programming
>language, but I don't think it really gets in the way.
>

Its not the glue that's the problem - its the idea that you image a
page by building up a list of boxes, arranged in some coordinate
system.  There's no easy way to get TeX to add something that isn't
adequately described by a box to its vertical list - to do so, you
have to use so many sub-boxes that, as I said above, it kills
performance.

>The parshape problem is not an effect of using boxes and glue.
>In fact, I think it's possible (within the Tex line-breaking
>model) to specify paragraph shapes by the actual shapes,
>rather than by list of line lengths (\parshape).
>Some suitable shape-description language has to be designed, however.
>

Exactly. And TeX has no primitives that are not based on a box
model. Suppose you wanted to specifiy something that looked like
this:

			XXXXXXXXX
	             XXXXXXXXXXXXXXXXXXX
				XXXXXXXXXXX
	XXXXXXXXXXXXXXXXXXXXXXXXXXX
		XXXXXXXXXXXXXXX
			XXXXXXXXXXXXXXXXXXXX
	XXXXXXXXXXXXXXXXXXXXXX
			XXXXXXXXXXXXXXXXXXXXx

where the resolution of the above cartoon is about 1 pixel (its a
small shape :-). TeX gices you no constructs to do this with, and even
if you had them, how would you tell TeX "now construct a parshape that
flows around this" ? The data you've somehow given TeX has to be
processed to give "per-line" widths for the line-breaking algorithm to
work. If TeX had this as an internal feature, it might work in
reasonable time (the Mac stuff shows that it can), but trying to do
this with macros, whilst possible, is not my idea of a "good thing".

>>The hacking problem is primarily caused by the fact that the TeX
>>extension language was designed (apparently) as a macro replacement
>>language. If TeX has a "normal" or "proper" programming language, then
>>it would a lot easier to extend, without getting used to its
>>arcane grammar.
>
>I agree.  I can think of three things wrong with the Tex language:
>textual substitution macros (rather than functions or even
>Lisp-like macros), dynamic scoping (lexical is better, both is best),
>lack of a programmer-visible representation of text (boxes are not enough).
>On the last point, it would be sufficient to have a side-effectless
>way to map text (strings as in the input document) into typeset text.
>As it is, the result is inconsistencies like Latex's fragile commands.

Add to that - too many primitives. Even Common Lisp, perhaps the most
burdened common language when it comes to primitives, doesn't come
close to TeX (this is a top-of-the-head assertion that I should really
check). Things like vbox and hbox should, in my opinion, be some way
above the primitive level. Knuth didn't think so, it appears, but I'm
guessing that this is largely because of when TeX was written. He
didn't have pagemaker to look at, only traditional typesetting
systems. TeX's language is a pretty good model of that process, but
doesn't reflect the much more powerful models that PostScript and
other PDL's have given us.

-- 
Paul Barton-Davis <pauld@cs.washington.edu> UW Computer Science Lab	 

"People cannot cooperate towards common goals if they are forced to
 compete with each other in order to guarantee their own survival."

eijkhout@s41.csrd.uiuc.edu (Victor Eijkhout) (05/11/91)

edward@priam.Berkeley.EDU (Edward Wang) writes:

>In article <1991May9.204113.17636@beaver.cs.washington.edu> pauld@stowe.cs.washington.edu (Paul Barton-Davis) writes:

>>The hacking problem is primarily caused by the fact that the TeX
>>extension language was designed (apparently) as a macro replacement
>>language. If TeX has a "normal" or "proper" programming language, then
>>it would a lot easier to extend, without getting used to its
>>arcane grammar.

What do you mean 'arcane grammar'? TeX, like that other
symbolic language Lisp, has no syntax to speak of.
Control sequences are trivially recognised, macro argument
absorbtion uses only the simplest type of pattern matching.
Some trivial facts about braces, and there you have
the whole of TeX.

But I do agree that the language is arcane. This is a consequence
of the fact that TeX is not really processed by a single
interpreter, but rather by three or four processor.
1/ The input processor maps from OS-characters to tokens
(with confusion number one: OS-characters are not TeX
character tokens);
2/ The expansion processor (roughly the programming language
level) tackles certain control sequence tokens (confusion
number two: exactly what is expandable), and
3/ The execution processor (roughly the typesetter) accepts
executable control sequences and characters and
makes lists (hor/ver/math) out of them.

These three terms, by the way, are coined by me in
an attempt to explain TeX's workings, both in an
article in the next issue of TUGboat, and in my
upcoming book on TeX.

>I agree.  I can think of three things wrong with the Tex language:
>textual substitution macros (rather than functions or even
>Lisp-like macros), 

One. Lisp has (as I said above) like TeX the 'program is data'
principle; macros accept arbitrary pieces of list
and does textual substitution.

>dynamic scoping (lexical is better, both is best),

Sorry, am not that kind of computer scientist. I don't
know what you mean by this. Examples would be appreciated.

>lack of a programmer-visible representation of text (boxes are not enough).

This would be fun, yes.

>On the last point, it would be sufficient to have a side-effectless
>way to map text (strings as in the input document) into typeset text.
>As it is, the result is inconsistencies like Latex's fragile commands.

You're subject to confusion number one here: TeX's
characters have a category code, so there is no one
mapping from input text to typeset text. The mapping
depends on the catcode assignments. Which can be
changed by that very text. This is not quite what's
the trouble with LateX's fragile commands, but it has
to do with it.

Victor.

pauld@stowe.cs.washington.edu (Paul Barton-Davis) (05/11/91)

In article <1991May10.211802.4344@csrd.uiuc.edu> eijkhout@s41.csrd.uiuc.edu (Victor Eijkhout) writes:
>edward@priam.Berkeley.EDU (Edward Wang) writes:
>
>>In article <1991May9.204113.17636@beaver.cs.washington.edu> pauld@stowe.cs.washington.edu (Paul Barton-Davis) writes:
>
>>>The hacking problem is primarily caused by the fact that the TeX
>>>extension language was designed (apparently) as a macro replacement
>>>language. If TeX has a "normal" or "proper" programming language, then
>>>it would a lot easier to extend, without getting used to its
>>>arcane grammar.
>
>What do you mean 'arcane grammar'? TeX, like that other
>symbolic language Lisp, has no syntax to speak of.
>Control sequences are trivially recognised, macro argument
>absorbtion uses only the simplest type of pattern matching.
>Some trivial facts about braces, and there you have
>the whole of TeX.
>

Well, its not just a question of syntax. TeX, unlike Lisp and most
other languages, does not have what I would call a "proper control
flow" facility. It has no scope. Opertators, like braces, are
overloaded.The primitives in TeX seem fundamentally rooted in
typesetting, not programming. Hence, they are great for doing
typesetting effects, and not so good for doing programming.

>
>>I agree.  I can think of three things wrong with the Tex language:
>>textual substitution macros (rather than functions or even
>>Lisp-like macros), 
>
>One. Lisp has (as I said above) like TeX the 'program is data'
>principle; macros accept arbitrary pieces of list
>and does textual substitution.
>

This might be true for Lisp macros, but not for Lisp functions.
Although people often talk of Lisp's treatment of data and program as
equivalent, my experience has been that you have to work quite hard to
use that in any meaningful way.

Anyway, I wouldn't argue that lisp is the idea language. When Tim & I
started TinT, we chose lisp because we both use and hack emacs a lot,
xlisp was avaible as a base (as was scheme & T) it seemed like a good
model (AutoCAD seems to agree). If there was a free C interpreter
available, we might have looked at that.



-- 
Paul Barton-Davis <pauld@cs.washington.edu> UW Computer Science Lab	 

"People cannot cooperate towards common goals if they are forced to
 compete with each other in order to guarantee their own survival."

marcel@cs.caltech.edu (Marcel van der Goot) (05/11/91)

[ NOTE: This is quite a long posting --- sorry for that. There is a
  summary at the end.
]

We've recently had a number of postings about ``the TeX language,'' its
perceived shortcomings, and comparisons with other typesetting systems.

From pauld@stowe.cs.washington.edu (Paul Barton-Davis):
> The hacking problem is primarily caused by the fact that the TeX
> extension language was designed (apparently) as a macro replacement
> language. If TeX has a "normal" or "proper" programming language, then
> it would a lot easier to extend, without getting used to its
> arcane grammar.

As Victor already pointed out, TeX doesn't have much syntax; what syntax
there is, such as what a number looks like, is fairly straightforward.
The only hard part is the lexical scanning (i.e., the conversion of
characters into tokens).

Also, the history of programming shows you wrong: ``hacking'' (used
in the negative sense) occurs in all programming languages. The hacking
problem is caused by programmers who lack knowledge about programming
techniques.

From edward@priam.Berkeley.EDU (Edward Wang):
> I agree.  I can think of three things wrong with the Tex language:
> textual substitution macros (rather than functions or even
> Lisp-like macros),

TeX is indeed based on macro substitution, rather than on the constructs
found in conventional programming languages. But then, TeX is not a
general-purpose programming language, it is specifically designed for
typesetting text. It is much more of an application program than a
programming language. An important difference with a ``normal'' programming
language is that the typical user wants to present TeX with the text
to be typeset. In other words, the typical user does not want to be
concerned with programming at all (most readers of this newsgroup are, in
that respect, probably not typical users).

As far as I can tell, the only way in which you can have the freedom of
presenting your input as almost plain text, with very little knowledge
about the details of the underlying programming language, is with a
language based on macro substitution. A conventional programming language
with strict syntax means that your input has to obey very strict rules.
It is true, TeX's macros take some getting used to, because of the complexity
of the system. What you get back for that is an unparallelled freedom in
the form your (normal text) input can take. You can make TeX look like
LaTeX (which is only somewhat different), but you could for instance also
make TeX look like troff. With a normal programming language that
is impossible: A C program always looks like a C program, a pascal program
always looks like a pascal program, and a lisp program always looks like
a lisp program. You cannot decide, ``let's give an end-of-line the meaning
of a semi-colon'' so that you don't have to write semi-colons in C; You
cannot decide to use square brackets instead of parentheses in lisp. And
these are only very minor variations, many TeX macros do much more.

Is it important to have so much control over the form of your input? Yes,
it is. Not so much for you programmer types. Programmers are used to
obeying strict syntax rules. But it is important for people who do the
actual writing of the books, papers, letters, etc. For those people the
ease of *usage* of the macros is important; TeX can make that very easy.
Of course, that means that the *construction* of macros becomes more
complicated.

In short: TeX's macro language makes input of text much easier than
any conventional type programming language could; the price is the
increased burden for the macro designer.


(Edward Wang)
> dynamic scoping (lexical is better, both is best),
> lack of a programmer-visible representation of text (boxes are not enough).
> On the last point, it would be sufficient to have a side-effectless
> way to map text (strings as in the input document) into typeset text.
> As it is, the result is inconsistencies like Latex's fragile commands.

Dynamic scoping is pretty much inherent in macro languages. Lexical scoping
requires strict syntax rules. I have no idea what you mean by your last
remark: what side-effects are you talking about? You can represent strings
as strings (in macros or token registers). Once you *typeset* something,
a lot more things than just the text is involved. For instance, the font
in which something is typeset. I think that that is much more a problem
that LaTeX's fragile commands have.

Paul Barton-Davis adds to that:
> Add to that - too many primitives. Even Common Lisp, perhaps the most
> burdened common language when it comes to primitives, doesn't come
> close to TeX (this is a top-of-the-head assertion that I should really
> check). Things like vbox and hbox should, in my opinion, be some way
> above the primitive level. [...]

As I said, TeX is more an application program than a programming language.
The main purpose is to typeset text, not to program macros. Therefore,
the comparison with Lisp doesn't make too much sense, I think. Why should
vbox and hbox be at a different level? (I assume you mean, expressed in
terms of other primitives.)

and:
> only traditional typesetting
> systems. TeX's language is a pretty good model of that process, but
> doesn't reflect the much more powerful models that PostScript and
> other PDL's have given us.

Well, TeX is a typesetting system, whereas (as you say) PostScript is
a page-description language. If you take a standard book on PostScript
(say the reference manual or so) then it will almost certainly tell you
the difference. Very few people want to write their text directly in
PostScript. Instead, PostScript is usually produced as output from some
other typesetting system (for instance, TeX). TeX's PDL is the dvi format,
which is indeed less general (but more efficient for the printer) than
PostScript format.

Also, aren't you shooting yourself a bit in the foot here: I think that
PostScript has more primitives than TeX.


(Paul Barton-Davis)
> TeX's model is based on
> ignoring the contents of each box, which works for general text, where
> each set character can be accurately described by a box along with a
> few extra details to handle areas where it is not contained by the
> box. However, this doesn't work for images, and is very difficult to
> work with when curves are heavily in use.

Well, who claims that TeX is good for all kinds of typography? TeX is
intended for typesetting things like books, papers, and letters. In
such environments even pictures are typically indeed just boxes. How
often do you see a picture in the newspaper that has the text flowing
around George Bush's head (say)? Never: everything is cut at right angles.
Such things, as well as typesetting along fancy curves, individual spacing
between characters, etc., *do* occur in, for instance, advertisements
and on posters. Sure, TeX is not very suitable for such things.


			********************

Whew! That was long. A summary:

When TeX is compared with other programs/systems, one should keep in
mind what TeX is intended for. Since TeX is intended for typesetting
books and the like, it should not be compared as if the primary use
of TeX is as a programming language. The emphasis in the design of TeX
is on the possibility to write easy-to-use macro packages, at the
cost of hard-to-program macros. (That, of course, does not imply that
all macros written are easy to use.)

Since TeX is not a page description language, and since PostScript is,
their comparison has very little basis. PostScript is best used with
other programs that generate page descriptions. When it comes to typesetting
books etc., TeX is one of the best such programs available. When it
comes to different forms of typesetting, other programs may be more
helpful.

Finally: No perfect program has ever been written. TeX isn't perfect,
and PostScript isn't perfect either.


                                          Marcel van der Goot
 .----------------------------------------------------------------
 | Blauw de viooltjes,                    marcel@vlsi.cs.caltech.edu
 |    Rood zijn de rozen;
 | Een rijm kan gezet
 |    Met plaksel en dozen.
 |

edward@priam.Berkeley.EDU (Edward Wang) (05/12/91)

In article <1991May11.013248.16286@nntp-server.caltech.edu> marcel@cs.caltech.edu (Marcel van der Goot) writes:

>As far as I can tell, the only way in which you can have the freedom of
>presenting your input as almost plain text, with very little knowledge
>about the details of the underlying programming language, is with a
>language based on macro substitution. A conventional programming language
>with strict syntax means that your input has to obey very strict rules.

I don't think it's desirable or even possible for the input to be completely
unstructured text.  For example, SGML is nothing but structure.
Given that, then it's possible to design a syntax based on the natural
text structure.  Syntax then becomes a part of structural correctness.

For example, on a trivial level, here's Tex's \centerline macro
as a Lisp function in an imaginary embedding of Tex in Lisp:

	(defun center-line (x)
	  (line (hss) x (hss)))

The function body (line (hss) x (hss)) has the same nesting as that
of the box-and-glue typesetting model, which is a structure any writer
of such Tex macros has to understand.

Control constructs necessarily have some syntax of their own.
Even Tex can't get away from that.

>Is it important to have so much control over the form of your input? Yes,
>it is. Not so much for you programmer types. Programmers are used to
>obeying strict syntax rules. But it is important for people who do the
>actual writing of the books, papers, letters, etc. For those people the
>ease of *usage* of the macros is important; TeX can make that very easy.
>Of course, that means that the *construction* of macros becomes more
>complicated.

This level of control gives us lots of cute tricks and some truly
useful features, but I don't think the results are easy to use.
My favorite example is that \verb in Latex can't always be used
in macro arguments.  In fact, it's possible to make the opposite
argument that the power is good for the expert, but the inconsistency
is bad for the novice.

>In short: TeX's macro language makes input of text much easier than
>any conventional type programming language could; the price is the
>increased burden for the macro designer.

A conventional conventional language, yes.  However there's no reason
that a conventional language can't be augmented with constructs to input
text.

>Dynamic scoping is pretty much inherent in macro languages. Lexical scoping
>requires strict syntax rules.

This isn't true.  Tex can be changed to be lexically scoped without
even changing the syntax (though the rest of the system is structured
to take advantage of dynamic scoping).  Indeed, macro arguments
in Tex are lexically scoped.

>When TeX is compared with other programs/systems, one should keep in
>mind what TeX is intended for. Since TeX is intended for typesetting
>books and the like, it should not be compared as if the primary use
>of TeX is as a programming language. The emphasis in the design of TeX
>is on the possibility to write easy-to-use macro packages, at the
>cost of hard-to-program macros. (That, of course, does not imply that
>all macros written are easy to use.)

Nevertheless, Tex is used to write large programs, and these programs
are necessary.  In any case, I believe that both ease of programming and
ease of use can come from the same well-designed language.

Scribe is another story.

Damian.Cugley@prg.ox.ac.uk (Damian Cugley) (05/13/91)

> From:		Victor Eijkhout <eijkhout@s41.csrd.uiuc.edu>
> Message-Id:	<1991May10.211802.4344@csrd.uiuc.edu>

> >In article <1991May9.204113.17636@beaver.cs.washington.edu> 
> > pauld@stowe.cs.washington.edu (Paul Barton-Davis) writes:

> >>The hacking problem is primarily caused by the fact that the TeX
> >>extension language was designed (apparently) as a macro replacement
> >>language. If TeX has a "normal" or "proper" programming language, then
> >>it would a lot easier to extend, without getting used to its
> >>arcane grammar.

> What do you mean 'arcane grammar'? TeX, like that other
> symbolic language Lisp, has no syntax to speak of.

He really meant arcane *semantics* -- what is expanded when and why and
how to change this sequence and why.  One of the problems with TeX is
precisely that it does lack syntax -- and hence structure.

Lisp has syntax which has been reduced to the bare minimum: structure
with no "frou-frou".  Unlike TeX, which has some "frou-frou" but no
structure!

> Control sequences are trivially recognised, macro argument
> absorbtion uses only the simplest type of pattern matching.
> Some trivial facts about braces, and there you have
> the whole of TeX.

The other 475 pages of the TeXbook are just padding?!?  :-)


> But I do agree that the language is arcane. This is a consequence
> of the fact that TeX is not really processed by a single
> interpreter, but rather by three or four processor[s].

> 1/ The input processor;
	-- which Knuth calls the eyes;
> 2/ The expansion processor, and
	-- the mouth;
> 3/ The execution processor.
	-- the stomach.

In other words: TeX is a macro language -- all macro languages are like
this.  (All computer languages are conceptually translated/interpreted
in several distinct phases, if it comes to that.)  The eyes and stomach
are not what makes TeX arcane; it is the fact that TeX's only
abstraction mechanism is textual substitution (macro replacement).

Macro languages are a very easy way to implement a moderately powerful
extension language.  You just supply primitives operations on whatever
datatype the problem involves (e.g., boxes and glue in TeX's case; paths
and pictures in METAFONT's case; Pascal instructions in the case of WEB)
and slap a macro-replacement system on top.  Voila -- a programmable
typesetter or font creator or whatever.

The problems is that macro languages are just about the worst thing to
have when trying to program anything non-trivial.  The most frustrating
bugs to track down in TeX code are caused by macros that expand at the
wrong time, or trivial errors in what they expand to which are only
spotted when primitive commands go wrong.

Having TeX as a non-macro-based language does not imply that it must
resemble C or Pascal or Lisp (although if TeX *programs* resembled (say)
Lisp or Pascal rather than just a mess of backslashes it would be no bad
thing).

---- Damian Cugley -------------------------------- pdc@prg.ox.ac.uk ---
    Computing Laboratory, 11 Keble Rd, Oxford  OX1 3QD  Great Britain   
------------------------------------------------------------------------
     malvern-request@prg.ox.ac.uk		   "share and enjoy"

eijkhout@s41.csrd.uiuc.edu (Victor Eijkhout) (05/14/91)

marcel@cs.caltech.edu (Marcel van der Goot) writes:


>[ NOTE: This is quite a long posting --- sorry for that. There is a
>  summary at the end.
>]

No need to apologise, your observations are interesting enough.

>Also, the history of programming shows you wrong: ``hacking'' (used
>in the negative sense) occurs in all programming languages. The hacking
>problem is caused by programmers who lack knowledge about programming
>techniques.

But writing inscrutable code (which is what I think the original
posters really meant to say) is easier in TeX (as it is in APL)
than in Pascal or other more run-of-the-mill programming languages.

>TeX is indeed based on macro substitution, rather than on the constructs
>found in conventional programming languages. But then, TeX is not a
>general-purpose programming language, it is specifically designed for
>typesetting text. 

There is one observation that so far no one has made on this
forum, maybe because it's so trivial, maybe not. That is:
TeX is based on a model of simple, incremental, one-pass
scanning of a piece of text. There is no 'goto' statement
in TeX. A macro is not a piece of program code: it is a
side-tracked piece of input. Therefore recursive incorporation
of a macro replacement text in the incrementally absorbed
input is the only form of iteration imaginable. Hence no loop
constructs. Large parts of the design of TeX follow immediately
from this basic principle.

>                                          Marcel van der Goot

Victor Eijkhout

> .----------------------------------------------------------------
> | Blauw de viooltjes,                    marcel@vlsi.cs.caltech.edu
> |    Rood zijn de rozen;
> | Een rijm kan gezet
> |    Met plaksel en dozen.
> |

Gesproken als een ware zoon van het land van vadertje Cats.

eijkhout@s41.csrd.uiuc.edu (Victor Eijkhout) (05/14/91)

Damian.Cugley@prg.ox.ac.uk (Damian Cugley) writes:

>> From:		Victor Eijkhout <eijkhout@s41.csrd.uiuc.edu>
>> Message-Id:	<1991May10.211802.4344@csrd.uiuc.edu>

>He really meant arcane *semantics* -- what is expanded when and why and
>how to change this sequence and why.  One of the problems with TeX is
>precisely that it does lack syntax -- and hence structure.

It lacks syntax? You mean every input is legal?

>> But I do agree that the language is arcane. This is a consequence
>> of the fact that TeX is not really processed by a single
>> interpreter, but rather by three or four processor[s].

>> 1/ The input processor;
>	-- which Knuth calls the eyes;
>> 2/ The expansion processor, and
>	-- the mouth;
>> 3/ The execution processor.
>	-- the stomach.

I try not to use those anthropomorphic terms, because they
are farfetched, slightly revolting, and in any case nowhere
near accurate.

>In other words: TeX is a macro language -- all macro languages are like
>this. 

Sorry, TeX is the only macro language that I know, so I can't
judge that. In any case, the fact that TeX can redefine
its syntax dynamically may be common to all macro languages,
it is not common to the programming languages that most people
use, and it is downright disturbing to people who haven't
programmed at all.

>The eyes and stomach
>are not what makes TeX arcane; it is the fact that TeX's only
>abstraction mechanism is textual substitution (macro replacement).

I guess we will keep disagreeing on this. Although I fail to see
how a paucity of means would make a language arcane. If anything,
it would make it simpler.

>Having TeX as a non-macro-based language does not imply that it must
>resemble C or Pascal or Lisp (although if TeX *programs* resembled (say)
>Lisp or Pascal rather than just a mess of backslashes it would be no bad
>thing).

As I pointed out in another msg five minutes ago, TeX processes
its input sequentially. There is no 'source code' that accepts
the 'input text' and processes it: there is only the
input text. Therefore macros are the only principle for
such a model of typesetting. I don't see immediately
how you would implement any system for typesetting that
is of a markup principle and that is not essentially
a macro language.

Victor.

edward@priam.Berkeley.EDU (Edward Wang) (05/14/91)

In article <1991May13.220314.17535@csrd.uiuc.edu> eijkhout@s41.csrd.uiuc.edu (Victor Eijkhout) writes:
>As I pointed out in another msg five minutes ago, TeX processes
>its input sequentially. There is no 'source code' that accepts
>the 'input text' and processes it: there is only the
>input text. Therefore macros are the only principle for
>such a model of typesetting. I don't see immediately
>how you would implement any system for typesetting that
>is of a markup principle and that is not essentially
>a macro language.

I don't think so.  Let me try this roundabout argument on you:

On the surface, Tex's execution model is to transform input text
into typeset output, and embedded in the text are macros that are
expanded.  However, some macros expand to nothing, but produce
some side effect.  Some of these are used to define other macros
(\def).  In the extreme cases (like latex.tex), the input is nothing
but definitions.  Here, the input is no longer data, but is clearly
an executable program.

So Tex can be thought of as an interpreter for a programming
language.  It executes the commands and treates the text
as arguments to implicit commands.  (In fact, Tex allows these
commands to be explicit: \indent, \noindent, \par.)  In this model,
most commands have side effect.  Some commands define other commands,
some change global parameters, and some produce DVI output.

I believe it's possible to redefine Tex this way and still allow
it to accept exactly the same input.

Given this, it's clearly possible to add true functions to Tex.
Functions can't replace macros in Tex because they can't exactly
duplicate the functionality, but I think it's possible to build
an equally powerful system (with a different look) using only functions.

Let me add a bit of concreteness to this discussion.
Luigi Semenzato and I have designed an experimental typesetting
language based on Lisp.  It has a sublanguage for entering text,
with an extensible syntax.  The typesetting model is from boxes
and glue (the output is Tex).  The input, however, is based on Lisp
functions that build trees that represent text.

The syntax extension mechanism in the sublanguage is macro-like
(character sequences are mapped into Lisp expressions), but
the sublanguage is used only for text entry.  Most of the programming
is done in Lisp.

The two languages coexist.  Switching between them is simple and natural.

We have a paper on our system in the July TUG meeting.

To make the input look like a descriptive markup, the commands
have to be designed carefully.  For one thing, they must be mostly
side-effect free.  It's the same discipline that's needed to make
Tex input a descriptive markup.

janl@ifi.uio.no (Nicolai Langfeldt) (05/14/91)

In article <1991May13.214453.17318@csrd.uiuc.edu> eijkhout@s41.csrd.uiuc.edu (Victor Eijkhout) writes:
...
>input is the only form of iteration imaginable. Hence no loop
>constructs. Large parts of the design of TeX follow immediately

No loop constructs? I seem to remember that there is both \loop and
\for available. I've even used one of them once!

Or do I halucinate?

Nicolai

eijkhout@s41.csrd.uiuc.edu (Victor Eijkhout) (05/15/91)

janl@ifi.uio.no (Nicolai Langfeldt) writes:

>In article <1991May13.214453.17318@csrd.uiuc.edu> eijkhout@s41.csrd.uiuc.edu (Victor Eijkhout) writes:
>...
>>input is the only form of iteration imaginable. Hence no loop
>>constructs. Large parts of the design of TeX follow immediately

>No loop constructs? I seem to remember that there is both \loop and
>\for available. I've even used one of them once!

In ordinary (algorithmic that is) programming languages
a loop is based on a GOTO statement jumping back to
some earlier instruction. TeX's \loop macro doesn't
do that, it inserts the loop body again. Looping five times
is not jumping back four times, it is in effect equal
to having the loop body laid out five times in a row.

Therefore no loop constructs. Merely emulations of them.

Victor.

Damian.Cugley@prg.ox.ac.uk (Damian Cugley) (05/15/91)

> From:		Edward Wang <edward@priam.Berkeley.EDU>
> Message-Id:	<1991May10.065219.23433@agate.berkeley.edu>

> Surely the box-and-glue model is a superset of the Postscript
> (positioning by coordinate) model.  I agree that most of the glue
> features wouldn't be necessary if Tex had a more complete programming
> language, but I don't think it really gets in the way.

PostScript's imaging model (arbitrary filled or "stroked" Bezier curves
arbitrarily transformed) is a superset of TeX's (untransformed bitmaps
placed in arbitrary positions).

The model TeX uses internally for deciding where to put the character
bitmaps -- the boxes and glue stuff -- corresponds to whatever internal
data structures used by the WP/DTP system that generates the PostScript
page description.  There is nothing to stop someone from using a
cleverer line-breaking system than the usual "greedy" algorithm.  You
could just about write a boxes-and-glue typesetter in PostScript if you
wanted to.

---- Damian Cugley -------------------------------- pdc@prg.ox.ac.uk ---
    Computing Laboratory, 11 Keble Rd, Oxford  OX1 3QD  Great Britain   
------------------------------------------------------------------------
     malvern-request@prg.ox.ac.uk		   "share and enjoy"