[comp.lang.c] Optimizing Floating Point & Tartan C Compiler

ah4h#@andrew.cmu.EDU (Andrew Hudson) (04/19/87)

Henry Spencer writes that it is possible to optimize C source in such a way
as to make it non-portable.  We are talking about floating point evaluation
and not dependencies like device drivers, byte ordering,or math libraries,
the only non-portable aspect I can think of is float size. Can you be more
specific about floating point optimizations, an example would be nice.

A few years ago Tartan Laboratories of Pittsburgh developed the Tartan C
Compiler (tcc). It was alleged to be a highly optimizing C compiler
completely compatible with the portable C compiler. I was surprised that
efficiency minded programmers didn't snap it up and laud it as the speed
messiah of system programming.  Has anyone had or heard of experiences with
tcc? I am most curious to know why it never caught on.

Andrew Hudson @ CMU Psychology Dept.
ah4h@andrew.cmu.edu.arpa

mike@hcr.UUCP (Mike Tilson) (04/21/87)

ah4h#@andrew.cmu.EDU (Andrew Hudson) writes:

> A few years ago Tartan Laboratories of Pittsburgh developed the Tartan C
> Compiler (tcc). It was alleged to be a highly optimizing C compiler
> completely compatible with the portable C compiler. I was surprised that
> efficiency minded programmers didn't snap it up and laud it as the speed
> messiah of system programming.  Has anyone had or heard of experiences with
> tcc? I am most curious to know why it never caught on.

Actually, there is a big demand for high performance compilers.  However,
performance is not the only issue.  Tartan C didn't catch on for a number
of reasons.  This is a lengthy response, but I think it's very instructive
to look at the causes in order to understand the issues better.

First, some potential customers thought "tcc" would make their UNIX
utilities double in speed.  I believe some people may have mis-interpreted
the Tartan sales literature in this regard. In fact, in C an optimizer can't
do much with a highly-tuned program.  The C programmer can allocate registers
at source level, replace indexing with strength-reduced pointer increments,
etc.  Most of what an optimizer can do can be expressed in C source.  Most
of the heavily used UNIX routines have been around for years, and the
performance hot spots have been "tuned out" already.  A *good* optimizer
may buy something on this code, but not a factor of two.  The biggest payoff
is in certain large applications programs and in newly written code.  (Why
pay a programmer to optimize when the compiler can do it?)  Also, highly
optimized code is *ugly*, hard to understand and maintain.  (We have a
program we call the "reconstructor" which we use with our Portable Code
Optimizer. It turns our optimized code back into C source code -- it doesn't
look like anything a sane person would write.)

An optimizing compiler is well worth while, as long as you understand what
it can and can't do for you.  Optimization is even more important in
languages like Fortran, where the program doesn't have the low-level freedom
given to the C programmer.

Second, Tartan couldn't guarantee absolute compatibility with "UNIX C", namely
PCC.  In some cases, PCC is arguably "wrong" and Tartan "right", but if all
you want is to re-compile for higher speed, then what you want is "the same
answers, but faster."  At HCR we've had some success with our optimization
technology, because it can be fit into the PCC structure, so that the
language semantics don't change.  There are some costs to this approach,
but it works surprisingly well.  We also spent a lot of time worrying about
whether the optimizer would break ugly things, like the kernel.  Once
a program works, people just don't want to think about it again.

Third, I think Tartan bit off quite a bit of work.  They were doing Ada, C,
Fortran, and Pascal, I think, for many machines, all with spiffy new
optimization techniques, in a start-up company.  It was said that the
development and marketing costs were giving them trouble.   Judging by their
sales literature, I believe they are now focusing mostly on Ada for government
contracts.

Finally, some people like to get their compilers for "fundamental" languages
like C and Fortran direct from the manufacturer (or at least from the OS
vendor.)  For many people languages like C are really "part of the machine"
and it feels dangerous to depend on an outside vendor.  Again in our
own experience we've found our best success in offering optimization
technology to the hardware manufacturers, who then re-sell the compilers
to the end users.  Of course, people will buy many things from third
parties, but for certain critical items they want their system seller
to stand behind them.  For example, Tartan released "tcc" for the IBM
RT PC.  Shortly after that, IBM announced that every AIX C and Fortran
compiler would include HCR's PCO optimizer.  I'm not sure anyone's
even bothered to benchmark the compilers against each other, at this point.

Although "tcc" didn't set the world on fire, I think there is a very strong
demand for high performance compilers.  As far as I can tell, the Tartan
C product was good, and the company has some very talented people. The
failure to sell in large volume has other causes.

Optimization is a relatively cheap way to make machines faster, so long as
your expectations are realistic.  If done right, programmers can spend
more time writing new programs, rather than squeezing microseconds out
of old ones.  Most leading hardware manufacturers are actively looking
for ways to improve their compilers.

/Michael Tilson		{utzoo,ihnp4,...}!hcr!mike	416-922-1937
/HCR Corporation
/130 Bloor St. W., 10th Floor, Toronto, Ont. M5S 1N5, Canada