[comp.lang.c] \"C\" vrs ADA

dsill@NSWC-OAS.arpa (Dave Sill) (08/21/87)

>From: John Unekis <etn-rad!jru>
>	  The ada language is far more than just a language. Ada includes
>	  standards for editors,compilers, and run-time symbolic debuggers.

The DoD standards define both the Ada language and the environment
under which it runs.  It *is* possible to have the language without
the full environment.

>			:
>	  This standard was 
>	  named ADA, (the name of the mistress of Charles Babbage, who

...the *benefactor* of Charles Babbage, who...

>	  invented a punched card driven loom, considered to be the first

invented a mechanical computer called the Difference Engine...
(Jacquard invented the punched-card driven loom.)

>	  computer, she was rumored to be the first person to ever write
>	  a program on punched cards- why her name is appropriate for a
>	  real-time language is a mystery).

Why not?

>	  ... As long as
>	  computers remain basically Von Neuman processors, no language is
>	  going to offer any advantages in the real world to a language
>	  like COBOL.

Wrong.

>	  No business is going to go through the 3 to 5 years
>	  effort of retraining and converting of existing code just to
>	  satisfy the dogmatic prejudices of computer-science weenies.

Right, but there are *real* disadvantages to languages like COBOL
that can very definitely make retraining and converting worthwhile,
especially when migrating to new hardware.  Us computer science
weenies are in a position to evaluate the relative strengths and
weaknesses of various programming environments; and we try to promote
those that we feel have the most to offer in a particular situation.

I don't use Ada, but I know enough about it to know that I probably
wouldn't like programming in it.  And as a computer scientist I
question whether it will ever be a viable real-time programming
language.  It seems like its goals are self-contradictory: to provide
every imaginable feature with high performance real-time execution.
At least they aren't asking for quick compilation. :-)
   

-David Sill		     "Faith is believing what you know ain't true."
 dsill@nswc-oas.arpa				-Anonymous schoolboy

The opinions expressed above are those of the author and do not
necessarily reflect those of the Department of Defense or the U.S.
Navy.

greg@bass.nosc.MIL (08/22/87)

Normally when someone writes an article with errors I prefer to send
them a private message and ask them to post a followup correction.
However, the recent article from John Unekis, purporting to give
advice, is so full of inaccuracies that I've decided to risk his
terminal embarrassment and post this followup with corrections and
criticism.  John's article is indented, my comments are not.
   
   From: John Unekis <etn-rad!jru>
   Newsgroups: comp.lang.ada,comp.lang.c,sci.space,sci.space.shuttle
   Subject: Re: "C" vrs ADA
   Message-Id: <253@etn-rad.UUCP>
   Date: 18 Aug 87 18:09:26 GMT
   Keywords: Any suggestions?
   To: info-c@brl-smoke.arpa
   	
   In article <1065@vu-vlsi.UUCP> harman@vu-vlsi.UUCP (Glen Harman)
   writes:
   		
      Hello, sorry to interrupt your regularly scheduled
      news articles, but I didn't know where else to turn...
   		
Glen asks various questions, to which John replies:
   		
   	
   The ada language is far more than just a language.  Ada includes
   standards for editors,compilers, and run-time symbolic debuggers.

Ada is the name for a language, only.  There are other standards
which were developed to go with it, which have their own names.
The other standards are not being very successful, so far.
   
   The C language evolved at AT&T in the process of developing the UNIX
   operating system. There were, beleive it or not, an A language and a B
   language that preceded it.

C is based upon B is based upon BCPL is based upon CPL.
A is also based upon BCPL, but is not an ancestor of C.
   
   Finally with the C language the original developer of the UNIX
   operating system (which was done on a PDP-7 microcomputer) felt that
   he had what he wanted.
   
The C language was developed by Dennis Ritchie, who with Ken Thompson,
the inventor of B, jointly developed Unix.  Unix was originally
developed on the DEC PDP-7 and PDP-9 minicomputers, but C was
developed for the port to the PDP-11.
   
   It was a moderately structured language , with a syntax that was
   similar to the UNIX c shell (or vice versa).
   
This is a wild notion!  The C Shell was developed by Bill Joy
at UC Berkeley about ten years after Unix and C were invented.
   
   As UNIX gained wide acceptance the C language became more popular. It
   has the advantage over FORTRAN of having structured variable types,
   but without the overly elaborate type checking done by a language like
   PASCAL.

Pascal's type checking is not elaborate, and is actually simpler than
C's.  However, its more restrictive.  Pascal checks for type
compatibility by name, C by structure.  C also does more automatic
conversions - except for function arguments, which makes function
calls fragile and dangerous (ANSI is fixing this).  Each style of type
checking has its advantages.  Some languages provide both.

   It does share with Pascal the fact that each was developed by
   a single individual and thus represents that individuals prejudices in
   computer languages.

Ada was also developed by a single individual, Jean Ichbiah.  However,
the specifications were developed by many people, so this distinction
has some truth.

   C is now widely available outside the UNIX community and is a defacto
   standard with many companies.  It is often the case in
   military/aerospace procurements that in order to generalize the
   request for computer hardware so as not to sole source a single vendor
   the government will ask for UNIX because it is the only operating
   system that can be had on a non-proprietary basis on a variety of
   computer hardware. UNIX of course brings C right along with it.
   
C's popularity with the DOD and their vendors is no longer dependent
on the popularity of Unix.  In my experience with these people, most
are using C with PC-DOS.
   
   Because UNIX does not handle real-time applications (such as
   interrupt handling) very well, and because there was no non-
   proprietary standard for real-time operating systems, the 
   government(DOD) wanted to develop such a standard.
   
Here, operating systems are confused with languages.  Ada is often
used under Unix; Unix has not been changed.  Also, there are many
realtime versions of Unix, from AT&T and others.  Ada is suitable
for programming realtime applications, provided that the underlying
O/S, Unix or other, can support such.
   
   Also, the DOD had a problem with each branch of the service having
   its own non-compatible languages for software
   development(COBOL,JOVIAL, CMS-II,FORTRAN,C,etc.). It has been decided
   that the DOD will develop a standard computer language that will
   include standards for editors, compilers, run-time debuggers, and even
   operating system functions for real-time processing. This standard was
   named ADA, (the name of the mistress of Charles Babbage, who 
   invented a punched card driven loom, considered to be the first
   computer, she was rumored to be the first person to ever write
   a program on punched cards- why her name is appropriate for a
   real-time language is a mystery).
   
Again, the name Ada refers only to the language.  This history of Ada
and Charles Babbage is so off the mark it sounds like intentional
satire.  Here's the real story:
   
Ada Agusta (Countess of Lovelace and daughter of Lord Byron, the poet)
and Charles Babbage were lovers.  Babbage did not invent the punch
card controlled loom - Jacquard did that.  Babbage used the idea in
his design for his Analytical Engine, an early design (perhaps the
first) for a mechanical computer.  The construction of the Analytical
Engine was never completed.  It was too ambitious for the machining
capabilities of the time.
   
Ada was credited with a number of insights about the programming of
such a machine, including the notion of using it for symbolic
computation and reasoning, instead of just for arithmetic.  (A recent
biography of her suggests these ideas were actually Babbage's.)

   If you are interested in the military/aerospace field then ADA is
   definitely the language to learn. Be aware that it is a very complex
   language (Carnegie Mellon University is rumored to have required it
   for all sophomores- which resulted in flunking out half their
   sophomore class) and that to learn it properly you must find a DOD
   certified implementation which includes the editors, compilers, and
   debuggers as well.

Ada is indeed complex relative to C or Pascal (although much simpler
than PL/1).  My opinion: Ada is not a good language for teaching
beginners.  Its constructs require lots of knowledge and experience to
use correctly.  I've never heard this rumor about CMU, and intend to
disregard it unless I hear it from a more reliable source.  If you are
an experienced programmer and want to learn Ada, find a book that is
well recommended, and/or a good teacher (shop carefully for these).
You will want to have access to a validated Ada compiler, but special
editors and debuggers are not needed.
   
   The DOD plans eventually to require ALL software to be done in ADA,
   but they realize that there is enormous inertia against it.  Thousands
   of programmers must be retrained, and millions of lines of code
   converted. Don't expect to see ADA used very widely outside of the DOD
   environment. It will fail for the same reason that Pascal, Modula2, C,
   PL1, and others have failed - IBM is the dominant force in the
   commercial market(~75 percent of all commercial installations) and
   COBOL dominates the IBM installed base (~90 percent of IBM
   applications are in COBOL).
   
Ada was designed for writing systems programs, primarily for embedded
systems.  It is not in competition with COBOL, nor is C.  Far from
failing, C has become the most popular systems programming language
ever invented, and its popularity is still rising fast.  Pascal was
designed for teaching programming, and is widely used for this purpose
at Universities.  It is being pushed out by C for many programming
tasks, but it has not failed in its primary purpose.  My personal
opinion: Pascal is much better suited for learning programming than
either C or Ada.  Time spent learning Pascal well is time well spent.
My students who know Pascal do far better with C than the others.
   
   As long as computers remain basically Von Neuman processors, no
   language is going to offer any advantages in the real world to a
   language like COBOL. No business is going to go through the 3 to 5
   years effort of retraining and converting of existing code just to
   satisfy the dogmatic prejudices of computer-science weenies.

Note the strong prejudice expressed here.  This kind of demeaning
characterisation has no place on the network.
   
   The DOD is perfectly capable, however, of making contractors like
   Boeing, Lockheed, TRW, Eaton,etc. jump through the ADA hoop just
   by refusing to accept bids which do not include ADA. Therefore if
   you want a career in military/aerospace, go for ADA.
   
   ---------------------------------------------------------------
   ihnp4!wlbr!etn-rad!jru   - The opinions above were mine when I
   thought of them, by tomorrow they may belong to someone else.
   
No service is provided to the community by this kind of article.  My
sense from the style of the article is of little concern for accuracy.
This is well supported by the remarkable number of errors.  The
mixture of facts with opinions hurts both, and the virulent prejudices
serve no one.  We all make these errors from time to time and to a
certain extent.  This particularly bad example may inspire us all to
be more careful!
   
_Greg
   
   
J. Greg Davidson			  Virtual Infinity Systems
+1 (619) 452-8059        6231 Branting St; San Diego, CA 92122 USA
    
greg@vis.uucp				ucbvax--| telesoft--|
greg%vis.uucp@nosc.mil			decvax--+--sdcsvax--+--vis
greg%vis.uucp@sdcsvax.ucsd.edu		 ihnp4--|  noscvax--|
   

henry@utzoo.UUCP (Henry Spencer) (08/24/87)

> ...  The construction of the Analytical
> Engine was never completed.  It was too ambitious for the machining
> capabilities of the time.

To correct this common misconception:  Babbage understood the machining
capabilities of his time thoroughly, he allowed for their limitations,
and his Analytical Engine designs would have worked had they been built.
The Analytical Engine's real problem was that it was not really fast
enough to justify its high cost, especially in view of the limited demand
for computing at the time.  Babbage kept fiddling with the design, trying
to make it faster -- he appears to have invented pipelining, among other
things -- and this instability also made it difficult for anyone to try
building it.

By the way, Babbage not only understood the machine technology of his time,
he may have made an indirect major contribution to it.  His early Difference
Engine used interchangeable parts many years before Eli Whitney "invented"
this concept.  One of the people involved in that project was Whitworth [?]
[this is from memory of a seminar some years ago], who later devised the
world's first standardized screw thread.
-- 
Apollo was the doorway to the stars. |  Henry Spencer @ U of Toronto Zoology
Next time, we should open it.        | {allegra,ihnp4,decvax,utai}!utzoo!henry

dsill@NSWC-OAS.arpa (Dave Sill) (08/25/87)

>From: Eric Beser sys admin <sarin!eric>
>... Ada is a verbose language (if rich can be stated in any other
>words). But it is that richness that makes the language so useful.

I'm afraid I don't see the connection between richness and verbosity
implied here.

>... Ada is not just a computer language,
>it is a software engineering methodology.

I don't agree.  Ada *is* just a language.  Software engineering
*requires* things that Ada simply can't.  The most important of
these, I think, would be design.  You know, all that work that's
supposed to be done *before* coding begins.

>Many customers who originally requested C are now asking for Ada.

The customer's always right, huh?  :-)

>		:
>This design change proliferated itself through the entire system. The coding
>for this change took about a man-month, but the debug and reintegration
>phase took only two days. The system ran as before, must more efficient
>due to the design change. Had the code been written in C, this would not
>have happened. Many of the errors that were interjected by the engineers
>were due to spelling, wrong variable selection, or unhandled boundary
>conditions. These errors would not have come out during C compilation.

No, but they would probably have shown up during linting, with the
exception of unhandled boundary conditions.  I don't think there's
anything in the C language that specifically precludes the type of
strictness Ada enforces.  However, the approach usually taken by C
implementors is to let compilers compile and syntax checkers check
syntax.  Granted, run-time error checking for C is usually poor or
non-existent, but there have been implementations with quite good
run-time support (Safe-C comes to mind, but I can't remember the name
of the vendor).

>		:
>My suggestion is that you learn software engineering, object oriented
>design, and the use of software tools. Then learn Ada.

Good advice, but I think the "learn Ada" part is not as important as
one might think.  Software engineering principles are valid across
language barriers, there are object-oriented languages based on C, and
there are, of course, quite a few tools available to C programmers.

If you want to learn Ada, by all means, learn it.  Just don't expect
it to magically solve all your problems.

>You will find
>that the C will come in handy when you have to try to understand what
>some engineer did so to do an Ada redesign.

I can't understand this sentence at all.


-Dave Sill		       "It is the tragic fate of most individuals
 dsill@nswc-oas.arpa		to die before they are born."      -Erich Fromm

The opinions expressed above are those of the author and do not
necessarily reflect those of the Department of Defense or the U.S.
Navy.

ram@elmgate.UUCP (Randy Martens) (08/25/87)

In article <8948@brl-adm.ARPA> vis!greg@bass.nosc.MIL writes:
>Normally when someone writes an article with errors I prefer to send
>them a private message and ask them to post a followup correction.
>However, the recent article from John Unekis, purporting to give
>advice, is so full of inaccuracies that I've decided to risk his
>terminal embarrassment and post this followup with corrections and
>criticism.  John's article is indented, my comments are not.
>   
(lots of stuff deleted, much of it derogatory)
>   
>No service is provided to the community by this kind of article.  My
>sense from the style of the article is of little concern for accuracy.
>This is well supported by the remarkable number of errors.  The
>mixture of facts with opinions hurts both, and the virulent prejudices
>serve no one.  We all make these errors from time to time and to a
>certain extent.  This particularly bad example may inspire us all to
>be more careful!
>   
>_Greg
>   
>J. Greg Davidson			  Virtual Infinity Systems
>+1 (619) 452-8059        6231 Branting St; San Diego, CA 92122 USA
>    
----

I am sorry Greg, but I could not disagree with you more.

The right to hold and express an opinion is VITAL to our basic freedom, both
as engineers/programmers/users, and as citizens.  Mr. Unekis's posting gave
the world a chance to hear what he thought, which is the whole purpose of this
net - the free and open exchange of opinions, ideas, answers, questions, facts,
rumours, gibberish, and just plain information.  I DO NOT think that John's 
posting was "virulent", and I think that he did indeed provide a service to
the community, by providing topic for discussion.  This makes people think, and
thinking is a good thing.

I am now going to make a bold statement, which I will stress is my opinion.

ADA SUCKS !!!!

I am of the opinion that there are only three good high level computer
languages for most
situations.  I have seen and programmed in all the major languages ( I have
probably not seen some estoteric ones that are quite nice, I am sure, but
limited in scope or usage).  

For doing serious programming, the ONLY language is "C".  I have NEVER seen
an application written in another language that would not have been better/
faster/more flexible/more useful if re-written in "C".  This includes ADA and
COBOL (retch!!).  I proved this to a previous employers by redoing a DBMS
update program in "C", where before it had been in COBOL.

For hacking device drivers, writing critical code segments, or doing very
time critical applications, write it in "C", and then hand-optimize the
critical segemnts in Assembly Language.

For quick & dirty problem solving type programs, do it in FORTH.  FORTH is
also wonderful to play around in.  I have written some great games in FORTH.

For teaching programming to new CS'ers, use PASCAL.  PASCAL is a brute becuase
of its strong limitations, but it teaches good programming etiquette and style.
I also like PASCAL, particularly TURBO PASCAL for the home hacker.

The above are my opinions, and reflect only my face in mirror.

Randy Martens - !rochester!kodak!elmgate!ram
"Reality - What a Concept !!" - R.Williams

edw@ius1.cs.cmu.edu (Eddie Wyatt) (08/26/87)

  It's niece to be back from vacation.....now back to some bashing :-)
  

  This stuff really doesn't belong in comp.lang.c it should be comp.lang.misc
or something like that but ......

In article <713@elmgate.UUCP>, ram@elmgate.UUCP (Randy Martens) writes:
> 
> For doing serious programming, the ONLY language is "C".  I have NEVER seen
> an application written in another language that would not have been better/
> faster/more flexible/more useful if re-written in "C".  This includes ADA and
> COBOL (retch!!).  I proved this to a previous employers by redoing a DBMS
> update program in "C", where before it had been in COBOL.

   I like to see you write a symbolic differentiation program in C.  I
can write that program in 20 lines or less in lisp.

   I think you've missed a very important aspect about programming in
different languages.  The languages we choose to program in, shape the
way we think about a solution to a problem.  They also affect the style
and methodology we use.

  An example...  Many people abbuse pointers in C.  The classic fast 
4 (sizeof(int)) byte copy code.

		* (int *) x = * (int *) y;

This is not portable. It should be coded as:

		x[0] = y[0];
		x[1] = y[1];
		x[2] = y[2];
		x[3] = y[3];

IF portability is an issue - it might not be.

  A language like Pascal will not allow the first construct.  There are many
other examples where C allows one to write code that is machine specific,
that one could not write in a language like Pascal.  Does this make Pascal
a better language then C?  Some people would say yes, I would say no but then
again thats another story. ;-)

  Is this a good enough example of how a language can effect the way think
about a solution to a problem?

-- 

					Eddie Wyatt

e-mail: edw@ius1.cs.cmu.edu

eric@snark.UUCP (08/26/87)

Wow. One of Henry Spencer's sage pronunciamentos turns out to be misleading.
They'll probably report hell freezing over next...;-)

In article <8473@utzoo.UUCP>, henry@utzoo.UUCP (Henry Spencer) writes:
> By the way, Babbage not only understood the machine technology of his time,
> he may have made an indirect major contribution to it.

The connection is much more direct than Henry implies. Babbage and his
associates effectively invented precision machining in the modern sense (i.e.
with repeatable tolerances smaller than the unaided eye can distinguish) during
his two-and-a-half attempts to build the 'Analytical Engine'.

Babbage, Whitworth and two of the other team members co-authored *the*
foundation text in the field of precision machining, a weighty tome with a
title that escapes me at the moment. Ask any historian of technology about it.

The article I got all this from claimed that the book is *still* occasionally
cited by the older generation of instrument makers and tool-and-die men, who
remember and revere Babbage's name without having retained as part of the craft
lore the original purpose of his work.

Sorry, I can't give hard sources for this. It might have been one of James
Burke's coffee-table books on the history of technology. -- 
      Eric S. Raymond
      UUCP:  {{seismo,ihnp4,rutgers}!cbmvax,sdcrdcf!burdvax,vu-vlsi}!snark!eric
      Post:  22 South Warren Avenue, Malvern, PA 19355    Phone: (215)-296-5718

drw@cullvax.UUCP (Dale Worley) (08/26/87)

henry@utzoo.UUCP (Henry Spencer) writes:
> [Babbage's] early Difference
> Engine used interchangeable parts many years before Eli Whitney "invented"
> this concept.

I question this.  Whitney worked with interchangable parts around
1800, Babbage was late 1800's.  But during most of this period, the
only people who used interchangable parts were rifle manufacturers for
the military.  The price was insanely high, but being able to
interchange parts in the field was worth it to the military.  Rifle
manufacture probably drove measuring technology and machining
technology for much of the century.

Dale
-- 
Dale Worley    Cullinet Software      ARPA: cullvax!drw@eddie.mit.edu
UUCP: ...!seismo!harvard!mit-eddie!cullvax!drw
Apollo was the doorway to the stars - next time we should open it.
Disclaimer: Don't sue me, sue my company - they have more money.

schung@cory.Berkeley.EDU (Stephen the Greatest) (08/26/87)

In article <713@elmgate.UUCP> ram@elmgate.UUCP (Randy Martens) writes:
>
>ADA SUCKS !!!!
>

Alright!

>
>For doing serious programming, the ONLY language is "C".  
>

Yeah!!!!

>
>The above are my opinions, and reflect only my face in mirror.
>

Good opinions.


- Stephen

dsill@NSWC-OAS.arpa (Dave Sill) (08/28/87)

From: John Unekis <etn-rad!jru>
>How could I have been so blind. I take back everything I 
>said about her. Ada is the perfect name for a computer 
>language that will be the standard for the U.S. military and
>aerospace contracting industries.

Ha ha, I never thought of it that way :-)
On the appropriateness of names: why did Wirth name his Modula-2
machine after a female demon, Lilith, who supposedly ate children?

Oh, excuse me, there is a another subject I wanted to bring up.
What has all this Ada/COBOL/Babbage/370 stuff got to do with C,
anyway?  Not much.  Let's try to keep our postings a little more
germane to the discussion of the C language.  Before I get a string of
"pot calling the kettle black" messages,  I'll admit I'm guilty of
this myself.  Sometimes it's awfully hard to resist the urge :-)


-Dave

henry@utzoo.UUCP (Henry Spencer) (08/29/87)

> > [Babbage's] early Difference
> > Engine used interchangeable parts many years before Eli Whitney "invented"
> > this concept.
> 
> I question this.  Whitney worked with interchangable parts around
> 1800, Babbage was late 1800's...

Please check your references; unless I am much mistaken, Whitney was circa
the US Civil War (1860s) while Babbage was *early* 1800s.

> But during most of this period, the
> only people who used interchangable parts were rifle manufacturers for
> the military...

Again, I suggest some checking of facts; I *think* Whitney was making muskets,
not rifles.  And they were justified on grounds of lower cost (always a
significant issue in wartime, believe it or not), not field interchange of
parts.

The most amusing part of this is that in fact gun parts are not fully
interchangeable to this day.  Neither are automobile parts, by the way.
It is more economical to accept some degree of hand-fitting or variance
in sizes (car-engine pistons effectively come in three sizes) than to
achieve the precision needed to make everything fully interchangeable.
(For example, making all pistons the same size requires changing or
adjusting cutting bits much more often, because wear on the bits slowly
changes the size of the parts they produce.  And those things are costly.)
-- 
"There's a lot more to do in space   |  Henry Spencer @ U of Toronto Zoology
than sending people to Mars." --Bova | {allegra,ihnp4,decvax,utai}!utzoo!henry

dant@tekla.UUCP (08/30/87)

>> >, > = Henry Spencer
>> = someone else

>> > [Babbage's] early Difference
>> > Engine used interchangeable parts many years before Eli Whitney "invented"
>> > this concept.
>> 
>> I question this.  Whitney worked with interchangable parts around
>> 1800, Babbage was late 1800's...
>
>Please check your references; unless I am much mistaken, Whitney was circa
>the US Civil War (1860s) while Babbage was *early* 1800s.
>

You're both both right and wrong.  Whitney actually came up with his idea
of interchangeable parts in the late 18th century.  He had a moderately
famous demonstration of this in 1801.  This demonstation was with muskets,
not rifles.

Babbage first got his ideas for mechanical calculators in 1812 or 1813.

One thing they both had in common was having to develop new machining
techniques and tools to put their ideas into practice.


---
Dan Tilque
dant@tekla.tek.com  or dant@tekla.UUCP