[comp.software-eng] CS education engineering, mathematics, and computer science

mjl@cs.rit.edu (11/14/89)

In article <34754@regenmeister.uucp> chrisp@regenmeister.uucp (Chris Prael) writes:
>From article <8911092042.AA21382@ctc.contel.com>, by fitz@CTC.CONTEL.COM (Joe Fitzgerald):
>> I learned a helluva lot more about software engineering in one quarter of
>> operating systems than I did in four quarters of calculus
>
>If the four quarters of calculus you took were at all typical, the were
>all cookbook and no theory.  No-one learns to think from memorizing that 
>trash. It wasn't until I got into the second and third year stuff that 
>they started teaching material of real value.
>
>First year calculus in not taught for any reason having anything to do
>with the education of mathematicians.  It is taught because the teaching
>of physics and most forms of engineering is crippled without it.  

Which leads to another interesting point: most engineers are not
particularly good mathmaticians.  This is *NOT* a slam on traditional
engineering disciplines -- there are enough other issues that must be
addressed, so it is not at all unusual that engineers are intelligent
*appliers* of the the results of mathematics, but are not well-versed
in the underlying theory.  When was the last time that an engineer
actually *proved* (in the rigorous mathematical sense) that the
application of a differential equation or a linear transform was
correct?

This was brought home to me in graduate school: one of my colleagues
had an undergraduate degree in mechanical engineering, while mine is in
mathematics.  We were talking one day about our the courses we took,
and the discussion turned to complex variables.  It soon became
apparent that we had taken two entirely different courses.  His
stressed the application of results to engineering problems, while mine
introduced the underlying mathematical structure, with little (that is,
no) regard for applications.

This has given me insight into one of the raging issues in the software
development community today, namely the push by some for greater
formalism (in the sense of mathematical rigor).  I'm all for a
disciplined approach to software development, and my background
probably makes me more sympathetic than most to rigorous mathematical
systems of program specification, design, and construction.  Yet
somehow this is all unsatisfying, primarily because application of
these techniques seems to require a mathematical sophistication well
beyond that seen in other engineering disciplines.

To the extent that such formalism can be shown to be inherent to
quality software development, I support it.  But I sometimes suspect
that it is partially a reaction by members of a young profession trying
to garner respect from other engineers and scientists.  In this regard,
I recommend Neil Rickert's delightful letter in ACM Forum, CACM,
November 1989).  I also think that it can obscure other important
issues such as designing for evolution, and the role of insight in the
design process, and identifying user wants.  To quote John von
Neumann:

	There's no sense being exact about something if your don't even
	know what you're talking about

(in "Exploring Requirements: Quality BEFORE Design", D. Gause and G.
Weinberg, Dorsett House, 1989).

My own view is that formal methods and rigorous development techniques
are important, but that they will have little effect on the state of
the practice until their essence can be distilled and presented in the
same way as traditional engineering mathematics.

Mike Lutz

P.S. Lest I be accused of rampant philistinism by adherents of the formal
methods school, let me recommend two excellent books: David Gries's
"The Science of Programming" (Springer-Verlag, 1981) and Cliff Jones's
"Systematic Program Development Using VDM" (Prentice-Hall, 1986).
Mike Lutz	Rochester Institute of Technology, Rochester NY
UUCP:		{rutgers,cornell}!rochester!rit!mjl
INTERNET:	mjlics@ultb.isc.rit.edu

chrisp@regenmeister.uucp (Chris Prael) (11/16/89)

From article <1398@cs.rit.edu>, by mjl@cs.rit.edu:
 
> Which leads to another interesting point: most engineers are not
> particularly good mathmaticians.

Perhaps it would be well to take your point a little further: engineers
are not supposed to be mathemeticians at all.  An engineer does need to
have a fair amount of mathematics in his/her toolbox, but that is a far
cry from any pretence of needing to be a mathematician.

> When was the last time that an engineer
> actually *proved* (in the rigorous mathematical sense) that the
> application of a differential equation or a linear transform was
> correct?

And why should the engineer have to waste time doing so?  "Prooving" an
equation or its application is completely beside the point.
 
> [...great anecdote...]
 
> This has given me insight into one of the raging issues in the software
> development community today, namely the push by some for greater
> formalism (in the sense of mathematical rigor).  ...
> To the extent that such formalism can be shown to be inherent to
> quality software development, I support it.  

You were very polite about this, perhaps too.  My background in math,
before I got into computing, early lead me to the conclusion that 
computing and math have nothing fundamental in common.  Most of the 
failings of Computer Science stem from the attempt to make computing a 
branch of mathematics.  The attempt displays profound misunderstandings 
of the fundamentals of mathematics and of the fundamentals of computing.

This confusion seems to stem from the fact that arithmetical and
methematical applications were the earliest to which computing was 
applied.  Those applications are not and never were fundamental to 
computing.  They are fundamental to the implementation of FORTRAN
libraries, but FORTRAN is a perishingly small corner of computing.  It 
would be as appropriate to assert that drafting is now the foundation of 
computing because there are so many CAD applications.

One of the postings yesterday (my appoligies for not being more
specific) pointed to the report of "The Task Force on the Core of
Computer Science" published in the Jan 89 issue of CACM as an example of
how the field is growing.  I agree, but not in the same sense.  The
report is so rooted in the idea of computing as imitation mathematics
that it is meaningless to the great majority of the field.

> My own view is that formal methods and rigorous development techniques
> are important, but that they will have little effect on the state of
> the practice until their essence can be distilled and presented in the
> same way as traditional engineering mathematics.

Here, I find the need to point back to what I said before about freshman
calculus, which perfectly exemplifies engineering mathematics.  It is the 
memorization of a body of recipies drawn from mathematics.  There is no 
structure, there is no theory.  It is not mathematics at all.  But, the
recipies are mandatory elements of the engineer's tool box.  

Computing does not have that excuse.  Only two modest segments of
mathematics contains elements that can even remotely be called
fundamental contents of our toolkit: elementry arithmetic and boolean
algebra.  Both of these are pretty elementry when compared to the math
recipies needed by civil, mechanical, and electrical engineers.  Is that
why we pretend to mathematical content that is not there?

Formal methods, in the mathematical sense, are as meaningful to
computing as they are to civil engineering: not at all.  I will not say
that formal methods can never be of use to us.  But I will say that if a
formal method is ever concocted that is actually functional in
computing, it will be found to be equally useful in designing and
implementing pencils, coffee cups, aquiducts, connecting rods, and light
bulbs.

Chris Prael

djones@megatest.UUCP (Dave Jones) (11/16/89)

> In article <34754@regenmeister.uucp> chrisp@regenmeister.uucp (Chris Prael) writes:
>...
>
>First year calculus in not taught for any reason having anything to do
>with the education of mathematicians.  It is taught because the teaching
>of physics and most forms of engineering is crippled without it.  
> 

I must differ. You can now buy a cheap symbolic calculus calculator
which will do everything taught in those entry level courses, and more.
And quicker, and more accurately than you can do it yourself.

But physics does have something to do with it. They teach calculus
that way because of inertia.

UH2@PSUVM.BITNET (Lee Sailer) (11/17/89)

In article <10072@goofy.megatest.UUCP>, djones@megatest.UUCP (Dave Jones) says:
>
>> In article <34754@regenmeister.uucp> chrisp@regenmeister.uucp (Chris Prael)
>writes:
>>...
>>
>>First year calculus in not taught for any reason having anything to do
>>with the education of mathematicians.  It is taught because the teaching
>>of physics and most forms of engineering is crippled without it.
>>
>
>I must differ. You can now buy a cheap symbolic calculus calculator
>which will do everything taught in those entry level courses, and more.
>And quicker, and more accurately than you can do it yourself.
>

But, as Gerry Weinberg says, it is not know-how that counts, it is
know-when.  You can give your symbolic calculus calculator (Maple?
Mathematica?) to a chimp, if you want, but she won't be able to use it
for anything useful.

crm@romeo.cs.duke.edu (Charlie Martin) (11/17/89)

I think Mike Lutz was onto something here: there is a difference between
the USE of mathmatics by engineers and the discovery/invention of new
mathematics by mathematicians.  But in terms of the specific question
of greater formalism in software engineering, we still have the problem
that many of the mathematical tools we need are not yet
discovered/invented.

Charlie Martin (crm@cs.duke.edu,mcnc!duke!crm) 

madd@world.std.com (jim frost) (11/17/89)

djones@megatest.UUCP (Dave Jones) writes:
>> In article <34754@regenmeister.uucp> chrisp@regenmeister.uucp (Chris Prael) writes:
>>First year calculus in not taught for any reason having anything to do
>>with the education of mathematicians.  It is taught because the teaching
>>of physics and most forms of engineering is crippled without it.  

>I must differ. You can now buy a cheap symbolic calculus calculator
>which will do everything taught in those entry level courses, and more.
>And quicker, and more accurately than you can do it yourself.

And you'll miss out on all the relationships that are obvious if you
know calculus.  Most of my high-school physics finally started making
sense when I took my first calculus course and saw how many of those
formulae were derived.

I would feel better if engineers knew WHY things worked, and I believe
that understanding the mathematics behind the scenes is crucial to
understanding the behavior of real-life systems.

jim frost
software tool & die
madd@std.com

dar@nucleus.UUCP (Dario Alcocer) (11/17/89)

In article <34770@regenmeister.uucp>, chrisp@regenmeister.uucp (Chris Prael) writes:
> 
> You were very polite about this, perhaps too.  My background in math,
> before I got into computing, early lead me to the conclusion that 
> computing and math have nothing fundamental in common.  Most of the 
> failings of Computer Science stem from the attempt to make computing a 
> branch of mathematics.  The attempt displays profound misunderstandings 
> of the fundamentals of mathematics and of the fundamentals of computing.
> 
> [text deleted...]
> 
> Computing does not have that excuse.  Only two modest segments of
> mathematics contains elements that can even remotely be called
> fundamental contents of our toolkit: elementry arithmetic and boolean
> algebra.  Both of these are pretty elementry when compared to the math
> recipies needed by civil, mechanical, and electrical engineers.  Is that
> why we pretend to mathematical content that is not there?
> 
> Chris Prael

I'll have to disagree with you Chris.  I can think of two branches of
mathematics that have _everything_ to do with computing...

Graph thoery - Has contributed many of the data structures we use,
trees, directed and non-directed graphs; just read DDJ, Sept 89, regarding
A* autorouting and simulated anealing.

Combinatorics - study of _discrete_ mathematics, useful in alogirth design
and analysis.

Many concepts in computing can be studied and learned without any regard to
the mathematics of them.  However, I think that the relevance of mathematics
is clear when it comes to develop new techniques and algorthims in the
computing field.

Dario Alcocer
dar@nucleus.mi.org
:wq


?

reino@cs.eur.nl (Reino de Boer) (11/17/89)

dar@nucleus.UUCP (Dario Alcocer) writes:

>In article <34770@regenmeister.uucp>, chrisp@regenmeister.uucp (Chris Prael) writes:
>> .....
>> computing and math have nothing fundamental in common.  Most of the 
>I'll have to disagree with you Chris.  I can think of two branches of
>mathematics that have _everything_ to do with computing...

>Graph thoery - Has contributed many of the data structures we use,
>Combinatorics - study of _discrete_ mathematics, useful in alogirth design

And how about computational complexity theory, logic, and numerical
mathematics ?

-- 
Reino R. A. de Boer
Erasmus University Rotterdam ( Informatica )
e-mail: reino@cs.eur.nl

chrisp@regenmeister.uucp (Chris Prael) (11/18/89)

From article <5481@nucleus.UUCP>, by dar@nucleus.UUCP (Dario Alcocer):
> In article <34770@regenmeister.uucp>, chrisp@regenmeister.uucp (Chris Prael) writes:
>> The attempt displays profound misunderstandings 
>> of the fundamentals of mathematics and of the fundamentals of computing.

 
> I'll have to disagree with you Chris.  I can think of two branches of
> mathematics that have _everything_ to do with computing...
 
> Graph thoery - Has contributed many of the data structures we use,
> trees, directed and non-directed graphs; just read DDJ, Sept 89, regarding
> A* autorouting and simulated anealing.

You have mixed two issues, Dario.  

First, while graph theory may have been used to explain them afterwards, 
the data structures we use were generally cooked up without reference to 
graph theory.  Nothing fundamental to computing about graph theory here.

Second, graph theory is obviously fundamental to the solution of a small
but useful class of applications (such as trace routers for PC board
design).  That does not make it fundamental to computing.  In the same 
wise, numerical analysis is fundamental to the solution of another small 
class of applications (such as trig functions).  That does not make
numerical analysis fundamental to computing.
 
> Combinatorics - study of _discrete_ mathematics, useful in alogirth design
> and analysis.

Perhaps.  The design of algorithms is primarily deduced from the
information structure(s) to be operated on.
 
> Many concepts in computing can be studied and learned without any regard to
> the mathematics of them.  However, I think that the relevance of mathematics
> is clear when it comes to develop new techniques and algorthims in the
> computing field.

Which circles us back to my original assertion: your assertion implies
that your understanding of either or both fields is insufficient.  I
wish there was a politer way to say that. :-)

Chris Prael
 
> Dario Alcocer
> dar@nucleus.mi.org
> :wq
> 
> 
> ?

mjl@cs.rit.edu (11/19/89)

Since I started this thread, let me toss out a few more thoughts.  I
think the most important contribution of mathematics to computer
science/software engineering is its mode of inquiry.  I won't discuss
here the particular topics that have been bandied about as being more
or less "related".  Instead, I'd say that it is the approach to problem
solving -- defining a specification or reasoning about a program in the
context of a rigorous axiomatic system -- that most closely ties
mathematics and computing.  In this respect, a good dose of discrete
mathematics or basic modern algebra is appropriate, not so much because
of the topics per se, but rather because the form of the proofs in
these domains maps most closely onto the formal models of computation.
(Of course, the topics themselves contain gems of abstraction and
generalization, for those willing to do some mining).

What bothers me mostis that this implies a significantly deeper
understanding of mathematical principles than is required by other
engineering disciplines.  Or, to phrase it slightly differently:

	Formal methods of software engineering require the engineer
	to think like a mathematician.  This is untrue of any other
	engineering discipline I am aware of.  Why the discrepancy?
	Can we address this by creating "boilerplate" mathematics
	for software engineering?

Mike Lutz
Mike Lutz	Rochester Institute of Technology, Rochester NY
UUCP:		{rutgers,cornell}!rochester!rit!mjl
INTERNET:	mjlics@ultb.isc.rit.edu

jhan@cbnewsd.ATT.COM (james.e.hanlon) (11/20/89)

In article <1408@cs.rit.edu>, mjl@cs.rit.edu writes:
> science/software engineering is its mode of inquiry.  I won't discuss

> mathematics and computing.  In this respect, a good dose of discrete
> mathematics or basic modern algebra is appropriate, not so much because
 
> 	Formal methods of software engineering require the engineer
> 	to think like a mathematician.  This is untrue of any other
> 	engineering discipline I am aware of.  Why the discrepancy?
> 	Can we address this by creating "boilerplate" mathematics
> 	for software engineering?
> 
> Mike Lutz
 
Mike Lutz has stated eloquently what has been on my  mind for a long time now,
as I have struggled to get my arms around a very complex, abstract, subject:
just what constitutes Software Engineering, and how should we best prepare
ourselves to go about doing it ?  I've concluded that S.E. should be
subdivided into three areas, all concerned with Discrete Logic Systems: their
Description, their Analysis, and their Construction. Conventional educational
practice fails us because it
	
	1. is not systematic, being component-oriented rather than
	   systems-oriented;

	2. offers little or no guidance regarding systems description tools,
	   techniques, goals, or philosophies;

	3. concentrates on analysis of only the historically numerical aspects
	   of software, that of algorithms; and

	4. persists in casting most solutions to problems in terms of linear,
	   sequentially oriented, languages, which have obscure syntax and
	   nonobvious side-effects.

For its part, Industry administers the second of the one-two punch
combination, by making formal policy statements that it will only hire
individuals who have thrived under such a restricted intellectual regimen.

It seems to me that most problems with software are Big Picture problems, and
that they could be sidestepped by workers with a Formal Systems Architecture
approach. With such talent at hand, problems like "How should I best track
field problem reports, and map them into my SCCS?" are replaced by "Aha, I see
that the multi-site architecture risks some update anomalies. I suggest
timestamping our transactions with a periodically resynchronized local
time-of-day clock to avoid them." 

I'm not sure that all mathematicians adopt the perspective implied above
when going about their daily work; but several mathematical "modes of inquiry"
definitely occur in software engineering. 

Learning these can be difficult in itself, but the first hurdle, finding out
just what modes are appropriate, presents an even greater immediate problem.
Apart from a grounding in the basics of sets and graphs that one gets in
Discrete Structures courses, I have personally found very helpful: Queueing
Theory, Operations Research, classical Formal Logic, Type Theory, the
philosophical origins of Set Theory, and certain areas of Formal Semantics
( note: some of these topics are not on any CS/SE curriculum; they're easy
enough to find books on in the library, though). An odd, but, in abstract
areas like Software, common, subject is the Epistemology of Personal Belief
Systems; tough to find courses on, but  accessible through "popular"
logicians like Smullyan. After becoming aware--later familiar--with the array
of supporting material available to the software engineer from other fields,
you will occasionally be gratified by moments of real communication with those
in other areas of your organization. They frequently have studied classic
problems that have SE implications, but under different names.


    Supporting Personal Anecdote

    I was trying to convince a VP Engineering that some subset of field
    problems derived from the software engineers' failure to make the program's
    task scheduling algorithm explicit. As I went on about processes and
    ports, and memory allocation, and time slices, all being finite, etc., he
    interrupted with "It's just Bin Packing". Which is how he had learned 
    scheduling theory--I had learned Operating Systems.

To return to the original poster's point, I think that, in some ways, there is
a "boilerplate" solution to the Software Engineer's problems--it's just
that the necessary patterns of thought and expression involve what are
presently considered impossibly esoteric subject areas. Does the Engineer have
to be Philosopher, Epistemologist, Logician, Semanticist ? Perhaps not
literally; but such fields do provide intellectual support for a more distant
systems perspective. I have seen too many projects, and too many engineers,
wallow in minutiae, to dismiss such notions out of hand.

Comments, anyone ?

Jim Hanlon				AT&T-BL

bradlee@cg-atla.UUCP (Rob Bradlee) (11/20/89)

In article <89320.110032UH2@PSUVM.BITNET> UH2@PSUVM.BITNET (Lee Sailer) writes:
>  You can give your symbolic calculus calculator (Maple?
>Mathematica?) to a chimp, if you want, but she won't be able to use it
>for anything useful.

Not True!  Here at Agfa we have employed dozens of chimps to perform our
development work.  We give them the latest in hardware and software tools
and they diligantly type away all day.  As far as we can tell, they produce
more lines of code with fewer bugs than our human developers.  Well, of
course they do take some time out each day for mutual grooming to remove
those nasty little fleas and lice from each other's fur, but we bury the
time on our time-sheets under "bug-fixing".  
    Finally, I think you all need to open your minds to new techniques 
like lower-species development methods.  I'm tired of the bigotry and 
intolerance shown towards our furry cousins.

				Rob

P. S.  Wanna bet someone takes this seriously?

-- 
Rob Bradlee  w:(508)-658-5600 X5153  h:(617)-944-5595
AGFA Compugraphic Division.    ...!{decvax,samsung}!cg-atla!bradlee
200 Ballardvale St.
Wilmington, Mass. 01887           The Nordic Way: Ski till it hurts!

madd@world.std.com (jim frost) (11/21/89)

chrisp@regenmeister.uucp (Chris Prael) writes:
>From article <5481@nucleus.UUCP>, by dar@nucleus.UUCP (Dario Alcocer):
>> I'll have to disagree with you Chris.  I can think of two branches of
>> mathematics that have _everything_ to do with computing...
> 
>> Graph thoery

>First, while graph theory may have been used to explain them afterwards, 
>the data structures we use were generally cooked up without reference to 
>graph theory.

This is true, but this is not the way it should be.  It would have
been easier to cook up the data structures with some knowledge of
graph theory to begin with.  Dario's point still stands.

Consider another field.  Thomas Edison tried (how many?  1500 rings a
bell) compounds before finding that carbonized thread could be used as
a filament.  If he had spoken with a competent chemist at the time he
wouldn't even have had to look.  Thus Edison, if he had had a
background in chemistry, could have been more effective.  He made the
light bulb, but he did it the hard way.

I've found mathematics to be priceless in both practical and
theoretical computing.  It certainly can avoid doing things the hard
way, and we can avoid the mistakes of our predecessors who had to do
so because of a poor understanding of the problems at hand.

jim frost
software tool & die
madd@std.com

chrisp@regenmeister.uucp (Chris Prael) (11/21/89)

From article <1408@cs.rit.edu>, by mjl@cs.rit.edu:
> Since I started this thread, let me toss out a few more thoughts.  I
> think the most important contribution of mathematics to computer
> science/software engineering is its mode of inquiry.  I'd say that it 
> is the approach to problem solving -- defining a specification or 
> reasoning about a program in the context of a rigorous axiomatic 
> system -- that most closely ties mathematics and computing.

My observation and experience is that this relationship is historical in
origin.  It traces to the fact that most computer science departments 
were spun off from mathematics departments.  I watched the process while 
I earned my BS in math.  From what I saw, it appeared that many of the
people who moved from math to computers did it not because they were 
interested in computers but they were not particularly successful at
math.  This resulted in "computer science" departments that were really
bad imitations of mathematics departments.  The repeated paragraph
implies that this is still true.

There are two other sources of the mathematical misapprehension.  The
first is that the earliest applications performed with computers were
mainly numerical analysis problems.  As a result, many thought that the
application was the field.  The other is the fact the programs are
completely abstract.  So, it is not surprising that a lay person would
assume that because computing and mathematics share the attribute of
having no concrete representation they are the same.  This confusion is
hard to excuse in a professional.

> In this respect, a good dose of discrete
> mathematics or basic modern algebra is appropriate, not so much because
> of the topics per se, but rather because the form of the proofs in
> these domains maps most closely onto the formal models of computation.

Perhaps this is another way of saying that the formal models of
computation have been copied from that of modern algebra.

> (Of course, the topics themselves contain gems of abstraction and
> generalization, for those willing to do some mining).

I found modern algebra to a little too long on recipies and a lot too
short on theoretical structure.  Far more useful would be time spent on
naive set theory, properties of the real line, logic (not the primitive
stuff!), and metatheory (theory of theories).  These are topics that you
can (and must) really chew on!

> What bothers me most is that this implies a significantly deeper
> understanding of mathematical principles than is required by other
> engineering disciplines.  

Which could be reasonably taken to imply that mathematical principals
have as much to do with software engineering as it does with the other
engineering disciplines.

> Or, to phrase it slightly differently:
> 	Formal methods of software engineering require the engineer
> 	to think like a mathematician.  This is untrue of any other
> 	engineering discipline I am aware of.  Why the discrepancy?

The discrepancy is clearly in the formal methods.  They have nothing 
functional to do with software and nothing functional to do with
engineering.  An engineer DOES NOT think like a methematician.  He/she
thinks like an engineer!  Trying to ape mathematicians will not make one
an engineer.  Not does it make one a mathematician.  It makes one an 
imitation mathematician!

> 	Can we address this by creating "boilerplate" mathematics
> 	for software engineering?

No!  This problem can only be addressed by ending the attempt to be
imitation mathematicians. 

The fact that you are questioning a quarter century of dogma is a major
step forward.  Take the next step.

Chris Prael

kurtl@fai.UUCP (Kurt Luoto) (11/22/89)

In article <5481@nucleus.UUCP> dar@nucleus.UUCP (Dario Alcocer) writes:
>In article <34770@regenmeister.uucp>, chrisp@regenmeister.uucp (Chris Prael) writes:
>> 
>> Computing does not have that excuse.  Only two modest segments of
>> mathematics contains elements that can even remotely be called
>> fundamental contents of our toolkit: elementry arithmetic and boolean
>> algebra.  Both of these are pretty elementry when compared to the math
>> recipies needed by civil, mechanical, and electrical engineers.  Is that
>> why we pretend to mathematical content that is not there?
>> 
>> Chris Prael
>
>I'll have to disagree with you Chris.  I can think of two branches of
>mathematics that have _everything_ to do with computing...
>
>Graph thoery - [...]
>
>Combinatorics - [...]
>
>Dario Alcocer

I agree with Dario.  I would also mention computability and complexity
theory and numerical analysis as branches of mathematics which are
fundamental to computing in general.

I would guess that the majority of computing practitioners,
which includes programmers, software engineers, etc,
use relatively little mathematics on a daily basis.
By the same token, most of the daily routine of other professions
probably includes little of the practitioner's specific training
in that profession.  (For example, I'd like to know how much of
a senior physicist's time is devoted to actually "doing physics"
rather than writing grant proposals, supervising grad students
or junior physicists, filling out purchase orders, fighting
political battles within and without his department, etc.)
But that is not to say that there is no content to these fields
of study.  Nor does it mean that the practitioner can safely
ignore studying the fundamentals of his field.

Pet peeve: I hate the term "computer science", but it seems that
we will be stuck with it for some time to come.
-- 

----------------
Kurt W. Luoto     kurtl@fai.com   or   ...!sun!fai!kurtl

chrisp@regenmeister.uucp (Chris Prael) (11/22/89)

From article <1989Nov20.160513.19313@world.std.com>, by madd@world.std.com (jim frost):
> chrisp@regenmeister.uucp (Chris Prael) writes:
>>From article <5481@nucleus.UUCP>, by dar@nucleus.UUCP (Dario Alcocer):
>>> I'll have to disagree with you Chris.  I can think of two branches of
>>> mathematics that have _everything_ to do with computing...
 
>>> Graph theory
 
>>First, while graph theory may have been used to explain them afterwards, 
>>the data structures we use were generally cooked up without reference to 
>>graph theory.
 
> This is true, but this is not the way it should be.  It would have
> been easier to cook up the data structures with some knowledge of
> graph theory to begin with.  Dario's point still stands.

One of the things that I learned some years ago was that it was very
rare for a person to use the word "should" and have a useful connection
with reality at the same time.  Dario's point stands as well as the next
fantasy.
 
The tale of Edison and supposition about a chemist would be quite good
if it did not assume that the knowlege that resulted from Edison's work
were present before he started.  There were trained chemists and trained
physicists on Edison's staff!

> I've found mathematics to be priceless in both practical and
> theoretical computing.  

Some examples would be useful.  

> It certainly can avoid doing things the hard
> way, and we can avoid the mistakes of our predecessors who had to do
> so because of a poor understanding of the problems at hand.

More mistakes and misunderstandings in computing and its development 
can be traced to pseudo-mathematical pretences than can be traced to 
mathematical ignorance.  Computing is being increasingly retarded by 
the uncomprehending aping of mathematics and mathematicians.  Bad
imitations of mathematics do more to obscure the problems at hand than
they do to clarify issues.  As did Dario, your argument raises the 
question of how deep or shallow your comprehension of either mathematics 
or computing are.

One thing that becomes particularly clear if one examines the development 
of the field over the last forty years is that the pace of innovation has 
decreased drastically in the last 20 to 25 years.  Kids come out of school 
with 25 year old techniques and ideas, believing that this stuff is brand 
new.  One can blame the slowing partly on the "maturing" of computing as a 
field.  More serious causes are the increasing reliance on rote
"learning" of techniques and playing at mathematics.

madd@world.std.com (jim frost) (11/23/89)

chrisp@regenmeister.uucp (Chris Prael) writes:
>From article <1989Nov20.160513.19313@world.std.com>, by madd@world.std.com (jim frost):
>> I've found mathematics to be priceless in both practical and
>> theoretical computing.  

>Some examples would be useful.

Most of the networks you're using use high-powered math, particularly
graph theory, to get reasonable performance.  Go pick up a networking
book worth its name and you'll find tons of those little squiggles all
through it.

Statistics are in use every day.  Ever used a paging kernel?  They use
statistics to predict future probabilities.  Which page should I get
rid of?  Which are likely to be used soon?  There's math behind all
that, since the pager is doing statistical analysis (often tempered
with a bit of generalizations like "programs run linearly" which may
or may not hold water).

Much of graphics is modelling real-life which works on physics which
works on mathematics.  Ever done ray-tracing?

These are off the top of my head.  Go get an algorithms book and go
through it and you'll find HUNDREDS of examples, most of which will be
better than mine because most of those guys are real theoreticians.

>> It certainly can avoid doing things the hard
>> way, and we can avoid the mistakes of our predecessors who had to do
>> so because of a poor understanding of the problems at hand.

>More mistakes and misunderstandings in computing and its development 
>can be traced to pseudo-mathematical pretences than can be traced to 
>mathematical ignorance.

Huh?  Most of the mistakes I've seen were the result of people not
understanding complexity, something which you learn early in
mathematics.  Bubble-sorts don't work on million-record databases.  2n
programmers don't create twice the code in the same amount of time.
Summing a series expicitly is not as effective as calculating a general
case and using it instead.  These are the kinds of mistakes you see
all over the place, most often with people who spend all of their time
worrying about having the most efficient code.

I'd like to see counterexamples.

>As did Dario, your argument raises the 
>question of how deep or shallow your comprehension of either mathematics 
>or computing are.

At least I acknowledge a relationship.  Do you?

>One thing that becomes particularly clear if one examines the development 
>of the field over the last forty years is that the pace of innovation has 
>decreased drastically in the last 20 to 25 years.  Kids come out of school 
>with 25 year old techniques and ideas, believing that this stuff is brand 
>new.  One can blame the slowing partly on the "maturing" of computing as a 
>field.

I'm confused here.  I mean, really confused.  I was shocked when
studying algorithms to find that most of the people involved are still
alive and working on things.  Development hasn't slowed, it's
accellerated to the point where it's impossible to keep up with it
all.  People building on previous theories or proving they degrade in
this-or-that circumstance or showing that some other technique is
optimal, or nearly so, or as optimal as possible given that you can't
know the future.  I read the technical reports and every month there's
something new.  If you don't see it, you're not looking for it, or
you're missing it because it's a much wider field than it used to be.
It certainly isn't going slow to me.

jim frost
software tool & die
madd@std.com

chrisp@regenmeister.uucp (Chris Prael) (11/23/89)

From article <2608@fai.UUCP>, by kurtl@fai.UUCP (Kurt Luoto):
> I would also mention computability and complexity
> theory and numerical analysis as branches of mathematics which are
> fundamental to computing in general.

1. Numerical analysis is as fundamental to computing as is tax
computation or pay check printing.  No more and no less.  Numerical
analysis is an application.

2. Computability is a topic of numerical analysis.  A part of an
application.

3. Complexity theory is math?  Perhaps.

> I would guess that the majority of computing practitioners,
> which includes programmers, software engineers, etc,
> use relatively little mathematics on a daily basis.

Quite true, and for good reason.

> By the same token, most of the daily routine of other professions
> probably includes little of the practitioner's specific training
> in that profession.  

You might want to question a structural engineer (such as my wife), a
lawyer (related to a number of those), or an M.D. (various friends).
You would find that your assertion is false in those cases.

>(For example, I'd like to know how much of
> a senior physicist's time is devoted to actually "doing physics"
> rather than writing grant proposals, supervising grad students
> or junior physicists, filling out purchase orders, fighting
> political battles within and without his department, etc.)

Not a functional example.  Perhaps even a disfunctional example.
Physics is not a profession, it is a field of study.  Interestingly, the
same is true of mathematics.  It is not a profession, it is a field of
study.  In your example, the "senior physicist"'s profession appears to
be that of teaching physics.  

> But that is not to say that there is no content to these fields
> of study.  Nor does it mean that the practitioner can safely
> ignore studying the fundamentals of his field.

Very true.  But first, one must accurately identify the fundamentals of
the field.  The most basic problem in computing for 25 years is and has
been a consistent misidentification of the fundamentals of computing.
 
> Pet peeve: I hate the term "computer science", but it seems that
> we will be stuck with it for some time to come.

I can enthusiastically second you in this one.  Note the other fields
that are labeled ... science: political science, health science,
military science.

Chris Prael

peterd@cs.washington.edu (Peter C. Damron) (11/25/89)

In article <34818@regenmeister.uucp> chrisp@regenmeister.uucp (Chris Prael) writes:
>From article <2608@fai.UUCP>, by kurtl@fai.UUCP (Kurt Luoto):
>> I would also mention computability and complexity
>> theory and numerical analysis as branches of mathematics which are
>> fundamental to computing in general.

>2. Computability is a topic of numerical analysis.  A part of an
>application.

Computability and complexity theory is unrelated to numerical analysis.
Complexity theory is the study of what is computable and how difficult
things are to compute.  Complexity theory is fundamental to computer science.
Numerical analysis is the study of the computation of numerical quantities
using finite precision arithmetic.

Anyone who is not aware of the applications of complexity theory
can hardly call themselves a software engineer.

Peter.

---------------
Peter C. Damron
Dept. of Computer Science, FR-35
University of Washington
Seattle, WA  98195

peterd@cs.washington.edu
{ucbvax,decvax,etc.}!uw-beaver!uw-june!peterd

mike@cs.arizona.edu (Mike Coffin) (11/25/89)

From article <34788@regenmeister.uucp>, by chrisp@regenmeister.uucp (Chris Prael):
> Second, graph theory is obviously fundamental to the solution of a small
> but useful class of applications (such as trace routers for PC board
> design).  That does not make it fundamental to computing.

I think graph theory would have to be considered fundamental to
algorithm design.  A huge number of computing problems are naturally
modeled by graphs because graphs are the natural model for collections
of objects that are related in some way.  Problems as different
looking as process scheduling, register allocation, and DNA sequence
reconstruction lead (very naturally) to solving problems over graphs.
Any good designer of algorithms must be familiar with basic graph
algorithms such as topological sorting, shortest-path algorithms,
network-flow algorithms, and depth- and breadth-first search.



-- 
Mike Coffin				mike@arizona.edu
Univ. of Ariz. Dept. of Comp. Sci.	{allegra,cmcl2}!arizona!mike
Tucson, AZ  85721			(602)621-2858

mjl@cs.rit.edu (11/27/89)

In article <34795@regenmeister.uucp> chrisp@regenmeister.uucp (Chris Prael) writes:
>From article <1408@cs.rit.edu>, by mjl@cs.rit.edu:
>> Since I started this thread, let me toss out a few more thoughts.  I
>> think the most important contribution of mathematics to computer
>> science/software engineering is its mode of inquiry.  I'd say that it 
>> is the approach to problem solving -- defining a specification or 
>> reasoning about a program in the context of a rigorous axiomatic 
>> system -- that most closely ties mathematics and computing.
>
>My observation and experience is that this relationship is historical in
>origin.  It traces to the fact that most computer science departments 
>were spun off from mathematics departments.
>
>...From what I saw, it appeared that many of the
>people who moved from math to computers did it not because they were 
>interested in computers but they were not particularly successful at
>math.

Well, if this is the case, then it's one of history's serendipitous
accidents.  And you slander those who did "move to computer science
from mathematics."  Certainly there is a range of abilities amongst
those interested in computer science theory, but that's no different
from mathematics itself.  And I think we can safely say that Dana
Scott, Michael Rabin, Richard Karp, and others like them are first-rate
mathematicians in their own right.

Unlike you, I don't denigrate my mathematical background as preparation
for computer science and software engineering.  I just wonder whether
software engineers can make do with a variant of this background that
emphasizes its application rather than the underlying formalism

>There are two other sources of the mathematical misapprehension.  The
>first is that the earliest applications performed with computers were
>mainly numerical analysis problems.

This is a red herring.  While numerical analysis is a particular
application of computers closely tied to a branch of traditional
mathematics, numerical analysis itself has had little influence on the
direction of formal methods.  You might as well say that optics had a
huge impact on computer science because the ability to compute FFTs led
to the design of more sophisticated lenses.

>> In this respect, a good dose of discrete
>> mathematics or basic modern algebra is appropriate, not so much because
>> of the topics per se, but rather because the form of the proofs in
>> these domains maps most closely onto the formal models of computation.
>
>Perhaps this is another way of saying that the formal models of
>computation have been copied from that of modern algebra.

Hey, if you've got a good model, why not use it?  Group theory has been
put to use in the design of error correcting codes, so why shouldn't
we use abstract algebra to model computation if it fits (I'd argue that
it's a pretty good fit for functional specification and design).

>> (Of course, the topics themselves contain gems of abstraction and
>> generalization, for those willing to do some mining).
>
>I found modern algebra to a little too long on recipies and a lot too
>short on theoretical structure.

Then I feel sorry for you.  My course stressed the logical progression
of ideas, with rigorous proofs all along the way.  My instructors
focused on the deeper intrinsic relationships between concepts with
little surface similarity.  Did you use Birkhoff and MacLane's "A
Survey of Modern Algebra"?  All in all, I took 4 semesters of standard
algebra courses as well as a seminar in my senior year on group
theory.  Sounds esoteric, but it was superb preparation for my graduate
work in CS (both *theory* and *practice*).

>> What bothers me most is that this implies a significantly deeper
>> understanding of mathematical principles than is required by other
>> engineering disciplines.  
>
>Which could be reasonably taken to imply that mathematical principals
>have as much to do with software engineering as it does with the other
>engineering disciplines.

Or it could imply that software engineering is significantly different
from other engineering disciplines in the *type* of mathematcal foundations
it requires.  David Parnas' curriculum, mentioned by David Lamb in a
previous posting, implicitly disagrees with this assessment, but then
I disagree with his curriculum :-).

>> Or, to phrase it slightly differently:
>> 	Formal methods of software engineering require the engineer
>> 	to think like a mathematician.  This is untrue of any other
>> 	engineering discipline I am aware of.  Why the discrepancy?
>
>The discrepancy is clearly in the formal methods.  They have nothing 
>functional to do with software and nothing functional to do with
>engineering.

I STRONGLY disagree here.  I think the formal methods have a lot to do
with software engineering, I just don't think they are yet in a form
that is approachable by most practitioners.  It is not at all unusual
for mathematical theory to precede its engineering applicability by
decades.  Or centuries -- look at the time lapse between Fourier's work
and the development of computers and the FFT that made this work
applicable to realistic problems.  If I have a problem with formal
methods, it is that the proponents seem to believe they're at the "FFT"
level already.

Mike Lutz
Mike Lutz	Rochester Institute of Technology, Rochester NY
UUCP:		{rutgers,cornell}!rochester!rit!mjl
INTERNET:	mjlics@ultb.isc.rit.edu

brnstnd@stealth.acf.nyu.edu (Dan Bernstein) (11/28/89)

In article <9924@june.cs.washington.edu> peterd@june.cs.washington.edu (Peter C. Damron) writes:
> Computability and complexity theory is unrelated to numerical analysis.

There are two sides to complexity theory. Theoretical complexity theory
(better called computability theory) studies issues like P = NP.
Practical complexity theory (what ``computational complexity theory''
used to mean) studies fast algorithms for practical problems.

Particular problems in practical complexity theory (e.g., computing pi
to many digits; or computing the sine of a floating-point number) have
quite a lot to do with numerical analysis.

(I'm in math, not computer science. Computability theory is definitely
CS; numerical analysis is definitely math; computational complexity
theory is somewhere in between.)

---Dan

kurtl@fai.UUCP (Kurt Luoto) (11/29/89)

In article <34804@regenmeister.uucp> fai!amdahl!apple!sun-barr!newstop!sun!regenmeister!chrisp chrisp@regenmeister.uucp (Chris Prael) writes:
>From article <1989Nov20.160513.19313@world.std.com>, by madd@world.std.com (jim frost):
>> I've found mathematics to be priceless in both practical and
>> theoretical computing.  
>
>Some examples would be useful.  
>
>> It certainly can avoid doing things the hard
>> way, and we can avoid the mistakes of our predecessors who had to do
>> so because of a poor understanding of the problems at hand.
>
>More mistakes and misunderstandings in computing and its development 
>can be traced to pseudo-mathematical pretences than can be traced to 
>mathematical ignorance.  Computing is being increasingly retarded by 
>the uncomprehending aping of mathematics and mathematicians.  Bad
>imitations of mathematics do more to obscure the problems at hand than
>they do to clarify issues.

As you say, Chris, some examples would be useful.  Such vague broadsides
do not clarify your arguments.

Certainly uncomprehension and bad imitation of mathematics (or of any
other technique or field of study for that matter) will generate
misunderstandings and mistakes and will obscure issues.  But that speaks
only of those who are incompetent.  It says nothing one way or the other
of the usefulness of mathematics in relation to computing.

>As did Dario, your argument raises the 
>question of how deep or shallow your comprehension of either mathematics 
>or computing are.

Nor do ad hominem attacks make your arguments more convincing.

>One thing that becomes particularly clear if one examines the development 
>of the field over the last forty years is that the pace of innovation has 
>decreased drastically in the last 20 to 25 years.  Kids come out of school 
>with 25 year old techniques and ideas, believing that this stuff is brand 
>new.  One can blame the slowing partly on the "maturing" of computing as a 
>field.  More serious causes are the increasing reliance on rote
>"learning" of techniques and playing at mathematics.

Rote learning is precisely the thing that Dijkstra seems to rant against
in his writings.  I am not about to take Dijkstra's position (there does
seem to be a little of the Ivory Tower flavor to his writings).  However
I have upon occasion seen the results of programmers who had apparently
too little mathematical competence.

On one project, while doing some maintenance, I ran across an existing
module which was supposed to select a set of memory block sizes for
a memory allocator.  The problem description sounded interesting,
so I studied the algorithm in hopes of learning something.  As I studied
it, I became suspicious.  I saw that the algorithm could not do what
the problem requirements demanded.  I proved it by providing a counter
example (a pathological input case) and running it through the code.
Note that there were no coding errors, it was the algorithm that was at fault.
I also designed, proved, wrote and tested an algorithm that did work.  

Now the module in question had gone through all the production phases
(spec, design, code, test), complete with formal reviews.  In fact, it
was apparent that the first cut of the original implementor did not
work either, and what I was studying was the implementor's second pass
at the design.  It was apparent that nowhere along the process did
implmentor or any of the reviewers have the background that would have
led them to spot the inadequacy of the design.  Now you could chalk it up
to incompetent workers, but I can't shake the feeling that these people
never got the proper training in the first place.  It also shows that
a formal review procedure can fail (i.e. let a bad design go through)
if it does not include basics like demonstrating the soundness of an
algorithm, which is basically mathematics in application.

More than gathering any specific set of techniques or accumulation of
facts (theorems, etc), proper mathematical training teaches a way of
thinking, a way of approaching problems.  I also encourages a certain
kind of skepticism (in a narrow sense), a "show-me" attitude in
regard to technical statements.

Your comments on the decreasing rate of innovation could be viewed
as evidence that we need better grounding in mathematics, i.e. real
mathematical training for our CS students, as opposed to the "aping"
or "playing at" mathematics that you mention.
-- 

----------------
Kurt W. Luoto     kurtl@fai.com   or   ...!sun!fai!kurtl

peterd@cs.washington.edu (Peter C. Damron) (11/29/89)

Sorry for posting this stuff to this group, but I hate to see misconceptions
go unnoticed.

Obviously, Dan Bernstein did not read what I wrote or did not understand it.

In article <4059@sbcs.sunysb.edu> brnstnd@stealth.acf.nyu.edu (Dan Bernstein) writes:
>In article <9924@june.cs.washington.edu> peterd@june.cs.washington.edu (Peter C. Damron) writes:
>> Computability and complexity theory is unrelated to numerical analysis.

>There are two sides to complexity theory. Theoretical complexity theory
>(better called computability theory) studies issues like P = NP.

Wrong.  Complexity and computability are two different things.
Computability is the study of what can be computed.  There are
intractible problems that cannot be computed (e.g. the Post correspondence
problem).  Complexity theory is the study of how hard things are to
compute.  Complexity theory divides problems into classes like P and NP
and studies the relationships between the problem classes.

>Practical complexity theory (what ``computational complexity theory''
>used to mean) studies fast algorithms for practical problems.

Wrong.  There is complexity theory, and there are applications of complexity
theory.  Your "practical complexity theory" is simply applications.
People studing complexity theory often develop new algorithms to solve
problems, but that is a side issue to showing the complexity class of
the problem.

>Particular problems in practical complexity theory (e.g., computing pi
>to many digits; or computing the sine of a floating-point number) have
>quite a lot to do with numerical analysis.

Wrong.  Computing digits of Pi or computing the sine function have nothing
to do with complexity thoery (though these problems may be classified
into a complexity class).  How to do the computation of Pi or sine is
numerical analysis.  How difficult it is to compute those things is an
application of complexity theory.

>(I'm in math, not computer science. Computability theory is definitely
>CS; numerical analysis is definitely math; computational complexity
>theory is somewhere in between.)

Wrong.  Complexity theory is definitely computer science.

Also, there are two kinds of numerical analysis, the kind done by
mathemeticians and the kind done by computer scientists (or engineers).
Mathemeticians deal with abstract numbers like integers and reals.
Computer scientists deal with limited precision numbers.

Perhaps mathemeticians don't understand computer science.

I'm not a computer science theory person, but at least I know what the
theory is all about.

Peter.

---------------
Peter C. Damron
Dept. of Computer Science, FR-35
University of Washington
Seattle, WA  98195

peterd@cs.washington.edu
{ucbvax,decvax,etc.}!uw-beaver!uw-june!peterd

chrisp@regenmeister.uucp (Chris Prael) (11/29/89)

From article <1417@cs.rit.edu>, by mjl@cs.rit.edu:
> In article <34795@regenmeister.uucp> chrisp@regenmeister.uucp (Chris Prael) writes:
>>...From what I saw, it appeared that many of the
>>people who moved from math to computers did it not because they were 
>>interested in computers but they were not particularly successful at
>>math.
 
> Well, if this is the case, then it's one of history's serendipitous
> accidents.  And you slander those who did "move to computer science
> from mathematics."

Not the people I experienced.  The people from whom I recieved the most
valuable formative experience (taught me) were all in industry.

> Unlike you, I don't denigrate my mathematical background as preparation
> for computer science and software engineering.

I don't denigrate my mathematical background.  I recognize, on the basis
of knowlege and experience, that the fields are disjoint.  Pretending
that the math had an application that it does not would be denigrative.

>>> What bothers me most is that this implies a significantly deeper
>>> understanding of mathematical principles than is required by other
>>> engineering disciplines.  
>>
>>Which could be reasonably taken to imply that mathematical principals
>>have as much to do with software engineering as it does with the other
>>engineering disciplines.
> 
> Or it could imply that software engineering is significantly different
> from other engineering disciplines in the *type* of mathematcal foundations
> it requires.  

You use an excuse that has been abused for a quarter of a century.  It
may be appropriate in computer science.  In software engineering it is a
refuge of the incompetant.  

I suspect that what we have really been saying here, and I paraphrase
Richard Botting, that it is past time for computer science and software
engineering to go their seperate ways.  The weltanschaungs differ too
greatly for most communication between the two fields to be
constructive.

Chris Prael

chrisp@regenmeister.uucp (Chris Prael) (11/30/89)

From article <2625@fai.UUCP>, by kurtl@fai.UUCP (Kurt Luoto):

Kurt, your tale is an excellent example of an engineering design
sequence.

> It also shows that
> a formal review procedure can fail (i.e. let a bad design go through)

The issue with reviews is not whether one can or cannot fail.  Of course
it can, as you experienced.  The issue is one of probabilities.  Design
reviews reduce the probability of failed designs, massively.

> if it does not include basics like demonstrating the soundness of an
> algorithm, which is basically mathematics in application.

As the prof used to say about an overly sparse proof (of mine), you have
not demonstrated the connection.  The review is a mathematics
application to about the same extent as an inspection of columns on a
double decker freeway after a quake.

> More than gathering any specific set of techniques or accumulation of
> facts (theorems, etc), proper mathematical training teaches a way of
> thinking, a way of approaching problems.  I also encourages a certain
> kind of skepticism (in a narrow sense), a "show-me" attitude in
> regard to technical statements.

I can agree with this statement, but I would replace "proper" with
"excellent".  However, this is just as true of excellent teaching in
history, physics, or any other discipline you can name.  The major
problem is that the desired degree of excellence (or propriety if you
prefer) is found relatively rarely.

There are a large variety of experiences available outside the academic
environment that are every bit as effective, some more so, in fostering
an appropriate degree of scepticism and an adequate level of
thoroughness.  I found racing sports cars to be at least as instructive
as the process of gaining a math degree.
 
Chris Prael

mjl@cs.rit.edu (11/30/89)

In article <34851@regenmeister.uucp> chrisp@regenmeister.uucp (Chris Prael) writes:
>
>You use an excuse that has been abused for a quarter of a century.  It
>may be appropriate in computer science.  In software engineering it is a
>refuge of the incompetant.  

Incompetence is in the eye of the beholder.  Perhaps your math
preparation was inadequate and flawed, in that it did not make you
competent enough to see a relationship that is apparent to me and many
others (see -- I can make an ad hominem attack too -- now can we get
back to the issue at hand?).

>I suspect that what we have really been saying here, and I paraphrase
>Richard Botting, that it is past time for computer science and software
>engineering to go their seperate ways.

Assuming for the moment that Richard is right, you should realize that
many -- if not most -- of the formalists would end up on the software
engineering side of the fence.  They believe they have a lot to say
concerning the engineering of software artifacts.  I believe this too,
but I think the message isn't being framed very well.

>The weltanschaungs differ too
>greatly for most communication between the two fields to be
>constructive.

I hope this is never true -- it would be like severing the link between
physics and electrical or mechanical engineering.  However strained the
relationship, the engineering disciplines still depend on the more
scientific/theoretical disciplines.  The same should hold true between
CS and SWE.

>Chris Prael

Mike Lutz
Mike Lutz	Rochester Institute of Technology, Rochester NY
UUCP:		{rutgers,cornell}!rochester!rit!mjl
INTERNET:	mjlics@ultb.isc.rit.edu

chrisp@regenmeister.uucp (Chris Prael) (11/30/89)

From article <9963@june.cs.washington.edu>, by peterd@cs.washington.edu (Peter C. Damron):
 
> Perhaps mathemeticians don't understand computer science.

What if mathematicians do understand computer science?  

brnstnd@stealth.acf.nyu.edu (Dan Bernstein) (11/30/89)

In article <9963@june.cs.washington.edu> peterd@june.cs.washington.edu (Peter C. Damron) writes:
> Obviously, Dan Bernstein did not read what I wrote or did not understand it.

Peter, one of us is involved in computing a billion digits of e and knows
what he's talking about.

I'll go through these ``wrong''s one at a time. You say it's obvious
that I didn't read/understand your previous posting, but none of your
arguments are based on information in your previous posting; what do
you believe I missed?

> In article <4059@sbcs.sunysb.edu> brnstnd@stealth.acf.nyu.edu (Dan Bernstein) writes:
> >In article <9924@june.cs.washington.edu> peterd@june.cs.washington.edu (Peter C. Damron) writes:
> >> Computability and complexity theory is unrelated to numerical analysis.
> >There are two sides to complexity theory. Theoretical complexity theory
> >(better called computability theory) studies issues like P = NP.
> Wrong.  Complexity and computability are two different things.

There was once a useful semantic distinction between computability
theory and (computational) complexity theory. I was pointing out that
when people say ``theoretical complexity theory,'' they usually mean
what ``computability'' used to mean. And you say that's wrong?

> Computability is the study of what can be computed.

> Complexity theory is the study of how hard things are to
> compute.

What's the difference? If Joe Shmoe thinks computable means P, then by
your definition, computability is the study of what's in P. If Sally
Shmoe thinks computable means NP, then by your definition, computability
is the study of what's in NP. If John Shmoe thinks computable means
Post-Turing computable, then by your definition, computability is the
study of what Post-Turing machines can do.

Do you distinguish between the study of measurable sets and integrable
functions and the study of different types of measures and integrals?

> Complexity theory divides problems into classes like P and NP
> and studies the relationships between the problem classes.

The fact is that computability theorists sit around talking about P
versus NP and all the other classes; the more theoretical ones think
about Church's thesis and Godel's theorem. Measure theorists sit around
talking about Lebesgue integrals versus Daniell integrals; the more
theoretical ones think about more general classes. Your distinction is
a play upon words and does not match how people think.

(Show me a so-called ``computability'' theorist who doesn't think about
P and NP, and I'll show you a logician.)

> >Practical complexity theory (what ``computational complexity theory''
> >used to mean) studies fast algorithms for practical problems.
> 
> Wrong.  There is complexity theory, and there are applications of complexity
> theory.  Your "practical complexity theory" is simply applications.

Which, as I said, is what ``computational complexity theory'' used to
mean. What's wrong with my statement?

Perhaps I should have used the word ``applied'' instead of ``practical.''

> People studing complexity theory often develop new algorithms to solve
> problems, but that is a side issue to showing the complexity class of
> the problem.

Oh, give me a break. I recently proved that if n-digit multiplication
and addition take time O(M(n)) where n^a = O(M(n)) for some a > 1, then
n digits of e can be computed in time O(M(n)). (The best previous result
is O(M(n) log n), which also applies for a = 1, if you care.) I had to
develop a new algorithm to achieve this result; do you really think I
consider the algorithm a ``side issue''? Harrrumph.

The travelling salesman is computability theory. e is computational
complexity theory (``applied complexity theory'' if you like).

> >Particular problems in practical complexity theory (e.g., computing pi
> >to many digits; or computing the sine of a floating-point number) have
> >quite a lot to do with numerical analysis.
> 
> Wrong.  Computing digits of Pi or computing the sine function have nothing
> to do with complexity thoery (though these problems may be classified
> into a complexity class).  How to do the computation of Pi or sine is
> numerical analysis.  How difficult it is to compute those things is an
> application of complexity theory.

I stand by my statement that those particular problems have a lot to do
with numerical analysis. Your ``how to do it'' versus ``how hard it is''
is silly.

Your paragraph (past ``wrong'') doesn't contradict my statement, but
instead says that those problems ``have nothing to do with complexity
theory.'' Again you're playing with words; would you agree that they are
``applied complexity theory''?

> >(I'm in math, not computer science. Computability theory is definitely
> >CS; numerical analysis is definitely math; computational complexity
> >theory is somewhere in between.)
> 
> Wrong.  Complexity theory is definitely computer science.

Read my lips. Computability theory (what you would call theoretical
complexity theory) is definitely computer science. Computational
complexity theory (what you would call applied complexity theory)
is floating somewhere in between. As I have done a bit of work in the
subject and keep up with the computational complexity journals, I am
confident in making these assertions.

> Also, there are two kinds of numerical analysis, the kind done by
> mathemeticians and the kind done by computer scientists (or engineers).
> Mathemeticians deal with abstract numbers like integers and reals.
> Computer scientists deal with limited precision numbers.

Please tell me what separates those two ``kinds.'' Can you tell the
articles apart in SIAM JNA or Math. Comp.? Can you divide the journals
into parts, saying, ``Now this has a real number in it... but this deals
with limited precision numbers... but this one has an integer in it...''

> Perhaps mathemeticians don't understand computer science.

Oh, really. I suppose you insist that Knuth is not a mathematician.

---Dan

mike@cs.arizona.edu (Mike Coffin) (12/01/89)

[Regarding the relationship between mathematics and computer science]

From article <34795@regenmeister.uucp> (Chris Prael):
> My observation and experience is that this relationship is
> historical in origin.  It traces to the fact that most computer
> science departments were spun off from mathematics departments. ...
> There are two other sources of the mathematical misapprehension.
> The first is that the earliest applications performed with computers
> were mainly numerical analysis problems. ... The other is the fact
> the programs are completely abstract.

I think the alleged ``misapprehension'' has more to do with the fact
that computer programs are algorithms than the reasons suggested
above.  Algorithms have been, at least since Euclid, the realm of
mathematicians.  A large part (but not all) of mathematics has been
the development of algorithms and proofs of their correctness.  Since
a very large portion of computer science is the design and analysis of
algorithms, it is natural that mathematicians would find the field
interesting and that computer scientists would find mathematical
reasoning useful.

-- 
Mike Coffin				mike@arizona.edu
Univ. of Ariz. Dept. of Comp. Sci.	{allegra,cmcl2}!arizona!mike
Tucson, AZ  85721			(602)621-2858

chrisp@regenmeister.uucp (Chris Prael) (12/01/89)

From article <15812@megaron.cs.arizona.edu>, by mike@cs.arizona.edu (Mike Coffin):
> [Regarding the relationship between mathematics and computer science]
> 
> From article <34795@regenmeister.uucp> (Chris Prael):
>> My observation and experience is that this relationship is
>> historical in origin.  It traces to the fact that most computer
>> science departments were spun off from mathematics departments. ...
>> There are two other sources of the mathematical misapprehension.
>> The first is that the earliest applications performed with computers
>> were mainly numerical analysis problems. ... The other is the fact
>> the programs are completely abstract.
> 
> I think the alleged ``misapprehension'' has more to do with the fact
> that computer programs are algorithms than the reasons suggested
> above.  Algorithms have been, at least since Euclid, the realm of
> mathematicians.

Except when they were the realm of physicists, chemists, engineers, ...
Algorithm is nothing but a pompous synonym for the word "process".  All
forms of engineering deal equally in designing and implementing
processes.

>  A large part (but not all) of mathematics has been
> the development of algorithms and proofs of their correctness.  

Not for a very long time (with certain minor exceptions).  Mathematics
deals primarily in discovering the implications of sets of postulates.
A paraphrase is: mathematics deals in determining the content of sets
defined by a small number of identifying elements.  A proof is a formal
demonstration that a non-identifying element is a member of the set.
The term "correctness" is content free (ie. meaningless) in mathematics.

Algorithms have historically been the province of people who got their 
hands dirty, unless they were bookkeepers.

Chris Prael

mike@cs.arizona.edu (Mike Coffin) (12/07/89)

From article <34878@regenmeister.uucp>, (Chris Prael):
> Except when they were the realm of physicists, chemists, engineers, ...
> Algorithm is nothing but a pompous synonym for the word "process".  All
> forms of engineering deal equally in designing and implementing
> processes.

"Algorithm" is not a synonym for "process."  In the usual sense of the
term an "algorithm" is a procedure for solving a *mathematical* problem.
If you want to call the procedure for building a bridge or making
coffee an algorithm I won't argue with you, but in doing so you make
the term almost vacuous.  Since "process" is a perfectly good term for
such activity, why not use "algorithm" in the more precise sense?
(By the way, I'm curious: why do you think "algorithm" is pompous?)

>>  A large part (but not all) of mathematics has been
>> the development of algorithms and proofs of their correctness.  
> Not for a very long time (with certain minor exceptions).  Mathematics
> deals primarily in discovering the implications of sets of postulates.
> A paraphrase is: mathematics deals in determining the content of sets
> defined by a small number of identifying elements.  A proof is a formal
> demonstration that a non-identifying element is a member of the set.
> The term "correctness" is content free (ie. meaningless) in mathematics.

This is sometimes the way mathematics is presented, but it's hardly
ever the way it is invented.  Newton invented the calculus in order to
derive an algorithm for predicting the motion of the planets.  He
didn't start with a collection of postulates, start deriving
inferences, and then suddenly realize that had predicted elliptical
orbits. Even seemingly abstract branches of mathematics were usually
invented for a purpose, although some textbooks seem bent on
concealing it.  If you take a good look at the work of great (and, for
that matter lesser) mathematicians you will find that they almost
uniformly had solid practical reasons for the mathematics they
invented.
-- 
Mike Coffin				mike@arizona.edu
Univ. of Ariz. Dept. of Comp. Sci.	{allegra,cmcl2}!arizona!mike
Tucson, AZ  85721			(602)621-2858

chrisp@regenmeister.uucp (Chris Prael) (12/08/89)

From article <15966@megaron.cs.arizona.edu>, by mike@cs.arizona.edu (Mike Coffin):
> From article <34878@regenmeister.uucp>, (Chris Prael):
>> Except when they were the realm of physicists, chemists, engineers, ...
>> Algorithm is nothing but a pompous synonym for the word "process".  All
>> forms of engineering deal equally in designing and implementing
>> processes.

A modest correction: "algorithm" as used (or abused) within the
computing community.
 
> "Algorithm" is not a synonym for "process."  In the usual sense of the
> term an "algorithm" is a procedure for solving a *mathematical* problem.
> If you want to call the procedure for building a bridge or making
> coffee an algorithm I won't argue with you, but in doing so you make
> the term almost vacuous.

I reckon that you have understood my meaning well.  Just apply it to
"computing" in the current sense of the word.

> (By the way, I'm curious: why do you think "algorithm" is pompous?)

Pompous adj. 1. Self-important; pretentious;
                                ^^^^^^^^^^^
>> Mathematics
>> deals primarily in discovering the implications of sets of postulates.
>> A paraphrase is: mathematics deals in determining the content of sets
>> defined by a small number of identifying elements.  A proof is a formal
>> demonstration that a non-identifying element is a member of the set.
>> The term "correctness" is content free (ie. meaningless) in mathematics.
 
> This is sometimes the way mathematics is presented, but it's hardly
> ever the way it is invented.

I described the way mathematicians understand the field.  It is not the 
way non-mathematicians (miss)understand the field.  I was once enough of a
mathematician to know that.  With a small handfull of exceptions, one of
which Mike mentions, below, my description is, indeed, how mathematics
grows.

> Newton invented the calculus in order to
> derive an algorithm for predicting the motion of the planets.  

Quite true.  But Newton was a physicist who invented some mathematics.
That does not make him a mathematician.  It did, and does, make him a
remarkable physicist.

> If you take a good look at the work of great (and, for
> that matter lesser) mathematicians you will find that they almost
> uniformly had solid practical reasons for the mathematics they
> invented.

I did.  An extensive look.  With very rare exceptions, the solidly 
practical reason for the invention of most new mathematics has been the
pursuit of mathematics.

Chris Prael