sam@castle.ed.ac.uk (S Manoharan) (03/07/91)
In article <1440@fang.dsto.oz> dch@aeg.dsto.oz.au (Dave Hanslip) writes: >5. do you use any CASE tools for documentation? I use TUGBOAT's trees.sty to draw the class hierarchies. It is really handy. I too would be interested in hearing about the tools available for documenting OO systems.
dch@aeg.dsto.oz.au (Dave Hanslip) (03/07/91)
I've read every OOA/D/P text I can lay my hands on including Booch, Coad and Yourdon, Wirfs-Brock et al and Rumbaugh et al. I'd like to know what standards software developers are using for documenting OO systems. 1. are you relying totally on diagrams and if so whose? 2. are you using diagrams supplemented by text? 3. how are you documenting system architecture? 4. how are you structuring your system documentation? 5. do you use any CASE tools for documentation? If I receive any useful responses I'll post a summary. David C. Hanslip E-mail: dch@aeg.dsto.oz.au Aeronautical Research Laboratory Phone: +61 8 259 5792 DSTO Salisbury, South Australia Fax: +61 8 259 5507
root@NEXTSERVER.CS.STTHOMAS.EDU (Max Tardiveau) (03/08/91)
In article <1440@fang.dsto.oz> dch@aeg.dsto.oz.au (Dave Hanslip) writes: >5. do you use any CASE tools for documentation? I use a program called makedoc to create parts of the documentation of my Objective-C code. It extracts comments and creates TeX files. Pretty nice. It also draws the inheritance hierarchy. Best tool of its kind that I've seen. Of course, some people will say that it's really the opposite of what you're supposed to do. But I design my objects beforehand. This tool is nice because it allows me to keep the details of the implementation of the objects with the code, not in some other document that I have to keep in sync with the code. -- Max Tardiveau (m9tardiv@cs.stthomas.edu)
cote@subsys.enet.dec.com (Michael P.A. Cote) (03/08/91)
My group started an OO based project a couple of months ago. We spent some time trying to decide on a process for doing the development since OO was completely new to the group. After looking at the available CASE tools (Software Through Pictures, Cadre Teamwork, and DECDesign) we decided that they did not provide us with any real benefit directly related to our OO process. (NOTE: Since that time these products have added some OO support). We decided that we would use a text based approach to documenting the analysis and design of our system. The diagrams that we do draw are done using a free form drawing tool. The diagrams are compatible with Booch's Class and Object diagrams (Booch, OOD with Examples, 1990). The diagrams are further explained in the accompanying text where needed. In addition, our analysis has identified classes that exist in the system. These classes are documented using a documentation template which includes: Class Name Overview SuperClasses Attributes Relations Detailed Description Since we are still in the analysis phase of the project, this documentation provides only general words about how the object operates. To document the design, we will baseline the analysis phase documentation and then begin transforming the class analysis to class design including defining parameters and writing PDL. In addition, the design phase will document the overall architecture of the system as well as some other miscellaneous topics such as startup and shutdown issues. In parallel with the design of the system we will be making a prototype as a proof of design. Our documentation is being written using VAX/Document MPAC Disclaimer: Opinions are mine and do not reflect any Digital Equipment Corp. policy.
cdurrett@cup.portal.com (chuck m durrett) (03/13/91)
Wouldn't it be nice if somehow the source were so simple it documented itself ? 8-) Chuck Durrett 313 524 8743
rick@tetrauk.UUCP (Rick Jones) (03/14/91)
In article <40084@cup.portal.com> cdurrett@cup.portal.com (chuck m durrett) writes: |>Wouldn't it be nice |> if somehow |> the source were so simple |> it documented |> itself |>? Ever tried Eiffel ? -- Rick Jones, Tetra Ltd. Maidenhead, Berks, UK rick@tetrauk.uucp The dynamics of a freeway approximates to a shoal of fish in 1.5 dimensions
barriost@gtephx.UUCP (Tim Barrios) (03/20/91)
In article <1114@tetrauk.UUCP>, rick@tetrauk.UUCP (Rick Jones) writes: > In article <40084@cup.portal.com> cdurrett@cup.portal.com (chuck m durrett) writes: > |>Wouldn't it be nice > |> if somehow > |> the source were so simple > |> it documented > |> itself > Ever tried Eiffel ? _or_ Ada (as opposed to "|| C++" :-) -- Tim Barrios, AG Communication Systems, Phoenix, AZ UUCP: ...!{ncar!noao!asuvax | uunet!zardoz!hrc | att}!gtephx!barriost Internet: gtephx!barriost@asuvax.eas.asu.edu voice: (602) 582-7101 fax: (602) 581-4022
glenn@huxley.huxley.bitstream.com (Glenn P. Parker) (03/22/91)
In article <271@orbit.gtephx.UUCP> barriost@gtephx.UUCP (Tim Barrios) writes: > _or_ Ada > > (as opposed to "|| C++" :-) You'll eat those words (bytes? :-), since it looks like X3J16 will adopt "or" as a pseudonym for "||" as part of its internationalization effort. -- Glenn P. Parker glenn@bitstream.com Bitstream, Inc. uunet!huxley!glenn 215 First Street BIX: parker Cambridge, MA 02142-1270
jls@rutabaga.Rational.COM (Jim Showalter) (03/22/91)
>> Ever tried Eiffel ? >_or_ Ada >(as opposed to "|| C++" :-) No shit! Most C++ is indistinguishable from line noise. Ada, properly written, looks like an English language description of the processing being performed, with the added advantage that it also executes. (There is no need for pseudo-code, or even much need for comments.) As I continually remind my students, programs are written by people for other PEOPLE, not for computers. It is the job of the COMPILER to translate a human-readable program into a machine-executable program. C hackers for some perverse reason want to eliminate the middleman and write programs for the computer directly, resulting in unreadable, incomprehensible, unmaintainable code. I used to work in a C shop and many times observed programmers floundering around trying to understand code they THEMSELVES had written just a few weeks earlier. Techno-weenies may see nothing wrong with this, but any manager with the ability to do simple sums should be able to conclude that he's pissing money down a rathole letting a process like this exist. -- ***** DISCLAIMER: The opinions expressed herein are my own. Duh. Like you'd ever be able to find a company (or, for that matter, very many people) with opinions like mine. -- "When I want your opinion, I'll read it in your entrails."
hsrender@happy.colorado.edu (03/23/91)
In article <jls.669588928@rutabaga>, jls@rutabaga.Rational.COM (Jim Showalter) writes: > Ada, properly written, looks like an English language > description of the processing being performed, with the > added advantage that it also executes. (There is no need > for pseudo-code, or even much need for comments.) Funny, that's what COBOL coders keep telling me. After all, which is more like English: ADD 1 to A GIVING B or B := A + 1 Just a thought. hal.
ark@alice.att.com (Andrew Koenig) (03/23/91)
In article <jls.669588928@rutabaga> jls@rutabaga.Rational.COM (Jim Showalter) writes: > No shit! Most C++ is indistinguishable from line noise. > Ada, properly written, looks like an English language > description of the processing being performed, with the > added advantage that it also executes. (There is no need > for pseudo-code, or even much need for comments.) That was the rationale for Cobol too, in the 1950s. It didn't work then either. -- --Andrew Koenig ark@europa.att.com
jls@rutabaga.Rational.COM (Jim Showalter) (03/24/91)
>In article <271@orbit.gtephx.UUCP> barriost@gtephx.UUCP (Tim Barrios) writes: >> _or_ Ada >> >> (as opposed to "|| C++" :-) >You'll eat those words (bytes? :-), since it looks like X3J16 will adopt >"or" as a pseudonym for "||" as part of its internationalization effort. I'll be damned: it only took the C community 25 years to realize something that has always been intuitively obvious to language designers/trainers--that English language words are superior to cryptic symbols. NEXT they'll realize that "begin" and "end" are preferable to '{' and '}' and, with just a few more similar realizations like this, they'll turn C into something closely resembling Pascal. Sheesh. By the way, I always encourage my students to use a standard include file that #define's "or" to mean "||", "and" to mean "&&", etc etc etc. It's amazing how readable one can make C if one tries. -- ***** DISCLAIMER: The opinions expressed herein are my own. Duh. Like you'd ever be able to find a company (or, for that matter, very many people) with opinions like mine. -- "When I want your opinion, I'll read it in your entrails."
hsrender@happy.colorado.edu (03/25/91)
In article <299@orbit.gtephx.UUCP>, barriost@gtephx.UUCP (Tim Barrios) writes: > In article <1991Mar22.120946.1@happy.colorado.edu>, hsrender@happy.colorado.edu writes: >> After all, which is more like English: >> >> ADD 1 to A GIVING B >> >> or >> B := A + 1 > > right answer, wrong question. we can all think of many > reasons why English isn't what's desirable here. a language to most > accurately represent the problem domain (ie, software design or > requirements) is what is desired. then, for most software engineering > application, the real question becomes which of the following is more > like engineering: > > add 1 to a. english, business > a := a + 1; -- math, engineering, science > a++; // made-up; worrying too much about the underlying target Your answer raises another question: why is the second option more readable than the third? (I'll ignore the question of which is more like engineering, because I have no idea how one measures the "engineeringness" of an expression.) I would postulate that it is because of experience. Most engineers learn to program in some form of block-structured language, so anything that looks like a block-structured language is immediately recognizable to them. But is this a better means of expression? I recall that at Syracuse they were teaching some of their intro students to program in Prolog, and they reported no ensuing problems in later CS courses. At Indiana and MIT I believe they teach Scheme to intro people, and both of them seem to produce good CS people. So, is it simply convention that makes us prefer "a := a + 1" to "a++"? After all, I never saw anything even vaguely like "a := a + 1" in any math or science course, so I can't believe I had any prior familiarity with it. hal.
barriost@gtephx.UUCP (Tim Barrios) (03/26/91)
In article <1991Mar22.120946.1@happy.colorado.edu>, hsrender@happy.colorado.edu writes: > After all, which is more like English: > > ADD 1 to A GIVING B > > or > B := A + 1 right answer, wrong question. we can all think of many reasons why English isn't what's desirable here. a language to most accurately represent the problem domain (ie, software design or requirements) is what is desired. then, for most software engineering application, the real question becomes which of the following is more like engineering: add 1 to a. english, business a := a + 1; -- math, engineering, science a++; // made-up; worrying too much about the underlying target -- Tim Barrios, AG Communication Systems, Phoenix, AZ UUCP: ...!{ncar!noao!asuvax | uunet!zardoz!hrc | att}!gtephx!barriost Internet: gtephx!barriost@asuvax.eas.asu.edu voice: (602) 582-7101 fax: (602) 581-4022
lance@motcsd.csd.mot.com (lance.norskog) (03/26/91)
jls@rutabaga.Rational.COM (Jim Showalter) writes: >I'll be damned: it only took the C community 25 years to realize something >that has always been intuitively obvious to language designers/trainers--that >English language words are superior to cryptic symbols. NEXT they'll realize >that "begin" and "end" are preferable to '{' and '}' and, with just a few >more similar realizations like this, they'll turn C into something closely >resembling Pascal. Memory and symbol processing operate largely by association. What you're advocating leads to operator overloading. If you use the same symbol in different contexts, you generally don't get confused. But, differences are gradual. Where two different contexts are almost exactly the same, using the same operator can cause severe confusion. If you want someone to learn a new totality of discourse, it's often better to come up with a completely new symbol system to avoid confusing your use of this or that symbol with a previously learned one. In other words, it's better to use 'cp' to copy a file because 'copy' means many things in many places. As an example, take the DOS and UNIX command line prompts. DOS "sampled" UNIX egregiously, but changed everything slightly. When I was showing Dad how to copy files, I showed him how to do wildcards. UNIX wildcards. The contexts were similar enough that I attempted to apply my vast and amazing UNIX knowledge to DOS, and of course it didn't work. Now, if DOS had said 'frip file1.^^^ file2.^^^' makes a copy of all files starting with file1 called file2 with the same extensions, I would have had no problems. This is sufficiently different that I would have looked it up in the book first. This points up a big problem with GUI's (Grand Unified Interfaces :-) wherein two different programs implement the exact same command set (the common user interface) but do completely different things. If two different programs really implemented the same UI, they would be the same program, right? To wander back to the subject of objects, making C into C++ left C behind and created an even bigger mess, because it tried to stay within the mental context which understands C syntax. Lance
jls@rutabaga.Rational.COM (Jim Showalter) (03/26/91)
>Funny, that's what COBOL coders keep telling me. >After all, which is more like English: > ADD 1 to A GIVING B >or > B := A + 1 >Just a thought. No argument. COBOL does have its heart in the right place when it comes to readability issues, far more than most languages. Sadly, that is about ALL it has going for it. -- ***** DISCLAIMER: The opinions expressed herein are my own. Duh. Like you'd ever be able to find a company (or, for that matter, very many people) with opinions like mine. -- "When I want your opinion, I'll read it in your entrails."
dwwx@cbnewsk.ATT.COM (david.w.wood) (03/26/91)
>In article <jls.669588928@rutabaga> jls@rutabaga.Rational.COM (Jim Showalter) writes: > >> No shit! Most C++ is indistinguishable from line noise. >> Ada, properly written, looks like an English language >> description of the processing being performed, with the >> added advantage that it also executes. (There is no need >> for pseudo-code, or even much need for comments.) > >That was the rationale for Cobol too, in the 1950s. > >It didn't work then either. >-- > --Andrew Koenig > ark@europa.att.com As much as I would like to agree with Andrew about this, I'm afraid the evidence is against him. The business world (and DoD) has too many millions of lines of COBOL to claim that it didn't work (and more are being written every year). As to Jim's statment that Ada looks like English, he must be using a different version of English than I am (probably a DoD approved/validated version :-). I don't believe that anyone would say the same about C++ (but then again...). Anyway, I personally DON'T want to write code in English. I program in C++, C, Lisp, KSH, AWK, ICON on a day-to-day basis. They all suffer from the same problem, they are too far removed from the problem domain semantics in which I am trying to solve problems (of course with C++ and Lisp I can build up the semantics and syntactic sugars to write "natural" solutions, but I want to solve problems ASAP, not spend weeks/months building a vocabulary to solve the problem). To get back to root of all these postings: So as far as productivity is concerned, Ada and C++ are the same within a constant factor. David Wood att!bubba!dww
ark@alice.att.com (Andrew Koenig) (03/26/91)
In article <299@orbit.gtephx.UUCP> barriost@gtephx.UUCP (Tim Barrios) writes: > add 1 to a. english, business > a := a + 1; -- math, engineering, science > a++; // made-up; worrying too much about the underlying target Why do you suppose programmers use the word "increment" so often? Could it be that it's a useful concept? -- --Andrew Koenig ark@europa.att.com
amanda@visix.com (Amanda Walker) (03/27/91)
In article <jls.669768656@rutabaga> jls@rutabaga.Rational.COM (Jim Showalter) writes: I'll be damned: it only took the C community 25 years to realize something that has always been intuitively obvious to language designers/trainers--that English language words are superior to cryptic symbols. It all depends on the application, and how you define "superior." I'd rather do vector calculus with "cryptic" symbols than as word problems, for example :). Maybe we should make everything a word ... and we could do away with that pesky syntax stuff ... Hmm. We seem to reinvented Lisp. Now, personally, I think Lisp is the right way to do programming, but I will admit to being weird :). -- Amanda Walker amanda@visix.com Visix Software Inc. ...!uunet!visix!amanda -- It is a vast and wonderful universe, but you wouldn't know it to live here.
jordanbo@i88.isc.com (Jordan Boucher) (03/27/91)
In article <20106@alice.att.com> ark@alice.UUCP () writes: >In article <jls.669588928@rutabaga> jls@rutabaga.Rational.COM (Jim Showalter) writes: > >> No shit! Most C++ is indistinguishable from line noise. >> Ada, properly written, looks like an English language >> description of the processing being performed, with the >> added advantage that it also executes. (There is no need >> for pseudo-code, or even much need for comments.) > >That was the rationale for Cobol too, in the 1950s. > >It didn't work then either. I'm not going to take one side or the other (explicitly ;). BUT... 1) C++ is line noise? Ada, _properly written_, looks like what? The readability of code depends ENTIRELY on the person writing that code, not the language itself! Some languages are prone (or allow) bad/ugly style. Any self disciplined software engineer would not get caught up in the "neatness of tricks" a language may offer, which are the types of traps that cause unreadable code. I'm sure Ada, NOT properly written, would look almost as ugly as the majority of the C code of this world. But, I'm also very sure that quality software engineers are capable of creating readable C/C++ code (read 'properly written'). You do say "Most C++", which leads me to believe that you have been reading code from people of a Hacker background, and not SWE. I agree (in concept) with what was said, but not in how it was stated. The plain truth is Hackers write "line noise" type code, independent of the language, and SW Engineers write readable code given the same tools. 2) To say Ada doesn't work is pretty bold! Or are you just against someone understanding their code 2 days after they write the stuff? Or the poor slob that has to support it after it gets released? It's easy to take shots at a language of the 50s and say it doesn't work today (I agree that Cobol stinks BTW, but we don't do MIS either). To take a shot at a language of the 60s and 70s, let's say C, would be pretty easy too. I won't do that because you were defending C++ (I think), which is at least an 80s language. -- / Jordan Boucher | INTERACTIVE Systems Corporation \ | | email: jordanbo@i88.isc.com | | #include <std/quote.h> | phone: (708) 505-9100 x272 | \ #include <std/disclaimer.h> | fax : (708) 505-9133 /
lance@motcsd.csd.mot.com (lance.norskog) (03/27/91)
ark@alice.att.com (Andrew Koenig) writes: >In article <jls.669588928@rutabaga> jls@rutabaga.Rational.COM (Jim Showalter) writes: >> Ada, properly written, looks like an English language >> description of the processing being performed, with the >> added advantage that it also executes. ... >That was the rationale for Cobol too, in the 1950s. >It didn't work then either. "With this new Cobol programming language, there will be no more need for programming as a separate job description. Everyone will write programs. The boss will dictate programs to his secretary, who will keypunch them for him." - paraphrase from memory of some particularly ripe Cobol hype from a more innocent time Lance Norskog America is innocent again. It's so pleasant, don't you think?
jls@rutabaga.Rational.COM (Jim Showalter) (03/27/91)
>> No shit! Most C++ is indistinguishable from line noise. >> Ada, properly written, looks like an English language >> description of the processing being performed, with the >> added advantage that it also executes. (There is no need >> for pseudo-code, or even much need for comments.) >That was the rationale for Cobol too, in the 1950s. >It didn't work then either. Define "work". There are more lines of COBOL out there than anything else, so it must have worked for somebody. Secondly, I think the goals of readability WERE satisfied by COBOL--the problem is that everything ELSE is screwed up in that language. P.S. For extra credit, pose a credible argument in FAVOR of code that is hard to read. -- ***** DISCLAIMER: The opinions expressed herein are my own. Duh. Like you'd ever be able to find a company (or, for that matter, very many people) with opinions like mine. -- "When I want your opinion, I'll read it in your entrails."
jls@rutabaga.Rational.COM (Jim Showalter) (03/27/91)
>So, is it simply convention that makes us prefer "a := a + 1" to "a++"? After >all, I never saw anything even vaguely like "a := a + 1" in any math or >science course, so I can't believe I had any prior familiarity with it. Huh? Did you never see?: 7 = 6 + 1 X = Y * Z T = Sin (X) W = W + 1 2 1 I saw stuff like that all the way from elementary school onwards. What I NEVER saw was: P++ -- ***** DISCLAIMER: The opinions expressed herein are my own. Duh. Like you'd ever be able to find a company (or, for that matter, very many people) with opinions like mine. -- "When I want your opinion, I'll read it in your entrails."
diamond@jit345.swstokyo.dec.com (Norman Diamond) (03/27/91)
In article <jls.670044400@rutabaga> jls@rutabaga.Rational.COM (Jim Showalter) writes: [attribution deleted by Mr. Showalter]: >>So, is it simply convention that makes us prefer "a := a + 1" to "a++"? After >>all, I never saw anything even vaguely like "a := a + 1" in any math or >>science course, so I can't believe I had any prior familiarity with it. > >Huh? Did you never see?: > 7 = 6 + 1 [etc.] >I saw stuff like that all the way from elementary school onwards. What I >NEVER saw was: > P++ Exactly. Lots of us have seen lots of equals signs, and learn to associate equals signs with equality. We didn't learn to associate equals signs with ASSIGNMENT (and possible inequality) until we learned Fortran, which might or might not have been in elementary school. And most of us didn't see either colons followed by equals sings, or double plus signs, in elementary school. Unless we learned Algol then; but we certainly didn't see them in math classes. -- Norman Diamond diamond@tkov50.enet.dec.com If this were the company's opinion, I wouldn't be allowed to post it.
spicer@tci.UUCP (Steve Spicer) (03/27/91)
In article <1991Mar26.170848.15936@visix.com> amanda@visix.com (Amanda Walker) writes: >In article <jls.669768656@rutabaga> jls@rutabaga.Rational.COM (Jim >Showalter) writes: > > I'll be damned: it only took the C community 25 years to realize > something that has always been intuitively obvious to language > designers/trainers--that English language words are superior to > cryptic symbols. > >It all depends on the application, and how you define "superior." I'd >rather do vector calculus with "cryptic" symbols than as word problems, >for example :). > Amanda is right, and no smiley is needed. If English language (what about German, French, ...) words are superior to "crytic symbols" then just think about what might have been possible had only Newton, Gauss, Russell (or in our own field, Knuth and Dijkstra), etc. not been diverted from working in "natural language" terms. - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - Steven Spicer/spicer@tci.bell-atl.com Is your design so simple that there are obviously no deficiencies, or so complicated that there are no obvious deficiencies? -- suggested by a quote from C.A.R. Hoare
lance@motcsd.csd.mot.com (lance.norskog) (03/28/91)
And furthermore, graphics-based PC's have been shipping for 15 years. Why am I still coding in text? Why can't I just push pictures around? Lance Norskog
schwartz@groucho.cs.psu.edu (Scott Schwartz) (03/28/91)
Lots of us saw left-arrow's in high school algebra. (I forget what book I used... Dolciani?) A := is very reminiscent of that; most people clue-in instantly. This thread is very very silly.
jgk@osc.COM (Joe Keane) (03/28/91)
Personally, i think arguing over `=' versus `:=' is silly. When people get hot over this issue, you know they don't have anything important to worry about. Usually in this group we argue about more sophisticated stupid issues like operator overloading and polymorphism. I often see people complain that C doesn't look enough like English. All i can say is, wake up. English sucks as a programming language. Humans tend to deal with ambiguity pretty well, but compilers don't. They take what you said literally, or else they don't take it at all. Besides that, Americans tends to forget that not everyone in the world speaks English. Things like `+', `0', and `<' are pretty much the same no matter where you go. Are you going to translate `or' into `o', `ou', and any number of other things? That's why i don't like `improvements' like using `or' instead of `|'. I consider this simplifying so much that it's wrong. There's the obvious objection that there are two `or' operators in C. Sometimes you can interchange them, but in general you know which one you want. Clearly the English word isn't very precise. Similarly i think using the word `real' for a floating-point number is a mistake. Floating-point numbers aren't real numbers, and by glossing over this distinction you can only make things worse. The set of floating-point numbers on a give machine is finite, while the set of real numbers is not only infinite but also non-denumerable. Floating-point addition isn't associative, they don't obey the distributive law, they don't have exact inverses. Need i go on? My point is that making something simpler at the expense of making it wrong isn't something you want to do in a programming language. One thing i like about C is that when you type something you know what it's going to do. There are some catches, like operator precedence isn't what it should be in some cases. But in general when you say `a = b', you're pretty sure what the compiler's going to do with this. I wish i could say the same thing about C++, but i'm afraid this is one of the big differences between the two languages. There are a number of causes, including inline functions, virtual inheritance, invisible copy constructors, and i'm sure there are more. The end result is that when you say `a = b', you have no idea what the compiler is actually going to put there. So you type a one-line C++ function, but when you look at the assembly version it's 100 lines. (Long ago i gave up reading the cfront output. Compiling it to assembler improves the readability.) This is a little distressing at first, but finally you just have to accept that C++ is far from a WYSIWYG language. -- Joe Keane, C++ hacker
ark@alice.att.com (Andrew Koenig) (03/28/91)
In article <jls.670044262@rutabaga> jls@rutabaga.Rational.COM (Jim Showalter) writes: > P.S. For extra credit, pose a credible argument in FAVOR of code > that is hard to read. Why on earth should I do that? Or are you making the implicit assumption that "easy to read" means "looks like English" ? If English were intrinsically unambiguous, there would be no poetry. But I don't want formal descriptions of algorithms to be poetic, thank you. -- --Andrew Koenig ark@europa.att.com
hsrender@happy.colorado.edu (03/28/91)
In article <jls.670044400@rutabaga>, jls@rutabaga.Rational.COM (Jim Showalter) writes: >>So, is it simply convention that makes us prefer "a := a + 1" to "a++"? After >>all, I never saw anything even vaguely like "a := a + 1" in any math or >>science course, so I can't believe I had any prior familiarity with it. > > Huh? Did you never see?: > > 7 = 6 + 1 > > X = Y * Z > > T = Sin (X) > > W = W + 1 > 2 1 > > I saw stuff like that all the way from elementary school onwards. What I > NEVER saw was: > > P++ I should have qualified that. What I never saw was any statement remotely resembling ASSIGMENT. There's a big difference between an equation and an assignment (unless you don't see any difference in imperative versus declarative programming languages). I also didn't say that P++ was better (I rarely use it even in my C code). What I did say that was I think that what most people like about certain programming languages is due to training, not some inherent property of the language. Yes, basic user interface design theory states that you should express yourself in a manner as close to the manner of expression of your target subject, but when you are communicating novel objects/actions, this is problematic. When people learn about equations, they think in declarative terms ("A is equal to B + 1", not "A becomes equal to B + 1). The concept of assignment, while not a big stretch, is different enough to warrant some additional explanation. Personally, that's why I prefer the assignment notation of languages like Smalltalk. As to whether P++ is immediately understandable by the non-programmer, I would say no, but then neither is P := P + 1. The concept of changing the state of a symbolic quantity is not something one learns in algebra (at least not algebra before the college level). You can legitimately argue that P++ is a lousy choice for an increment statement, but it is not that much worse than INC(P) (which several "nicer" languages support). Although I had earlier resolved to stay out of this debate, let me state this: I like Ada. I think it combined the features of several good languages, and I think that Stoneman was a good way to ensure that it has good integrated support for its users. BUT, I think that to sell it as the be-all, end-all, world's greatest SE language is a major shuck. There are languages out there that support many of the same features, and even those that don't can be used to engineer quality software systems. The constant harping by Ada supporters on the shortcomings of other languages sounds remarkably like the bleatings of religious zealots about the falseness of other faiths. In reality, Ada is the PL/I of the 80's, and it remains to be seen whether Ada will become a dominant force in software engineering or whether it will, like PL/I, be just an interesting footnote in the evolution of programming languages. If Ada supporters wish to persuade other people to use the language, then they had better do more than knock the languages that other people already use. hal render render@zeppo.colorado.edu
glenn@huxley.huxley.bitstream.com (Glenn P. Parker) (03/29/91)
In article <4693@osc.COM> jgk@osc.COM (Joe Keane) writes: > Besides that, Americans tends to forget that not everyone in the world speaks > English. Things like `+', `0', and `<' are pretty much the same no matter > where you go. Are you going to translate `or' into `o', `ou', and any number > of other things? > > That's why i don't like `improvements' like using `or' instead of `|'. Yes, indeed. Americans (like Joe) also tend to forget that not every language (or keyboard) in the world has `|', `{', `}', `[', and `]'. *That* is why using `or' instead of `||' makes sense. Granted, it's not the correct word in all the other languages, but at *least* they can type it in. What about all the other English-derived C++ keywords, anyway? > I consider this simplifying so much that it's wrong. As I hope I've made clear, it's not a question of simplification, but rather accessibility. > There's the obvious objection that there are two `or' operators in C. > Sometimes you can interchange them, but in general you know which one you > want. Clearly the English word isn't very precise. The x3j16 proposal suggests `bitor' for `|' and `or' for `||'. In the context of a C++ specification, the words seem quite precise. > There are a number of causes, including inline functions, virtual > inheritance, invisible copy constructors, and i'm sure there are more. > The end result is that when you say `a = b', you have no idea what the > [C++] compiler is actually going to put there. Honestly, it's not *that* bad. Most of the time, all of the examples you cite can be anticipated and understood without examining the compiler's output. Of course, it's not as easy as it was with C (but that's really an advantage, when you think about it :-). > ...finally you just have to accept that C++ is far from a WYSIWYG > language. Precisely. If you want a "real" WYSIWIG language, use assembler. -- Glenn P. Parker glenn@bitstream.com Bitstream, Inc. uunet!huxley!glenn 215 First Street BIX: parker Cambridge, MA 02142-1270
jls@rutabaga.Rational.COM (Jim Showalter) (03/29/91)
> The readability of code depends ENTIRELY on the person writing that > code, not the language itself! Some languages are prone (or allow) > bad/ugly style. Any self disciplined software engineer would not > get caught up in the "neatness of tricks" a language may offer, > which are the types of traps that cause unreadable code. I only partially agree with this. I certainly agree that a language is only as good as those who write it, but a language doesn't exist separate from a CULTURE that grows up around it. The culture that grew up around C is hackerish, undisciplined, and tends to produce code that is, indeed, indistinguishable from line noise. Hell, they even have a special C PUZZLE section in trade mags celebrating the cleverest hack of the month (this is poor instinct run amok, in my opinion). The culture that grew up around Ada from day 1 was oriented toward software engineering. That doesn't mean that it is always the case that Ada is readable and easy to maintain, or that C is always unreadable and hard to maintain...but that's certainly the smart money bet. As for C++, I am pleased by it because it offers a chance to reboot the commercial sector. I am displeased by it because the hackers who populate the C "culture" are the most likely candidates to create the C++ culture...and this bodes not well, not well at all. When teaching C++, I draw two columns on the whiteboard, one for Pro and one for Con. In the Pro column I write "Compatible with C". In the Con column I write "Compatible with C". > I'm sure Ada, NOT properly written, would look almost as ugly as > the majority of the C code of this world. But, I'm also very sure > that quality software engineers are capable of creating readable > C/C++ code (read 'properly written'). No argument. > You do say "Most C++", which leads me to believe that you have been > reading code from people of a Hacker background, and not SWE. Indeed I have. That's my concern. We need more SWE's, and SOON. > I agree (in concept) with what was said, but not in how it was stated. > The plain truth is Hackers write "line noise" type code, independent > of the language, and SW Engineers write readable code given the same > tools. We're in violent agreement here. > 2) To say Ada doesn't work is pretty bold! I never said this. Far from it--someone else I was responding to said this. Gotta love these ASCII newsreaders, yes? Where is hypertext when we need it? (waiting for fiber, probably) -- ***** DISCLAIMER: The opinions expressed herein are my own, except in the realm of software engineering, in which case I've borrowed them from incredibly smart people.
jls@rutabaga.Rational.COM (Jim Showalter) (03/29/91)
>>It all depends on the application, and how you define "superior." I'd >>rather do vector calculus with "cryptic" symbols than as word problems, >>for example :). >> >Amanda is right, and no smiley is needed. The last time I looked, APL was a VERY symbolic language. It has also been described as a WORN language (Write Once Read Never). Going down the path of pure symbolic notation not only hasn't proven out, but it becomes absurd when you consider the ramifications: am I to have N banks of keys on my keyboard to have little symbols for employees, stacks, oranges, and any other object that happens to pop up while programming? If we grant the validity of extensible languages, in which user-defined types are indistinguisable from pre-defined types, then there is essentially no limit to the number of first-class citizens we would need symbols for. Clearly this is unworkable, so perhaps the idea of having a symbol for everything should be scrapped and replaced with the idea of having STRINGS for everything, since strings ARE infinitely descriptive. -- ***** DISCLAIMER: The opinions expressed herein are my own, except in the realm of software engineering, in which case I've borrowed them from incredibly smart people.
jls@rutabaga.Rational.COM (Jim Showalter) (03/29/91)
>One thing i like about C is that when you type something you know what it's >going to do. Okay, what is this going to do?: int j (int y, int m, int d) { int m_d[12] = {31,28,31,30,31,30,31,31,30,31,30,31}; if (m<1 || m>12) return 0; int l = !(y%4) && y&400; if (d<1 || d>(m_d[m-1] + (m==2 && y))) return 0; int dd = 0; for (int i=0; i<m-1; i++) dd += m_d[i]; return dd + d + (m>2 && y); } You should be able to tell me right off the bat--you think English sucks as a programming language, and this is DEFINITELY not written in English. -- ***** DISCLAIMER: The opinions expressed herein are my own, except in the realm of software engineering, in which case I've borrowed them from incredibly smart people.
pallas@eng.sun.com (Joseph Pallas) (03/30/91)
In <20131@alice.att.com> ark@alice.att.com (Andrew Koenig) writes: >If English were intrinsically unambiguous, >there would be no poetry. But I don't want >formal descriptions of algorithms to be poetic, >thank you. I can't resist dredging these up from "Mathematical Writing": "I'm thinking about running a contest for the best Pascal program that is also a sonnet." --Don Knuth This algorithm to count bits Rotates VALUE one left and sums its two's-comp negation in a zeroed location Repeats WORD LENGTH times, then exits. --anonymous joe
pallas@eng.sun.com (Joseph Pallas) (03/30/91)
In <jls.670218725@rutabaga> jls@rutabaga.Rational.COM (Jim Showalter) writes: >Okay, what is this going to do?: > int j (int y, int m, int d) { > int m_d[12] = {31,28,31,30,31,30,31,31,30,31,30,31}; > if (m<1 || m>12) return 0; > int l = !(y%4) && y&400; > if (d<1 || d>(m_d[m-1] + (m==2 && y))) return 0; > int dd = 0; > for (int i=0; i<m-1; i++) dd += m_d[i]; > return dd + d + (m>2 && y); > } >You should be able to tell me right off the bat--you think English sucks >as a programming language, and this is DEFINITELY not written in English. I can't believe I'm about to waste my time on this, but here goes: int j (int year, int month, int day) { int month_days[12] = {31,28,31,30,31,30,31,31,30,31,30,31}; if (month<1 || month>12) return 0; int leap = !(year%4) && year&400; if (day<1 || day>(month_days[month-1] + (month==2 && year))) return 0; int dayofmonth = 0; for (int i=0; i<month-1; i++) dayofmonth += month_days[i]; return dayofmonth + day + (month>2 && year); } I've left the name of the function cryptic, but the rest is the result of a simple global substitution. It becomes immediately apparent not only what the function is supposed to do, but that it contains at least one error (surely you meant to check (month==2 && leap)). Funny thing is, the function is still written in C. Amazing, isn't it? joe
jls@rutabaga.Rational.COM (Jim Showalter) (03/30/91)
>BUT, I think that to sell [Ada] as the be-all, end-all, >world's greatest SE language is a major shuck. There are languages out there >that support many of the same features, But not all at the same TIME. See, that's the difference: one language has exceptions, but lacks genericity. Another language has strong typing, but lacks representation clauses. Etc etc etc. People fault Ada for being a "kitchen sink" language, but I prefer to describe it as the only language that has the minimal spanning set of features necessary to attack problems ranging from the very fast and tiny (e.g. device drivers) to the incredibly large (e.g. the Space Station), and everywhere in between. About the only area I know of where Ada is lame is AI, but I don't find that very surprising, somehow. >If Ada >supporters wish to persuade other people to use the language, then they had >better do more than knock the languages that other people already use. I keep trying to make this point: I am one of those "other people"--I didn't start out day 1 writing in Ada. I've worked in Pascal, C, Modula-2, FORTRAN, BASIC, COBOL, and Lisp. My appreciation of Ada was forged in the fires of frustration caused by those other languages. -- ***** DISCLAIMER: The opinions expressed herein are my own, except in the realm of software engineering, in which case I've borrowed them from incredibly smart people.
jls@rutabaga.Rational.COM (Jim Showalter) (03/30/91)
>> ...finally you just have to accept that C++ is far from a WYSIWYG >> language. >Precisely. If you want a "real" WYSIWIG language, use assembler. Nah, too high-level. Work exclusively in microcode. :-) -- ***** DISCLAIMER: The opinions expressed herein are my own, except in the realm of software engineering, in which case I've borrowed them from incredibly smart people.
bs@alice.att.com (Bjarne Stroustrup) (03/30/91)
> When teaching C++, I draw two columns on the whiteboard, one for > Pro and one for Con. In the Pro column I write "Compatible with C". > In the Con column I write "Compatible with C". I happen to have used a slide like that in many (most?) of my C++ talks for the last 6 years or so. :-)
foobar@dist.dist.unige.it (Maurizio Vitale) (03/30/91)
In article <jls.670218725@rutabaga>, jls@rutabaga.Rational.COM (Jim Showalter) writes: |> >One thing i like about C is that when you type something you know what it's |> >going to do. |> |> Okay, what is this going to do?: |> |> int j (int y, int m, int d) { |> int m_d[12] = {31,28,31,30,31,30,31,31,30,31,30,31}; |> if (m<1 || m>12) return 0; |> int l = !(y%4) && y&400; |> if (d<1 || d>(m_d[m-1] + (m==2 && y))) return 0; |> int dd = 0; |> for (int i=0; i<m-1; i++) dd += m_d[i]; |> return dd + d + (m>2 && y); |> } |> |> You should be able to tell me right off the bat--you think English sucks |> as a programming language, and this is DEFINITELY not written in English. |> -- |> ***** DISCLAIMER: The opinions expressed herein are my own, except in |> the realm of software engineering, in which case I've borrowed |> them from incredibly smart people. I absoulutely do not want to start a flame war, but please do consider: 1. It has only take a double read of your code (please trust me) to understand that your function does return the number of days elapsed between the 1.st of january of the year and the date it is given has argument (providethey are in the same year, no check, limited usefulness, definetly bad SWE). The double read has been due to the fact that I was biased from the first lines into thinking that the function returned the day of the week, the last lines cleared this out and the second read confirmed the new interpretation. 2. Giving variables single letters name (even if at least onest initials) is definetly bad programming practice in *ANY* language. 3. English (as any other naural language for that matter) is definetly *NOT* a good programming language. Take some good writer (expecially philosophers) and try to understand at the first read what they are saying. Then try to find more than two people and a computer who understand the same thing from what they had read. Either you've got the point or you'll definetly never get it. 4. The original poster did mean that the *one* who write the code is able to understand what the computer will do of it, *NOT* that anyone is able to tell at the first glance what someone else write down. 5. I hope that the "incredibly smart people" have some more substantial opinion in the "realm of software engineering" over those they have passed to you. To end my argument I hope that people end arguing which language is the best to write bad code and start writing good code (the really brave could carry on producing better languages) I will not answer to more than one follow up to this since I've more important things to do in my spare time. Maurizio -- ---------- e-mail: Maurizio.Vitale@dist.dist.unige.it s-mail: Maurizio Vitale via Monaco Simone 1/14 16148 - GENOVA ITALIA
bs@alice.att.com (Bjarne Stroustrup) (03/30/91)
pallas@eng.sun.com (Joseph Pallas) writes > In <jls.670218725@rutabaga> jls@rutabaga.Rational.COM (Jim Showalter) > writes: > > >Okay, what is this going to do?: > > > int j (int y, int m, int d) { > > int m_d[12] = {31,28,31,30,31,30,31,31,30,31,30,31}; > > if (m<1 || m>12) return 0; > > int l = !(y%4) && y&400; > > if (d<1 || d>(m_d[m-1] + (m==2 && y))) return 0; > > int dd = 0; > > for (int i=0; i<m-1; i++) dd += m_d[i]; > > return dd + d + (m>2 && y); > > } > > >You should be able to tell me right off the bat--you think English sucks > >as a programming language, and this is DEFINITELY not written in English. > > I can't believe I'm about to waste my time on this, but here goes: > > int j (int year, int month, int day) { > int month_days[12] = {31,28,31,30,31,30,31,31,30,31,30,31}; > if (month<1 || month>12) return 0; > int leap = !(year%4) && year&400; > if (day<1 || day>(month_days[month-1] + (month==2 && year))) return 0; > int dayofmonth = 0; > for (int i=0; i<month-1; i++) dayofmonth += month_days[i]; > return dayofmonth + day + (month>2 && year); > } > > I've left the name of the function cryptic, but the rest is the result > of a simple global substitution. It becomes immediately apparent not > only what the function is supposed to do, but that it contains at > least one error (surely you meant to check (month==2 && leap)). > > Funny thing is, the function is still written in C. > > Amazing, isn't it? Actually, we discover that the function was never written in C. Note the way variables are declared exactly where they are needed. This is C++. Also, total lack of indentation to make the code more obscure is not too common in the C/C++ world. int j (int year, int month, int day) { int month_days[12] = {31,28,31,30,31,30,31,31,30,31,30,31}; if (month<1 || month>12) return 0; int leap = !(year%4) && year&400; if (day<1 || day>(month_days[month-1] + (month==2 && year))) return 0; int dayofmonth = 0; for (int i=0; i<month-1; i++) dayofmonth += month_days[i]; return dayofmonth + day + (month>2 && year); } A comment or two would also help.
chip@tct.uucp (Chip Salzenberg) (03/31/91)
According to jls@rutabaga.Rational.COM (Jim Showalter): >>One thing i like about C is that when you type something you know what it's >>going to do. > >Okay, what is this going to do?: > > int j (int y, int m, int d) { > int m_d[12] = {31,28,31,30,31,30,31,31,30,31,30,31}; > if (m<1 || m>12) return 0; > int l = !(y%4) && y&400; > if (d<1 || d>(m_d[m-1] + (m==2 && y))) return 0; > int dd = 0; > for (int i=0; i<m-1; i++) dd += m_d[i]; > return dd + d + (m>2 && y); > } Actually, this code isn't legal C, what with the declarations in the middle of the function. And it's badly formatted. And it has a bug or two -- I think the two "&& y" phrases should be "&& l". And the rule for leap years is "(y%4)==0 && ((y%100)!=0 || (y%400)==0". Nevertheless, it is easily recognized as the classic month/day/year to Julian day conversion, with zero as the error return. >You should be able to tell me right off the bat--you think English sucks >as a programming language, and this is DEFINITELY not written in English. Sigh. Jim, you fell victim to one of the classic blunders: "English = Bad" does NOT necessarily imply "!English = Good". -- Chip Salzenberg at Teltronics/TCT <chip@tct.uucp>, <uunet!pdn!tct!chip> "All this is conjecture of course, since I *only* post in the nude. Nothing comes between me and my t.b. Nothing." -- Bill Coderre
chip@tct.uucp (Chip Salzenberg) (03/31/91)
According to jls@rutabaga.Rational.COM (Jim Showalter): >The culture that grew up around C is hackerish, undisciplined, and tends >to produce code that is, indeed, indistinguishable from line noise. Prejudice rears its ugly head. Jim, any tool *must* be evaluated without reference to the milieu from which it sprang. To do otherwise is to discard ideas because of the people espousing them, which is a truly close-minded course. -- Chip Salzenberg at Teltronics/TCT <chip@tct.uucp>, <uunet!pdn!tct!chip> "All this is conjecture of course, since I *only* post in the nude. Nothing comes between me and my t.b. Nothing." -- Bill Coderre
jwohl@eeserv1.ic.sunysb.edu (Jeremy Wohl) (03/31/91)
In article <1991Mar28.104619.1@happy.colorado.edu> hsrender@happy.colorado.edu writes: >In article <jls.670044400@rutabaga>, jls@rutabaga.Rational.COM (Jim Showalter) writes: >>> [...] >> >> Huh? Did you never see?: >> [...] >> T = Sin (X) >> >> W = W + 1 >> 2 1 >> >> I saw stuff like that all the way from elementary school onwards. What I >> NEVER saw was: >> [...] > > [...] I can't wait for the soon-to-be-released-by-several-companies pen computers to appear, and proper software dev. environments to indulge the software engineer. Not only can a programmer apply common mathematical notation (subscripts, sets, vector calculus, etc.), but notation can be expanded to include concepts such as object private/public members, relationships to other objects, etc. English can't possibly be my dream notation. I deal in algorithms, which can most succinctly be described in standard math symbols. -- Jeremy Wohl / wohl@max.physics.sunysb.edu / jwohl@csserv1.ic.sunysb.edu
amanda@visix.com (Amanda Walker) (04/02/91)
I wrote: >>It all depends on the application, and how you define "superior." I'd >>rather do vector calculus with "cryptic" symbols than as word problems, >>for example :). jls@rutabaga.Rational.COM (Jim Showalter) writes: The last time I looked, APL was a VERY symbolic language. It has also been described as a WORN language (Write Once Read Never). Going down the path of pure symbolic notation not only hasn't proven out, [...] Whoa. Read my first sentence again: "It all depends on the application." Let's take your example of APL. There are a number of kinds of numerical processiing for which I think APL is very well suited. I sometimes say it's the best desk calculator I have ever used: it's great for certain kinds of work. For those kinds of work, it certainly has proven out, as can be shown by its continued viability. Sometimes you don't want a single tool that does everything-- you want a single tool that does one thing (or a few things) really well. APL, at least to me, falls into the second category. perhaps the idea of having a symbol for everything should be scrapped and replaced with the idea of having STRINGS for everything, since strings ARE infinitely descriptive. This would obviously work, but it seems to be throwing the baby out with the bathwater. Even "plain English" uses punctuation and other symbols in order to improve clarity, after all. Personally, I like the traditional Lisp approach of using strings, but allowing symbols to be used as shorthand (macros, etc.). This seems to work pretty well. -- Amanda Walker amanda@visix.com Visix Software Inc. ...!uunet!visix!amanda -- Courage is the willingness of a person to stand up for his beliefs in the face of great odds. Chutzpah is doing the same thing wearing a Mickey Mouse hat.
amanda@visix.com (Amanda Walker) (04/02/91)
In article <27F4D3F3.6CD@tct.uucp> chip@tct.uucp (Chip Salzenberg) writes:
Actually, this code isn't legal C, what with the declarations in the
middle of the function.
Well... it's not according to K&R, but it would be accepted by
may C compilers (I haven't checked my ANSI grammar). This doesn't
seem to materially affect Jim's point, though.
Nevertheless, it is easily recognized as the classic month/day/year to
Julian day conversion, with zero as the error return.
Indeed. I didn't find it at all opaque. Ugly, perhaps :), but formatting
gains a lot of its leverage after you pass a screenful or so.
"English = Bad" does NOT necessarily imply "!English = Good".
What? Bringing Freshman Logic 101 into a Usenet discussion? How dare
you? :) :)
Anyway... Jim's point would be a little better made with one of the
Obfuscated C Contest winners, but even there I'd say it's a pretty
marginal point. Showing that you can write garbage in C doesn't mean
you can't write poetry in it too.
--
Amanda Walker amanda@visix.com
Visix Software Inc. ...!uunet!visix!amanda
--
We find ourselves confronted by insurmountable opportunities.
jls@rutabaga.Rational.COM (Jim Showalter) (04/02/91)
> 4. The original poster did mean that the *one* who write the >code is able to understand what the computer will do of it, *NOT* that >anyone is able to tell at the first glance what someone else write >down. Precisely my point. 50% of all programming done in this country is maintenance. Of that, how much do you suppose is done by the original programmer? Code not deliberately written to be as readable as possible by others is evidence of bad instincts run amok, and part of the reason software maintenance costs so much (any managers out there want to show a higher profit?--encourage better programming technique). > 5. I hope that the "incredibly smart people" have some more >substantial opinion in the "realm of software engineering" over those >they have passed to you. Parnas, Dijkstra, Humphrey, Denning, Booch, Goodenough, Turing, VonNeuman, Godel, Devlin, etc etc etc. Who do YOU listen to? >I will not answer to more than one follow up to this since I've more important things to do in my spare time. Debugging, perhaps? ;-) -- ***** DISCLAIMER: The opinions expressed herein are my own, except in the realm of software engineering, in which case I've borrowed them from incredibly smart people.
richieb@bony1.bony.com (Richard Bielak) (04/02/91)
In article <jls.670044400@rutabaga> jls@rutabaga.Rational.COM (Jim Showalter) writes: [...] >>all, I never saw anything even vaguely like "a := a + 1" in any math or >>science course, so I can't believe I had any prior familiarity with it. > >Huh? Did you never see?: > > 7 = 6 + 1 [...] > > W = W + 1 > 2 1 > >I saw stuff like that all the way from elementary school onwards. What I >NEVER saw was: > > P++ Being a math major in college, I was rather confused the first time I saw the expression: A = A + 1; /* PL/I Syntax */ That could only be true, if A were a transfinite number! I mean 2 is not equal to 2 + 1. Anyway, properly written programs should PRECISELY state what they are doing in some agreed upon notation. So, IMHO, a := a + 1 is as good as a++. ...richie -- *-----------------------------------------------------------------------------* | Richie Bielak (212)-815-3072 | Programs are like baby squirrels. Once | | Internet: richieb@bony.com | you pick one up and handle it, you can't | | Bang: uunet!bony1!richieb | put it back. The mother won't feed it. |
horvath@motcid.UUCP (Bob Horvath) (04/02/91)
pallas@eng.sun.com (Joseph Pallas) writes:
-I can't believe I'm about to waste my time on this, but here goes:
-int j (int year, int month, int day) {
-int month_days[12] = {31,28,31,30,31,30,31,31,30,31,30,31};
-if (month<1 || month>12) return 0;
-int leap = !(year%4) && year&400;
-if (day<1 || day>(month_days[month-1] + (month==2 && year))) return 0;
-int dayofmonth = 0;
-for (int i=0; i<month-1; i++) dayofmonth += month_days[i];
-return dayofmonth + day + (month>2 && year);
-}
-I've left the name of the function cryptic, but the rest is the result
-of a simple global substitution. It becomes immediately apparent not
-only what the function is supposed to do, but that it contains at
-least one error (surely you meant to check (month==2 && leap)).
-Funny thing is, the function is still written in C.
Having read the rest of the posts I now know what it does. Reading it with the
variables changed gave me a clue that it did something with days, months, and
years and the amount of days in a month. But not being a regular C programmer,
I still wasn't able to *quickly* figure out exactly what it was doing.
-Amazing, isn't it?
I don't think so. Even during times when I was working in C I was never able to
read it as easily as other languages.
jls@rutabaga.Rational.COM (Jim Showalter) (04/02/91)
>Jim, any tool *must* be evaluated without reference to the milieu from >which it sprang. To do otherwise is to discard ideas because of the >people espousing them, which is a truly close-minded course. Spurious in the extreme. If the only people using a tool are incompetent, then what does it matter if the tool itself is value-neutral? After all, shouldn't one judge a tool by the quality of the work produced through its use? How else, in fact, WOULD you judge a tool? -- ***** DISCLAIMER: The opinions expressed herein are my own, except in the realm of software engineering, in which case I've borrowed them from incredibly smart people.
jls@rutabaga.Rational.COM (Jim Showalter) (04/02/91)
%int j (int year, int month, int day) { %int month_days[12] = {31,28,31,30,31,30,31,31,30,31,30,31}; %if (month<1 || month>12) return 0; %int leap = !(year%4) && year&400; %if (day<1 || day>(month_days[month-1] + (month==2 && year))) return 0; %int dayofmonth = 0; %for (int i=0; i<month-1; i++) dayofmonth += month_days[i]; %return dayofmonth + day + (month>2 && year); %} THIS is readable? I think the problem is more deeply rooted in the C culture than I thought... (this would garner one of my Ada students a D minus) %It becomes immediately apparent not %only what the function is supposed to do, but that it contains at %least one error (surely you meant to check (month==2 && leap)). Yep, you found one of the planted bugs. Now find the other. P.S. For all those who said what this function does is calculate the Julian date: wrong. What it is SUPPOSED to do is calculate the Julian date. What it actually does is produce erroneous results. This might have been easier to see if it had been written better. -- ***** DISCLAIMER: The opinions expressed herein are my own, except in the realm of software engineering, in which case I've borrowed them from incredibly smart people.
spicer@tci.UUCP (Steve Spicer) (04/03/91)
In article <jls.670561073@rutabaga> jls@rutabaga.Rational.COM (Jim Showalter) writes: >>Jim, any tool *must* be evaluated without reference to the milieu from >>which it sprang. To do otherwise is to discard ideas because of the >>people espousing them, which is a truly close-minded course. > >Spurious in the extreme. If the only people using a tool are incompetent, >then what does it matter if the tool itself is value-neutral? After all, >shouldn't one judge a tool by the quality of the work produced through >its use? How else, in fact, WOULD you judge a tool? Thanks, Jim. That's as fine an argument for gun control as I've ever seen. After all, since most people using guns on our city streets seem to be criminals, then we should definitely get rid of the guns, right? DON'T START A GUN CONTROL THREAD. It was an exaggerated EXAMPLE to make a point, OK? - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - Steven Spicer/spicer@tci.bell-atl.com Is your design so simple that there are obviously no deficiencies, or so complicated that there are no obvious deficiencies? -- suggested by a quote from C.A.R. Hoare
pallas@eng.sun.com (Joseph Pallas) (04/03/91)
In <jls.670567351@rutabaga> jls@rutabaga.Rational.COM (Jim Showalter) writes: >(this would garner one of my Ada students a D minus) I'm glad to hear it. Because your tone of argument suggests that you believe writing in some particular language will prevent one from writing unreadable programs. It even suggests that you believe it is not possible to write readable programs in certain other languages. I'm pleased to hear that those aren't actually your views. joe
amanda@visix.com (Amanda Walker) (04/03/91)
jls@rutabaga.Rational.COM (Jim Showalter) writes:
Spurious in the extreme. If the only people using a tool are incompetent,
then what does it matter if the tool itself is value-neutral?
So, are you implying that the only people using C are incompetent? If so,
I beg to differ...
--
Amanda Walker amanda@visix.com
Visix Software Inc. ...!uunet!visix!amanda
--
For every vision, there is an equal and opposite revision.
jls@rutabaga.Rational.COM (Jim Showalter) (04/03/91)
>Showing that you can write garbage in C doesn't mean >you can't write poetry in it too. Never said you couldn't. Please allow me to restate my thesis: 1) You can write terrible code in any language. 2) Some languages have an edge in the readability department even when one writes the same stuff in both and compares (more about this later). 3) A language does not exist separate from its culture. 4) The culture that grew up around C is hackerish, undisciplined, and darned proud of it (for reasons I've never understood). 5) The culture that grew up around Ada is software engineering oriented. 6) The smart money bet is that code written in Ada will be cleaner than code written in C. One can of course find numerous counterexamples, but I'm talking gross percentages here. 7) C++ provides an opportunity to reboot the C/UNIX culture with a more disciplined, software engineering approach to things. The jury is still out on whether this will actually occur or not. P.S. Please submit some C poetry--I'm about to submit what I hope is some Ada poetry (it may turn out to be Ada doggerel, but oh well), and this will provide a comparison of good against good rather than good against bad. I'm shooting for readability and a self-documenting result, so you might try to also. -- ***** DISCLAIMER: The opinions expressed herein are my own, except in the realm of software engineering, in which case I've borrowed them from incredibly smart people.
Chris.Holt@newcastle.ac.uk (Chris Holt) (04/03/91)
horvath@motcid.UUCP (Bob Horvath) writes: >pallas@eng.sun.com (Joseph Pallas) writes: >-int j (int year, int month, int day) { >-int month_days[12] = {31,28,31,30,31,30,31,31,30,31,30,31}; >-if (month<1 || month>12) return 0; >-int leap = !(year%4) && year&400; >-if (day<1 || day>(month_days[month-1] + (month==2 && year))) return 0; >-int dayofmonth = 0; >-for (int i=0; i<month-1; i++) dayofmonth += month_days[i]; >-return dayofmonth + day + (month>2 && year); >-} >Having read the rest of the posts I now know what it does. Reading it with the >variables changed gave me a clue that it did something with days, months, and >years and the amount of days in a month. But not being a regular C programmer, >I still wasn't able to *quickly* figure out exactly what it was doing. I couldn't be bothered with the original version, and again having read the posts I know what it does. But *why* should it ever return 0, rather than some bottom value or exception? It seems to me we've got a language with pretty poor typing facilities :-). Arr, Jim lad; I remember when we used less than 0 for the first branch, 0 for the second, and greater than 0 for the third. That were intuitively obvious, that were. ----------------------------------------------------------------------------- Chris.Holt@newcastle.ac.uk Computing Lab, U of Newcastle upon Tyne, UK ----------------------------------------------------------------------------- "And when they die by thousands why, he laughs like anything." G Chesterton
chip@tct.com (Chip Salzenberg) (04/04/91)
Jim has well demonstrated his prejudice against C in the referenced article. I couldn't have gotten a more completely close-minded reply had I asked for it. According to jls@rutabaga.Rational.COM (Jim Showalter): >If the only people using a tool are incompetent, then what does it matter >if the tool itself is value-neutral? Isn't it obvious that "current users" and "future users" are not necessarily the same groups? >Shouldn't one judge a tool by the quality of the work produced through >its use? How else, in fact, WOULD you judge a tool? I judge a tool by how I would use it. ("How else", indeed!) -- Chip Salzenberg <chip@tct.com>, <uunet!pdn!tct!chip> Brand X Industries Custodial, Refurbishing and Containment Service When You Never, Ever Want To See It Again [tm]
jls@rutabaga.Rational.COM (Jim Showalter) (04/04/91)
>I'm glad to hear it. Because your tone of argument suggests that you >believe writing in some particular language will prevent one from >writing unreadable programs. Not at all. I wish it WERE that simple, but obviously it is not. > It even suggests that you believe it is >not possible to write readable programs in certain other languages. Nope. But I DO believe that it is easier to write readable programs in some languages than others, even using the same hygienic approach to coding style, naming issues, etc. -- ***** DISCLAIMER: The opinions expressed herein are my own, except in the realm of software engineering, in which case I've borrowed them from incredibly smart people.
jls@rutabaga.Rational.COM (Jim Showalter) (04/04/91)
>So, are you implying that the only people using C are incompetent?
Nope. What I AM saying is that a large percentage of those writing
in C are lousy software engineers, if the quality of the C code I've
seen over the years is representative of their output--and I'm also
saying that I attribute this to the culture that grew up around C,
not to the language itself (since the language is largely a value-neutral
tool). Furthermore, I'm saying that there are plenty of incompetent C++
and Ada programmers running around loose too, but that the percentage of
them relative to the percentage of competent software engineers working
in those languages is less than the same comparison in C, FORTRAN, or
other dinosaur languages--and I'm saying that this difference
is attributable to the software engineering oriented cultures that
have grown up around Ada and C++ (and that were, in fact, responsible for
their creation in the first place), Finally, I'm saying that, should one be
inclined to be a software engineer, one would in general find Ada or
C++ to be superior vehicles for expressing one's designs.
--
***** DISCLAIMER: The opinions expressed herein are my own, except in
the realm of software engineering, in which case I've borrowed
them from incredibly smart people.
sabbagh@acf5.NYU.EDU (sabbagh) (04/04/91)
>Nope. What I AM saying is that a large percentage of those writing >in C are lousy software engineers, if the quality of the C code I've >seen over the years is representative of their output--and I'm also >saying that I attribute this to the culture that grew up around C, >not to the language itself (since the language is largely a value-neutral >tool). Furthermore, I'm saying that there are plenty of incompetent C++ >and Ada programmers running around loose too, but that the percentage of >them relative to the percentage of competent software engineers working >in those languages is less than the same comparison in C, FORTRAN, or >other dinosaur languages--and I'm saying that this difference >is attributable to the software engineering oriented cultures that >have grown up around Ada and C++ (and that were, in fact, responsible for >their creation in the first place), Finally, I'm saying that, should one be >inclined to be a software engineer, one would in general find Ada or >C++ to be superior vehicles for expressing one's designs. How do you think the "software engineering oriented cultures" evolved? HACKERS! That's right, hackers. Suppose we didn't call them hackers; supposed we called them "researchers". Sure, their code is messy, ugly, hard to maintain. But they explored the limits of hardware and software; they have developed algorithms to solve real world problems; they have asked the right questions: what tools are needed, how to make maintainable software, etc. I take strong exception to the word *incompetent*. People write computer programs for a vast number of reasons. In many areas reusability and maintainability are non-issues and are to be sacrificed for GETTING AN ANSWER. Fortran and C are both excellent languages for just these problems. (I have personally replaced these two with C++ and Forth, but I'm wearing an asbestos suit, nyah nyah :-) Also, keep in mind that the first OS to embody ANY kind of software engineering principle (in particular, "structured programming") was... UNIX! C was originally invented for exactly this problem. Just goes to show you: hindsight is the *only* perfect science. Ada and C++ could not have been developed with first experiencing C and Algol. Hadil G. Sabbagh E-mail: sabbagh@cs.nyu.edu Voice: (212) 998-3125 Snail: Courant Institute of Math. Sci. 251 Mercer St. New York,NY 10012 "Injustice anywhere is a threat to justice everywhere." - Martin Luther King, Jr. Disclaimer: This is not a disclaimer.
colin@toshiba.tic.oz (Colin Sutton) (04/05/91)
In <6108@lead2.UUCP> horvath@motcid.UUCP (Bob Horvath) writes: >pallas@eng.sun.com (Joseph Pallas) writes: >-I can't believe I'm about to waste my time on this, but here goes: >-int j (int year, int month, int day) { >-int month_days[12] = {31,28,31,30,31,30,31,31,30,31,30,31}; >-if (month<1 || month>12) return 0; >-int leap = !(year%4) && year&400; >-if (day<1 || day>(month_days[month-1] + (month==2 && year))) return 0; >-int dayofmonth = 0; >-for (int i=0; i<month-1; i++) dayofmonth += month_days[i]; >-return dayofmonth + day + (month>2 && year); >-} >-I've left the name of the function cryptic, but the rest is the result >-of a simple global substitution. It becomes immediately apparent not >-only what the function is supposed to do, but that it contains at >-least one error (surely you meant to check (month==2 && leap)). I can't read C either, but I think I see another bug (or a specification error). Years that are a multiple of 100 are not leap years, except multiples of 400. Most software that doesn't care about centuries will work just fine until 2100! What's C doing in this newsgroup anyway? ;-) --- Colin Sutton Project Manager, Computer Systems Development Centre Toshiba International Corporation Pty. Ltd. Tel. Sydney 428 2077 Australian Company Number 001 555 068 email: colin@toshiba.tic.oz.au
jls@rutabaga.Rational.COM (Jim Showalter) (04/05/91)
>Just goes to show you: hindsight is the *only* perfect science. Ada and C++ >could not have been developed with first experiencing C and Algol. Agreed. The difference between software and other disciplines, however, is that in other disciplines obsolete tools and techniques are readily discarded when something better comes along. By your own admission, C and Algol are the equivalent of slide rules and the abacus. All I'm trying to get people to do is dump them in favor of something better. Stubbornly clinging to languages and techniques with proven deficiencies makes as much sense as a hardware engineer refusing to use those darned newfangled VLSI chips, a mechanical engineer refusing to use those scary new CAD systems, a construction firm refusing to use poured concrete in place of bricks and plaster, etc. -- * The opinions expressed herein are my own, except in the realm of software * * engineering, in which case I borrowed them from incredibly smart people. * * * * Rational: cutting-edge software engineering technology and services. *
sabbagh@acf5.NYU.EDU (sabbagh) (04/06/91)
jls@rutabaga.Rational.COM (Jim Showalter) writes: >>Just goes to show you: hindsight is the *only* perfect science. Ada and C++ >>could not have been developed with first experiencing C and Algol. >Agreed. The difference between software and other disciplines, however, >is that in other disciplines obsolete tools and techniques are readily >discarded when something better comes along. By your own admission, >C and Algol are the equivalent of slide rules and the abacus. All I'm >trying to get people to do is dump them in favor of something better. >Stubbornly clinging to languages and techniques with proven deficiencies >makes as much sense as a hardware engineer refusing to use those darned >newfangled VLSI chips, a mechanical engineer refusing to use those >scary new CAD systems, a construction firm refusing to use poured concrete >in place of bricks and plaster, etc. I'm not sure I buy this argument. No tool is ever completely obsolete, not even the abacus. Millions of Chinese still use them in their daily life. It has certain advantages over electronic calculators. Tools have appropriate contexts. Fortran and C have an appropriate context. In the C case, there is an emerging concensus that C++ is superior to C. But there is really insufficient experience with C++ to say that this is a certainty. The other disciplines you mention are the result of *hundreds* even *thousands* of years of collective experience. Finally, I want to point out that the invention of C++, Eiffel and a number of other languages have pointed out the true value of C: as a replacement to assembly language! I predict that most future languages will compile to C instead to machine language. This is a tremendous achievement for the software community: true resuability of language and concepts. Hadil G. Sabbagh E-mail: sabbagh@cs.nyu.edu Voice: (212) 998-3125 Snail: Courant Institute of Math. Sci. 251 Mercer St. New York,NY 10012 "Injustice anywhere is a threat to justice everywhere." - Martin Luther King, Jr. Disclaimer: This is not a disclaimer.
brnstnd@kramden.acf.nyu.edu (Dan Bernstein) (04/07/91)
In article <jls.670709095@rutabaga> jls@rutabaga.Rational.COM (Jim Showalter) writes: > Nope. What I AM saying is that a large percentage of those writing > in C are lousy software engineers, Why, thank you. Whenever I see that I'm writing something that's been written before, I turn it into a library routine. Module. Structured program. Reusable component. Whatever the same old idea will called next decade. I respect no other rule of software ``engineering.'' I would be insulted by any implication that I believed the mounds of unjustified crap that are associated with software ``engineering.'' I am proud not to be a software engineer. ---Dan
diamond@jit345.swstokyo.dec.com (Norman Diamond) (04/08/91)
In article <1576@acf5.NYU.EDU> sabbagh@acf5.NYU.EDU (sabbagh) writes: >Finally, I want to point out that the invention of C++, Eiffel and a >number of other languages have pointed out the true value >of C: as a replacement to assembly language! Uh, yeah, in fact, the original purpose of C has been rediscovered every month for the last 21 years. Or longer, if we recall that C's predecessors were also intended as replacements to assembly language. -- Norman Diamond diamond@tkov50.enet.dec.com If this were the company's opinion, I wouldn't be allowed to post it.
hsrender@happy.colorado.edu (04/08/91)
In article <jls.670817462@rutabaga>, jls@rutabaga.Rational.COM (Jim Showalter) writes: > Stubbornly clinging to languages and techniques with proven deficiencies > makes as much sense as a hardware engineer refusing to use those darned > newfangled VLSI chips, a mechanical engineer refusing to use those > scary new CAD systems, a construction firm refusing to use poured concrete > in place of bricks and plaster, etc. Just for my benefit, can you give me an idea of what the availability of cheap, portable Ada compilers is? One reason I gave up on attempts to do anything with it 6 years ago was that I couldn't find a good development system. Sure, Rational had efficient compilers even then, but they were priced out of the range of the average academic budget. Perhaps more schools would teach Ada (and thus create more Ada supporters) if the compilers were cheaper. BTW, I don't consider $3500/machine to be a cheap compiler, even if it is cheaper than some things on the market. I think such pricing strategies are what cause the failure of good language systems, such as Eiffel and the latest release of ParcPlace Smalltalk. Since we've moved somewhat far afield from the original topic, I recommend that any further discussions on this be moved to comp.lang.misc. hal.
marc@dumbcat.sf.ca.us (Marco S Hyman) (04/09/91)
In article <jls.670541848@rutabaga> jls@rutabaga.Rational.COM (Jim Showalter) writes:
Code not deliberately written to be as readable as possible
by others is evidence of bad instincts run amok, and part of the reason
software maintenance costs so much (any managers out there want to show
a higher profit?--encourage better programming technique).
Jim shouldn't be so narrow here. Programming technique is well and good but
the real key is encouraging the programming staff to write, in whatever human
language is appropriate, about the system being programmed. The programs
easiest changed are those where not only the "what" is captured (as lines of
code), but the "why".
For an excellent example of this (as well as an example of picking the right
tool for the job) see Jon Bentley's Programming Pearls column in the June '86
issue of Communication of the ACM. In it is a beautifully documented program
by Don Knuth written in WEB -- and the (some might say hard to read :-) 6 line
UNIX shell script that does the same thing.
Ada, C, C++, SmallTalk, Assembler... None of them are right. All of them are
right. Use the proper one for the given job.
--
// marc
// home: marc@dumbcat.sf.ca.us pacbell!dumbcat!marc
// work: marc@ascend.com uunet!aria!marc
marc@dumbcat.sf.ca.us (Marco S Hyman) (04/09/91)
In article <1574@acf5.NYU.EDU> sabbagh@acf5.NYU.EDU (sabbagh) writes: > Suppose we didn't call them hackers; supposed we called them "researchers". > Sure, their code is messy, ugly, hard to maintain. But they explored the > limits of hardware and software; they have developed algorithms to solve > real world problems; they have asked the right questions: what tools are > needed, how to make maintainable software, etc. Sorry, Hadil, but there's more to the difference between "hacker" and "researcher" than a change of name. Researchers write up the results of their experiments, sharing those results with the rest of the world. What better documentation could you ask for? The goal is to add and share knowledge. (Yes, the "hacker ethic" shares this goal -- but the term "hacker" has gained negative connotations in the context of "software engineering." The terms "hack" and "quick, throwaway, and dirty " have become synonymous.) Hackers (the bad kind) are worse that incompetent; they're selfish. Their code is written to solve a specific problem today and to hell with anyone who might have the same problem tomorrow. Hackers (the bad kind) can be selfish in any language. Researchers can teach new concepts and ideas using any language. -- // marc // home: marc@dumbcat.sf.ca.us pacbell!dumbcat!marc // work: marc@ascend.com uunet!aria!marc
jls@rutabaga.Rational.COM (Jim Showalter) (04/10/91)
>I am proud not to be a software engineer.
See, this is the really interesting thing to me: I cannot think of any other
technical discipline in which people swagger around bragging about how they
are not engineers. Can you imagine somebody swaggering around bragging that
he is not a mechanical engineer? What sort of confidence would a person inspire
in potential clients by broadcasting her not being a civil engineer? Would
anybody even HIRE a person in any other discipline that not only had no
formal background or credentialization, but was actually PROUD of it? Would
you have a person with no formal training or proof of background to wire up
your house? Be honest... would you like to have your gallbladder taken out
by a "hacker" doctor (no training, no credentials, proud of it, CLAIMS he
knows what he's doing...)? Give me a break.
Sure, I acknowledge the existence once in a blue moon of a genius who needs
no such training or background--but it stretches the bounds of credibility
that so darned many of these geniuses just happen to be programmers. Or
could it--shocking thought--be that many of the self-proclaimed non software
engineer programmer/hackers aren't actually as good as they think they are?
Heavens--could this explain over-budget over-schedule software shot though
with bugs? Hmmmmm...
--
* The opinions expressed herein are my own, except in the realm of software *
* engineering, in which case I borrowed them from incredibly smart people. *
* *
* Rational: cutting-edge software engineering technology and services. *
jls@rutabaga.Rational.COM (Jim Showalter) (04/10/91)
>Just for my benefit, can you give me an idea of what the availability of >cheap, portable Ada compilers is? One reason I gave up on attempts to >do anything with it 6 years ago was that I couldn't find a good development >system. Well, you can get a Meridian compiler for a PC for about $195, last I heard. You can get Alsys and Telesoft compilers for a number of platforms for a grand or so. >Perhaps more schools >would teach Ada (and thus create more Ada supporters) if the compilers were >cheaper. Agreed. So buy Meridian. >I think such pricing strategies >are what cause the failure of good language systems, such as Eiffel and the >latest release of ParcPlace Smalltalk. Agreed again. On the other hand, the adage "you get what you pay for" is certainly true in the compiler world. You can get very inexpensive compilers for some languages (C comes to mind) provided you don't much care about bugs, idiosyncratic interpretations of the language "standard", crappy support for other tools, etc. You can get superb compilers if you're willing to lay out the money for them. I realize that educational institutions work on a different scheme than industrial ones, so for you guys a buggy but cheap compiler is probably the way to go. On the other hand, for a commercial site $20k per seat for a fully integrated lifecycle support environment is a BARGAIN. -- * The opinions expressed herein are my own, except in the realm of software * * engineering, in which case I borrowed them from incredibly smart people. * * * * Rational: cutting-edge software engineering technology and services. *
amanda@visix.com (Amanda Walker) (04/10/91)
jls@rutabaga.Rational.COM (Jim Showalter) writes:
On the other hand, for a commercial site $20k per seat for a fully
integrated lifecycle support environment is a BARGAIN.
Not to mention the trendy phrases and buzzwords that come along with it
for free...
Sorry, I couldn't resist :). You wouldn't happen to be in marketing,
would you...?
--
Amanda Walker amanda@visix.com
Visix Software Inc. ...!uunet!visix!amanda
--
X Windows: Warn you friends about it.
jdudeck@polyslo.CalPoly.EDU (John R. Dudeck) (04/11/91)
In an article jls@rutabaga.Rational.COM (Jim Showalter) wrote: >>I am proud not to be a software engineer. >Or >could it--shocking thought--be that many of the self-proclaimed non software >engineer programmer/hackers aren't actually as good as they think they are? >Heavens--could this explain over-budget over-schedule software shot though >with bugs? Hmmmmm... Let's not forget poorly-abstracted, opaque, hard-to-maintain, fragile... -- John Dudeck "Communication systems are jdudeck@Polyslo.CalPoly.Edu inherently complex". ESL: 62013975 Tel: 805-545-9549 -- Ron Oliver
philbo@dhw68k.cts.com (Phil Lindsay) (04/11/91)
In article <3201:Apr705:40:4591@kramden.acf.nyu.edu> brnstnd@kramden.acf.nyu.edu (Dan Bernstein) writes: >In article <jls.670709095@rutabaga> jls@rutabaga.Rational.COM (Jim Showalter) writes: >> Nope. What I AM saying is that a large percentage of those writing >> in C are lousy software engineers, ....deleted... > >I am proud not to be a software engineer. > >---Dan Do people actually think Computer Science has advanced enough to become an engineering dicipline? I think not. We have learned many things in the past..."You don't build a house on sand." -- Phil Lindsay - "Patents threaten future technology" Internet: philbo@dhw68k.cts.com Phone: Wrk7143852311 Hm7142891201 UUCP: {spsd,zardox,felix}!dhw68k!philbo USMAIL: 152A S. Cross Creek Rd, Orange, Ca. 92669
jls@rutabaga.Rational.COM (Jim Showalter) (04/12/91)
> On the other hand, for a commercial site $20k per seat for a fully > integrated lifecycle support environment is a BARGAIN. >Not to mention the trendy phrases and buzzwords that come along with it >for free... >Sorry, I couldn't resist :). You wouldn't happen to be in marketing, >would you...? Nope, but I could look at myself in the mirror in the morning if I were. I like working where I work because our stuff really does what we say it does...and our customers agree. We've got lots of success stories where customers say things like "We couldn't have done it without Rational's support environment" and "Rational saved us $117 million over the course of the four projects", etc. We DO sell a fully-integrated lifecycle support environment. I can't help it if it sound buzzwordy, but it's true. I'll happily send you some of that dreaded marketing literature if you're interested. ;-) -- * The opinions expressed herein are my own, except in the realm of software * * engineering, in which case I borrowed them from incredibly smart people. * * * * Rational: cutting-edge software engineering technology and services. *
diamond@jit345.swstokyo.dec.com (Norman Diamond) (04/12/91)
In article <1991Apr11.062250.15105@dhw68k.cts.com> philbo@dhw68k.cts.com (Phil Lindsay) writes: >In article <3201:Apr705:40:4591@kramden.acf.nyu.edu> brnstnd@kramden.acf.nyu.edu (Dan Bernstein) writes: >>In article <jls.670709095@rutabaga> jls@rutabaga.Rational.COM (Jim Showalter) writes: >>> Nope. What I AM saying is that a large percentage of those writing >>> in C are lousy software engineers, >>I am proud not to be a software engineer. >Do people actually think Computer Science has advanced enough to >become an engineering dicipline? I think not. We have learned >many things in the past..."You don't build a house on sand." I think yes, because a number of analogous lessons HAVE been learned in the software industry. The problem is that most of the people in this industry refuse to listen to, and practice, the lessons. Of course, we still have a problem with tools. Some languages have some features, other languages have other features, and they cannot talk to each other. The carpenter is forbidden to use both a saw and a hammer on the same job. And no one will allow a co-operating toolset to be developed. -- Norman Diamond diamond@tkov50.enet.dec.com If this were the company's opinion, I wouldn't be allowed to post it.
brnstnd@kramden.acf.nyu.edu (Dan Bernstein) (04/12/91)
In article <jls.671247104@rutabaga> jls@rutabaga.Rational.COM (Jim Showalter) writes: > >I am proud not to be a software engineer. > See, this is the really interesting thing to me: I cannot think of any other > technical discipline in which people swagger around bragging about how they > are not engineers. Mathematics is a technical discipline. What do you mean to say? Every working definition of ``engineering'' appears to exclude computer science. Here's a question: New York State imposes legal requirements upon anyone working in the recognized fields of engineering. A ``professional engineer,'' for instance, must pass a test in his field before he can use that title. What would you put on a test for software engineers? The existing tests for engineers include some amount of jargon, to be sure, but they also include *problems* that relate directly to problems in the *real world*---problems whose solutions are applied directly by engineers every day. Most of the answers aren't obvious, even to someone acquainted with the jargon. So how would you test software engineers? Would you make sure they knew the latest terminology? Or would you aim for the blindingly obvious? ``True or false: If you need code ten times, it is cheaper for the code to be (a) rewritten each time; (b) stored in a library.'' Or would you use material from what appears to be the vast majority of software engineering literature---theories that are neither applied by working programmers, nor proven to help solve *real world* problems. Maybe there is some engineering behind ``software engineering.'' I'd love to hear what it is. If you have an example, send me e-mail, and I'll summarize. > Would > anybody even HIRE a person in any other discipline that not only had no > formal background or credentialization, but was actually PROUD of it? Be serious. There is a huge difference between someone without formal training or credentials and someone who is proud not to be a software engineer. ---Dan
amanda@visix.com (Amanda Walker) (04/12/91)
In article <jls.671419885@rutabaga> jls@rutabaga.Rational.COM (Jim Showalter) writes: We DO sell a fully-integrated lifecycle support environment. I can't help it if it sound buzzwordy, but it's true. I'll happily send you some of that dreaded marketing literature if you're interested. ;-) Well, I've asked once already, and haven't seen anything, but I'd still be happy to look at it. Send me email if you need Visix's address or FAX number. -- Amanda Walker amanda@visix.com Visix Software Inc. ...!uunet!visix!amanda -- X Windows: A mistake carried out to perfection.
oz@yunexus.yorku.ca (Ozan Yigit) (04/13/91)
Dan Bernstein writes: >I am proud not to be a software engineer. ... and a pot is proud not to be a kettle. Dan, who tf cares?
amanda@visix.com (Amanda Walker) (04/13/91)
brnstnd@kramden.acf.nyu.edu (Dan Bernstein) writes:
Every working definition of ``engineering'' appears to exclude computer
science.
Indeed. A large software project is, in my experience, more like a
book than it is like a building. Do we call novelists "prose
engineers?" Do we call movie producers "audio-visual entertainment
engineers?" No, and I don't think we should. My business card says
"software engineer," and I do what is usually called "software
engineering," but I think that it's a misnomer. I think that things
like "software author," "algorist," "software designer," "software
artist" (by analogy to "graphic artist" or "commercial artist"), or
"software architect" would be better, but currently they just confuse
people. People think they know what "software engineer" means.
Part of my problem is that, put simply, I don't think we know enough
about what software is to make it into an engineering discipline or a
trade. Software is so mutable and responsive that I can't think of
its creation and manipulation as anything except an artistic
disclipline.
When I interview someone I'm interested in hiring, I'd rather see
a portfolio of their work than what certificates they've earned.
--
Amanda Walker amanda@visix.com
Visix Software Inc. ...!uunet!visix!amanda
--
"In all my life, I have prayed but one prayer: 'O Lord, make my enemies
ridiculous.' And God granted it." --Voltaire
jls@rutabaga.Rational.COM (Jim Showalter) (04/13/91)
>Every working definition of ``engineering'' appears to exclude computer >science. Is this something to be PROUD of? As it stands, it is certainly an accurate reflection of the current state of affairs, but hardly a good thing. >What would you put on a test for software engineers? What do they put on tests for other engineers? >The existing tests for engineers include some amount of jargon, to be >sure, but they also include *problems* that relate directly to problems >in the *real world*---problems whose solutions are applied directly by >engineers every day. Most of the answers aren't obvious, even to someone >acquainted with the jargon. I'm a bit confused. Is it your belief that software engineers do NOT work on problems directly related to the real world? I thought that satellites, dishwashers, dialysis machines, airplanes, telecommunications equipment, laptop computers, relational databases, finite element analysis models, and, well, actually, every OTHER thing I've seen software used for was part of the real world. Or did you mean to say something other than what you appear to be saying? >Would you make sure they knew the latest terminology? Or would you aim >for the blindingly obvious? ``True or false: If you need code ten times, >it is cheaper for the code to be (a) rewritten each time; (b) stored in >a library.'' I think I'd ask them to solve some problems. Probably have them rig up an interface to some device, write some queries for a database, reverse engineer a design from some legacy code, etc etc etc. You know--do some software engineering. Think of it as the kind of thing lawyers have to do to pass the bar, or doctors have to do to become doctors. >Or would you use material from what appears to be the vast majority of >software engineering literature---theories that are neither applied by >working programmers, nor proven to help solve *real world* problems. Could you please list some software engineering theories that are not applied by programmers and/or do not help to solve real world problems? For extra credit, could you give me an example of a real world problem? I think we have some sort of communications problem. -- * The opinions expressed herein are my own, except in the realm of software * * engineering, in which case I borrowed them from incredibly smart people. * * * * Rational: cutting-edge software engineering technology and services. *
cok@islsun.Kodak.COM (David Cok) (04/14/91)
In article <1991Apr12.201053.18348@visix.com> amanda@visix.com (Amanda Walker) writes: >Indeed. A large software project is, in my experience, more like a >book than it is like a building. Do we call novelists "prose ... A publisher friend once said to me that books were never finished, only abandoned, meaning that they were declared complete when the author tired of correcting and improving. Another similarity between books and software... David R. Cok Eastman Kodak Company cok@Kodak.COM
martelli@cadlab.sublink.ORG (Alex Martelli) (04/14/91)
jls@rutabaga.Rational.COM (Jim Showalter) writes:
:>I am proud not to be a software engineer.
:
:See, this is the really interesting thing to me: I cannot think of any other
:technical discipline in which people swagger around bragging about how they
:are not engineers. Can you imagine somebody swaggering around bragging that
Architecture!!! It is a typical and everyday occurrence for an architect to
brag that he is not an engineer (typically put as "not JUST an engineer!"), I
assume with the implication that the architect adds "artistic" value to the
"merely technical" role of the civil engineer.
I don't think this is the sense in which Dan is speaking, though.
--
Alex Martelli - CAD.LAB s.p.a., v. Stalingrado 53, Bologna, Italia
Email: (work:) martelli@cadlab.sublink.org, (home:) alex@am.sublink.org
Phone: (work:) ++39 (51) 371099, (home:) ++39 (51) 250434;
Fax: ++39 (51) 366964 (work only), Fidonet: 332/401.3 (home only).
brnstnd@kramden.acf.nyu.edu (Dan Bernstein) (04/14/91)
In article <jls.671514765@rutabaga> jls@rutabaga.Rational.COM (Jim Showalter) writes: > >The existing tests for engineers include some amount of jargon, to be > >sure, but they also include *problems* that relate directly to problems > >in the *real world*---problems whose solutions are applied directly by > >engineers every day. Most of the answers aren't obvious, even to someone > >acquainted with the jargon. > I'm a bit confused. Is it your belief that software engineers do NOT > work on problems directly related to the real world? The key phrases are ``problems whose solutions are applied directly by engineers every day'' and ``most of the answers aren't obvious.'' > I think I'd ask them to solve some problems. Probably have them rig > up an interface to some device, write some queries for a database, > reverse engineer a design from some legacy code, etc etc etc. You > know--do some software engineering. That sure sounds like a programming test to me. You wouldn't find people doing ``software engineering'' to produce correct answers on that test. Are you saying that any working program is the result of software engineering? That's an extremely broad definition. How come you can only come up with test problems, not questions? Is there no significant body of nontrivial ``software engineering'' techniques that you can test someone's knowledge of? The electrical engineering tests don't ask you to build a computer. ---Dan
fmhv@minerva.inesc.pt (Fernando Manuel Vasconcelos) (04/15/91)
Please Has someone took at look at the subject of this discussion ????? I don't want to be boring ( I mean I've read this sort thing over and over again ) but the original question did interest me ... Not that the actual line of discussion isn't inlightning :-) :-) -- Fernando Manuel Hourtiguet de Vasconcelos INESC - Instituto de Engenharia de fmhv@inesc.inesc.pt Sistemas e Computadores mcsun!inesc!fmhv@uunet.uu.net Rua Alves Redol No 9, sala 208 Tel: +351(1)545150 Ext. 216 Apartado 10105
steve@Advansoft.COM (Steve Savitzky) (04/16/91)
In article <1991Apr12.201053.18348@visix.com> amanda@visix.com (Amanda Walker) writes: brnstnd@kramden.acf.nyu.edu (Dan Bernstein) writes: Every working definition of ``engineering'' appears to exclude computer science. Indeed. A large software project is, in my experience, more like a book than it is like a building. Do we call novelists "prose engineers?" Do we call movie producers "audio-visual entertainment engineers?" No, and I don't think we should. My business card says "software engineer," and I do what is usually called "software engineering," but I think that it's a misnomer. The tax return for my at-home-on-the-side business says "author"; I write software, prose, and songs, and find that the three activities are more similar than different. Perhaps we should start borrowing our terminology from the arts: "producer" for the person who puts up the money and controls the budget, "director" for the one with overall artistic control, "designer" for the one who creates the look and feel of the project, and "writer" for the ones doing the programming and the technical writing (hopefully mostly the same people). Part of my problem is that, put simply, I don't think we know enough about what software is to make it into an engineering discipline or a trade. Software is so mutable and responsive that I can't think of its creation and manipulation as anything except an artistic disclipline. I think Knuth was absolutely right when he called his book "The Art of Computer Programming". It's possible to treat software as a product of engineering only when it's embedded in a non-interactive system. As soon as the software has a user interface, artistic considerations take over. -- \ --Steve Savitzky-- \ ADVANsoft Research Corp \ REAL hackers use an AXE! \ \ steve@advansoft.COM \ 4301 Great America Pkwy \ #include<disclaimer.h> \ \ arc!steve@apple.COM \ Santa Clara, CA 95954 \ 408-727-3357 \ \__ steve@arc.UUCP _________________________________________________________
schwartz@groucho.cs.psu.edu (Scott Schwartz) (04/16/91)
steve@Advansoft.COM (Steve Savitzky) writes:
Perhaps we should start borrowing our terminology from the arts...
Look at the credits on a video game sometime. They are usually phrased
as you suggest.
cole@farmhand.rtp.dg.com (Bill Cole) (04/16/91)
Steve Savitzky writes: |> Every working definition of ``engineering'' appears to exclude computer |> science. |> |> I think Knuth was absolutely right when he called his book "The Art of |> Computer Programming". It's possible to treat software as a product |> of engineering only when it's embedded in a non-interactive system. |> As soon as the software has a user interface, artistic considerations |> take over. |> I'd extend that to any user interface -- including reports. Someone will spend time (maybe hours) with the reports we generate and 'readability' is a definite factor in how we format and produce the report. Programming is a creative endeavor at some level. We have to devise ways to overcome obstacles placed in our paths by an EE or another programmer. Many times, the solution to a problem is nothing short of the 'eureka' moment, that golden moment of insight that brings an unexpected fix to a difficult problem. If we were strictly engineers, we could down a catalog of routines and, by cleverly sticking them together, build a program. Each programmer has a catalog of routines they've either learned or built, it's true, and you could argue that this constitutes a form of engineering catalog that EEs or CEs use to build computers or buildings. The difference is that programmers are tasked to build new components if they can't find one in the own catalog of program components. How many CEs build bridges out of self-designed components? The views expressed here are opinions formulated for and by myself, /Bill
jeff@hilbert.uucp (Jeff Freedman) (04/18/91)
In article <STEVE.91Apr15150739@diana.Advansoft.COM> steve@Advansoft.COM (Steve Savitzky) writes: > brnstnd@kramden.acf.nyu.edu (Dan Bernstein) writes: > book than it is like a building. Do we call novelists "prose > engineers?" Do we call movie producers "audio-visual entertainment > engineers?" No, and I don't think we should. My business card says Note also that we don't call a person a novelist unless he or she has actually written a book, but it seems that there are quite a few "experienced software engineers" running around who couldn't write an ounce (at 1K byte/oz.) of software. I'd like to believe that the best software being written nowadays is at companies where the programmers are considered to be authors, rather than engineers; and that they are judged by their ability to design and write code, rather than their ability to spout jargon and write memos. -- Jeff Freedman
jrj1047@ritvax.isc.rit.edu (JARRETT, JR) (04/18/91)
In article <1991Apr16.124522.16592@dg-rtp.dg.com>, cole@farmhand.rtp.dg.com (Bill Cole) writes... >|> I think Knuth was absolutely right when he called his book "The Art of >|> Computer Programming". <stuff deleted> > > If we were strictly engineers, >we could down a catalog of routines and, by cleverly sticking them >together, build a program. Each programmer has a catalog of routines >they've either learned or built, it's true, and you could argue that >this constitutes a form of engineering catalog that EEs or CEs use to >build computers or buildings. The difference is that programmers are >tasked to build new components if they can't find one in the own catalog >of program components. How many CEs build bridges out of self-designed >components? > Thinking of software development more as art than engineering, this process seems more akin to working out different brush strokes on canvas, or mastering different styles of writing. They work, unmodified in a lot of cases, but sometimes need to change.
rick@tetrauk.UUCP (Rick Jones) (04/18/91)
In article <1991Apr16.124522.16592@dg-rtp.dg.com> cole@farmhand.rtp.dg.com (Bill Cole) writes: > > [...] Many times, the solution to a problem is nothing short of >the 'eureka' moment, that golden moment of insight that brings an >unexpected fix to a difficult problem. If we were strictly engineers, >we could down a catalog of routines and, by cleverly sticking them >together, build a program. Each programmer has a catalog of routines >they've either learned or built, it's true, and you could argue that >this constitutes a form of engineering catalog that EEs or CEs use to >build computers or buildings. The difference is that programmers are >tasked to build new components if they can't find one in the own catalog >of program components. How many CEs build bridges out of self-designed >components? I think all fields of engineering require a measure of creativity to be successful, it's not exclusive to sofware engineering. Thinking of auto-engineering, there are a number of landmark innovative cars which changed the rule book - the VW beetle and Alec Issigonis' Mini are just two examples. These cars weren't made by assembling all the standard bits from the catalogue; they were the result of saying "why not do it like this instead? - I don't care if it hasn't been done before!". The _engineering_ is then applying the knowledge of the underlying science - calculating stresses, etc - so that the new components for the new design can be built. This is what pioneers do; the followers then copy the ideas, and use these new components, or more likely the component _designs_, to build the copies. Soon the new bits are part of the standard catalogue. Engineers in all disciplines can and do come up with novel designs, and if the components for what they want don't exist, they build them. Perhaps the difference is that they look for existing components first, and resort to building them only if they don't already exist. But more significantly, even if they do build their own components, in most cases they will use standard _designs_ to build them from. Software at all levels is a design process, _not_ a production process. So the magical notion of "reuse" is a reuse of design, which may or may not exist in the form of actual compilable source code. Which brings us full circle - the most important aspect of technical software documentation, OO or otherwise, is the description of the design principles of the module rather than the actual implementation mechanism. -- Rick Jones, Tetra Ltd. Maidenhead, Berks, UK rick@tetrauk.uucp Any fool can provide a solution - the problem is to understand the problem
bw@uecok.ecok.edu (Bill Walker CS Dept. Chairman) (04/19/91)
Re: definition of "software engineering". I had opportunity to have lunch with several prominent computer scientists (whose names I believe most folks would recognize.) The conversation can be summarized like this: Prof A: What is "software engineering"? Prof B: The study of large programs. Prof A: What is a large program ? Prof C: Any program we do not understand. Prof A: By this logic, software engineering is the study of programs we do not understand. How do we escape this dilemma ? Prof D: "Don't call it 'software engineering!'" Bill Walker bw@cs.ecok.edu
diamond@jit345.swstokyo.dec.com (Norman Diamond) (04/19/91)
In article <764@uecok.ECOK.EDU> bw@uecok.ecok.edu (Bill Walker CS Dept. Chairman) writes: > Prof A: What is "software engineering"? > Prof B: The study of large programs. > Prof A: What is a large program ? > Prof C: Any program we do not understand. > Prof A: By this logic, software engineering > is the study of programs we do not > understand. How do we escape this > dilemma ? > Prof D: "Don't call it 'software engineering!'" Wrong conclusion, I think. Engineering IS (partly) the study of situations that are incompletely understood. If a bridge will be a clone of an existing bridge (if it were possible to know that without having to do any studies to change unknown information into known information), then there will be no engineering work involved. If the conditions are slightly different from anywhere else, if study has to be done to understand the exact situation and to decide which scientific laws to apply to each part of the problem, that is engineering. -- Norman Diamond diamond@tkov50.enet.dec.com If this were the company's opinion, I wouldn't be allowed to post it.
kitchel@iuvax.cs.indiana.edu (Sid Kitchel) (04/19/91)
diamond@jit345.swstokyo.dec.com (Norman Diamond) writes: |->Wrong conclusion, I think. Engineering IS (partly) the study of |->situations that are incompletely understood. If a bridge will be |->a clone of an existing bridge (if it were possible to know that |->without having to do any studies to change unknown information |->into known information), then there will be no engineering work |->involved. If the conditions are slightly different from anywhere |->else, if study has to be done to understand the exact situation |->and to decide which scientific laws to apply to each part of the |->problem, that is engineering. Yep!! This is the Engineering School party line: engineering is the application of scientific law to practical problems. This (rather modern) definition of engineering is the basis for many REAL engineers being curious or critical of the use of "software engineer" as a title. But being argumentative and a reformed historian of science, I am forced to point out that engineering existed and succeeded long before Galileo and other Renaissance types invented the scientific study of strength of materials. Those many miles of Roman aquaduct still functioning in France and Spain do not know that the engineers that built them missed the 4 or 5 years at Purdue or MIT. Those Roman engineers succeeded without having "to decide which scientific laws to apply." Software engineers may well be succeeding before computer science grows out of its bastardized mathematics phase and also becomes a science. Software engineering may succeed without the full blessing of all academic computer scientists or institutionalized engineers. But the job it is trying to do is very tough and it may well not succeed. Only time and experience will tell, just as in science. Now back to parallelizing databases, --Sid -- Sid Kitchel...............WARNING: allergic to smileys and hearts.... Computer Science Dept. kitchel@cs.indiana.edu Indiana University kitchel@iubacs.BITNET Bloomington, Indiana 47405-4101........................(812)855-9226
styri@cs.hw.ac.uk (Yu No Hoo) (04/22/91)
<somebody> wrote: > Every working definition of ``engineering'' appears to exclude computer > science. I really don't know. Isn't this statement kind of ... conclusive? I don't it's a true statement. In article <1991Apr16.124522.16592@dg-rtp.dg.com> cole@farmhand.rtp.dg.com (Bill Cole) writes: > > [...] If we were strictly engineers, >we could down a catalog of routines and, by cleverly sticking them >together, build a program. Each programmer has a catalog of routines >they've either learned or built, it's true, and you could argue that >this constitutes a form of engineering catalog that EEs or CEs use to >build computers or buildings. The difference is that programmers are >tasked to build new components if they can't find one in the own catalog >of program components. How many CEs build bridges out of self-designed >components? I guess there are some CEs out there that would be offended by the above statement. If the last sentence was rewritten to the EE domain it would be equally untrue. Even if the statement was true it implies a very narrow definition of engineering. Maybe we need to define the words 'engineer' and 'engineering' before claiming that 'software engineering' is a contradiction. A very important part of an engineers work is to transform plans/requirements to product in a systematic manner. I see no reason for excluding the software engineer at this point. To be able to do his/her work the engineer must be able to communicate both wih fellow engineers and people from other professions. This is also true for the software engineer. However, software engineers do have a problem when it comes to standards and metrics. It's my personal opinion that producing software should be no different from producing cars. But state of art in software engineering is probably pre-Ford compared to the car industry. ---------------------- Haakon Styri Dept. of Comp. Sci. ARPA: styri@cs.hw.ac.uk Heriot-Watt University X-400: C=gb;PRMD=uk.ac;O=hw;OU=cs;S=styri Edinburgh, Scotland
jls@rutabaga.Rational.COM (Jim Showalter) (04/23/91)
>> If we were strictly engineers, >>we could down a catalog of routines and, by cleverly sticking them >>together, build a program. Isn't that what we DO? We gather together small chunks (e.g. various library routines) and big chunks (e.g. X-Windows), and paste together a system. At least, that's what a software engineer does--I've known a lot of hackers who want to start from a blank sheet of paper each time... >>The difference is that programmers are >>tasked to build new components if they can't find one in the own catalog >>of program components. As are all other engineers. Consider four-wheel steering. It didn't exist until a few years ago. Someone had to design it. It was built out of as many low-level units as possible (e.g. universal joints), but it was still a brand-new mechanism for automobiles. I can think of examples for any engineering discipline you'd care to name. >How many CEs build bridges out of self-designed >>components? If bridges are not very creative avenues for civil engineers, it is because civil engineering is a more mature discipline than software engineering. It is NOT because there is some fundamental difference between the two. Consider this: until the 1800's, HALF of all bridges built fell down. Why? Because the principles of civil engineering were not well-understood, and so much of it was empirical and artistic: more craft than science. It is different now, but I bet for a while there were members of the old guard who protested loudly that civil engineering was not a science and never could be...just like hackers proclaim loudly now. -- * The opinions expressed herein are my own, except in the realm of software * * engineering, in which case I borrowed them from incredibly smart people. * * * * Rational: cutting-edge software engineering technology and services. *
rh@smds.UUCP (Richard Harter) (04/24/91)
In article <jls.672364339@rutabaga>, jls@rutabaga.Rational.COM (Jim Showalter) writes: > >> If we were strictly engineers, > >>we could down a catalog of routines and, by cleverly sticking them > >>together, build a program. > Isn't that what we DO? We gather together small chunks (e.g. various > library routines) and big chunks (e.g. X-Windows), and paste together > a system. At least, that's what a software engineer does--I've known > a lot of hackers who want to start from a blank sheet of paper each time... No, we don't really do that, except perhaps in GPSS and Simula. First let me give an example of when it is done. If you take a series of UNIX utilities and connect them via pipes you are building a composite program by pasting components together. Now what is going on here? You have a number of standard software components, each with a single input and output channel, and a single standard connector with a standard data flow protocol (the byte stream). The UNIX input-process-pipe-...-pipe-process-output paridigm allows you to create composite programs in minimal time with maximal reusability of software when it is applicable. However it is essentially inadequate if we are looking at a bigger picture of program construction via reusability of components because of linear connectivity and the single type of data flow. What one would like is a defined set of data types, a library of software elements that operate on those data types, and a suite of connectors specific to those data types. With said facilities a graphical representation showing the elements and their connections would constitute an executable program description. A key point is that when one routine calls another, or when one routine needs to know about what another routine expects, reusability is compromised. Just speculative opinion, so don't get excited folks. -- Richard Harter, Software Maintenance and Development Systems, Inc. Net address: jjmhome!smds!rh Phone: 508-369-7398 US Mail: SMDS Inc., PO Box 555, Concord MA 01742 This sentence no verb. This sentence short. This signature done.
kambic@iccgcc.decnet.ab.com (George X. Kambic, Allen-Bradley Inc.) (04/26/91)
In article <1991Apr17.175106.5581@hilbert.uucp>, jeff@hilbert.uucp (Jeff Freedman) writes: > In article <STEVE.91Apr15150739@diana.Advansoft.COM> steve@Advansoft.COM (Steve Savitzky) writes: >> brnstnd@kramden.acf.nyu.edu (Dan Bernstein) writes: >I'd like to believe that the best software being written > nowadays is at companies where the programmers are considered to be authors, > rather than engineers; and that they are judged by their ability to design and > write code, rather than their ability to spout jargon and write memos. Novels have critics and readers with opinions. Software must unequivocally meet requirements at some point or it "don't fly". You need engineers for that. GXKambic standard disclaimer
jeff@hilbert.uucp (Jeff Freedman) (04/28/91)
In article <4365.2816ec94@iccgcc.decnet.ab.com> kambic@iccgcc.decnet.ab.com (George X. Kambic, Allen-Bradley Inc.) writes: >Novels have critics and readers with opinions. Software must unequivocally >meet requirements at some point or it "don't fly". You need engineers for >that. Much software also has critics and users with opinions. And novels do have to meet some requirements, such as decent grammar and coherency. Perhaps we're arguing from different ends of the industry. I would probably agree that the embedded software controlling an anti-lock braking system is closer to "engineering", while a fantasy roll-playing game is closer to "art". -- Jeff Freedman
kambic@iccgcc.decnet.ab.com (George X. Kambic, Allen-Bradley Inc.) (04/30/91)
In article <1991Apr27.231303.14133@hilbert.uucp>, jeff@hilbert.uucp (Jeff Freedman) writes: > In article <4365.2816ec94@iccgcc.decnet.ab.com> kambic@iccgcc.decnet.ab.com (George X. Kambic, Allen-Bradley Inc.) writes: > Much software also has critics and users with opinions. And novels do have > to meet some requirements, such as decent grammar and coherency. Perhaps > we're arguing from different ends of the industry. I would probably agree > that the embedded software controlling an anti-lock braking system is closer > to "engineering", while a fantasy roll-playing game is closer to "art". Maybe we are, but without being too far apart I think. Novels and bridges do not appear to be adequate models to explore these issues. Since I have spent a lot of time testing software, I am very concerned about its robustness and error tolerance under any circumstances. I want that sucker to work and be easily repairable in the maintenance life cycle. BUT, it must have the quality factors of usability, etc., that make customers salivate when it is demonstrated to them so they buy right then and there. Fundamentally the process is a continuum wherein each person involved in the creation must respect the opinion of others who are also involved. Each segment, marketing, sales, engineering, is fundamentallu responsible for measuring their part of the process and improving it. This includes, if you will, "artistic" influences that may form the external appearance of the HMI. A word appeared recently in another context that is applicable, and that word is discipline. There are many disciplines required of us to properly manufacture software. There is no one sole source of knowledge or ideas for making the product better. Also, self discipline is required; that discipline that makes each one of us do the finest work possible, measure its quality, self criticize it, and start again. GXKambic No time to think up a stunning disclaimer
cole@farmhand.rtp.dg.com (Bill Cole) (05/03/91)
Jim Showalter replies to Bill Cole: |> >> If we were strictly engineers, |> >>we could down a catalog of routines and, by cleverly sticking them |> >>together, build a program. |> |> Isn't that what we DO? We gather together small chunks (e.g. various |> library routines) and big chunks (e.g. X-Windows), and paste together |> a system. At least, that's what a software engineer does--I've known |> a lot of hackers who want to start from a blank sheet of paper each time... Yup, we use lots of 'off the shelf' stuff and almost all of it has lots of bugs in it. But we design and implement the interfaces ourselves. It may be that you and I write perfect software (well, I don't know about you....), but we have virtually no control of that commodity stuff. Unlike the CE who's building a bridge and knows the strength of his materials and can reject materials that are sub-standard. |> >>The difference is that programmers are |> >>tasked to build new components if they can't find one in the own catalog |> >>of program components. |> |> As are all other engineers. Consider four-wheel steering. It didn't exist |> until a few years ago. Someone had to design it. It was built out of as |> many low-level units as possible (e.g. universal joints), but it was still |> a brand-new mechanism for automobiles. I can think of examples for any |> engineering discipline you'd care to name. My point was that many 'software engineers' believe that they are the only one capable of some particular feat even though that same feat may have been accomplished hundreds of times before. The difference in in the frame of mind the engineer brings to the project. |> >How many CEs build bridges out of self-designed |> >>components? |> |> If bridges are not very creative avenues for civil engineers, it is because |> civil engineering is a more mature discipline than software engineering. It |> is NOT because there is some fundamental difference between the two. Consider |> this: until the 1800's, HALF of all bridges built fell down. Why? Because the |> principles of civil engineering were not well-understood, and so much of it |> was empirical and artistic: more craft than science. It is different now, but |> I bet for a while there were members of the old guard who protested loudly |> that civil engineering was not a science and never could be...just like hackers |> proclaim loudly now. And half of the bridges stayed up. We don't understand our discipline well enough to state many 'well-defined' priniciples. I've seen too many problems surface because the software engineer didn't deal with situations that "couldn't" happen, even though they did. /Bill We agree more than it might appear.