jlh@loral.UUCP (The Mad Merkin Hunter) (04/27/88)
I seem to have inherited a program that is, ahem, not in good shape. For example, there was a '#define OK 0' at the top of the code, and some of the routines actually used it. BUT, in addition to OK, it used 0, FALSE, and NULL used interchangeably! I couldn't believe it! I mentioned that this was not a good thing to do and the response was "Why not? I know it's 0 because I set it to 0!". I'm stumped. To me the answer is so obvious I can't think of anything better than "because". So what do I tell this person? Oh yeah, the rest of the program. Well, it took 2 weeks to convert it from microcrap C 4.0 to 5.0. Lots of little things, like version 4.0 evidently returns 0 (or NULL or FALSE or OK or whatever) when the function didn't have an explicit 'return(0)' at the bottom, but 5.0 correctly returns random numbers. Is hari-kari still honorable? Jim -- Jim Harkins Loral Instrumentation, San Diego {ucbvax, ittvax!dcdwest, akgua, decvax, ihnp4}!ucsd!sdcc6!loral!jlh
friedman@uiucdcsb.cs.uiuc.edu (04/28/88)
Jim Harkins (jlh@loral.UUCP) writes: > I seem to have inherited a program that is, ahem, not in good shape. > For example, there was a '#define OK 0' at the top of the code, and > some of the routines actually used it. BUT, in addition to OK, it used 0, > FALSE, and NULL used interchangeably! I couldn't believe it! > I mentioned that this was not a good thing to do and the response was "Why > not? I know it's 0 because I set it to 0!". I'm stumped. To me the > answer is so obvious I can't think of anything better than "because". > So what do I tell this person? Tell the person: (1) It's bad because it's confusing to read due to the inconsistency. You aren't writing just for the compiler's benefit, but for the humans who come after you and have to read and understand the program and modify it. (2) It's bad because, in the rare circumstance that one wanted to change the value of "OK", one might reasonably expect to change the one #define and presume that would get the job done. The inconsistent usage would destroy that reasonable expectation. H. George Friedman, Jr. Department of Computer Science University of Illinois at Urbana-Champaign 1304 West Springfield Avenue Urbana, Illinois 61801 USENET: ...!{pur-ee,ihnp4,convex}!uiucdcs!friedman CSNET: friedman@a.cs.uiuc.edu ARPA: friedman@a.cs.uiuc.edu
jk@hpfelg.HP.COM (John Kessenich) (04/29/88)
Analytical people often work under a fundamental axiom: Given two methods that achieve the same result, the simpler method is the better (more correct) method. There are lots of rational justifications of this, but the fact that it is so fundamental can leave you speechless when it is challenged. So is it easier to just use macros based on their names, or to have to keep track of what they are defined as and keep checking that this is semantically correct? (It is probably simpler for the programmer not to have to simulate cpp while writing code.) John Kessenich