speed@teklds.UUCP (05/30/84)
I think everyone in this group will get a kick out of the following letter which appears in the Reader's Comment section of the May 31 issue of Electronics magazine. C is the worst To the editor: I could not disagree more with your article on the C programming language ("The power and the portability," April 19, p. 152). Why does "i+=2" convey more information than "i=i+2"? It saves two characters, but so what -- it just makes the code harder to read. C has other confusing operators, too. For instance, in some cases "&" indicates "address of"; in others, "bit-wise addition"; and in still others, "bit-wise intersection and increment." In these days of constantly plunging memory costs, why don't we stop trying to save every last bit and start going back to the user interface and the human being it serves. Let's see programs written in English or its well-known derivative, the Basic programming language. It's readable, and compilers make it run faster than interpreters can. C is without doubt the worst language that I have ever had to work with. The whole point of writing programs is twofold. First, we want to control the computer. Second, we want to be able to read the program back in the future, when we have forgotten it or when someone else wants to read it. We should therefore try to convey as much meaning as possible in each line of code instead of boiling lines down to the point of unintelligibility. After all, a programmer's time is more expensive than memory. The letter is signed by a person from a firm in Anaheim, CA, but I will not repeat the name here in order to provide some level of protection to the 'quote' innocent 'unquote'.