crowl@cs.rochester.EDU (Lawrence Crowl) (08/20/87)
In article <997@bsu-cs.UUCP> dhesi@bsu-cs.UUCP (Rahul Dhesi) writes: >I think that for too long language designers have tried to accomodate the >vagaries of EBCDIC, specifically, that it's a character set with holes. I think that for too long language designers have tried to accomodate the vagarities of ASCII. It has 'Z' < 'a' of all things! Not only that, 'A' <= z && z <= 'z' allows non-letter characters. The ISO Latin-n standard places various modified letters above delete! Clearly kludgy. >Must every language accomodate every whim and fancy of every badly-designed >architecture? The answer ought to be a resounding no. I assume this applies to character sets also. Since you are talking about EBCDIC, are you implying the 360 architecture was badly designed. This claim will need VERY good arguments to over-ride 25 years (almost) of success. >The general principle is quite simple: Those who choose to use badly-designed >machines should have to bear the burden of doing so without dragging the rest >of us down with them. Good, lets throw out all those bank accounts stored in EBCDIC. Lets hunt down and destroy all those floppies with (gasp) ASCII encoded memos. >The six-character limit on external names (I hope it's gone from the draft >now) is another example of not applying this philosophy. Why does it make any >sense for the rest of the world have to suffer from the limitations of old >linkers on old machines from the sixties? Well, if we just throw out C, a sixties language with many limitations, we have solved the problem, haven't we? Throwing out C would also reduce the net traffic. After all, we have the programming language of the future waiting for us, Ada. (Well, it at least dates from the seventies.) >We should certainly make an effort to accomodate all widely-used >architectures, but not at the expense of seriously distorting the language for >everybody else. It is the architecture's fault or C's fault that C needs some distorting? Remember, very few people have to distort Lisp to put it on a new architecture. -- Lawrence Crowl 716-275-8479 University of Rochester crowl@cs.rochester.arpa Computer Science Department ...!{allegra,decvax,seismo}!rochester!crowl Rochester, New York, 14627
dhesi@bsu-cs.UUCP (Rahul Dhesi) (08/21/87)
I asked, among other things: >>Must every language accomodate every whim and fancy of every badly-designed >>architecture? Lawrence Crowl responds with a lot of sarcasm that's worth ignoring, but one cliche'd misconception deserves response: >I assume this applies to character sets also. Since you are talking about >EBCDIC, are you implying the 360 architecture was badly designed. This claim >will need VERY good arguments to over-ride 25 years (almost) of success. The misconception here is that a broad user base implies high quality or elegance of design. Instead of offering VERY good arguments, I will simply offer three counterexamples without further comment. 1. The 8086 family of CPUs versus the 680x0 family of CPUs 2. The National Enquirer versus the Wall Street Journal 3. Family Feud versus the MacNeil/Lehrer Report -- Rahul Dhesi UUCP: {ihnp4,seismo}!{iuvax,pur-ee}!bsu-cs!dhesi
mjr@osiris.UUCP (Marcus J. Ranum) (08/21/87)
In article <8915@brl-adm.ARPA>, crowl@cs.rochester.EDU (Lawrence Crowl) writes: > > I assume this applies to character sets also. Since you are talking about > EBCDIC, are you implying the 360 architecture was badly designed. This claim > will need VERY good arguments to over-ride 25 years (almost) of success. Stone hammers, along with flint knives, showed more success (in years) than EBCDIC architecture, but nobody uses them anymore. Only the trailing edge of technology still supports 360 architecture... Arguing that your flint axe has had '2000 years of success' is not going to change the fact that the times have changed. Do you also favor laser-optical card reader technology ? > If we just throw out C, a sixties language with many limitations, we have > solved the problem, haven't we? Throwing out C would also reduce the net > traffic. We have the programming language of the future waiting for > us, Ada. (Well, it at least dates from the seventies.) Future ? Yeah - ADA will be with us a long time. Because the DoD has said they don't want anything else. I hope you're not going to argue that ADA's popularity is an indicator of how good it is... "Gee, ADA must be great !! EVERYONE in the DoD uses it !! WOW !!". Your logic is not very good. Throwing out 'C' is not a bad idea. I will, especially if it gets junked up too badly be all the people who want it to run on their machines, under their braindead architecture. I suppose next the FORTRAN programmers will be asking the 'C' support a set of FORTRAN intrinsics, for compatibility... > >>We should certainly make an effort to accomodate all widely-used >>architectures, but not at the expense of seriously distorting the language > > It is the architecture's fault or C's fault that C needs some distorting? > Very few people have to distort Lisp to put it on a new architecture. This is due to the amazing self-distorting nature of Lisp :-) In all seriousness, from what I gather, there are as many versions of Lisp out there as there are of 'C'... --mjr(); -- If they think you're crude, go technical; if they think you're technical, go crude. I'm a very technical boy. So I get as crude as possible. These days, though, you have to be pretty technical before you can even aspire to crudeness... -Johnny Mnemonic
henry@utzoo.UUCP (Henry Spencer) (08/23/87)
> I think that for too long language designers have tried to accomodate the > vagarities of ASCII. It has 'Z' < 'a' of all things! Not only that, > 'A' <= z && z <= 'z' allows non-letter characters. The ISO Latin-n standard > places various modified letters above delete! Clearly kludgy. Try coming up with an internationally-portable alternative. The plain fact is that there is *no* way to do sophisticated sorting using just the native collating sequence, no matter what that collating sequence is. Quite apart from international differences in collating sequence (including non-trivial complications like letter pairs that sort as if they were one letter), any sophisticated sort is going to have its own requirements for treatment of things like white space, abbreviations, etc etc. Changing the collating sequence because "it makes sorting harder" is like changing it because the bottom bits of the sequence "0123456789abcdef" aren't the sixteen hex digits. It means turning the world upside down to "solve" a problem that will have to be solved the hard way *anyway*. > ... are you implying the 360 architecture was badly designed. This claim > will need VERY good arguments to over-ride 25 years (almost) of success. Nonsense. Good design has only the most limited correlation with commercial success. IBM's strength is its marketing people, not its designers. Are *you* implying that the 8088 architecture was *well* designed?!? > Remember, very few people have to distort Lisp to put it on a new > architecture. Yeah, because they're distorting it to suit their own ideas already -- name three Lisp implementations that accept *exactly* the same language! :-) -- Apollo was the doorway to the stars. | Henry Spencer @ U of Toronto Zoology Next time, we should open it. | {allegra,ihnp4,decvax,utai}!utzoo!henry
minow@decvax.UUCP (Martin Minow) (08/24/87)
Several collegues comment on EBCDIC. Here is my contribution: 1. The main problem with EBCDIC is the lack of a common standard for the character semantics. Several years ago, the ANSI C committee distributed at least three incompatible EBCDIC "standards." 2. I don't believe that EBCDIC has been standardized by ANSI -- i.e., its contents are under the control of a single manufacturer. Of course, this might be an advantage. Unfortunately (as noted above), this has resulted in unspecified characters being added in different places. 3. I used to think that the fact that the alphabet was discontiguous was a problem. Unfortunately, ISO Latin 1 added two graphic characters (multiply and divide) in the middle of the "right-hand" alphabetic portion. 4. Some of the strangeness of EBCDIC resulted from its being an evolution of an earlier (BCD) standard, and the close connection of both standards to punch card codes. 5. Before berating EBCDIC for problems in sorting, it should be noted that there is no commonly acceptable standard for character order. For example, In Swedish, the letters run a..z, a-ring, a-two_dots, o-two_dots; while in Danish, a-ring follows o-two_dots. Any implementation that needs to order text alphabetically must apply fairly sophisticated procedures that are both language and country- specific. ANSI C attempts to solve this by adding localization macros and the strcoll() function. Martin Minow decvax!minow
artm@phred.UUCP (Disaster Master) (08/24/87)
In article <8915@brl-adm.ARPA> crowl@cs.rochester.EDU (Lawrence Crowl) writes: >In article <997@bsu-cs.UUCP> dhesi@bsu-cs.UUCP (Rahul Dhesi) writes: >>I think that for too long language designers have tried to accomodate the >>vagaries of EBCDIC, specifically, that it's a character set with holes. > >I think that for too long language designers have tried to accomodate the >vagarities of ASCII. It has 'Z' < 'a' of all things! Not only that, >'A' <= z && z <= 'z' allows non-letter characters. The ISO Latin-n standard >places various modified letters above delete! Clearly kludgy. > Wait a minute.... by "full of holes" are we referring to the code itself, or its original medium? Rahul, you're such a card :-) ................................................................................ My employers only concern themselves with the opinions they pay me for, and this definitely wasn't one of them. ................................................................................ Art Marriott ...uw-beaver!tikal!phred!artm
henry@utzoo.UUCP (Henry Spencer) (08/24/87)
> 2. I don't believe that EBCDIC has been standardized by ANSI -- i.e., its > contents are under the control of a single manufacturer... There is an ANSI EBCDIC, I believe, but everyone including IBM ignores it. The Unix "dd" command may be the only real implementation of it! -- Apollo was the doorway to the stars. | Henry Spencer @ U of Toronto Zoology Next time, we should open it. | {allegra,ihnp4,decvax,utai}!utzoo!henry