tomr@dbase.A-T.COM (Tom Rombouts) (03/20/91)
I suppose this might be an old question, but I have not read this group in a while, save for 250 postings just now. Essentially, I am seeking some analysis or critiques of C++ compared and contrasted with both C and other OOPS languages such as Eiffel, Smalltalk, possibly Modula/Oberon, etc. ( I have one already, "Why Use C++?" that was presented by Paul Gross of Borland at the Software Development '91 conference in February.) References to major publications would be preferred, although we can track down obscure things eventually. To start things off, here are some comments by a knowledgeable friend of mine, although it is not my intent to start a long thread on the pros or cons of C++: 1. A lack of regular C tools such as lint for C++. (This is obviously only a temporary problem.) 2. Difficulty of mixing and matching classes from different vendors/sources. 3. Great expertise required to properly build classes from scratch. 4. Certain things (he used the example of a stack class) just have too much overhead and must be done in "regular" C anyway to get adequate performance. Again, I apologize if such references already exist in some sort of FAQ file or are often discussed. Tom Rombouts Torrance 'Tater tomr@ashtate.A-T.com V:(213)538-7108
steve@taumet.com (Stephen Clamage) (03/24/91)
tomr@dbase.A-T.COM (Tom Rombouts) writes: >To start things off, here are some comments by a knowledgeable >friend of mine... I think your friend is less knowledgeable than he thinks he is. >2. Difficulty of mixing and matching classes from different >vendors/sources. How is this less of a problem in C? In some OO languages, everything is derived from the mother of all super-classes, which makes integration of new classes easier (execpt for name clashes). C++ deliberately does not do this, because runtime efficiency was considered a more important goal. (You can choose your own goals.) >3. Great expertise required to properly build classes from scratch. True in any language at any time. It is even harder in languages which do not have data structuring. Classes actually simplify the process, as you can properly modularize your design. >4. Certain things (he used the example of a stack class) just >have too much overhead and must be done in "regular" C anyway >to get adequate performance. False. C++ typically involves less overhead to accomplish the same task than C, especially if you are using a compiler which goes directly to object code rather than via C intermediate code. I cite as an example a project of our own of about 30,000 lines of source code which we are converting from C to C++. We have converted all the string processing to use carefully designed string and storage- management classes, and achieved a 20% reduction in both source and object code size. We haven't measured the runtime improvement precisely, but the code is faster. The operations we do could be simulated in C code, but the C code would be very hard to understand, use, debug, and modify. The C++ code is very straightforward. I challenge your friend to write a stack class in conforming ANSI C which I cannot write at least as efficiently in C++. (He can't win, since his program will also be a C++ program, possibly with minor adjustments.) But I think I my C++ program will be more reliable and easier to use than the C program he writes, and at least as efficient. -- Steve Clamage, TauMetric Corp, steve@taumet.com
mikeg@c3.c3.lanl.gov (Michael P. Gerlek) (03/25/91)
In article <633@taumet.com> steve@taumet.com (Stephen Clamage) writes: > C++ typically involves less overhead to accomplish the same task than > C, especially if you are using a compiler which goes directly to > object code rather than via C intermediate code. I cite as an example > a project of our own of about 30,000 lines of source code which we are > converting from C to C++. We have converted all the string processing > to use carefully designed string and storage- management classes, and > achieved a 20% reduction in both source and object code size. We > haven't measured the runtime improvement precisely, but the code is > faster. The operations we do could be simulated in C code, but the C > code would be very hard to understand, use, debug, and modify. The > C++ code is very straightforward. I've just started playing with C++, and keep hearing that "they say" C++ won't cut it for codes where performance issues are critical, so I'm happy to hear "they" may be wrong. So... Leaving aside issues of how hard the C code would be to understand, maintain, debug, etc, does anyone have any details or references on just how much more efficient C++ code is over C? - runtime peformance, source/object code size, and so on? Do Stephen's claims really hold true using a compiler that *does* go through intermediate code? At first blush this doesn't seem intuitive. (A "tight" piece of C++ code has to be translated into a "tight-with-overhead" piece of C code. What does the "overhead" consist of?) Finally, Stephen says his string processing uses "carefully designed" management classes. How "careful" need one be in order to exceed the source and object code size of a traditional C program? I'm sure this is largely FAQ stuff, but any pointers would be appreciated. -- -[mpg] mikeg@lanl.gov "Hell on (two) wheels."
black@blake.u.washington.edu (Jim Black) (03/25/91)
I've been using C++ for a good while now, and I like it a lot more than C. I find I'm getting things done faster and faster in C++ as time goes on, especially compared with doing them in C. Constructors and destructors, for one, provide so much more support for getting things done in one place vs exploding interfaces in C and having error-prone open/close start/end procedure calls floating all over. As for performance, as you learn what the language is doing I don't think there's any reason that C++ is inherently weaker/slower than C - tho you can naively code powerful and inefficient constructs that look simple and do too much work ... something you couldn't do in C. Anyway, by far the biggest criticism I would level at C++ is that it's a hard language to learn - much harder than C to become expert at. I've heard others on the net echo this "steep learning curve" experience. I won't go back to C, but I wouldn't underestimate this cost either. Also I find it's lack of support for type-tags awkward: esp. for persistent objects (encouraging an impedance-mismatch problem with objects in memory vs. on-disk); and the fact that all virtual functions must take the same base-class argument types is awkward too (instead of more specialized versions of the same arguments, for example). Lack of exceptions is a bitch - but that's going away, I hope... (If anyone has written some preprocessor tools to implement C++ exceptions and templates as in ARM for the short-term, till compiler support is available - please let me know!) -- Jim Black (black@blake.u.washington.edu)
rfg@NCD.COM (Ron Guilmette) (03/25/91)
In article <MIKEG.91Mar24104126@c3.c3.lanl.gov> mikeg@c3.c3.lanl.gov (Michael P. Gerlek) writes: > >I've just started playing with C++, and keep hearing that "they say" >C++ won't cut it for codes where performance issues are critical, so >I'm happy to hear "they" may be wrong. So... Well, you know what they say: "All generalizations are false... including this one." :-) >Leaving aside issues of how hard the C code would be to understand, >maintain, debug, etc, does anyone have any details or references on >just how much more efficient C++ code is over C? - runtime peformance, >source/object code size, and so on? When the issue of C vs. C++ performance comes up, the reference most often cited seems to be: A C++ Interpreter for Scheme by Vincent F. Russo & Simon M. Kaplan In: USENIX 1988 C++ Conference Proceedings The authors claim that they converted a system (the Scheme interpreter) from C to C++ and it actually got faster. As one more (albeit less relevant) data point, my protoize program is written in that strange dialect which lies in the intersection of ANSI C and C++, so it can be compiled with either and ANSI C compiler, or with cfront (and then a C compiler) of with g++. Once I tried to compile it all three ways, and then ran `size' on the resulting binaries only to find that the results were virtually identical in all three cases. That's pretty good because it indicates that in C++, you don't really pay any price for features that you don't use. >Do Stephen's claims really hold true using a compiler that *does* go >through intermediate code? At first blush this doesn't seem >intuitive. (A "tight" piece of C++ code has to be translated into a >"tight-with-overhead" piece of C code. What does the "overhead" >consist of?) If you declare virtual functions, you pay the price for those in terms of space because you get space allocated to hold virtual function pointer tables, but if you *use* the virtual functions that you declare, you may be regaining that space elsewhere in your program. >Finally, Stephen says his string processing uses "carefully designed" >management classes. How "careful" need one be in order to exceed the >source and object code size of a traditional C program? I think that you got it backwards. If you are *sloppy* you will "exceed" the source and object code sizes of an "equivalent" C program. -- // Ron ("Shoot From The Hip") Guilmette // Internet: rfg@ncd.com uucp: ...uunet!lupine!rfg // New motto: If it ain't broke, try using a bigger hammer.
chip@tct.uucp (Chip Salzenberg) (03/27/91)
According to black@blake.u.washington.edu (Jim Black): >Also I find it's lack of support for type-tags awkward... Anything a type tag can do, a virtual function can do; in fact, most implementation of type tags would simply be compiler-generated virtual functions. A compiler implementation would be of little utility, and its lack is not a serious defect in the language. >... and the fact that all virtual functions must take the same base-class >argument types is awkward too (instead of more specialized versions of the >same arguments, for example). If you want or need to change an argument in an overriding virtual function, then it is obvious that you're not simply providing a new implementation of the _same_ function. So if this restriction appears to be a problem, blame your design. -- Chip Salzenberg at Teltronics/TCT <chip@tct.uucp>, <uunet!pdn!tct!chip> "All this is conjecture of course, since I *only* post in the nude. Nothing comes between me and my t.b. Nothing." -- Bill Coderre
jjb@hardy.u.washington.edu (Jim Black) (03/27/91)
In article <27EF838D.4115@tct.uucp> chip@tct.uucp (Chip Salzenberg) writes: >According to black@blake.u.washington.edu (Jim Black): >>Also I find it's lack of support for type-tags awkward... > >Anything a type tag can do, a virtual function can do; in fact, most >implementation of type tags would simply be compiler-generated virtual >functions. A compiler implementation would be of little utility, and >its lack is not a serious defect in the language. You say this all the time Chip. Others have tried to explain to you the problems encountered with persistent types here, and you don't seem to get it. Try to snapshot an object, and bring it back to life in a different address space. The virtual function table pointer is meaningless at that point, because it points into the wrong address space. You want some way of knowing the type of the object - SPECIFICALLY TO RESTORE THE VIRTUAL FUNCTION TABLE POINTER! Some current solutions to this problem, like that in the nihcl library, require every new class derived from Object to recurse to it's bases to store, call it's members to store, and store its own state. It's a mess, it's inefficient, and it's too much work. I want to just snapshot the whole "new"'d range of memory and be able to reinstantiate it at a later time - to do that I have to build type tags on top of the language, and this is more awkward than if they were provided. >>... and the fact that all virtual functions must take the same base-class >>argument types is awkward too (instead of more specialized versions of the >>same arguments, for example). > >If you want or need to change an argument in an overriding virtual >function, then it is obvious that you're not simply providing a new >implementation of the _same_ function. So if this restriction appears >to be a problem, blame your design. class B { operator <= (B&); }; class D : B { operator <= (D&); }; I dunno, call this bad design, but I don't think so. Right now, class D must instead code : class D : B { operator <= (B&); }; and typecheck the parameter to a D& (or blindly cast and damn the torpedos). If class D is going to typecheck the argument, we're back to implementing typetags on top of the language. Anyway, in many cases, Class D WANTS B's implementation (the virtual function table mechanism) to call D::operator<= -- ie, D is simply derived from B, a specialization, and what worked for B's will work for D's - provided they're type-consistent. You can use templates to achieve this today, at the cost of losing plug-and-play over some abstract base. Sometimes that's okay, sometimes it's unacceptable.