mike@hcr.UUCP (Mike Tilson) (07/12/89)
This is quite long, and concerns pricing and licensing more than it concerns technology. Press "n" now if not interested. Various readers of this group have commented unfavorably about AT&T's new pricing policy. As our firm has recently entered the ranks of C++ compiler vendors based on selling an enhanced version of AT&T's Release 2.0 product, I have been watching this discussion with interest. The complaints can be summarized as follows: 1. The price is too high. 2. AT&T could have "owned" the C++ market with a lower price. 3. There is no good way to license a network. 4. The price change was a big surprise. 5. "The price is too high, let's all get behind GNU G++, except it's not practical for me to do this...." I'd like to comment on each, and end with a few comments about our own C++ activity. 1. The price is too high. (Note: I can't speak for AT&T, these are simply my opinions.) The first release of C++ was as an unsupported research product. In other words, "here it is, make of it what you will, and we'd like a little money for it." Release 2.0 on the other hand is being treated like a product. There has been a development effort, followed by an extensive beta testing effort with multiple companies over an extended period. For a compiler product, $20,000 for source is not actually that large a price. For comparison, UNIX commercial source is now close to $100K (e.g. for 386 UNIX source.) Yes, both used to be cheaper, but in comparison to other companies how do these prices look? Most major commercial compilers don't even sell source! Anyone seen a list price for Microsoft C source code lately? What do VMS compilers cost for source, or VMS itself, for that matter? To its credit, AT&T offers considerable educational discounts for source. This is not often done by other vendors. One thing people haven't considered in their calculations is that AT&T also offers binary sublicensing rights. A large site can get the source code it needs for porting or modification, and then distribute MUCH LOWER COST binary copies to the machines that need them. If you have 500 workstations using C++, just how many of them are involved in modifying the compiler? One hopes that the main reason for using C++ is to *USE* C++, and not to hack the compiler. Just a few source copies should suffice for most organizations that need source -- most users on most workstations can get by just fine with a binary copy, and this will reduce the price a lot. (See AT&T licensing for details of binary licensing.) Finally, if C++ if going to be successful, then commercial companies will be selling "shrink-wrap" packaged products -- pre-built, tested, installable, with documentation, with added value, and with support. At the moment, AT&T is licensing technology. Product vendors will take the technology and build end-user products for various platforms. This is really what most users want. Only a few want to hack the compiler source (and only a few are qualified to do so with any probability of success.) Most users want to take something off the shelf and have it work. The AT&T licensing policies seem to be aimed at compiler vendors and OEMs rather than end users. I believe this will benefit the end users, because it will spawn a highly competitive market. Users still have more choices than they do with most compiler products, because they too can license the base technology if they wish. It just isn't free. After much urging from the industry, AT&T has separated its UNIX Software Operation from its computer business. USO must now make a profit without being cross-subsidized by the AT&T computer business. (The industry wanted this because of the suspicion that AT&T might be tempted to warp software product development in a direction calculated to help sell more 3B processors.) Almost all of the cost of software is "fixed" (e.g. R&D) rather than "variable" (e.g. cost of tape, printing manual, shipping.) People tend to compare the variable cost to the price and complain, conveniently ignoring the other costs. 2. AT&T could have "owned" the C++ market with a lower price. Several have commented that AT&T could have swept the market by pricing C++ source so low that everyone would buy it in preference to any other solution. I hope a moment's reflection will convince people that giving AT&T a hammerlock on the C++ market might not be the best thing. Also, such a pricing strategy could be considered predatory and therefore illegal. The character of the market is about to change. Early adopters of the C++ technology had to have source -- there was no other choice. But if C++ is truly an important technology, then a variety of vendors will support the product on various platforms with various degrees of added value, extra tools, etc. This will not happen if AT&T totally owns all aspects of the market. 3. There is no good way to license a network. This is true. You have to buy a license for each machine you wish to use the software on. I'm not sure this is such a big problem. Nobody seems surprised that you have to pay to buy each machine on the network, but somehow buying the software seems to be a big problem. I think often the complaint boils down to "I wish it were cheaper/free." It *is* an administrative pain to keep track of licenses, and I agree that site/network licensing is desirable. Speaking from a software vendor point of view, it isn't that easy to do -- any policy has to work right for all networks and all sites. We'd like to see AT&T do something here, but better no policy than a wrong policy. 4. The price change was a big surprise. Yes, this seems poorly managed. A lot of people expected to buy Release 2.0 for a certain amount, and now don't have the funds in the budget. The Release 2.0 features and bug fixes are substantial, and there is a pent-up demand for things like multiple inheritance. AT&T never promised anything about Release 2.0 pricing, but they could have communicated better to their customers. I think some of the complaints are a result of the fact the people may now need to wait for binary vendors such as HCR and others to ship binaries on the appropriate platforms. This will not happen immediately on all platforms of interest. However, since licensees of C++ 1.2 can upgrade every previously licensed CPU for $10,000, I'm not sure this is such a big problem. If it isn't worth the price of another workstation, perhaps there is no economic justification for using the new language features. (Will the new features save your organization a couple of staff months over the next few years, e.g. a day or two per year per developer? If not, why bother and why is it an issue? If so, then the expenditure is justified.) 5. "The price is too high, let's all get behind FSF GNU G++, except it's not practical for me to do this...." The ultimate low cost option is to get your software "for free." So why does anyone ever buy a C++ compiler, when they could have G++? (RMS says "good question!" at this point. :-)) The answer is that nothing is really free. I am aware of a case where a local firm paid thousands of dollars to a consultant in order to get FSF compilers up and running on a common workstation. Since this firm has no compiler gurus, if they ever want an upgrade or if their base OS changes, etc., they'll probably pay the price again in the future. When you buy a commercial package, you have an expectation that you can "plug it in and go." This is worth something. In the long run it may be cheaper to buy a product than to get "free" software. If you have the skills and the time, "free" software may be for you, but make sure you've analyzed all the costs. Do people really want the cost of software to go down to zero, or near zero? If it goes exactly to zero, then the expected commercial R&D investment in creation of new software products will go to zero as well. If it goes to "near zero", then a few companies that can achieve massive market dominance can make money on volume, but nobody else can. (I'll note in passing that FSF seems to have released only packages that are clones -- albeit good quality value-added clones -- of existing already successful products, and with the exception of emacs everything being cloned is the result of development and marketing efforts carried on by profit-making, software-selling organizations. Perhaps this should tell us something. By the way, while I disagree fundamentally with the FSF world view, I do admire them for putting their money where their mouth is.) As a software vendor, we are investing in product development in the hope of receiving a return on that investment. We think our investment will produce useful software; the market will tell us if we are right. We'll only get a return on our investment if we can charge a fee based on the use (e.g. number of copies, number of users, etc.) of our software. It is not practical or reasonable for us to follow the FSF model that all revenue is consulting revenue (or donations, although if you wish you may send us some! :-)) I think users benefit from an active, competitive, and profitable software industry. What are we doing with AT&T Release 2.0 and C++ products? Our product has already been mentioned on the net. I don't want to make any sales pitch here, so I'll confine myself to some factual statements and some corrections to a posting made by <mark@intek01.UUCP>. This information is not intended as a product announcement, but rather as an illustration by example of what you might expect various compiler vendors to do on various platforms. HCR/C++ has been announced as being available for UNIX System V.3.2 on 386 architectures. It is based on AT&T Release 2.0. We are shipping the complete AT&T product ported to the 386, including the task library. Every copy is shipped with another HCR product called dbXtra. This is a window oriented debugger upward compatible with dbx. The version shipped with C++ cooperates with the compiler to debug using C++ names. Availability on other platform other than 386 UNIX V.3.2 has not yet been announced. The price on this platform is US$995 for the full package (C++ compiler, libraries, documentation, and debugger). The earlier posting implied dbXtra was "extra" -- it is included. Until Aug 31 the introductory price is US$499. Shipments commence this month. Talk to a sales person for more details, volume pricing, etc., etc. While it would be nice to have no competition :-) I'm sure other vendors will be also be active with the Release 2.0 technology. In addition, compilers not derived from AT&T technology will continue to be announced. I think C++ is here to stay, the main language features are stabilizing, and users are about to enjoy a competitive market. /Michael Tilson {...!hcr!mike} /HCR Corporation /130 Bloor Street West, 10th Floor /Toronto, Ontario, M5S 1N5 /CANADA /Phone: 416-922-1937 /Fax: 416-922-8397 (Note: please direct HCR/C++ enquiries to our sales staff and not to me. Thanks.)
richard@pantor.UUCP (Richard Sargent) (07/13/89)
Well, I've listened to the discussions about the AT&T pricing policy, and I've come to a simple conclusion: AT&T does NOT want to sell compilers to everyone and their cousin! How do I figure this? Well, take economics (please :-) If you jack up the price of your product 10 fold, you can expect a significant reduction in the sales of that product. Of course, this will depend on how strong the demand really is. If it is very strong, then AT&T can make a lot of money this way (happy stockholders!). Now, I don't believe that everyone wants to own their own compiler source (especially not when they have to pay big $ for that privilige[sp?]). So, I conclude that AT&T is trying to economically restrict sales to compiler companies rather than any and all software development companies. I'm guessing now, but I expect AT&T will sell a lot fewer than 10% of the number of copies they sold when the price was 1/10. I guess they don't feel the support costs are worth the revenue. Comments: please, flames: pass. Richard Sargent Internet: richard@pantor.UUCP Systems Analyst UUCP: uunet!pantor!richard
jacob@gore.com (Jacob Gore) (07/13/89)
/ comp.lang.c++ / mike@hcr.UUCP (Mike Tilson) / Jul 11, 1989 / > If [cost of software] goes exactly to zero, then the expected commercial > R&D investment in creation of new software products will go to zero > as well. ... > As a software vendor, we are investing in product development in the > hope of receiving a return on that investment. We think our investment > will produce useful software; the market will tell us if we are right. > We'll only get a return on our investment if we can charge a fee > based on the use (e.g. number of copies, number of users, etc.) of > our software. Might it be that your R&D costs are much higher than they have to be because you do not use existing Free (as in "unchained" -- I wish the word wasn't overloaded in English) software? You are not the only company who has reinvented a window-based debugger for C++, you know. I, as a user, would certainly benefit from your improvements more if you could save yourself (and me) money on "developing" the parts of the debugger that have long been hashed out. I think most people take the current economic "common sense" for granted, without examining what would happen to the R&D costs (which, I agree, do need to be recovered, one way or another) if the R&Ders could take advantage of work already completed by others. Sure, you can't make as much money selling a gcc-based compiler as you can selling your own (or sublicensed for a fee and then modified). But it doesn't cost you as much to develop it, either. And you benefit from continuous improvements and fixes made to the compiler by other people. In this new bright competitive world of multiple C++ vendors that you described, do you think they will all be helping each other with fixes to cfront? I think you will see less and less of that happening. I think the most glaring example of that is Unix itself. When a company picks up a new Unix port, or set it up on a new machine of theirs, the first thing they do is spend human-decades on fixing dormant bugs, nonportable code, and various limitations that they don't consider proper for a commercial OS. Most of those fixes are (or can be) of general nature, and would improve Unix code for other machines as well. But do all (or any) of these fixes find their way back to AT&T or Berkeley? Not on your life. "We invested hundreds of thousands of dollars to make these fixes... uhm, I mean, Value Additions, so let our competitors do their own work!" So now we have all these vendors out there providing Unix, and each one ended up doing the same work that the others did. What does that do to prices that the users see? That kind of competition is supposed to benefit users? > I think users benefit > from an active, competitive, and profitable software industry. So do I. But instead, we get a software industry where each wheel is reinvented hundreds of times. What a waste of time and money. -- Jacob Gore Jacob@Gore.Com {nucsrl,boulder}!gore!jacob
tom@elan.elan.com (Tom Smith) (07/13/89)
Mike Tilson accurately (in my opinion, of course) justified AT&T's strategy for the notorious C++ Release 2.0 pricing strategy. However, many of us who have been using C++ since 1.2 (or earlier), either in source form or from commercial vendors, are still to a degree being hung out to dry. Here's why: From article <1379@hcr.UUCP>, by mike@hcr.UUCP (Mike Tilson): > 4. The price change was a big surprise. > Yes, this seems poorly managed. A lot of people expected to buy Release > 2.0 for a certain amount, and now don't have the funds in the budget. > The Release 2.0 features and bug fixes are substantial, and there is a > pent-up demand for things like multiple inheritance. AT&T never promised > anything about Release 2.0 pricing, but they could have communicated better > to their customers. I think some of the complaints are a result of the > fact the people may now need to wait for binary vendors such as HCR > and others to ship binaries on the appropriate platforms. This will > not happen immediately on all platforms of interest. For around a year now the many people complaining about the propensity of bugs in Release 1.2 have been told "Fixed in 2.0". This implies that support of 1.2 by AT&T has ceased. It would be unreasonable to expect third-party compiler companies such as Glockenspiel or Oregon Software to fix problems in their distributions of Release 1.2 only to have it made obsolete by the 2.0 release when it becomes available; however, this introduces a support hiatus of at least a full year. When 2.0 is released (any minute now, actually), most compiler companies will be just beginning their porting efforts (commendations to HCR for some foresight here). One would expect commercial C++ releases to be generally available 6-9 months down the road, depending on the individual company's release cycle and development time. Unusual machines will take somewhat longer. Commercial product development using an "unsupported research" compiler is not for the faint-of-heart. Many companies were developing using Release 1.2 in anticipation of 2.0 availability in the near future. Now it seems that the expectations of an August release was misleading; the average development group will have to wait until around first quarter 1990 for a product-quality C++ compiler. The availability of source for a price accessable to the general public would have On a different vein: (also) From article <1379@hcr.UUCP>, by mike@hcr.UUCP (Mike Tilson): > 3. There is no good way to license a network. > This is true. You have to buy a license for each machine you wish to > use the software on. I'm not sure this is such a big problem. Nobody > seems surprised that you have to pay to buy each machine on the > network, but somehow buying the software seems to be a big problem. > I think often the complaint boils down to "I wish it were cheaper/free." This is a big problem for a company that provides their product on many platforms, and is willing to do the porting themselves (and swallow the source cost). Such a company typically has a set of development machines roughly of a one-to-one ratio with the developers, and a set of testing, or "porting lab" machines for qa and release distribution. Their could easily be an order of magnitude more porting machines, than developers, yet the number of compilers being used is really at most the number of people using them, not the number of machines. This type of environment is quite common, and has caused a demand in recent years for "floating licensing", whereby a company pays for as many copies of a product as will be used at one time, plus a premium for the ability to float those copies from machine to machine. Perhaps a floating source-license scheme is called for here. As always, these opinions are endorsed by no one other than myself. Thomas Smith Elan Computer Group, Inc. tom@elan.com, {ames, hplabs, uunet}!elan!tom
jima@hplsla.HP.COM (Jim Adcock) (07/14/89)
Seems like we're going to have some people using 2.0, and a lot of people using 1.2 for most of the next year, at least. And some people are going to be stuck indefinately on 1.2, because they're on an unusual machine that doesn't get vendor support. I hate to see people who have been loyal early supporters of C++ get stuck like this. And I hate to see bus bandwidth tied up indefinately in 1.2 vs 2.0 issues. Too bad, 'cuz it seems the 2.0 version of the language is much nicer. I continue to believe the pricing issue will be a C++ lock-out for many many people. I believe people have come to expect software upgrade costs of maybe $100, or 20% of the original purchase cost. Not tens of thousands of dollars. I think this represents a big setback for the C++ community. I guess I would have expected this of an ObjC, or an Eiffel.
jima@hplsla.HP.COM (Jim Adcock) (07/15/89)
Depends upon whose $10,000 [U.S.] your spending. Release 1 cost $2000. Release 2, after 30 Sept, costs $20,000. What price release 3.0? Gee, I take it back -- all of a sudden hack-cpp-parameterization looks a lot more attractive to me! :-)
nosmo@eiffel.UUCP (Vince Kraemer) (07/17/89)
In article <6590196@hplsla.HP.COM>, jima@hplsla.HP.COM (Jim Adcock) writes: > a big setback for the C++ community. I guess I would have expected this > of an ObjC, or an Eiffel. Ahem! Although I cannot speak for Stepstone, I feel compelled to respond to this, educating those who share this misconception about Interactive Software Engineering and our pricing policies. Currently, we are releasing a new version of Eiffel and LOWERING the prices. Just to give an idea, the Xenix version is $495 (basic system), $795 (developer's toolkit, with the library and all the tools). Net whisper > BUT YOU GUYS ONLY SELL A BINARY VERSION OF THE COMPILER. > WE WANT SOURCE. As Greg Minshall points out, in message 773@kinetics.UUCP, there are only two reasons that one may want a source license: 1. Port to a new platform. We, like others, are in this business to make a profit. This is not to say that we are not receptive to others doing ports to systems on which they have more experience. 2. To fix bugs in the compiler or LIBRARY. Fixing bugs in the library has been pointed out by two other people as being one of the major reasons why they have had to choose a source licence for C++, over depending on a third party BINARY version of the C++ translator and libC (References: Jim Hughes, message 1477@ns.network.com and Joseph Sacco, message 52@eileen.samsung.com). Our library comes as SOURCE with the Eiffel system. There are no complicated legal strings attached to this library of classes. There are also no run-time royalties of any kind, even if you include a copy of our run-time system (garbage collector and all) in C form. With the current change in C++ pricing, I hope that some you will take a second look at Eiffel. I've never had to hate hack-cpp-parameterizations, Vince Kraemer (nosmo@eiffel.com or ..!uunet!eiffel!nosmo)
clyde@hitech.ht.oz (Clyde Smith-Stubbs) (07/18/89)
From article <110001@gore.com>, by jacob@gore.com (Jacob Gore): > / comp.lang.c++ / mike@hcr.UUCP (Mike Tilson) / Jul 11, 1989 / >> If [cost of software] goes exactly to zero, then the expected commercial >> R&D investment in creation of new software products will go to zero >> as well. ... >> [deletions] >> We'll only get a return on our investment if we can charge a fee >> based on the use (e.g. number of copies, number of users, etc.) of >> our software. > > Might it be that your R&D costs are much higher than they have to be > because you do not use existing Free (as in "unchained" -- I wish the word > wasn't overloaded in English) software? Where do you get Free (as in "unchained") worthwhile software? FSF software is most certainly not "unchained". As I read the General Public Licence if I incorporate anything covered by it I may not make a profit by selling that new product. True "Public Domain" software (as in where the author has relinquished the copyright) is rare - or maybe I don't know where to look? Can anyone enlighten me? ------------------------ Clyde Smith-Stubbs HI-TECH Software, P.O. Box 103, ALDERLEY, QLD, 4051, AUSTRALIA. ACSnet: clyde@hitech.ht.oz INTERNET: clyde@hitech.ht.oz.au PHONE: +61 7 300 5011 UUCP: uunet!hitech.ht.oz.au!clyde FAX: +61 7 300 5246
tma@osc.COM (Tim Atkins) (07/19/89)
In article <185@eiffel.UUCP> nosmo@eiffel.UUCP (Vince Kraemer) writes: >only two reasons that one may want a source license: > >1. Port to a new platform. > We, like others, are in this business to make a profit. This is not > to say that we are not receptive to others doing ports to > systems on which they have more experience. > >2. To fix bugs in the compiler or LIBRARY. > There is, of course, a third reason. Namely that one has a love of languages, a lack of patience with established vendors and very strong ideas on how the existing languages should be extended and/or improved. From among Objective C, C++ and Eiffel, Eiffel comes the closest to having the features I believe good OO programming demands. My desires for Eiffel source fall more in a fourth category of admiration of language features and curiosity as to how they are implemented. Interesting additions to Eiffel would include constructs for shared and distributed objects including concurrency control mechanisms. Much of the existing library, in particular any Collection type that includes the notion of a cursor, would have to change to support sharing. My opinions, of course, are strictly my own. - Tim Atkins
jeff@aiai.uucp (Jeff Dalton) (07/20/89)
In article <1379@hcr.UUCP> mike@hcr.UUCP (Mike Tilson) writes: >[see below] While you've made a number of valid points, I don't agree with everything you said. In particular: >3. There is no good way to license a network. >This is true. You have to buy a license for each machine you wish to >use the software on. I'm not sure this is such a big problem. Nobody >seems surprised that you have to pay to buy each machine on the >network, but somehow buying the software seems to be a big problem. >I think often the complaint boils down to "I wish it were cheaper/free." If you want to see it that way, perhaps it does. But one might also note that while one must buy two cpus to have tow cpus, one does not have to buy two copies of some software in order to run it on two cpus at once. So software is significantly different from machines in this respect. >5. "The price is too high, let's all get behind FSF GNU G++, except > it's not practical for me to do this...." >The ultimate low cost option is to get your software "for free." So >why does anyone ever buy a C++ compiler, when they could have G++? >(RMS says "good question!" at this point. :-)) The answer is that >nothing is really free. I am aware of a case where a local firm >paid thousands of dollars to a consultant in order to get FSF >compilers up and running on a common workstation. Since this firm >has no compiler gurus, if they ever want an upgrade or if their >base OS changes, etc., they'll probably pay the price again [...] Well, that's just one case. It happens that g++ runs just fine on the workstations I use without any extra effort on our part. And it seems at least as likely that this will continue to be so in the future as it does that certain commercial products will survive into the same future. Not only that, getting commercial software fixed is often very difficult. G++ is not necessarily worse in this respect. >(I'll note in passing that FSF seems to have released only >packages that are clones -- albeit good quality value-added clones -- >of existing already successful products, and with the exception of >emacs everything being cloned is the result of development and marketing >efforts carried on by profit-making, software-selling organizations. Actually, it is not the case that everything available from FSF is a clone of some commercial product. Not only that. Many commercial products are "clones", and sometimes of software originally produced by Universities. >Perhaps this should tell us something.
jim@kaos.Stanford.EDU (Jim Helman) (07/21/89)
mike@hcr.UUCP writes:
3. There is no good way to license a network.
This is true. You have to buy a license for each machine you
wish to use the software on. I'm not sure this is such a big
problem. Nobody seems surprised that you have to pay to buy
each machine on the network, but somehow buying the software
seems to be a big problem. I think often the complaint boils
down to "I wish it were cheaper/free."
Whether it's good or not is debatable, but some packages, e.g.
FrameMaker, now come with a network license server which permits any
machines on a local network to run the package up to the licensed
number of copies.
I think it's a very sensible and economical way of licensing software,
since the cost for licensing a network is in proportion to the use of
the product.
How well it works and whether it will catch on remain to be seen.
Jim Helman
Department of Applied Physics P.O. Box 10494
Stanford University Stanford, CA 94309
(jim@thrush.stanford.edu) (415) 723-4940