taylord@Software.Mitel.COM (Don Taylor) (12/09/90)
What is the current thinking on software cost estimation models? A few years ago Barry Boehm's COCOMO models seemed to be the definitive work of the day, and I was wondering where things had gone in this arena since then. I have a few questions: Is COCOMO still used? The databases used to derive the models were not huge (although they were not tiny either), has anyone systematically validated the models with much larger databases? (I am particularly interested in the Maintenance Models rather than the new development models). Are there software packages available that implement the COCOMO models? One of the problems that I have with COCOMO and other LOC-based models is that it is based upon a program parameter (lines of code) that is not known until the latter phases of a project. Is there a useful cost estimation model around that is based upon an early phase deliverable, say the requirements specification, that has been shown to work for a significant number of projects? COCOMO assumes that a waterfall model of sw. development is used. Although this is still widely used for large projects, it is falling out of fashion. Are there any cost estimation models available for the newer development models, eg. spiral development, rapid prototyping? As I said earlier my main interest at this time is in the maintenance models, it seems to me that one way to view these new development techniques is that they consist of a very truncated new program development phase and an earlier and longer maintenance phase. (Maintenance includes the addition of new functionality to an existing, delivered base as well as defect repair.) Has anyone tried to use the COCOMO maintenance models in this fashion? Are there any alternative models around for the maintenance phase? Thank you in advance, Don. Don Taylor (613)-592-2122 x 3007 mitel!taylord@uunet.uu.net Mitel Corp. ...!uunet!mitel!taylord 350 Legget Drive, Kanata Ontario, Canada, K2K 1X3 -- Don Taylor (613)-592-2122 x 3007 mitel!taylord@uunet.uu.net Mitel Corp. ...!uunet!mitel!taylord 350 Legget Drive, Kanata Ontario, Canada, K2K 1X3
alanw@ashtate (Alan Weiss) (12/13/90)
In article <5676@taylord> taylord@Software.Mitel.COM (Don Taylor) writes: >What is the current thinking on software cost estimation models? > >A few years ago Barry Boehm's COCOMO models seemed to be the definitive >work of the day, and I was wondering where things had gone in this arena >since then. I have a few questions: > >Is COCOMO still used? COCOMO, REVEC, etc are still used in more traditional development environments by Development/QA/Project Managers who want to verify their instincts :-) >Are there software packages available that implement the COCOMO models? SLIM (Software Life Cycle Management), by Quantitative Software Management, Inc. In 1987, when my version (2.0) was released the were located at: QSM 1057 Waverley Way McLean, VA 22101 (703) 790-0055 I make no claims that this address nor phone number is still valid. In my experience, SLIM's estimation skills varied widely with the nature of the project (system, applications, enabling) and YOUR estimation skills. It fell out of favor at one firm I worked at, but this was probably due more to the force-fed nature of the tool adaptation process. >One of the problems that I have with COCOMO and other LOC-based models >is that it is based upon a program parameter (lines of code) that is not >known until the latter phases of a project. Is there a useful cost >estimation model around that is based upon an early phase deliverable, >say the requirements specification, that has been shown to work for a >significant number of projects? I would STRONGLY suggest you look into Function Point Analysis. When tied into your already previously done :-) Quality Function Deployment Analysis (House of Quality), F(x) Point Analysis is MUCH more predictive, IMHO. The problem is two-fold in adaptation: 1. It is a LOT of up-front work, and requires very detailed Functional and Design specifications. 2. Political: it requires a LOT of up-front work :-) In short, you must have management and staff buy-in that this is worthwhile. The Urge to Code always comes .... >COCOMO assumes that a waterfall model of sw. development is used. >Although this is still widely used for large projects, it is falling >out of fashion. Are there any cost estimation models available for the >newer development models, eg. spiral development, rapid prototyping? Its a funny thing about the ol' Waterfall Model: like Mark Twain, rumors of its demise are greatly exaggerated. There is a HUGE gulf between what the researchers (such as Dr. Barry Boehm, for whom I have the greatest respect) are doing and what the worker-bees are doing. When there IS a process model, it is almost invariably some "tailored"/bastardized version of the Waterfall. When there ISN'T a development methodology/process model, its called "rapid prototyping." :-( The exception is clearly Object Oriented Systems. OOPS is a different beast, for which I am just now exploring. DOES ANYONE HAVE EXPERIENCE IN AN OOPS ENVIRONMENT? >Thank you in advance, > >Don Taylor (613)-592-2122 x 3007 mitel!taylord@uunet.uu.net >Mitel Corp. ...!uunet!mitel!taylord >350 Legget Drive, Kanata >Ontario, Canada, K2K 1X3 > Hope I've helped. My profession is Software Quality Assurance and System Test, as well as Project Management. I have the scars and burn marks to prove it! .__________________________________________________________. |-- Alan R. Weiss -- | These thoughts are yours for the| |alanw@ashton | taking, being generated by a | |alanw@ashtate.A-T.COM | failed Turing Test program. | |!uunet!ashton!alanw |---------------------------------| |213-538-7584 | Anarchy works. Look at Usenet! | |________________________|_________________________________|
locke@nike.paradyne.com (Richard Locke) (12/14/90)
In article <1990Dec12.215401.27859@ashtate> alanw@ashtate (Alan Weiss) writes: >In article <5676@taylord> taylord@Software.Mitel.COM (Don Taylor) writes: >>out of fashion. Are there any cost estimation models available for the >>newer development models, eg. spiral development, rapid prototyping? ... >The exception is clearly Object Oriented Systems. OOPS is a >different beast, for which I am just now exploring. DOES ANYONE >HAVE EXPERIENCE IN AN OOPS ENVIRONMENT? SPR's Checkpoint (tm) tool does cost and effort estimation for several development languages, including C++ and Objective C. You can tailor it's input somewhat based on the development steps your organization takes. By the way, I'd very much like to get in touch with anyone who uses Checkpoint to ask some questions and see if it's useful. -dick -- Dick Locke AT&T Paradyne Corporation {uunet,peora}!pdn!locke Mail stop LG-133 Phone: (813) 530-8241 P.O. Box 2826 Largo, FL 34649-2826 USA
davis@mwunix.mitre.org (David Davis) (12/14/90)
COCMO assumes a lot. It's parameters are basically drawn from anectodal information and are essentially quantifiactions of qualitative judgements. Of course estimating lines of code is difficult very early in a development program. Its espessciall inaccurate if the project is at all experimental or involves many new factorss, i.e., language, business area, design approach, or hardware. Also, any of these factors, and others, can be familiar to the industry at large but new to a particular set of project personnel..thats an experiment. Another model that is used is Price-S. It bases its estiamtes on historical data from similar projects, by category. It is much less dependent on estimating lines of code for modules no one has yet designed as is COCOMO. However, this area is still a black art and is subject to the normal kinds of abuse that estimates from bids and proposals always are. P -- ZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZ Dave Davis davis@mwunix.mitre.org MITRE Corporation McLean, VA me := disclaimer.all
rlandsma@bbn.com (Rick Landsman) (12/14/90)
I am currently involved in a measurement program know as Enterprise Software Planning led by Capers Jones at Software Productivity Research Inc. in Burlington Mass. . Capers developed a follow up methodology more applicable to real time software development known as feature points. It basically adds the number of algorithms with some weighting changes to the basic function point calculation developed by Albrecht while at IBM. He markets (what I consider the leading estimation and measurment analysis tool in the industry) a tool called "Checkpoint" which is a knowledge based modeling system that can estimate defect potential and removal efficiencies as well as productivity and reliability metrics based on function or feature points. In addition for those who want to ease into counting function points it contains a "backfire" mechanism that allows calculating function/feature points from KLOC and answers to project description and complexity questions. Any one interested in seeing what the output report (100 pages) looks like from the tool let me know and I would be glad to send an example a long. regards rick Test Signature
pcooper@eecs.wsu.edu (Phil Cooper - CS495) (12/15/90)
In article <1990Dec12.215401.27859@ashtate> alanw@ashtate (Alan Weiss) writes: > >I would STRONGLY suggest you look into Function Point Analysis. >When tied into your already previously done :-) Quality Function >Deployment Analysis (House of Quality), F(x) Point Analysis is MUCH >more predictive, IMHO. The problem is two-fold in adaptation: > > 1. It is a LOT of up-front work, and requires very > detailed Functional and Design specifications. > 2. Political: it requires a LOT of up-front work :-) > In short, you must have management and staff buy-in > that this is worthwhile. The Urge to Code always > comes .... > Ever try to do a FP analysis on the "hello, world" program? You might be surprised to find out that it will take about 250 LOC (in 'C') to do. At least according to the FP analysis conversion specs (from FP's to LOC). I suppose it MUST be more accurate for larger projects though. Phil Cooper
HUSTON@RELAY.Prime.COM (12/15/90)
I know of another estimation model called "Monte Carlo simulation". After you set up your tasks and dependencies and take a shot at an estimate, you give some indication of how probable it is that your estimate is close to reality (how much you trust it). The project is simulated with a bunch of different combinations of different tasks finishing late/on time, and a most probable estimate of time and finish is produced. I'm certainly no expert on Monte Carlo, but the above is my understanding. I've seen it used on a number of software projects with very good results (the project finishes pretty darn close to the estimated date). I don't know of commercially-available PM tools that use it, but I'm not "up" on the wealth of tools, so there very well may be some. Steve Huston PSI Special Systems Prime Computer, Inc.
laub@Software.Mitel.COM (Boniface Lau) (12/17/90)
Anyone serious about software cost estimation will find the following article extremely interesting to read: Lessons Learned from Modeling the Dynamics of Software Development by Tarek K. Abdel-Hamid and Stuart E. Madnick Communications of the ACM Dec/1989, p. 1426 - 1438