Software Project Management
Estimation
1
Introduction
•
Successful project is the one delivered on time
,within budget and within required quality.
•
The targets are set by the project managers
•
Realistic estimates are required.
•
Effort estimates affect costs, duration and the
delivery time.
•
Estimates are not predictions, it is a management
goal.
Difficulties in Estimation
•
They arise from the complexity and invisibility
•
Subjective Nature of estimating e.g. research shows that
people tend to underestimate the difficulty of small tasks
and over-estimate large ones.
•
Political Implications: different people within and
organization have different objectives.
•
Changing Technology: When technologies change very
fast, the experience of previous projects become
irrelevant.
Difficulties in Estimation contd.
•
Lack of Homogeneity of project experience:
with no change of technology, knowledge
about typical task durations may not be easily
transferred from one project to another
because of other differences.
Where Estimates are done
•
Strategic Planning:
•
Feasibility Study: Confirms the benefits of the potential system
will justify the cost.
•
System Specification: estimates at the design level shows that
the feasibility study is still valid.
•
Evaluation of Suppliers’ proposal
•
Project Planning
•
As the project proceeds, the accuracy of the estimates improves.
Problems with over- and under-
estimates
•
Parkinson’s Law: “ Work expands to fill the time available” ,
i.e. given and easy target staff will work less hard.
•
Brooks’ Law: “ Putting more people on late job makes it
later” i.e. the effort of implementing a project will go up
disproportionately with the number of staff assigned to it.
•
A big disadvantage of under-estimates is the effect of
quality.
•
Zeroth law of reliability: “ if a system does not have to be
reliable, it can meet any other ojective”
Elements of a Sound Estimate
•
To generate a sound estimate, a project manager
must have:
– A work breakdown structure (WBS), or a list of tasks
which, if completed, will produce the final product
– An effort estimate for each task
– A list of assumptions which were necessary for making the
estimate
– Consensus among the project team that the estimate is
accurate
7
Assumptions Make Estimates More
Accurate
•
Team members make assumptions about the work to
be done in order to deal with incomplete information
– Any time an estimate must be based on a decision that has
not yet been made, team members can assume the
answer for the sake of the estimate
– Assumptions must be written down so that if they prove to
be incorrect and cause the estimate to be inaccurate,
everyone understands what happened
– Assumptions bring the team together very early on in the
project so they can make progress on important decisions
that will affect development
8
Measure of work
•
We express work size independently of effort
•
We measures such as Source Lines of
Code(SLOC) or KLOC (Thousands of lines of
code).
•
Other Measures such as function points will
be discussed later on.
Software productivity
•
A measure of the rate at which individual
engineers involved in software development
produce software and associated
documentation.
•
Not quality-oriented although quality assurance
is a factor in productivity assessment.
•
Essentially, we want to measure useful
functionality produced per time unit.
Productivity measures
•
Size related measures based on some output
from the software process. This may be lines
of delivered source code, object code
instructions, etc.
•
Function-related measures based on an
estimate of the functionality of the delivered
software. Function-points are the best known
of this type of measure.
Measurement problems
•
Estimating the size of the measure (e.g. how
many function points).
•
Estimating the total number of programmer
months that have elapsed.
•
Estimating contractor productivity (e.g.
documentation team) and incorporating this
estimate in overall estimate.
Fundamentals of software
estimating
•
Historical data: Information about past projects
although possible difference between
programming language and the experience of
staff might exist.
•
International Software Benchmarking Standards
Group (ISBSG) maintains a database with
around 9,000 project projects.
•
http://isbsg.org
Software Estimation Techniques
•
Algorithmic Models: which uses “effort
drivers” representing characteristics of the
target system and the implementation
environment to predict effort.
•
Expert Judgement: based on the active
knowledge of staff
•
Analogy: Where similar , completed project is
identified and its actual effort is used as the
basis of the estimate.
•
Parkinson: where the staff effort available to
Software Estimation Techniques
contd.
•
Top down: where an overall estimate for the
whole project is broken down into the effort
required for component tasks.
•
Bottom-up: where component tasks are
identified and sized and these individual
estimates are aggregated.
Expert judgment
•
One or more experts in both software
•
development and the application domain use their
•
experience to predict software costs. Process
•
iterates until some consensus is reached.
•
Advantages: Relatively cheap estimation method.
•
Can be accurate if experts have direct experience
•
of similar systems
•
Disadvantages: Very inaccurate if there are no
experts!
Estimation by analogy
•
The cost of a project is computed by comparing the
project to a similar project in the same application
domain
•
Advantages: May be accurate if project data available
and people/tools the same
•
Disadvantages: Impossible if no comparable project
has been tackled. Needs systematically maintained
cost database
Parkinson's Law
•
The project costs whatever resources are
available
•
Advantages: No overspend
•
Disadvantages: System is usually unfinished
Cost Pricing to win
•
The project costs whatever the customer has to spend on
it
•
Advantages: You get the contract
•
Disadvantages: The probability that the customer gets
the system he or she wants is small. Costs do not
accurately reflect the work required.
•
How do you know what customer has?
•
Only a good strategy if you are willing to take a serious
loss to get a first customer, or if Delivery of a radically
reduced product is a real option.
Bottom-up estimation
•
Usable when the architecture of the system is
known and components identified.
•
This can be an accurate method if the system
has been designed in detail.
•
It may underestimate the costs of system level
activities such as integration and
documentation.
Procedural code-oriented approach
•
The bottom-up approach described above works in
activities like software coding:
•
Envisage the number and type of software modules in the
final system: most IS system consist of system operations
e.g. insert, update, amend, display, delete, print etc.
•
Estimate the SLOC of each identified module: draw a
diagram to visualize how many instructions would be
required to implement each identified procedure. The
estimator may look at existing programs which have similar
functional description to assist in this process
Top-down estimation
•
Usable without knowledge of the system
architecture and the components that might
be part of the system.
•
Takes into account costs such as integration,
configuration management and
documentation.
•
Can underestimate the cost of solving difficult
low-level technical problems.
Top-down approach and
parametric models
•
Top-down approach is associated with parametric or algorithmic models.
•
Project effort relates to variables associated with characteristics of the final
system.
•
Parametric Model:
effort = (system size) x (productivity rate) where system size might be in the
form of KLOC e.g. 3 KLOC, the productivity rate can 40 days per KLOC. It is a
matter of judgement
Bottom line: forecasting software development effort has two key components:
- assessing the amount of work needed
- assessing the rate of work at which the task can be done.
Top-down approach and
parametric models contd.
•
KLOC is the size driver indicating the amount of
work to be done, while developer experience is the
productivity driver affecting the productivity rate.
•
If you get the effort expended on previous projects
(work-days) and system size in KLOC
then productivity=effort/size u or using statistical
technical of least squares regression:
effort = constant1 + (size x constant2)
Function points
•
External Input types: Input transactions that
update internal computer files
•
External output types: are transactions that output
to the user, e.g. printed reports etc.
•
External Inquiry types: are transactions initiated by
the user but do not update the internal files.
•
Logical internal file types: are the standing files
used by the system.
Function points contd.
•
External interface file types: for output and
input that may pass to and from other
computer applications.
COCOMO-II
•
COCOMO: Constructive Cost Model refers to a group of
models.
•
The basic model is based on the equation
effort =c(size) k
Where effort is being mentioned as pm or person-months
consisting of units of 152 working hours, size was
measured in kdsi thousand lines of delivered and c and k
are constants
COCOMO-II
•
The first step is determine the system size in
terms of kdsi, the constants c and k are
available in the table.
•
The systems are classified into
– Organic
– Semi-detached
– Embeded
COCOMO-II contd.
•
Organic Mode: This is typical when relatively
small team developed software in a highly
familiar in-house environment and when the
system being developed is small and the
interface requirements are flexible
•
Embedded Mode: This is when the product
being developed has to operate within a very
tight constraints and changes to the system are
very costly
COCOMO-II
•
Semi-detached mode: This combined the
elements of the organic and the embedded
modes or has characteristics that came
between the two.
•
When the exponent k is greater than 1 i.e.
larger project are seen requiring
disproportionately more effort than smaller
one.
Estimate stages or COCOMO-II
•
Application composition: here the external
features of the system that the users will
experience are designed. Prototyping is normally
used.
•
Early design: the fundamental software structures
are designed.
•
Post architecture: here the software structures
undergo final construction, modification and tuning
to create a system that will perform as required.
Application Composition
•
To estimate the effort for application
composition , the counting of objects points is
recommended.
•
Externally apparent features of the software
i.e. screens, reports rather logical ones such
entity types.
•
This is useful when prototyping is used
Early design stage
•
FPs are recommended as the way of gauging a basic system size.
•
The FP count can convert into LOC equivalent by multiplying the
FPs by a factor for the programming language that is to be used.
•
This model can be used to calculate an estimate of person-
months.
Pm=A(size)(sf) x (em1)x (em2)x….x(emn)
Where pm is the effort in person-months, A is a constant which is
around 2.94 and size is measured in kdsi which have been
derived from FPs
Early design stage contd.
•
The scale factor is derived from :
sf=B + 0.01 x ∑ (exponent driver ratings)
Where B is a constant currently set at 0.91. The
effect of the exponent scale factor is to
increase the effort predicted for larger
projects i.e. to take account of diseconomies
of scale which makes larger projects less
productive.
Early design stage contd.
•
The qualities that dictate the exponent drivers used
to calculate scale factor are listed below.
•
Precedentedness (PREC): this quality is the degree
to which there are precedents or similar past cases
for the current projects.
•
Development flexibility (FLEX): this reflects the
number of different ways there are of meeting the
requirements. The less the flexibility there is , the
higher the value of exponent driver
Early design stage contd.
•
Architecture/ risk resolution (RESL): this reflects the degree of
uncertainty about meeting the requirements.
•
Team cohesion (TEAM) this reflects the degree to which there is a
largely dispersed team (perhaps in several countries) as opposed
to there being small tightly knit team.
•
Process Maturity (PMAT): the more structured and organized the
way the software is produced, the lower the uncetainty and the
lower the exponent driver.
•
Effort multipliers: adjust the estimate to take account of the
productivity factors but do not involve in economies or dis-
economies of scale
COCOMO-II scale factor value by
expert judgement
Driver Very low Low Nominal High Very High Extra High
PREC 6.20 4.96 3.72 2.48 1.24 0.00
FLEX 5.07 4.05 3.04 2.03 1.01 0.00
RESL 7.07 5.65 4.24 2.83 1.41 0.00
TEAM 5.48 4.38 3.29 2.19 1.10 0.00
PMAT 7.80 6.24 4.68 3.12 1.56 0.00
COCOMO81 constants
System type C K
Organic 2.4 1.05
Semi-detached 3.0 1.12
Embeded 3.6 1.20
Wideband Delphi
•
Wideband Delphi is a process that a team can
use to generate an estimate
– The project manager chooses an estimation team,
and gains consensus among that team on the
results
– Wideband Delphi is a repeatable estimation
process because it consists of a straightforward
set of steps that can be performed the same way
each time
43
The Wideband Delphi Process
•
Step 1: Choose the team
– The project manager selects the estimation team
and a moderator. The team should consist of 3 to 7
project team members.
•
The moderator should be familiar with the Delphi
process, but should not have a stake in the outcome of
the session if possible.
•
If possible, the project manager should not be the
moderator because he should ideally be part of the
estimation team.
44
The Wideband Delphi Process
•
Step 2: Kickoff Meeting
– The project manager must make sure that each team
member understands the Delphi process, has read the
vision and scope document and any other documentation,
and is familiar with the project background and needs.
– The team brainstorms and writes down assumptions.
– The team generates a WBS with 10-20 tasks.
– The team agrees on a unit of estimation.
45
The Wideband Delphi Process
•
Step 3: Individual Preparation
– Each team member independently generates a set
of preparation results.
– For each task, the team member writes down an
estimate for the effort required to complete the
task, and any additional assumptions he needed
to make in order to generate the estimate.
46
The Wideband Delphi Process
•
Step 4: Estimation Session
– During the estimation session, the team comes to a
consensus on the effort required for each task in the WBS.
– Each team member fills out an estimation form which
contains his estimates.
– The rest of the estimation session is divided into rounds
during which each estimation team member revises her
estimates based on a group discussion. Individual numbers
are not dicsussed.
47
The Wideband Delphi Process
•
Step 4: Estimation Session (continued)
– The moderator collects the estimation forms and plots
the sum of the effort from each form on a line:
Click to edit the outline text format
Second Outline Level
Third Outline Level
Fourth Outline Level
48
The Wideband Delphi Process
•
Step 4: Estimation Session (continued)
– The team resolves any issues or disagreements that are brought up.
•
Individual estimate times are not discussed. These disagreements are usually
about the tasks themselves. Disagreements are often resolved by adding
assumptions.
– The estimators all revise their individual estimates. The moderator
updates the plot with the new total:
Click to edit the outline text format
Second Outline Level
Third Outline Level
Fourth Outline Level
49
The Wideband Delphi Process
•
Step 4: Estimation Session (continued):
– The moderator leads the team through several rounds of estimates to
gain consensus on the estimates. The estimation session continues until
the estimates converge or the team is unwilling to revise estimates.
•
Step 5: Assemble Tasks
– The project manager works with the team to collect the estimates from
the team members at the end of the meeting and compiles the final task
list, estimates and assumptions.
•
Step 6: Review Results
– The project manager reviews the final task list with the estimation team.
50
Criteria for a Good Model
•
Defined—clear what is estimated
•
Accurate
•
Objective—avoids subjective factors
•
Results understandable
•
Detailed
•
Stable—second order relationships
•
Right Scope
•
Easy to Use
•
Causal—future data not required
•
Parsimonious—everything present is important
Conclusion
•
Estimates are really management targets
•
Collect as much information about previous projects as possible
•
Use more than one method of estimating
•
Top down approaches are used at earlier stages of project planning
while bottom-up approaches become more prominent at later stages
•
Be careful about other people’s historical productivity as basis for your
estimates, especially if comes from other environments.
•
Seek a range of opinions
•
Document your method of doing estimates and record all your
assumptions.