0% found this document useful (0 votes)
49 views18 pages

Validation and Verification

Uploaded by

duygualsan1
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
49 views18 pages

Validation and Verification

Uploaded by

duygualsan1
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

4/28/2024

EMÜ322
Validation and Verification of
Simulation Models
Banu Yuksel Ozkaya
Hacettepe University
Department of Industrial Engineering
1

Purpose & Overview


• The goal of the validation process is
– To produce a model that represents the true
behavior closely enough for decision making
purposes
– To increase the model’s credibility to an
acceptable level

1
4/28/2024

Validation and Verification


• Verification is concerned with determining
whether the assumptions (or the model) have
been correctly translated into a computer
program.
• Purpose of verification is to assure that the model
(or the assumptions) is reflected accurately in the
computerized representation.
– Simple in concept.
– Difficult task in a large scale simulation system.

Model-Building,
Verification & Validation
Calibration & Validation Real
System

Conceptual Validation

Conceptual Model
1. Assumptions on system components
2. Structural assumptions, which define the
interactions between system components
3. Input parameters and data assumptions

Model Verification

Operational model
(Computerized
representation)
4

2
4/28/2024

Verification
• Verification asks the following question:
– Is the conceptual model (assumptions on system
components and system structure, parameter
values, abstractions and simplifications)
represented by the operational model (the
computerized representation)?

Verification
• Common sense suggestions for verification:
– Suggestion 1: In developing a simulation model,
write and debug the program in modules and
subprograms.
• Consider a 10000 statement simulation model.
• It is of poor practice to write the entire program before
attempting any debugging.
• This large program almost certainly will not execute and
determining the location of errors will be extremely
difficult.

3
4/28/2024

Verification
• Common sense suggestions for verification:
– Suggestion 1: In developing a simulation model,
write and debug the program in modules and
subprograms.
• The simulation model’s main program and a few of the
key subprograms should be written and debugged first.
• Next additional subprograms should be added and
debugged successively.

Verification
• Common sense suggestions for verification:
– Suggestion 2: It is advisable in developing large
simulation models to have more than one person to
review the program (structured walk-through of the
program)
• All members of the modeling team e.g. system analysts,
programmers, etc. are assembled in a room and each is
given a copy of a particular set of subprograms to be
debugged.
• The subprograms’ developer goes through the programs
but does not proceed from one statement to another
until everyone is convinced that the statement is correct.
8

4
4/28/2024

Verification
• Common sense suggestions for verification:
– Suggestion 3: Run the simulation under a variety of
settings of the input parameters and check to see
that the output is reasonable (often overlooked!!).
• Simple measures of performance may be computed
exactly and used for comparison.
• For instance, long run average utilization with s parallel
servers is l/sm

Verification
• Common sense suggestions for verification:
– Suggestion 4: Trace the system.
• The state of the simulated system, e.g. the contents of
the event list, state variables, certain statistical measures
are displayed just after each event occurs and are
compared with hand calculations to see if the program is
operating as intended.
• It is desirable to evaluate each program path as well as
the program’s ability to deal with extreme conditions.
• Many simulation packages provide capability to perform
trace.
10

5
4/28/2024

Verification
• Common sense suggestions for verification:
– Suggestion 5: The model should be run, when
possible, under simplifying assumptions for which
its true characteristics are known or can be easily
computed.
• M/M/s queuing system.

11

Verification
• Common sense suggestions for verification:
– Suggestion 6: With some types of simulation
models, it may be helpful to observe an animation
of the simulation output.
• Where the bottleneck occurs, etc.

12

6
4/28/2024

Verification
• Common sense suggestions for verification:
– Suggestion 7: Compute the sample mean and
sample variance for each simulation input
probability distribution, and compare with the
desired (or historical) mean and variance.
• This suggests that values are being correctly generated
from these distributions.

13

Verification
• Common sense suggestions for verification:
– Suggestion 8: Use a commercial package to reduce
the amount of programming required.
• You must also be extremely careful when using a
simulation package since it may contain errors of a subtle
nature.

14

7
4/28/2024

Calibration and Validation


• Validation is the process of determining
whether a simulation model is an accurate
representation of the system, for the
particular objectives of the study.
• It is the overall process of comparing the
model and its behavior to the real system

15

Calibration and Validation


• Calibration is the iterative process of
comparing the model to the real system and
making adjustments.
Compare model
Initial model
to reality

Revise

Real Compare revised First revision


System model to reality of model

Revise
Second
Compare second
revision of
revision to reality
model 16

8
4/28/2024

Calibration and Validation


• No model is ever a perfect representation of
the system
– The modeler must weigh the possible, but not
guaranteed, increase in model accuracy versus the
increased cost of validation effort.
(Like a saddle point)

17

Level of Detail
• A simulation practitioner must determine what
aspects of a complex real world system actually need
to be incorporated into the simulation model and at
what level of detail and what aspects can be safely
ignored.
• Modeling each aspect will seldom be required to
make effective decisions and might result in
excessive execution time.

18

9
4/28/2024

Level of Detail
• Guidelines to determine the level of detail:
– Define the specific issues to be investigated by the
study and measures of performance that will be
used for evaluation.
• Models are not universally valid but are designed for
specific purposes.
• The issues of interest should be stated.
• The performance measures should be specified.
• Problem formulation is usually done at an initial kickoff
meeting with people representing all key aspects of the
system being present.
19

Level of Detail
• Example: A dog-food manufacturer had a consulting
company building a simulation model of the
manufacturing line, which produced 1 million cans
per day at a constant rate.
– Very expensive and very slow to run a simulation model
that considers each can as a single entity.
– Cheaper and much faster to run a simulation model that
models the manufacturing process as a continuous flow.

20

10
4/28/2024

Level of Detail
• Guidelines to determine the level of detail:
– The entity moving through the simulation model
does not always have to be the same as the entity
moving through the corresponding system.
• A large food manufacturer built a simulation model of
its manufacturing line for snack crackers. Initially, they
tried to model each cracker as a separate entity, but the
computational requirements of the model made this
approach infeasible.
• The company was forced to use a box of crackers as the
entity moving through the model.
21

Level of Detail
• Guidelines to determine the level of detail:
– A mistake often made by beginning modelers is to
include an excessive amount of model detail.
• Start with a moderately detailed model.
• The adequacy of a particular version of the model is
determined in part by presenting the model to subject-
matter experts and managers.

22

11
4/28/2024

Level of Detail
• Guidelines to determine the level of detail:
– The level of detail should be consistent with the
type of data available.
• A model used to design a manufacturing system will
generally be less detailed than the one used to fine-
tune an existing system since little or no data will be
available.

23

Level of Detail
• Guidelines to determine the level of detail:
– In all simulation studies, time and money
constraints will be a major factor to determine the
level of detail.
– Do not have more detail in the model than is
necessary to address the issues of interest.
• You should make sure that the model must have
enough detail to be credible.
• It may be necessary to include the details not required
for model validity but required for credibility.

24

12
4/28/2024

Calibration and Validation


• No model is ever a perfect representation of
the system
– Three-step approach
• Build a model that has high face validity
• Validate model assumptions
• Compare the model input-output transformations with
the real system’s data

25

High Face Validity


• The model should appear reasonable to model users
and others who are acknowledgeable about the
system
– Especially important when it is possible to collect data
from the system
• Ensure a high degree of realism: Potential users
should be involved in model construction.
• Sensitivity analysis can also be used to check a
model’s face validity.

26

13
4/28/2024

Validate Model Assumptions


• General classes of model assumptions
– Structural assumptions: how the system operates
– Data assumptions: reliability of data and its statistical
analysis
• Bank example: customer queuing and service facility
in a bank
– Structural assumptions, e.g. customers waiting in one line
versus many lines, served FCFS versus priority
– Input data assumptions, e.g. inter-arrival time of
customers, service times for commercial accounts
• Verify data reliability with bank managers
• Test correlation and goodness of fit for data

27

Validate Input-Output Transformation


• Goal: Validate the model’s ability to predict future
behavior
– The only objective test of the model
– The structure of the model should be accurate enough to
make good predictions for the range of input data sets of
interest.
• One possible approach: use historical data that have
been reserved for validation purposes only
• Criteria: Use the main system responses of interest

28

14
4/28/2024

Validate Input-Output Transformation


• Example: One drive-in window serviced by one teller,
only one or two transactions are allowed
– Data collection: 90 customers during 11am to 1pm.
• Observed service times (Si, i=1,2….90)
• Observed inter-arrival times (Ai, i=1,2….90)
– Data analysis let to the conclusion that:
• Interarrival times: exponentially distributed with rate l=45
• Service times: N(1.1,0.2)

29

Validate Input-Output Transformation


• A model was developed in close consultation with
bank management and employees
• Model assumptions were validated
• Resulting model is now viewed as a “black box”:
Model output variables
Input variables Primary interest:
Poisson arrivals Y1 (teller’s utilization)
Uncontrolled l=45/hour: X11,X12…. Y2 (average delay)
variables Service times: N(D2,0.22), Model Y3(maximum queue length)
X21,X22…. “black box” Secondary interest:
f(x,D)=Y Y4 (observed arrival rate)
D1=1 (one teller) Y5 (average service time)
Controlled
variables
D2=1.1 min Y6 (sample variance of
D3=1 (one line) service times)
30

15
4/28/2024

Validate Input-Output Transformation


• Real system data are necessary for validation.
– Average delays should have been collected during the
same time period (11am to 1pm on the same Friday)
• Compare the average delay from the model Y with
the actual delay Z:
– Average delay observed Z=4.3 minutes, consider this to be
the true mean value m0=4.3.
– When the model is run with generated random variates X1n
and X2n, Y should be close to Z.
– Six statistically independent replications of the model,
each of 2-hour duration.
31

Validate Input-Output Transformation


• Compare the average delay from the model Y with
the actual model Z:
– Hypothesis testing: Evaluate whether the simulation and
the real system are the same (w.r.t. output measures)
H0: E(Y)=4.3 minutes
H1: E(Y)≠4.3 minutes
• If we reject H0 (at a significance level a), the current version of the
model (stated in H0) is rejected and the modeler needs to improve
the model
• If we fail to reject H0, there is no reason to consider the model
invalid

32

16
4/28/2024

Validate Input-Output Transformation

Average delay times


Simulation
model Y1,Y2,…..Y6 iid random variables

Replication Average Replication Average


Delay Delay
1 2.79 4 3.45
2 1.12 5 3.13
3 2.24 6 2.38

33

Validate Input-Output Transformation


• Hypothesis testing:
– Conduct the t test:
• Check the assumptions justifying a t-test, that the observations are
normally and independent distributed.
• Choose the level of significance (a=0.05) and a sample size (n=6)
• Compute the sample mean and sample standard deviation over n
replications:  n 2
1/ 2

n


 (Y − Y ) 
i


1 i =1
Y= Yi = 2.51 S =   = 0.81
n n −1
i =1  
 
 
• Compute the test statistics:

Y − m0 2.51 − 4.3
t0 = = = −5.24  −ta / 2,n −1 = −2.571
S n 0.82 6 34

17
4/28/2024

Summary
• Model validation is essential:
– Model verification
– Calibration and validation
– Conceptual validation
• Best to compare system data to model data and make a
comparison using a wide variety of techniques
• Some techniques covered (in increasing cost-to-value
ratios)
– Ensure high face validity by consulting knowledgeable persons
– Conduct simple statistical tests on assumed distributional forms
– Compare model output to system output by statistical tests
35

General Scheme
Validation
Establish credibility

Analysis
System Assumptions
and data
Programming
Verification

Correct results Simulation


available Make model Program
Present results
to runs
management Establish
credibility
Results used in Validation
decision making
process

36

18

You might also like