0% found this document useful (0 votes)
83 views94 pages

ISO 17025 Proficiency Testing Guide

This document provides information about proficiency testing (PT) and inter-laboratory comparison (ILC) requirements for laboratories. Key points include: - PT and ILC are critical technical requirements for laboratories under ISO 17025. - Laboratories must participate in PT exercises run by authorized organizations to test the accuracy of their results. - ILC can be initiated by a laboratory but requires higher confidentiality and more involvement than PT. - The z-score is used to evaluate laboratory performance on PT and ILC exercises, with different criteria for satisfactory, questionable, and unsatisfactory scores. - Examples are provided of calculating z-scores to evaluate laboratory results from PT or ILC exercises.

Uploaded by

usman javaid
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
83 views94 pages

ISO 17025 Proficiency Testing Guide

This document provides information about proficiency testing (PT) and inter-laboratory comparison (ILC) requirements for laboratories. Key points include: - PT and ILC are critical technical requirements for laboratories under ISO 17025. - Laboratories must participate in PT exercises run by authorized organizations to test the accuracy of their results. - ILC can be initiated by a laboratory but requires higher confidentiality and more involvement than PT. - The z-score is used to evaluate laboratory performance on PT and ILC exercises, with different criteria for satisfactory, questionable, and unsatisfactory scores. - Examples are provided of calculating z-scores to evaluate laboratory results from PT or ILC exercises.

Uploaded by

usman javaid
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd

ISO 17025

TECHNICAL
REQUIREMENTS
Clause 5.1 to 5.10

[Link] 1
Chapter 6

CRITICAL TECHNICAL
REQUIREMENTS
CRITICAL REQUIREMENTS

 PT and ILC (5.9)

 Z-Score (5.9)

 Estimation of Uncertainty (5.4.6)

 Statistical Analysis (5.9)

[Link] 3
And Inter-Lab Comparison

PROFICIENCY TESTING

[Link] 4
PROFICIENCY TESTING (PT)…
 Only authorized organizations are allowed to
establish a PT Exercise
 Authorization is granted by American
Association for Lab Accreditation
([Link])
 The organization has to ensure compliance
with ISO Guide 43 (meant for PT and ILC)
 Labs get registered with them for
participation in this exercise

[Link] 5
PROFICIENCY TESTING (PT)
 PT Providers send “Blind Samples” to
registered Labs
 With a List of Tests to be performed along
with calculation of Uncertainty (in most
cases)
 Lab has to submit the results within a given
deadline
 After getting all the results, PT Provider
compiles it and performs comparison analysis
on it (z-Score)

[Link] 6
PARTICIPATING IN PT …
 As it has become a compulsion because of
PNAC requirements, ILC can be ignored (for
now)
 PT requires us to register with a PT Provider
(who have you selected?)
 Every PT provider comes with its own
requirements
 What should you expect from them?

[Link] 7
PARTICIPATING IN PT …
 Following is part general practices of a PT
Provider;
 Your Lab’s Code (Unique ID)
 A Sample (very securely sealed and wrapped)
 A brief document about the sample (with necessary
information that you need, to perform the test)
 A reporting format expressing the expected
outcomes of the activity
 Instructions about sending the results (they may
require you to return the sample but this usually
happens in case of samples that are expired or
deemed useless after test has been performed)

[Link] 8
PARTICIPATING IN PT …
 What to do after you have received the
sample?
 Inspect the sample for any drawbacks
 Report the drawbacks on immediate basis,
especially if the sample is found to be in an
inappropriate condition
 Send them a receiving of the sample as soon
as you get it (fax or email recommended)!
 Start Performing as soon as possible and
mark your calendar for the given deadline!

[Link] 9
PARTICIPATING IN PT …
 Who should perform?
 Only the best performers are recommended
 As failure not only causes you to loose a few
dollars but also;
 Creates an NC which needs to investigated
 The NC can only be closed after participating in
a similar PT while ensuring that you get
“satisfactory” results
 But how to ensure a good PT participation?

[Link] 10
PARTICIPATING IN PT
 Following are the required necessary steps to
ensure a successful PT Participation;
 At least one Performer should be able to
supervise (more than one person can participate
if found necessary)
 Equipment calibrated and in its best condition
 Accessories and additional utilities exclusively
available for the test (includes redundant power
backups)
 Perform as you would perform a critical test
using your best method (STM recommended)
 Pray!

[Link] 11
INTER-LAB COMPARISON …
 Consider being a PT Provider
 You have to ensure the following as the
initiator of ILC;
 Confidentiality Controls (Why?)
 Selecting the Team to Monitor the ILC (They are
not allowed to Participate or to disclose
confidential information)
 Selecting the Test
 Selecting the type of test samples
 Selecting the method to prepare a sample
(includes storage criteria, storage facility,
transport to other labs)
[Link] 12
INTER-LAB COMPARISON …
 You need to define unique coding schemes for
both Labs involved and sample
 You need to define Master Samples whose
results are known to the team handling ILC
 Master Samples Characteristics = Other Samples
 Sample codes can differentiate based on their
origination for easier tracking;
 Sample from a Lahore Lab – SXLMS2547
 Sample from an Islamabad Lab – SXILM9987
 The code “does not” need to give “clues” about
its origination

[Link] 13
INTER-LAB COMPARISON …
 Define the packing scheme for sending the
sample
 Develop instructions to perform the test on
sample for the labs involved (the instructions
need to comprehensive but should not disclose
critical information about sample condition and
origin)
 Develop reporting format that can provide your
team the required information necessary for
ILC analysis (Z-Score, etc.)
 Get a good courier service or a good personal
delivery system
[Link] 14
INTER-LAB COMPARISON
 ILC is a hectic activity
 In many ways it seems less expensive than PT
 But with a Time and Money Comparison you
can easily see that ILC may be less in dollars,
but killing in time consumption
 ILC requires more involvement of Senior
Authorities than PT
 And most of all it requires higher level of
confidentiality from your team; sharing with
your team is cheating ;)
 Here are some examples of ILC
[Link] 15
SOME COMPARISON EXAMPLES
Sample B
Analysis of Test Results Min:
Max:
0.08
1.90
Average: 0.76
2 Range: 1.83

1.5

1
EC

0.5

0
LAB 1 LAB 2 LAB 3 LAB 4 LAB 5 LAB 6 LAB 7 LAB 8 LAB 9
Labs EC dS/m

[Link] 16
SOME COMPARISON EXAMPLES
Sample C
Min: 0.09
Analysis of Test Results Max: 2.00
Average: 0.89
Range: 1.91

2.5
2
1.5
EC

1
0.5
0
LAB1 LAB 2 LAB 3 LAB 4 LAB 5 LAB 6 LAB 7 LAB 8 LAB 9
Labs EC dS/m

[Link] 17
SOME COMPARISON EXAMPLES
EC Camparison from All Samples
4.00
3.50
3.00
2.50 A
2.00 B
EC

1.50 C
1.00
0.50
0.00
LAB 1 LAB 2 LAB 3 LAB 4 LAB 5 LAB 6 LAB 7 LAB 8 LAB 9
Labs

[Link] 18
For Inter-Lab Comparison Analysis

CALCULATING Z-SCORE

[Link] 19
Z-SCORE
 What do we need this for?
 It is a simple formula that provides a criteria
for rating labs’ participating in ILC or PT
 You can use this in case you decide to
perform an ILC
 Z-Score is defined in the ISO Guide 43, this
training portion is based on the same guide

[Link] 20
AS STATED IN GUIDE 43 OF ISO
 Formula for z-Score
x-X
 Z = ________________

s
 Where, Z = the required result
 x = Result from the Lab
 X = True Value / Average / Median / Mode
 s = Standard Deviation

[Link] 21
CRITERIA FOR Z-SCORE
 |z| < 2 (Satisfactory)
Z Score is Less Than or Equal to 2
 2 < |z| < 3 (Questionable)
 Z-Score is greater than 2 but less than 3
 |z| > 3 (Unsatisfactory)
 Z-Score is greater than equal to 3
 Lets use this formula in an example for
better clarity!

[Link] 22
Z-SCORE EXAMPLE
No. Lab Results
1 7.02
True Value 7.01
2 7.04
3 7.03
4 7.01 SD 0.011738

5 7.03
6 7.02
7 7.03
8 7.01
9 7.04
10 7.01

[Link] 23
Z-SCORE EXAMPLE
No. Lab Results Z-Score Level
1 7.02
2 7.04
3 7.03
4 7.01
5 7.03
6 7.02
7 7.03
8 7.06
9 7.04
10 7.01

[Link] 24
Z-SCORE EXAMPLE
No. Lab Results Z-Score Level
1 7.02 0.656217959
2 7.04 1.968653877
3 7.03 1.312435918
4 7.01 0
5 7.03 1.312435918
6 7.02 0.656217959
7 7.03 1.312435918
8 7.06 3.281089794
9 7.04 1.968653877
10 7.01 0

[Link] 25
Z-SCORE EXAMPLE
No. Lab Results Z-Score Level
1 7.02 0.656217959 Satisfactory
2 7.04 1.968653877 Satisfactory
3 7.03 1.312435918 Satisfactory
4 7.01 0 Satisfactory
5 7.03 1.312435918 Satisfactory
6 7.02 0.656217959 Satisfactory
7 7.03 1.312435918 Satisfactory
8 7.06 3.281089794 Unsatisfactory
9 7.04 1.968653877 Questionable
10 7.01 0 Satisfactory

[Link] 26
Z-SCORE EXAMPLE
7.07

7.06

7.05

7.04

7.03

7.02

7.01

6.99

6.98
1 2 3 4 5 6 7 8 9 10

[Link] 27
ESTIMATION OF
UNCERTAINTY

[Link] 28
ESTIMATION OF UNCERTAINTY
 A very important requirement of ISO 17025
 The method involves Statistics and can be
measured using various types of methods
 Unfortunately we can only adopt only one, so
will stick to only one!
 The method we’ll be using is derived from
PNAC’s training on the same topic!
 Before we perform it, we need to ensure we
are covering the Basics…

[Link] 29
WHAT IS UNCERTAINTY? …
 Is it a disease, some irresolvable mystery?
 Can it be removed?
 Why does it need to calculated?
 Is anything perfect?
 How do you define what is Perfect?
 Can you eliminate errors?
 Is it a type of Error?

[Link] 30
WHAT IS UNCERTAINTY? …

 To understand uncertainty, we need to


understand “Error”
 There are generally two types of Error
 Systematic or Technical
 Random

[Link] 31
ERROR VS. UNCERTAINTY …
 Error is the difference from the true value
 Systematic Error
 An error that can be corrected using correction
factor
 An error due to blunder, technical fault or
malfunction
 Random Error
 Difference from the True Value because of
influencing factors
 This is also known as Uncertainty
 But this error does not express “problem”, it only
expresses a fact – Perfection is never 100%
[Link] 32
ERROR VS. UNCERTAINTY
 Error is the Difference between “Measured
Value” and the “True Value”
 In Statistical Terms; Error can either be
expressed in Negative or Positive
 Whereas, Uncertainty is deviation from the
True Value and is always in the form of a
“Range”
 Example of Error! (An Auto-Pipette provides
+0.2 in addition to actual result, how to
correct?)
 Example of Uncertainty! (+0.02)
[Link] 33
These Definitions will help us in better
understanding

BASICS OF
UNCERTAINTY

[Link] 34
PRECISION
 Precision is the numerical agreement
between two or more measurements
 The precision can be reported as a range for
a measurement (difference between the min
and max)
 It is a measure of how close together the
measurements are, not how close they are
to the correct or true value
 This is a useful measure of the performance
of a test method

[Link] 35
ACCURACY
 Accuracy is the nearness of a measurement
to the accepted or true value
 This is a useful measure and what most
customers are interested in when they want
to know about the performance of a test
method
 In general, Accuracy is better than Precision
 But Uncertainty requires both at the same
time!

[Link] 36
PRECISION VS. ACCURACY

[Link] 37
PRECISION VS. ACCURACY

[Link] 38
REFERENCE VALUE
 The value which represents the “reference”
to “intended” result
 It can be;
 True Value of the Sample (usually a standardized
sample)
 Mean (Average)
 Median (Central Value)
 Mode (most repeated Value)
 Reference Value remains the same
throughout the test

[Link] 39
HOW IS UNCERTAINTY
CREATED?
 When you weigh a mass of 2 kg on a balance,
it reads 2.002 kg????
 What does that mean?
 Is the balance calibration out?
 Is the Mass really 2 kg?
 Was the performer weighing it correctly?
 Was the surface of the balance clean and
without dust?
 Is this activity effected by Lab’s temperature
and humidity?

[Link] 40
SOURCES OF UNCERTAINTY
1. Person Performing
a. Uncertainty of Repeatability
b. Uncertainty of Reproducibility
2. Method being used
3. Equipment and Accessories utilized
4. Material, Sample and Chemical
5. Environment and Atmosphere

[Link] 41
TYPES OF UNCERTAINTY
 Type A
 When You calculate it yourself through statistical
means
 Common Uncertainties: Repeatability &
Reproducibility
 Type B
 Through Published and Recognized Source
 You use it in finding the Final Uncertainty
 Common Example: Method, Standards, Manuals

[Link] 42
CONFIDENCE INTERVAL
 There are Three Common Levels known as “k”
 k = 1; Confidence Level 68%
 k = 2; Confidence Level 95%
 k = 3; Confidence Level 99.7%
 The Final Uncertainty will be multiplied by k
(at least 2) to ensure compliance with ISO
17025
 Uncertainty without Confidence Interval is not
acceptable
 We shall use k = 2 (95% Confidence)

[Link] 43
STAGES OF UNCERTAINTY
 Identify Uncertainty Sources particularly
effecting the test under consideration
 Quantify the Sources
 Find Sensitivity Coefficient against each
source
 Find the Contribution Value for Combined
Uncertainty
 Calculate Combined Uncertainty
 Calculate Expanded Uncertainty by
Multiplying CU with 2 (Why 2?)
 You are done!
[Link] 44
HOW TO PERFORM?
 To ensure that we are able to understand
this completely, we’ll discuss these steps
with the help of an example
 Keep your calculator and pencils ready, as
“you” shall be doing this manually!

[Link] 45
UNCERTAINTY EXERCISE

[Link] 46
EXAMPLE …
 We’ll take example of pH of Water (as
currently we have complete data associated
with its uncertainty)
 Now considering what we just discussed, can
you figure out the “Sources of Uncertainty”
relating to this test?
 Remember, we only need sources that can be
quantified!

[Link] 47
EXAMPLE …
 Test: pH of Water
 Analysts: A & B (Repeatability &
Reproducibility)
 Buffer Solution: 7.00 (Uncertainty: +0.02)
 pH Meter Uncertainty (Taken from
Calibration Certificate): +0.02
 Temperature Limit: 23+2
 Required Coverage Factor 95%

 Find Out Expanded Uncertainty

[Link] 48
EXAMPLE …
 Here is some more information on the
Analysts!
S/N Analyst 1 Analyst 2
1 7 7
2 7 7.01
3 7.01 7.04
4 7.01 7
5 7.01 7.36
6 7.01 7
7 7 7
8 7.01 7.03
9 7 7
10 7.01 7.06

[Link] 49
EXAMPLE …
 Additional Information on Buffer Solution, if
it helps that is!
 I couldn’t find any Confidence Level!

TEMPERATURE pH of BUFFER SOLUTION MU

20 0C 7.02 +0.02

25 0C 7.04 +0.02

30 0C 7.06 +0.02

[Link] 50
EXAMPLE …
 Some information on pH Meter;
 It is clean
 It is not new but was calibrated a month ago
by an external body
 It’s calibration status was checked and found
positive before this activity
 Certificate provides confidence level of 95%
 Calibration Certificate does not bear a logo
of PNAC

[Link] 51
EXAMPLE …
 We’ll go for TYPE-A first, i.e. Analysts as we
need to find their Uncertainty on our own
 We need find their Standard Deviations
 Perform the following steps;
 Find their Averages …

[Link] 52
EXAMPLE …
 Standard Deviation formula;
 SD = √ ∑ (Xi – x) 2 / n-1
 In words, Minus Average from each result and
square the resulting the figure
 Sum the squared value and divide it by total
results minus one (n-1)
 Take square root of the resulting figure
 And you have SD of Analyst

[Link] 53
EXAMPLE … FOR ANALYST 1
S/N Analyst 1 AX - X SQ(AX-X)
1 7
2 7
3 7.01
4 7.01
5 7.01
6 7.01
7 7
8 7.01
9 7
10 7.01
Average

[Link] 54
EXAMPLE … FOR ANALYST 1
S/N Analyst 1 AX - X SQ(AX-X)
1 7 -0.006
2 7 -0.006
3 7.01 0.004
4 7.01 0.004
5 7.01 0.004
6 7.01 0.004
7 7 -0.006
8 7.01 0.004
9 7 -0.006
10 7.01 0.004
Average 7.006  

[Link] 55
EXAMPLE … FOR ANALYST 1
S/N Analyst 1 AX - X SQ(AX-X)
1 7 -0.006 0.000036
2 7 -0.006 0.000036
3 7.01 0.004 0.000016
4 7.01 0.004 0.000016
5 7.01 0.004 0.000016
6 7.01 0.004 0.000016
7 7 -0.006 0.000036
8 7.01 0.004 0.000016
9 7 -0.006 0.000036
10 7.01 0.004 0.000016
Average 7.006   0.000240

[Link] 56
EXAMPLE … FOR ANALYST 1
 Put the Values in the formula;
 SD = √ ∑ (Xi – x) 2 / n-1
 SD = √ 0.000240 / 10 - 1
 SD = 0.005163978
 What is the Confidence Level at this point?
 SD @ 68% is also called Standard Uncertainty
 Now perform the same activity for Analyst 2

[Link] 57
EXAMPLE … FOR ANALYST 2
S/N Analyst 2 AX - X SQ(AX-X)
1 7
2 7.01
3 7.04
4 7
5 7.36
6 7
7 7
8 7.03
9 7
10 7.06
Average

[Link] 58
EXAMPLE … FOR ANALYST 2
S/N Analyst 2 AX - X SQ(AX-X)
1 7 -0.05
2 7.01 -0.04
3 7.04 -0.01
4 7 -0.05
5 7.36 0.31
6 7 -0.05
7 7 -0.05
8 7.03 -0.02
9 7 -0.05
10 7.06 0.01
Average 7.05  

[Link] 59
EXAMPLE … FOR ANALYST 2
S/N Analyst 2 AX - X SQ(AX-X)
1 7 -0.05 0.002500
2 7.01 -0.04 0.001600
3 7.04 -0.01 0.000100
4 7 -0.05 0.002500
5 7.36 0.31 0.096100
6 7 -0.05 0.002500
7 7 -0.05 0.002500
8 7.03 -0.02 0.000400
9 7 -0.05 0.002500
10 7.06 0.01 0.000100
Average 7.05   0.110800

[Link] 60
EXAMPLE … FOR ANALYST 2
 Put the Values in the formula;
 SD = √ ∑ (Xi – x) 2 / n-1
 SD = √ 0.110800 / 10 - 1
 SD = 0.110955447

[Link] 61
EXAMPLE …
 Now we two results in the category of
Analyst
 But we need only one result
 Which one to select?
 To be safer side, we always go for the
maximum SD!
 So whoever came up with the highest SD
becomes part of Uncertainty Budget (what is
that????)

[Link] 62
NOW WE GO FOR TYPE-B
 Now in most cases we usually get an
uncertainty with Confidence Level
 In this case, for pH Meter we have +0.02 with
Confidence Level 95%
 But to ensure we do everything according to
rules, we need to reduce its confidence level
to 68%
 But How?
 Simple …

[Link] 63
EXAMPLE …
 Divide Uncertainty with 1.96
 i.e. 0.02 / 1.96
 Standard Uncertainty for pH Meter @
Confidence Level 68% is …
 0.010204

[Link] 64
EXAMPLE …
 Continuing with another source
 Buffer Solution of pH 7.00!
 Now, there are two issues associated with it
 One is Uncertainty itself
 And other, temperature of solution
influences the result (as you may observe the
buffer solution table)
 We’ll focus first on Uncertainty …

[Link] 65
EXAMPLE …
 Now, in this case, we don’t have Confidence
Level
 In cases like these, we should never put the
same value of uncertainty in the Uncertainty
Budget
 But instead, we’ll divide it by √3 (equal to
1.73)
 So, 0.02 / 1.73 will give us
 Standard Uncertainty @ Confidence Level
68%
 Standard Uncertainty = 0.01156
[Link] 66
EXAMPLE …
 Now in case of Environment, we have
temperature limit of 23+2
 This is not uncertainty, but in case of
temperature and humidity limits, we can
take their limits (such as +2) as the same
thing
 We shall not remove any confidence level
here as there is no uncertainty only
allowable limit

[Link] 67
EXAMPLE …
 Now we have Uncertainties associated with
all sources
 But before we move on, we need to make
sure that none of these sources are effected
by each other
 This process is statistically called “Sensitivity
Coefficient”
 Although a difficult process, but our
assumptions make it very easy to calculate
 How? And What Assumptions?

[Link] 68
EXAMPLE …
 Let’s say
 Y = pH of Water
 Where, Y = f (X1 + X2 + X3 + X4)
 Where, X’s are the sources of uncertainty
 Now, we need to find Sensitivity Coefficient
for every source
 Statistically it is denoted as;
 For X1; C1 = dY / dX1
 So …

[Link] 69
EXAMPLE …
 Putting Value of Y in equation
 C1 = d(X1 + X2 + X3 + X4) / dX1
 Now, if we cannot “Quantify” the effects of
other sources on X1, we shall assume them
as constant
 And Derivative of a Constant is always “0”
 So, C1 = dX1 + 0 + 0 + 0 / dX1
 C1 = dX1 / dX1
 C1 = 1

[Link] 70
EXAMPLE …
 Now considering Analysts and Equipment (C1
and C2), we can assume that they are not
influenced by sources that can be quantified
 So, we easily assume their answers to be “1”
 But in case of Environmental Temperature, it
will be “0”

 Why?

[Link] 71
EXAMPLE …
 Assuming that test was performed in a
temperature controlled environment
 And considering the fact, it is the
temperature of buffer solution that actually
influences the result “NOT” the
environmental temperature
 We can simply consider it as a constant
during this activity
 Therefore, C4 would be …

[Link] 72
EXAMPLE …
 Putting Value of Y in equation
 C4 = d(X1 + X2 + X3 + X4) / dX4
 So, C4 = 0 + 0 + 0 + 0 = dX4
 Why are All Zeroes????
 C4 = 0 / dX4
 C4 = 0

[Link] 73
EXAMPLE …
 Our case this buffer solution is different,
which slightly changes our presentation of
formula;
 Remember the Table for Buffer Solution?

TEMPERATURE pH of BUFFER SOLUTION MU

20 0C 7.02 +0.02

25 0C 7.04 +0.02

30 0C 7.06 +0.02

[Link] 74
EXAMPLE …
 Now the temperature shift can be considered
as an additional influence “t” on buffer
solution “s”
 So, considering all other factors constant
(they’ll be zero, again);
 C3 = Δt / Δs
 Where, Delta (Δ) expresses the difference
between two end points of a single category,
i.e.

[Link] 75
EXAMPLE …
 Δt = t1 – t2 & Δs = s1 – s2
 Δt = 25 – 20 & Δs = 7.02 – 7.04
 Putting these values in the formula;
 C3 = Δt / Δs

 C3 = (7.04 – 7.02) / (25 – 20)


 C3 = 0.02 / 5
 C3 = 0.004

[Link] 76
EXAMPLE …
 Now we have all the required information
 We need to assemble it in a manner that it
becomes easy to access and use for future
reference and especially for our final
estimation!
 We’ll create an Uncertainty Budget!

[Link] 77
UNCERTAINTY BUDGET
 Use the Following Table for each Testing
Activity (its separate for each test);
1. Sources of Uncertainty
2. Estimate / Average / Reference Value
3. Uncertainty
4. Sensitivity Coefficient
5. Contribution in Combined Uncertainty

[Link] 78
EXAMPLE …

SU / Contribution to
Estimate / Sensitivity
Sources Allowable Combined
Average Coefficient
Limit Uncertainty

Analyst 7.05 0.110955447 1

pH Meter 7 0.010204 1

Buffer Solution 7 0.01156 0.004

Environment 23 2 0

[Link] 79
EXAMPLE …

SU / Contribution to
Estimate / Sensitivity
Sources Allowable Combined
Average Coefficient
Limit Uncertainty

Analyst 7.05 0.110955447 1 0.110955447

pH Meter 7 0.010204 1 0.010204

Buffer Solution 7 0.01156 0.004 0.00004624

Environment 23 2 0 0

[Link] 80
EXAMPLE …
 Uc = Combined Uncertainty
 Uc = √ SQR(Ux1) + SQR(Ux2) + SQR(Ux3) +
SQR (Ux4)

 Uc = √ 0.0123111112 + 0.0001041216 +
0.0000000021 + 0.0000000000
 Uc = √ 0.0124152350
 Uc = 0.1114236733

 And what is the Confidence Level here?

[Link] 81
EXAMPLE …
 So our current result is;
 +0.1114236733 or +0.1114
 At Confidence Level 68%
 But, the standard requires us to give higher
confidence
 Like 95%
 This type of Uncertainty is known as
Expanded Uncertainty
 Ue = Uc x k
 Where, “k” is Confidence Level

[Link] 82
EXAMPLE …
 K=1 equals 68% Confidence Level
 K=2 equals 95% Confidence Level
 K=3 equals 99% Confidence Level
 So, Ue at 95% would be …
 Ue = Uc x k
 Ue = 0.1114 x 2
 Ue = 0.2228

 So our result is;


 pH of Water = 7.00 +0.2228 @ 95% Confidence
Level
[Link] 83
UNCERTAINTY CONCLUDED
 This concludes one of the most important
chapters of this training as well as an
Important requirement of ISO 17025
 Hope you are able to perform it on your own!
 To make you life easier, we have developed
Uncertainty Calculation sheets in MS Excel
 Which will definitely save a lot of your time
while you are performing estimation exercise
 Moving on to our Last topic…

[Link] 84
QUALITY CONTROL
CHARTS

[Link] 85
WHAT? & WHY?
 Quality Charts are tables / graphs that can
provide you with a progress review of last 6
months or more of a system in a single
glance
 Like, you can make graphs that express the
number of Non-Conformities in last 3 audits
 It would be like…

[Link] 86
EXAMPLE …
NCs
12

10

6 NCs

0
1 2 3

[Link] 87
CHARTS FOR TESTING
ACTIVITIES
 In this case you can make
charts/tables/graphs associated with your
performance
 Like, the number of tests which crossed pre-
defined limits in last month
 Or, the number of test results going out of
limits in year 2008 compared to year 2007
 Through these analysis we can define
“trends” of these issues
 Like …

[Link] 88
EXAMPLE …
35

30

25

20 Upper
Result
15
Lower
10

0
1 2 3 4 5 6 7 8 9 10 11 12

[Link] 89
LIMITS
 What makes these charts effective are the
Limits
 Limits are in most cases pre-defined, like
your team knows when the sample’s results
are not “right”
 They are like the passing criteria of an exam,
e.g. anyone scoring lower than 75% in the
final exam will be facing The Wrath of
QMS.9000!
 But what if you don’t have any limits…

[Link] 90
UNCERTAINTY CAN HELP!
 In cases when you have to check the trend of
your performance and you don’t have a pre-
defined limit
 We use Uncertainty to define our limit
 Like in the previous example of Uncertainty;
 Our Upper limit can be 7.2228 and Lower
limit can be 6.7772 (derived from Ue = 7.00
+0.2228)
 Its graph will be plotted as;

[Link] 91
GRAPH
7.3

7.2

7.1

7
Upper
6.9 Results
6.8 Lower

6.7

6.6

6.5
1 2 3 4 5 6 7 8 9 10
[Link] 92
ADDITIONAL FUNCTIONS
 You can also make quality charts for
comparisons of internal retesting and
replicate testing activities of similar tests
 These comparisons will be help you
determine regular fluctuations and their
causes
 In the end, please remember; All limits
crossed need to be recorded as “Non-
Conformities” and their causes need to be
investigated (use CPA System)

[Link] 93
CONCLUSION
 We hope this training has not drained the
last drop of energy out of you
 Because you are going to utilize that drop in
the Exam

 Please fill our training evaluation form


before you give the exam

 And Good Luck!

[Link] 94

You might also like