Data Quality Assessment Tool For USAID implementing Partners
Fill in the details below for each IP assessed:
Name of Implementing Partner:
Service Area:
Date of the assessment:
Implementing Partner staff consulted
Designation/
Names Telephone Contact
Position
DQA Assessment team members
INSTRUCTIONS
Introduction
The purpose of this tool is to enable USAID Uganda staff, MEL Contractors, and
implementing partners (IPs) to conduct data quality assessments (DQAs). The
purpose of the DQAs is to ensure that the USAID mission/ office and DO team
are aware of the:
• Strengths and weaknesses of the data as determined by applying the
five data quality standards of Validity; Reliability; Timeliness and
Precision and Integrity as defined in the ADS 201.
• The extent to which the data integrity can be trusted to influence
management decisions.
The tool is divided into sections i.e. Section I that is to be administered at the
IP headquarters and Section II at the site/service delivery point. The two
sections include the criteria for assessing the indicator specific data
management system and data validation checklist.
Steps to Follow When Conducting a DQA
A. Planning
The IPs should be informed of the impending DQA in advance and requested
to avail of the relevant management and MEL staff as well as relevant
documentation on the assessment day.
B. Completing the tool
• Start by visiting the IP's Headquarters and/or Regional Offices then
together with the IP staff, visit their sites/service delivery points (SDPs).
• For each entity visited, complete the DQA tool for that level: IP, service
delivery point.
• Use the options provided to indicate the numeric value of your response to
each question. Put your answer in space provided for the response.
• Fill one tool per indicator for each of the IPs and or SDP/site.
• An appreciative inquiry approach should be applied throughout the process
using a supportive supervision process.
• The emphasis should be on direct observation of documents and or data
collection tools.
C. Feedback, documentation & follow up on findings
• Feedback should be given to IP staff based on the data quality
improvement plan template attached
• The DQA team should prepare a detailed report using the template
attached to see annex 1 of the DQA protocol.
• A copy of the final report that has been signed off by the respective
AOR/COR should be shared with the respective IP and as appropriate with
the relevant staff at each level assessed.
• AORs shall take the lead responsibility to ensure that follow up visits to
assess the implementation of the suggested solutions to identified gaps.
• USAID i.e. the AORs and IPs should incorporate any learning from the
assessments into program improvement.
Data Quality Assessment Tool - For USAID Implementing Partners
Fill in the details below for each of the indicators and IP assessed
Title of performance indicator:
Data source(s):
Period for which the data are being reported:
Is the Indicator a standard or custom indicator?
Date of the assessment :
Table A: Data Management System
Response
QUESTIONS Comments
options
General Questions
1. Please provide an overview
of this Activity's M&E system
and structure
2. Could you please walk us
through how data for this
indicator is generated,
recorded, processed, stored,
3. Is the data collected for Yes---------1
this indicator aggregated
routinely through your
No-----------0
established data MEL
systems?
4. Does your organization Yes--------1
collect the information for this
indicator itself?
Probe if the IP receives
No---------0
this information from other
organization(s) and asks to
see the list
A. Validity
i. What is your definition for Yes---------1
this indicator? Ask for the
PIRS and review the definition
and compare it with the
standard indicator definition No----------0
where applicable. Check if
the indicator definition
operationally correct
ii. Is there a standard data Yes---------1
collection tool used to collect
data for this indicator? Look
at the tool and confirm if it No----------0
has all the required
variables
iii. Does the data collection Yes---------1
tool capture all required
No----------0
information about the
indicator e.g. unit of measure,
disaggregation, sources, date,
etc.? Check if all the N/A-----------9
variables are filled for the
period under review
iv. How do you avoid or Yes---------1
minimize double-counting for No----------0
this indicator? Probe for the
existence of unique identifiers,
electronic checks, multiple
data points for the same N/A-----------9
clients, for the same services,
etc.)
v. Cross-check reported data What is the
Yes---------1
versus data in the IP database. reason for
Is there any discrepancy the
between the data at the No----------0 discrepancy
organization's HQ database if any?
and that reported to USAID
PRS? I.e. over- or under-
counting)? Verify data
reported to USAID PRS against N/A-----------9
data at partner organization’s
central database – Tick Yes if
variance/discrepancy is
with +/-5%
B. Reliability
i. Are the data collection Yes---------1
processes stable and No----------0
consistent over the last six
months in terms of N/A-----------9
methodology and tools
ii. Has the data source for this Yes---------1
indicator changed in the last 3 No----------0
years? (E.g. units,
respondents, location, etc.) N/A-----------9
iii. Are the data analysis Yes---------1
processes stable and No----------0
consistent over the last six
months? (procedure, formula) N/A-----------9
iv. Do you have documented Yes---------1
standard procedures for data No----------0
collection for this indicator?
Request for a hard copy of N/A-----------9
reference documents
v. Do you have documented Yes---------1
standard procedures for data No----------0
analysis for this indicator?
Request for a hard copy of N/A-----------9
reference documents
C. Precision
i. Does this indicator have the Yes---------1
required disaggregates? No----------0
Cross-check the PIRS
versus the standard PIRS
for the required N/A-----------9
disaggregation
ii. If so, is the data collection Yes---------1
system able to collect the No----------0
required disaggregates for this
indicator? Cross-check if the
data collection and
reporting tools, as well as N/A-----------9
database, include
variables for the required
disaggregation
iii. Has the margin of error Yes---------1
been reported along with the No----------0
data if applicable? N/A-----------9
iv. Is the margin of error less Yes---------1
than the expected change No----------0
being measured if applicable? N/A-----------9
v. Are the same formulae Yes---------1
applied consistently for the No----------0
last 2 years e.g. if data from
multiple sources need to be N/A-----------9
aggregated
D. Integrity
i. Are there procedures or Yes---------1
safeguards in place to No----------0
minimize transcription errors?
(E.g. Field data verification,
double entry of data for large N/A-----------9
surveys, etc.)
ii. Are there procedures or Yes---------1
safeguards in place to No----------0
minimize system errors?
(e.g. electronic checks,
random checks by supervisors N/A-----------9
of data entered)
iii. Is the data stored in an Yes---------1
adequately secure place at No----------0
different levels? N/A-----------9
iv. Is there independence in Yes---------1
key data collection, No----------0
management, and assessment
procedures? Is there a
possibility of an external N/A-----------9
influence on recording and
compilation of data?
v. Are mechanisms in place to Yes---------1
prevent unauthorized changes No----------0
N/A-----------9
to the data?
E. Timeliness
Yes---------1
i. Is the data for this indicator
No----------0
collected as scheduled?
N/A-----------9
ii. Is the data for this indicator Yes---------1
analyzed as per the required No----------0
time and disaggregation? N/A-----------9
iii. Are the data reported as Yes---------1
soon as possible at the No----------0
required reporting timelines
(e.g. quarterly, semi-annual, N/A-----------9
and Annual reports)?
iv. Are the data reported the Yes---------1
most current practically No----------0
available? N/A-----------9
v. Is data available frequently Yes---------1
enough to inform program No----------0
management decisions? N/A-----------9
Summary of key findings and recommendation