For instructions on using this template, please see Notes to Author/Template Instructions on page 17.
Notes on accessibility: This template has been tested and is best accessible with JAWS 11.0 or higher.
For questions about using this template, please contact CMS IT Governance
([email protected]). To request changes to the template, please submit an XLC Process
Change Request (CR) (https://www.cms.gov/Research-Statistics-Data-and-Systems/CMS-Information-
Technology/XLC/Downloads/XLCProcessChangeRequestCR.docx).
<Project Name/Acronym>
Test Case Specification
Version X.X
MM/DD/YYYY
Document Number: <document’s configuration item control number>
Contract Number: <current contract number of company maintaining document>
CMS XLC Table of Contents
Table of Contents
1. Introduction .............................................................................................................. 1
2. Overview ................................................................................................................... 2
3. Assumptions/Constraints/Risks ............................................................................. 3
3.1 Assumptions ..................................................................................................... 3
3.2 Constraints ........................................................................................................ 3
3.3 Risks ................................................................................................................. 3
4. Test Case Summary................................................................................................. 4
5. Test Case-To-Requirements Traceability Matrix ................................................... 5
6. Test Case Details ..................................................................................................... 6
6.1 <Test Case/Script Identifier> ............................................................................ 6
6.1.1 Test Objective ............................................................................................. 6
6.1.2 Inter-Case Dependencies ........................................................................... 6
6.1.3 Test Items ................................................................................................... 6
6.1.4 Prerequisite Conditions............................................................................... 6
6.1.5 Input Specifications..................................................................................... 7
6.1.6 Expected Test Results ................................................................................ 7
6.1.7 Pass/Fail Criteria ........................................................................................ 8
6.1.8 Test Procedure ........................................................................................... 8
6.1.9 Assumptions and Constraints ..................................................................... 9
6.2 <Test Case/Script Identifier> ............................................................................ 9
6.2.1 Test Objective ............................................................................................. 9
6.2.2 Inter-Case Dependencies ........................................................................... 9
6.2.3 Test Items ................................................................................................... 9
6.2.4 Prerequisite Conditions............................................................................. 10
6.2.5 Input Specifications................................................................................... 10
6.2.6 Expected Test Results .............................................................................. 10
6.2.7 Pass/Fail Criteria ...................................................................................... 10
6.2.8 Test Procedure ......................................................................................... 10
6.2.9 Assumptions and Constraints ................................................................... 10
Appendix A: Record of Changes ............................................................................... 11
Appendix B: Acronyms ............................................................................................... 12
Appendix C: Glossary ................................................................................................. 13
Appendix D: Referenced Documents ........................................................................ 14
Appendix E: Approvals ............................................................................................... 15
Appendix F: Additional Appendices .......................................................................... 16
TCS Version X.X ii <Project and release name>
CMS XLC List of Figures
Appendix G: Notes to the Author/Template Instructions ........................................ 17
Appendix H: XLC Template Revision History ........................................................... 18
List of Figures
No table of figures entries found.
List of Tables
Table 1 - Test Case Summary ........................................................................................ 4
Table 2 - Test Procedure Steps for Given Test Case/Script Identifier ............................. 9
Table 3 - Record of Changes ........................................................................................ 11
Table 4 - Acronyms ....................................................................................................... 12
Table 5 - Glossary ......................................................................................................... 13
Table 6 - Referenced Documents ................................................................................. 14
Table 7 - Approvals ....................................................................................................... 15
Table 8 - Test Case-To-Requirements Traceability Matrix ............................................ 16
Table 9 - XLC Template Revision History ..................................................................... 18
TCS Version X.X iii <Project and release name>
CMS XLC Introduction
1. Introduction
Instructions: Provide full identifying information for the automated system, application, or
situation for which the Test Case Specification applies, including as applicable,
identification number(s), title(s)/name(s), abbreviation(s)/acronym(s), part number(s),
version number(s), and release number(s). Summarize the purpose of the document,
the scope of activities that resulted in its development, the intended audience for the
document, and expected evolution of the document. Also describe any security or
privacy considerations associated with use of the Test Case Specification.
TCS Version X.X 1 <Project and release name>
CMS XLC Overview
2. Overview
Instructions: Briefly describe the purpose and context for the system or situation, and
summarize the history of its development. Include the high-level context diagram(s) for
the system and subsystems previously provided in the High-Level Technical Design
Concept/Alternatives, Requirements Document, and/or System Design Document
(SDD), updated as necessary to reflect any changes that have been made based on
more current information or understanding. If the high-level context diagram has been
updated, identify the changes that were made and why.
TCS Version X.X 2 <Project and release name>
CMS XLC Assumptions/Constraints/Risks
3. Assumptions/Constraints/Risks
3.1 Assumptions
Instructions: Describe any assumptions affecting the creation and/or execution of the
test cases/scripts in general. Assumptions made specific to an individual test case/script
are to be described in a later section corresponding with that particular test case/script.
3.2 Constraints
Instructions: Describe any limitations or constraints that have a significant impact on the
test cases/scripts in general. Constraints specific to an individual test case/script are to
be described in a later section corresponding with that particular test case/script.
Constraints may be imposed by any of the following (the list is not exhaustive):
Hardware or software environment
End-user environment
Availability of resources
Interoperability requirements
Interface/protocol requirements
Data repository and distribution requirements.
3.3 Risks
Instructions: Describe any risks associated with the test cases/scripts and proposed
mitigation strategies.
TCS Version X.X 3 <Project and release name>
CMS XLC Test Case Summary
4. Test Case Summary
Instructions: Provide an overview of the test cases/scripts that will be executed. List
each test case/script by its project-unique identifier and title. Test cases/scripts may be
grouped by test function (e.g., user acceptance testing, Section 508 testing, system
testing, regression testing, etc.). If test case/script information is maintained in an
automated tool, this information may be exported or printed from the tool and included
as an appendix to this document that is referenced here.
Table 1 - Test Case Summary
Test Case/Script Identifier Test Case/Script Title Execution Priority
<Test Case/Script Identifier> <Test Case/Script Title> <Execution Priority>
TCS Version X.X 4 <Project and release name>
CMS XLC Test Case-To-Requirements Traceability Matrix
5. Test Case-To-Requirements Traceability Matrix
Instructions: Provide a table that maps all of the requirements contained within the
Requirements Document to their corresponding test cases/scripts. Reference the
Appendices section of this document for a sample template for a Test Case-to-
Requirements Traceability Matrix. The completed traceability matrix should be inserted
here or a reference made to its inclusion as a separate appendix. If test case/script
information is maintained in an automated tool, the matrix may be exported or printed
from the tool for inclusion in this document.
TCS Version X.X 5 <Project and release name>
CMS XLC Test Case Details
6. Test Case Details
Instructions: Provide details for each test case/script identified in the Test Case
Summary section. There should be a separate detail section for each test case/script. If
test case/script information is maintained in an automated tool, the information
described in the following sub-sections should be collected for each test case/script.
This information may be printed from the tool and included as an appendix to this
document that is referenced here. The test case/script details may also be printed in a
tabular format to allow groupings of test cases/scripts with similar characteristics to
reduce the volume of reported information and make it easier to review the content.
6.1 <Test Case/Script Identifier>
Instructions: Provide a project-unique identifier and descriptive title for the test case or
test script. Identify the date, number, and version of the test case/script and any
subsequent changes to the test case/script specification. The number of the test
case/script may identify its level in relation to the level of the corresponding software to
assist in coordinating software development and test versions within configuration
management.
6.1.1 Test Objective
Instructions: Describe the purpose of the test case/script and provide a brief description.
Also, identify if the test case/script may be used by multiple test functions.
6.1.2 Inter-Case Dependencies
Instructions: List any prerequisite test cases/scripts that would create the test
environment or input data in order to run this test case/script. Also, list any post-
requisite test cases/scripts for which the running of this test case/script would create the
test environment or input data.
6.1.3 Test Items
Instructions: Describe the items or features (e.g., requirements, design specifications,
and code) to be tested by the test case/script. Keep in mind the level for which the test
case/script is written and describe the items/features accordingly. The item description
and definition can be referenced from any one of several sources, depending on the
level of the test case/script. It may be a good idea to reference the source document as
well (e.g., Requirements Document, System Design Document, User Manual,
Operations & Maintenance Manual, Installation Instructions from Version Description
Document, etc.)
6.1.4 Prerequisite Conditions
Instructions: Identify any prerequisite conditions that must be established prior to
performing the test case/script. The following considerations should be discussed, as
applicable:
TCS Version X.X 6 <Project and release name>
CMS XLC Test Case Details
Environmental needs (e.g., hardware configurations, system software (e.g.,
operating systems, tools), other software applications, facilities, training);
Stubs, drivers, flags, initial breakpoints, pointers, control parameters, or initial
data to be set/reset prior to test commencement;
Preset hardware conditions or states necessary to run the test case/script;
Initial conditions to be used in making timing measurements;
Conditioning of the simulated environment; and
Other special conditions (e.g., interfaces) peculiar to the test case/script.
6.1.5 Input Specifications
Instructions: Describe all inputs required to execute the test case/script. Keep in mind
the level for which the test case/script is written and describe the inputs accordingly. Be
sure to identify all required inputs (e.g., data (values, ranges, sets), tables, human
actions, conditions (states), files (databases, control files, transaction files), and
relationships (timing)). The input can be described using text, a picture of a properly
completed screen, a file identifier, or an interface to another system. It is also
acceptable to simplify the documentation by using tables for data elements and values.
Include, as applicable, the following:
Name, purpose, and description (e.g., range of values, accuracy) of each test
input;
Source of the test input and the method to be used for selecting the test input;
Whether the test input is real or simulated;
Time or event sequence of test input; and
The manner in which the input data will be controlled to:
Test the item(s) with a minimum/reasonable number of data types and values.
Exercise the item(s) with a range of valid data types and values that test for
overload, saturation, and other “worst case” effects.
Exercise the item(s) with invalid data types and values to test for appropriate
handling of irregular inputs.
Permit retesting, if necessary.
6.1.6 Expected Test Results
Instructions: Identify all expected test results for the test case/script, including both
intermediate and final results. Describe what the system should look like after the test
case/script is run by examining particular screens, reports, files, etc. Identify all outputs
required to verify the test case/script. Keep in mind the level for which the test
case/script is written and describe the outputs accordingly. Be sure to identify all outputs
(e.g., data (values, sets), tables, human actions, conditions (states), files (databases,
control files, transaction files), relationships, timing (response times, duration)). The
description of outputs can be simplified by using tables, and may even be included in
TCS Version X.X 7 <Project and release name>
CMS XLC Test Case Details
the same table as the associated input to further simplify the documentation and
improve its usefulness.
6.1.7 Pass/Fail Criteria
Instructions: Identify the criteria to be used for evaluating the intermediate and final
results of the test case/script, and determining the success or failure of the test
case/script. For each test result, the following information should be provided, as
applicable:
The range or accuracy over which an output can vary and still be acceptable;
Minimum number of combinations or alternatives of input and output conditions
that constitute an acceptable test result;
Maximum/minimum allowable test duration, in terms of time or number of events;
Maximum number of interrupts, halts, or other system breaks that may occur;
Allowable severity of processing errors;
Conditions under which the result is inconclusive and re-testing is to be
performed;
Conditions under which the outputs are to be interpreted as indicating
irregularities in input test data, in the test database/data files, or in test
procedures;
Allowable indications of the control, status, and results of the test and the
readiness for the next test case/script (may be output of auxiliary test software);
and
Other criteria specific to the test case/script.
6.1.8 Test Procedure
Instructions: Describe the series of individually numbered steps that are to be
completed in sequential order to execute the test procedure for the test case/script. For
convenience in document maintenance, the test procedures may be included as an
appendix and referenced in this paragraph. The appropriate level of detail in the test
procedure depends on the type of software being tested. For most software, each step
may include a logically-related series of keystrokes or other actions, as opposed to each
keystroke being a separate test procedure step. The appropriate level of detail is the
level at which it is useful to specify expected results and compare them to actual results.
The following should be provided for the test procedure, as applicable:
Test operator actions and equipment operation required for each step, including
commands, as applicable, to:
Initiate the test case/script and apply test inputs
Inspect test conditions
Perform interim evaluations of test results
Record data
Halt or interrupt the test case/script
TCS Version X.X 8 <Project and release name>
CMS XLC Test Case Details
Request diagnostic aids
Modify the database/data files
Repeat the test case if unsuccessful
Apply alternate modes as required by the test case/script
Terminate the test case/script.
Expected result and evaluation criteria for each step.
If the test case/script addresses multiple requirements, identification of which test
procedure step(s) address which requirements.
Actions to follow in the event of a program stop or indicated error, such as:
Recording of critical data from indicators for reference purposes
Halting or pausing time-sensitive test-support software and test apparatus
Collection of system and operator records of test results
Actions to be used to reduce and analyze test results to accomplish the following:
Detect whether an output has been produced
Identify media and location of data produced by the test case/script
Evaluate output as a basis for continuation of test sequence
Evaluate test output against required output.
Table 2 - Test Procedure Steps for Given Test Case/Script Identifier
Step # Action Expected Results/Evaluation Criteria Requirement(s) Tested
<#> <Action> <Expected Results/Evaluation Criteria> <Requirement(s) Tested>
6.1.9 Assumptions and Constraints
Instructions: Identify any assumptions made and constraints or limitations imposed in
the description of the test case due to system or test conditions (e.g., limitations on
timing, interfaces, equipment, personnel, and database/data files. If waivers or
exceptions to specified limits and parameters are approved, they are to be identified
and their effects and impacts upon the test case/script described.
6.2 <Test Case/Script Identifier>
6.2.1 Test Objective
6.2.2 Inter-Case Dependencies
6.2.3 Test Items
TCS Version X.X 9 <Project and release name>
CMS XLC Test Case Details
6.2.4 Prerequisite Conditions
6.2.5 Input Specifications
6.2.6 Expected Test Results
6.2.7 Pass/Fail Criteria
6.2.8 Test Procedure
6.2.9 Assumptions and Constraints
TCS Version X.X 10 <Project and release name>
CMS XLC Appendix A: Record of Changes
Appendix A: Record of Changes
Instructions: Provide information on how the development and distribution of the Test
Case Specification will be controlled and tracked. Use the table below to provide the
version number, the date of the version, the author/owner of the version, and a brief
description of the reason for creating the revised version.
Table 3 - Record of Changes
Version Number Date Author/Owner Description of Change
<X.X> <MM/DD/YYYY> CMS <Description of Change>
<X.X> <MM/DD/YYYY> CMS <Description of Change>
<X.X> <MM/DD/YYYY> CMS <Description of Change>
TCS Version X.X 11 <Project and release name>
CMS XLC Appendix B: Acronyms
Appendix B: Acronyms
Instructions: Provide a list of acronyms and associated literal translations used within
the document. List the acronyms in alphabetical order using a tabular format as
depicted below.
Table 4 - Acronyms
Acronym Literal Translation
<Acronym> <Literal Translation>
<Acronym> <Literal Translation>
<Acronym> <Literal Translation>
TCS Version X.X 12 <Project and release name>
CMS XLC Appendix C: Glossary
Appendix C: Glossary
Instructions: Provide clear and concise definitions for terms used in this document that
may be unfamiliar to readers of the document. Terms are to be listed in alphabetical
order.
Table 5 - Glossary
Term Acronym Definition
<Term> <Acronym> <Definition>
<Term> <Acronym> <Definition>
<Term> <Acronym> <Definition>
TCS Version X.X 13 <Project and release name>
CMS XLC Appendix D: Referenced Documents
Appendix D: Referenced Documents
Instructions: Summarize the relationship of this document to other relevant documents.
Provide identifying information for all documents used to arrive at and/or referenced
within this document (e.g., related and/or companion documents, prerequisite
documents, relevant technical documentation, etc.).
Table 6 - Referenced Documents
Document Name Document Location and/or URL Issuance Date
<Document Name> <Document Location and/or URL> <MM/DD/YYYY>
<Document Name> <Document Location and/or URL> <MM/DD/YYYY>
<Document Name> <Document Location and/or URL> <MM/DD/YYYY>
TCS Version X.X 14 <Project and release name>
CMS XLC Appendix E: Approvals
Appendix E: Approvals
The undersigned acknowledge that they have reviewed the Test Case Specification and agree
with the information presented within this document. Changes to this Test Case Specification
will be coordinated with, and approved by, the undersigned, or their designated representatives.
Instructions: List the individuals whose signatures are desired. Examples of such
individuals are Business Owner, Project Manager (if identified), and any appropriate
stakeholders. Add additional lines for signature as necessary.
Table 7 - Approvals
Document Approved By Date Approved
Name: <Name>, <Job Title> - <Company> Date
Name: <Name>, <Job Title> - <Company> Date
Name: <Name>, <Job Title> - <Company> Date
Name: <Name>, <Job Title> - <Company> Date
TCS Version X.X 15 <Project and release name>
CMS XLC Appendix F: Additional Appendices
Appendix F: Additional Appendices
Instructions: Use appendices to facilitate ease of use and maintenance of the Test Case
Specification. Each appendix should be referenced in the main body of the document
where that information would normally have been provided. Suggested appendices
include, but are not limited to, the following:
Test Case Summary
Test Case-to-Requirements Traceability Matrix
Test Case Details
Below is an example of a test case-to-requirements traceability matrix. The table below
should be modified appropriately to reflect the actual identification and mapping of test
cases to requirements for the given system/project.
Table 8 - Test Case-To-Requirements Traceability Matrix
Test Case Test Case Test Case Test Case Test Case Test Case
Requirement
01 02 03 04 05 06
Requirement <Identify <Identify <Identify <Identify <Identify <Identify
1.0 traceability> traceability> traceability> traceability> traceability> traceability>
Requirement <Identify <Identify <Identify <Identify <Identify <Identify
1.1 traceability> traceability> traceability> traceability> traceability> traceability>
Requirement <Identify <Identify <Identify <Identify <Identify <Identify
1.2 traceability> traceability> traceability> traceability> traceability> traceability>
Requirement <Identify <Identify <Identify <Identify <Identify <Identify
2.0 traceability> traceability> traceability> traceability> traceability> traceability>
Requirement <Identify <Identify <Identify <Identify <Identify <Identify
2.1 traceability> traceability> traceability> traceability> traceability> traceability>
TCS Version X.X 16 <Project and release name>
CMS XLC Appendix G: Notes to the Author/Template Instructions
Appendix G: Notes to the Author/Template Instructions
This document is a template for creating a Test Case Specification for a given
investment or project. The final document should be delivered in an electronically
searchable format. The Test Case Specification should stand on its own with all
elements explained and acronyms spelled out for reader/reviewers, including reviewers
outside CMS who may not be familiar with CMS projects and investments.
This template includes instructions, boilerplate text, and fields. The developer should
note that:
Each section provides instructions or describes the intent, assumptions, and
context for content included in that section. Instructional text appears in blue
italicized font throughout this template.
Instructional text in each section should be replaced with information specific to
the particular investment.
Some text and tables are provided as boilerplate examples of wording and
formats that may be used or modified as appropriate.
When using this template, follow these steps:
1. Table captions and descriptions are to be placed left-aligned, above the table.
2. Modify any boilerplate text, as appropriate, to your specific investment.
3. Do not delete any headings. If the heading is not applicable to the investment,
enter “Not Applicable” under the heading.
4. All documents must be compliant with Section 508 requirements.
5. Figure captions and descriptions are to be placed left-aligned, below the
figure. All figures must have an associated tag providing appropriate
alternative text for Section 508 compliance.
6. Delete this “Notes to the Author/Template Instructions” page and all
instructions to the author before finalizing the initial draft of the document.
TCS Version X.X 17 <Project and release name>
CMS XLC Appendix H: XLC Template Revision History
Appendix H: XLC Template Revision History
The following table records information regarding changes made to the XLC template
over time. This table is for use by the XLC Steering Committee only. To provide
information about the controlling and tracking of this artifact, please refer to the Record
of Changes section of this document.
This XLC Template Revision History pertains only to this template. Delete this XLC
Template Revision History heading and table when creating a new document based on
this template.
Table 9 - XLC Template Revision History
Version
Date Author/Owner Description of Change
Number
1.0 12/31/2009 ESD Deliverables Workgroup Baseline version
2.0 08/15/2014 Celia Shaunessy, XLC Changes made per CR 14-012
Steering Committee
2.1 02/02/2015 Surya Potu, CMS/OEI/DPPIG Updated CMS logo
3.0 04/25/2017 CMS Updated template style sheet for Section
508 compliance
Added instructional text to all blank cells in
tables
Added Acronym column to Table 5 -
Glossary
Reformatted Table 7 - Approvals in
Appendix E: Approvals for Section 508
compliance
Changed location of Appendix F:
Additional Appendices so that it resides
below Appendix E: Approvals and is no
longer the last appendix in the template
Added instructional text to Appendix H:
XLC Template Revision History instructing
authors to delete this appendix when
creating a new document based on this
template
TCS Version X.X 18 <Project and release name>