Overview of TCBG LoadManager
2016
TCBG LoadManager - highlights
• Performance testing tool
• Besides T24, supports HTTP, HTTPS, TCP/IP, MQ, all Java
protocols
– The TCBG LoadManager is a performance testing tool, that is
primarily used for testing T24. However, it is capable of testing
other client-server applications too.
– LoadManager can also simulate external interfaces, that is, to
generate and accept the corresponding interface messages to and
from T24.
• Simulation of several users (sessions)
– The LoadManager is able to simulate several houndred users,
and this is one of the most remarkable advantage of it compared
to manual test. In most cases the manual test not only requires a
large staff, but even impossible.
TCBG LoadManager – highlights (cont.)
• Distributed test execution (multiple agents)
– Due to the large number of user sessions and high load, the
performance thest coud cause extreme load of the client
computer. To overcome this, the LoadManager is able to
distribute the client tasks among several agent computers in
a fairly convenient way.
• Capable of high-volume test
• Response time, throughput and customized metric
analysis
– The evaluation of the test is usually done after finishing the
test, which is supported by the LoadManager by several
tables and diagrams.
• Monitoring and interacting during test execution
LoadManager approach of testing
• LoadManager considers the tested server(s) as a black box.
– It substitutes the clients and satellite systems, and communicates with
the server using those connections.
– Besides initiating requests and accepting responses, LoadManager
also performs measurements (e.g. response times, transaction rates
etc.).
– Interface simulation is usually done in a „detached” manner.
Client HTTP or TCP/IP or ...
Server(s)
(e.g T24 Browser (e.g. T24
or Desktop) server,
Web or
Interface connection Application
Satellite server)
MQ or TC or ...
system
LoadManager
Application areas
LoadManager is uses for
• On-line performance measurement
– Also: lock/deadlock detection
• Database building (upload, migration)
• Reproducing random errors
Out of scope
LoadManager is not used for
• Functional testing
– LoadManager concentrates on the performance, not the
functional correctness.
– Use TCBG TestManager for functional and regression testing.
• „Offline” testing
– Most typically, LoadManager is not used directly for COB/EOD
performance measurement. However, LoadManager can suport
such test with database preparation / fulfill with transactions.
On-line performance test
• Simulation of many concurrent T24users
• Simulation of the T24 Desktop/Browser clients
• Simulation of interfaces
• Measurement of transactional throughput,
response times and custom metrics
• Storage and analysis of client-server
communication
• Lock detection
– Lock: two or more users attempt to access the same
exclusive resource (e.g. account number).
– Deadlock: a circular chain of users waiting for each
other.
Database building
• Database building is required for
– Examining a database that represents a future, planned,
not yet existing database
– Migration
– Preparing for COB/EOD performance measurement
• Globus/T24 browser/Desktop-like approach (in
contrast to other database building methods)
• Performance measurement data can also be provided
Reproducing random functional errors
• If a functional error is
– Random
– Occurs rarely or
– Occurs at high load only
• LoadManager test case to repeatedly attempt the error-
prone situation
– Test in close co-operation with the T24 developers
– Test case can be fine-tuded for better indentify the error
– LoadManager may provide additional information for the
analysis
Functional and performance test – a
comparison
Functional test Performance test
May spawn several Typically one day
banking days
Thorough functional Minimal or no functional
checks checks
High number of test Low number of test cases
cases
One or very few users Several users
Minimal Globus database Live-sized Globus
size database
Performance Testing with LoadManager
Methodology
Project phases
A typical performance test project using
LoadManager has the following phases:
Planning
• Automation
• Preparation
• Execution
• Analysis
• Reporting
Planning
The following are the steps of the Planning phase:
• Define the purpose of the test
• Collecting the operations
(transactions, enquiries and interfaces)
• Acquiring requirements
• Constructing the test portfolio
• Acceptance criteria
• Test plan
Define the purpose of the test
Determine what test is needed to be prepared and executed:
• Load test
Load the system under normal and extreme circumstances.
Verify the response times and the database locks.
• Bandwidth test
The correct network bandwidth has to be defined after the
evaluation of the test. During bandwidth test a network
emulator software is recommended (e.g.: WANEm).
• Stress test
Testing the bounds of the system, or testing under over-extreme
circumstances. Checking for data loss.
• Etc.
Collecting the operations
• Tested modules
• Operation types
– Transaction
Filling the fields of a T24 version, then commit (or authorise) on Browser or
Desktop.
– Enquiry
Initiating a T24 enquiry on Browser or Desktop.
– Interface
Simulating a satellite system communicating with t24 using an interface connection.
The formal specification of interface connections is usually required.
• Simplification
• Merge similar cases
• Ignore rare cases (or execute manually)
• Instead of composite screens, measure the included operations
Acquiring requirements
• Measured figures
– Transaction rate
• During performance test, operations (or operation sequences) executed
repeatedly. Transaction rate is the frequency of iterations.
• Target rate: Transaction rate at which the iterations are (attempted to be)
generated by the LoadManager.
• Acrual rate: The actual transaction rate, that may be less than the target rate.
– Locks, dead-locks
– Response times
– Amount of data (number of bytes) transferred
• Measurement by LoadManager is not fully adequate (conceptually).
• Consider involving an external network measurement tool (SW+HW).
– Etc.
Acquiring requirements (continued)
• Test cycles, test durations
– Plan each test cycle
– Allocate sufficient time for each test cycle
(preparation, saving logs, short analysis)
– Consider allocating a contingency day
Bad experience at one performance test, when insufficient time
was allocated, and the test duration could not be extended.
• Special cases
– Measure other (non-T24) components
– Combined performance test with other teams (network,
satellite systems)
– Etc.
Constructing the test portfolio
• List of operations (transaction, enquiry, interface)
The simulation of an operation is usually implemented in a
Test Script.
A test script is a Java class that has access to various
services through the LoadManager API.
• Frequency for each operation
– Peak frequency
– Transactions per hour / daily / yearly changes
• Number of users
– Also users and operations can be grouped
Test portfolio example
Description Name Script Peak daily Changes
frequency in day
Outgoing SWIFT FT,BANK.OT FtBankOt 14500 12:30-15:00
double freq.
Internal payment FT,BANK.IPO FtBankIpo 45 -
Incoming Swift (Swift SwiftIn202 2050 Half of load at
202 interface) bank opening
FT INAO enquiry ENQ EnqFtInaoStatus 600 -
BANK.FT.INAO.
STATUS
Acceptance criteria
• Acceptance criteria for the test
– In case of what kind of results the test itself is considered to be acceptable
– For example:
93% of the operations must be (functionally) successful or
90% of the transactions must be completed within 1 minute
At one performance test, when the acceptance criteria was not defined in
advance, this resulted in additional work both for TCBG and the Bank to
discuss and analyse the results in detail. Impact was a longer time for
completion of the testing cycles, so ensure the criteria are defined in advance
and unambiguous
• Acceptance criteria for the tested system
criteria for
– Transaction rates
– Response times
– Maximal loss of data
– Etc.
Test Plan
• Test Plan is the final delivery document of the
Planning phase
• The test plan should contain:
– The purpose of the test with detailed description
– The test portfolio
Names of versions, enquiries, interfaces, and their frequency
– The acceptance criteria
– Resource requirements
Computers, access rights, software requirements etc.
– Detailed plan and timing for the subsequent test
phases
Project phases (Automation)
Planning
Automation
• Preparation
• Execution
• Analysis
• Reporting
Automation
Implementation of Test scripts for each operation.
Test scripts has two purposes:
• Test scripts for preparation phase
– Scripts what are creating new records and store their IDs into
datapools
– Scripts what are collecting existing records and store their IDs
into datapools
• Test scenarios for execution phase (the test itself)
– Test scenarios that are executed for measuring purposes
Automation
Datapools
• Input data in table format
• Filled up by preparation scripts
• Execution scripts iterate on them and pick test data
Automation
Implementation has two phases:
• Implement scripts for the operations
• Implementing test suite (file with .ptxml
extension)
– Group transactions into user groups
– Create transactors for the specific groups
– Define the scale and the rate of transactors
– Determine the weight of the scripts selected randomly
– Define the usage of agent computers
Test suite – an example
User Groups
• A user group is a
group of users
performing the
same action
• For each user
group the set
and weight of
agent computers
can be specified
too
Transactors
• Transactors are
iterators with a
number of
adjustable
attributes
Automation – testing scripts
Testing the Test Scripts
• Add some functional checks to the scripts
– Checks has two purposes:
• Check the correct implementation of the script
• Check the correct operation of the tested system (T24)
– LoadManager has also some built-in checks
– Checks need not be exhaustive and must not
(significantly) impact the performance measurement
• Test the scripts at the end of the automation
phase
Project phases (Preparation)
Planning
Automation
Preparation
• Execution
• Analysis
• Reporting
Preparation
• Preparation means to prepare the input data and the T24
environment ready for the test execution.
– Creating new records in T24 (e.g. users, customers, accounts)
– Collecting test data from T24 (e.g. customer, account and other
ID-s) and store them into datapools
• The following should be taken into consideration:
– Take care of the order of the scripts
For example, customer creation must precede account creation;
account creation or collection of account id has to precede teller
transactions etc.
– When the preparation phase is finished and the planned
transactions are collected, a full system backup is recommended
Preparation
• Preparation should be done
– at the first time (after automation)
– each time a new T24 environment is used (or the T24
environment is updated/restored).
• Test the correctness of preparation
– Use a suite that contains each scripts (each kinds of
operations) exactly once
– Execute this suite at the end of each preparation
phase
Project phases (Execution)
Planning
Automation
Preparation
Execution
• Analysis
• Reporting
Execution
• The tests are executed on the prepared T24 environment
– Should add the largest possible heap size for the java machine of the
test suite
– The necessary logging options are set properly (e.g. option of
generation of performance report should usually be turned on)
– Care about agent computers
– Execute test scenarios according to test plan
– Carefully save logs of every execution
• Create checklist
– Actions to be done and checked before/during/after test executions
– Update the checklist continuously
Execution monitoring
During test execution the tester can
• Monitor the state and behaviour of users
• Detect occasional locks and dead-locks
• Monitorthe execution progress and speed
• Alter the target rate of certain transactions
Execution monitoring: Users view
• Selected users can be suspended, resumed,
debugged or stopped.
• A snapshot can be saved from the users view at
any time
Execution monitoring: Transactor list
• The target rate of a selected transactor can be altered,
or the transactor can be gracefully stopped
Execution monitoring: Transactor Graph
• The transactor graph shows the progress of execution for
the selected transactors
Execution monitoring: Metrics
• A metric is a named measure at certain times
Project phases (Analysis)
Planning
Automation
Preparation
Execution
Analysis
• Reporting
Analysis
• After the test runs the test results are evaluated
– Check if the test is acceptable
based on the acceptance criteria
– The execution errors must be understood and categorised
– The measurement data should be collected and arranged
• Short analysis is necessary after each test cycle
Command list and diagram
• Response times in table and in chart format
Execution log
• Primarily to analyse execution errors
Project phases (Reporting)
Planning
Automation
Preparation
Execution
Analysis
Reporting
Reporting
• The Test Report is the final delivery of the test
– The scope of the test
– Measurement results
– Evaluation of the test
(if the test is acceptable or not)
Test Report extract - example
o Sample table of the transactions and enquiries done (extract)
o Can be used to verify if the acceptance criteria for the test are
fulfilled
Type Name 1x frequency 2x frequency
Count % passed Count % passed
Enq STMT.ENT.BOOK 20 100 53 98
Commit FT,BANK.OT 1347 99 2715 93
Auth FT,BANK.OT 1344 98 2674 91
Commit TELLER,BANK.WITH 146 100 291 100
ENQ TT.TILL.STATUS 15 100 31 100
Commit TELLER,BANK.DEPO 137 99 271 100
Test Report extract – example (cont.)
• Sample table of the response times: average and 90% percentile
(extract)
• Can be used to verify if the acceptance criteria for the tested
system are fulfilled
Type Name 1x frequency 2x frequency
Count Avg 90% Count Avg 90%
Enq STMT.ENT.BOOK 20 0.44 0.47 53 0.46 0.62
Commit FT,BANK.OT 1347 0.87 1.36 2715 1.10 2.03
Auth FT,BANK.OT 1344 0.62 0.78 2674 0.93 1.65
Commit TELLER,BANK.WITH 146 0.25 0.45 291 0.34 0.69
ENQ TT.TILL.STATUS 15 2.15 4.37 31 3.56 15.18
Commit TELLER,BANK.DEPO 137 0.27 0.37 271 0.35 0.54
Thank You
The
Core
Banking
Group