Click and Send
Document Version: 1.0
Document Date: 2013-03-20
Author:
Product: Mantis Bug Tracker
Amendment Record
Version Date Actioned By Description
1.0 First draft. Cycle 1.
Contents
1 Contents
2 PURPOSE, REQUIREMENTS AND DETAILED ANALYSIS EXPLANATION..................................3
1.1 PURPOSE....................................................................................................................................................3
1.2 REQUIREMENTS.........................................................................................................................................3
1.3 DETAILED ANALYSIS EXPLANATION..........................................................................................................3
3 TEST RESULTS AND ANALYSIS..............................................................................................................4
2.1 TEST ENVIRONMENT..................................................................................................................................4
2.2 TEST RESULTS...........................................................................................................................................5
2.2.1 Scenario 1 Detailed analysis.............................................................................................................5
2.2.2 Scenario 2 Detailed analysis...........................................................................................................15
2.3 IN CONCLUSION.......................................................................................................................................16
TEST INSIGHT PERFORMANCE AND LOAD TEST PLAN 2
2 PURPOSE, REQUIREMENTS AND DETAILED ANALYSIS EXPLANATION
1.1 PURPOSE
This report corresponds to the last launching of the tests Scenario 1 and 2 describe in the document
20130103_TestInsight_BenchmarkRequirements.docx.
1.2 REQUIREMENTS
All the tests requirements are available for consultation on the document already mentioned named
20130103_TestInsight_BenchmarkRequirements.docx, section 2 Test Requirements.
The content of Scenarios 1 and 2 are also detailed, section 4.8.
1.3 DETAILED ANALYSIS EXPLANATION
Along the Tests we have been running during this second cycle, records were made to detail, answer
and match to all the requirements. The analysis below follows each point of the requirements it has to
answer for both Scenarios 1 and 2.
In our pool of customer we used for that test.
Scenario 1: users bench00001 until bench00520.
Scenario 2: users bench00001 until bench01120.
TEST INSIGHT PERFORMANCE AND LOAD TEST PLAN 3
3 TEST RESULTS AND ANALYSIS
1.4 TEST ENVIRONMENT
The testing have been carried out using a Software designed to load test functional behaviour and
measure performance named JMeter – the tool was specified in the requirements.
Graph of the Test Environment.
1.5 TEST RESULTS
This section describes the full details of each the test Scenarios.
1.5.1 Scenario 1 Detailed analysis
1.5.1.1 Summarization of Scenario1
Scenario 1: The number of concurrent users to be simulated in this period is 520 divided like this:
Time Script 1 Script 2 Script 3
3pm – 4pm 250 0 1
TEST INSIGHT PERFORMANCE AND LOAD TEST PLAN 4
4pm – 5pm 250 0 1
5pm – 6pm 500 20 1
Script 1: Login, Current user will raise 20 test cases, Logout.
Script 2: Login, Process a 20 lines import file, Logout.
Script 3: Manual navigation on the system at 2:30 pm, 3:30pm, 4:30pm, 5:30pm and 6:30pm
(Manual).
TEST INSIGHT PERFORMANCE AND LOAD TEST PLAN 5
1.5.1.2 Results against requirements sorted by ID
1.5.1.2.1.1 Requirement 2.1: Customer first log into Click and Send within 5 seconds
Verify that customers first log into Click and Send within 5 seconds PASS
The graphs below correspond to the time took for each user to log into the Mantis.
They are detailed web server for Script 1 and Script2.
On the graphs appears a red line which corresponds to the requirement of 5 seconds.
The line is showed only if the ‘response times’ line reaches or breaks the requirement otherwise the
requirement correspond to the top of the Y axis and the red line does not appear even if it is stipulated
on the legend.
All the blue lines lower than that red line match the requirement and all the rest upper do not.
About the requirement we see clearly here that we are much faster in terms of pace. For Scripts 1 the
logs into the Mantis never took longer than 3 second. For Script 2, the logs into the Mantis never took
longer than 1.5 second.
Log into the Mantis on Web Server 1 – Webx0001 - for Script 1 & Script 2.
TEST INSIGHT PERFORMANCE AND LOAD TEST PLAN 6
1.5.1.2.2 Requirement: Ability to successfully complete load testing on the application to ensure
peak time processing will not result in any system failure
Ability to successfully complete load testing on the application to
ensure peak time processing will not result in any system failure
PASS
When running Script 1 with 500 concurrent users and Script 2 with 20 concurrent users, 50% missing
shipments for printing. All the missing shipments belong to Script 1.
With the run Script 1 with 480 concurrent users and Script 2 with 20 concurrent users, no system
failure.
Volume Benchmark Date Benchmark Errors rate <3 Requirements
scenario status seconds match
rate
500 Scenario 1 1/31/2013 OK 0% 94% Yes, over
users requirements
520 Scenario 1 2/01/2013 KO 50% missing NC No, too many
users created test errors
cases
1.5.1.2.3 Requirement NF 2: Users over the time
Mantis to support the following profiles:
80% of users create test cases between 15:00 and 18:00.
25% will be between 15:00 and 16:00
25% will be between 16:00 and 17:00 PASS
50% will be between 17 :00 and 18 :00
The graph below presents the 580 concurrent users of Script 1.
TEST INSIGHT PERFORMANCE AND LOAD TEST PLAN 7
TEST INSIGHT PERFORMANCE AND LOAD TEST PLAN 8
1.5.1.2.4 Requirement NF 3: Performance timings of the manual tests
Screen Performance to be proved in a controlled environment within
the Mantis data centre and not subject to Internet connections. PASS
All the times stamped for each of the actions in the graph below were made manually and correspond
to Script 3.
Records of the manual test – Script 3 - over the Scenario.
Step Action Time to capture manually
1. Reach the login page N/A (Captured in script 1)
2. Log in N/A (Captured in script 1)
3. Navigate to the Test Cases List Time to load the screen.
screen
4. Click Action->Create test case Time to load the Create Test Case screen.
5. Enter the test case details N/A
6. Click Save button Time to save the test case.
7. Log out the application. N/A (Captured in script 1)
Number of concurrent users 1 252 252 520 1
Step Action 4h 5h 6h 7h 8h
1 Reach the login page
2 Log in
3 Navigate to the Test Cases List
screen 1.213 1.413 1.462 1.619 1.375
4 Click Action->Create test case 1.697 1.553 1.421 1.645 1.574
5 Enter the test case details
6 Click Save button 3.125 3.324 3.294 3.429 3.548
7 Log out the application.
1.5.1.2.5 Requirement NF 3.1: Create Test Cases Screen available within 3 seconds for 90% of users
90% of users to be able to access Create Test Cases Screen within 3
seconds of Logging in to Click and Send. PASS
Totally, we have 217868 requests for Script 1, 602 requests took longer than 3 seconds.
So the rate is 99.72%
TEST INSIGHT PERFORMANCE AND LOAD TEST PLAN 9
Access to the page Create Test Cases Screen for Script 1.
Requirement NF 3.2 & 3.3: Field and Drop Down menu’s available instantaneously within 0.5
seconds
Within a screen, any field/ Drop Down menu's to be available
instantaneously/within 0.5 seconds of user clicking on the field. PASS
That has been covered by the manual tests – Script 3. As you see, the minimum is 1.431s.
Part of the records of the manual test – Script 3 - over the Scenario.
TEST INSIGHT PERFORMANCE AND LOAD TEST PLAN 10
1.5.1.2.6 Requirement NF 3.4: Data committed for processing completed within 2 seconds
Data committed for processing to be completed within 5 seconds. PASS
Have 20080 requests and all of them are completed within 5 seconds.
Save of the test cases for Script 1.
TEST INSIGHT PERFORMANCE AND LOAD TEST PLAN 11
1.5.1.2.7 Requirement NF 3.6: Imported files of 1000 items within 30 minutes.
Ability for user to import, load and process 1000 items using the import
test cases data functionality within 30 minutes. PASS
During Scenario 1 – Script 2, we imported 20 items for 10 users within 4 minutes.
TEST INSIGHT PERFORMANCE AND LOAD TEST PLAN 12
1.5.1.3 Screen shot of the monitoring tool of the Servers
1.5.1.3.1 Web Servers and Database Server
1.5.1.3.1.1 Web graphs
TEST INSIGHT PERFORMANCE AND LOAD TEST PLAN 13
1.5.1.3.1.2 BDD graphs
TEST INSIGHT PERFORMANCE AND LOAD TEST PLAN 14
1.5.2 Scenario 2 Detailed analysis
1.5.2.1 Summarization of Scenario 2
Scenario 1: The number of concurrent users to be simulated in this period is 1120 divided like this:
Time Script 1 Script 2 Script 3
3pm – 4pm 550 0 1
4pm – 5pm 550 0 1
5pm – 6pm 1100 20 1
Script 1: Login, Raise 20 test cases with current user, Logout.
Script 2: Login, Process a 20 lines import file, Logout.
Script 3: Manual navigation on the system at 2:30 pm, 3:30pm, 4:30pm, 5:30pm and 6:30pm
(Manual).
1.5.2.2 Result of Scenario 2
The break point of the system is in about 580 – 600 users.
Volume Benchmark Date Benchmark Errors <3 seconds Requirements
TEST INSIGHT PERFORMANCE AND LOAD TEST PLAN 15
scenario status rate rate match
580 Script 1 4/01/2013 OK 0% 98% Yes, over
users without requirements
import
600 Script 1 4/01/2013 NOK <0,01% 97% Yes, over
users without requirements
import
TEST INSIGHT PERFORMANCE AND LOAD TEST PLAN 16
1.6 IN CONCLUSION.
We can handle up to 500 users with import, and the break point without import is around 580 users.
END
TEST INSIGHT PERFORMANCE AND LOAD TEST PLAN 17