0% found this document useful (0 votes)
296 views16 pages

Technical and User-Oriented Testing Phase I

Uploaded by

api-258718800
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
296 views16 pages

Technical and User-Oriented Testing Phase I

Uploaded by

api-258718800
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd

Security Awareness for SME Testing Phase I

P a g e 0 | 15



TECHNICAL AND USER-
ORIENTED TESTING
Phase I [Link] Homepage
Security Awareness for SME


Security Awareness for SME Testing Phase I
P a g e 1 | 15

1 CONTENTS
2 Introduction .......................................................................................................................................... 2
2.1 Purpose of this document ............................................................................................................. 2
2.2 Structure of Testing....................................................................................................................... 2
3 Compatibility Testing ............................................................................................................................ 3
3.1 Compatibility Testing Environments ............................................................................................. 3
3.2 Compatibility Testing Results ........................................................................................................ 4
4 Functional Testing ................................................................................................................................. 6
4.1 Functional Testing Environments ................................................................................................. 6
4.2 Functional Testing Results ............................................................................................................ 7
5 Load Testing .......................................................................................................................................... 8
5.1 Load Testing Results ...................................................................................................................... 8
6 Performance Testing ............................................................................................................................. 9
6.1 Performance Testing Results ........................................................................................................ 9
7 User Acceptance Testing ....................................................................................................................... 9
7.1 User Acceptance Testing Results ................................................................................................ 10
8 Appendices .......................................................................................................................................... 13
8.1 Appendix A .................................................................................................................................. 13
8.2 Appendix B .................................................................................................................................. 14
8.3 Appendix C .................................................................................................................................. 15

Table 1 Mobile Environments .................................................................................................................... 3
Table 2 Desktop Environments .................................................................................................................. 4
Table 3 Functional Testing Environments .................................................................................................. 7
Table 4 Functional Testing Results ............................................................................................................. 8


Security Awareness for SME Testing Phase I
P a g e 2 | 15

2 INTRODUCTION
2.1 PURPOSE OF THIS DOCUMENT
This document is intended to support and record the testing phase I of our project website homepage,
[Link] . The testing will be primarily based on the relevant requirements set out in our
document, Technical & User-focused Requirements, submitted previously. Phase I testing will cover the
testing requirements, post Sprint 1 and Sprint 2 respectively. Sprint 1 and Sprint 2 have concluded to
create but not limited to, our functional Homepage. The testing phase I is therefore, concentrated on
the technical and user aspects of the Homepage.
2.2 STRUCTURE OF TESTING
The testing is broken down into the following sections:
Compatibility Testing
Functional Testing
Load Testing
Performance Testing
User Acceptance Testing
Security Awareness for SME Testing Phase I
P a g e 3 | 15

3 COMPATIBILITY TESTING
Compatibility testing, is part of the non-functional tests to be conducted on the website. It is testing
conducted on the website to evaluate its compatibility with differing computing environments, e.g.
Windows OS versus a Linux OS, or a mobile platform such as iOS 7 versus Android. This is an important
process due to the variety of environments utilised in world computing today, compared to 10-15 years
ago, when Windows OS dominated. Particular emphasis must be placed on compatibility testing for the
mobile environment, especially after the feedback received from the Small Firms Association, whose
website receives over 3m visits per annum, with the majority now being from mobile devices.
There are a number of online tools available to help complete a comprehensive testing process. For our
purposes [Link] and [Link] were utilised to cover the
majority of major computing environments, platforms and browsers.
3.1 COMPATIBILITY TESTING ENVIRONMENTS
Utilising the two online tools mentioned previously, a total of 46 computing environments were tested,
to cover the majority of environments used by our target users. The environments tested were:
Operating System Platform Compatibility
Apple iOS 7 Apple iPhone 5 100%
Apple iOS 6 Apple iPhone 5 85%
Apple iOS 7 Apple iPhone 5s 85%
Apple iOS 5.1 Apple iPhone 4s 100%
Apple iOS 7 Apple iPad Mini 100%
Apple iOS 7 Apple iPad 3 100%
Apple iOS 6 Apple iPad 3 100%
Android OS Ice Cream Sandwich (4.0) Amazon Kindle Fire 2 100%
Android OS Ice Cream Sandwich (4.0) Amazon Kindle Fire HD 80%
Android OS Ice Cream Sandwich (4.0) HTC One X 80%
Android OS Jelly Bean (4.2) LG Nexus 4 100%
Android OS Gingerbread (2.3) Motorola Droid Razr 80%
Android OS Gingerbread (2.3) Samsung Galaxy S II 80%
Opera Mobile Samsung Galaxy S II 80%
Table 1 Mobile Environments

Security Awareness for SME Testing Phase I
P a g e 4 | 15

Operating System Browser Compatibility
Windows XP Chrome 25.0 100%
Windows XP Internet Explorer 7.0 70%
Windows 7 Firefox 30.0 100%
Windows 7 Firefox 3.6 100%
Windows 7 Internet Explorer 8.0 70%
Windows 7 Internet Explorer 9.0 100%
Windows 7 Opera 12.16 100%
Windows 8 Firefox 18.0 100%
Windows 8 Internet Explorer 10.0 100%
Apple Mac OS X Snow Leopard Safari 5.1 100%
Apple Mac OS X Lion Firefox 19.9 100%
Apple Mac OS X Mountain Lion Chrome 26.0 100%
Linux Debian 6.0 Seamonkey 2.23 100%
Linux Debian 6.0 Opera 12.16 100%
Linux Debian 6.0 Lynx 2.8.8 100%
Linux Debian 6.0 Links 2.8 100%
Linux Debian 6.0 Midori 0.4 100%
Linux Debian 6.0 Rekonq 2.4.2 100%
Linux Debian 6.0 Firefox 29.0 100%
Linux Debian 6.0 Chrome 30.0 100%
Linux Debian 6.0 Firefox 24.0 100%
Linux Debian 6.0 Chrome 34.0 100%
Linux Debian 6.0 Dillo 3.0 100%
Linux Debian 6.0 Konqueor 4.13 100%
Linux Ubuntu 9.1 Seamonkey 2.7.2 100%
Linux Ubuntu 9.1 Rekonq 1.1 0%
Linux Ubuntu 9.1 Firefox 21.0 0%
Linux Ubuntu 9.1 Konqueor 4.9 0%
Linux Ubuntu 9.1 Luakit 1.8 100%
Linux Ubuntu 9.1 Chrome 27.0 100%
Linux Ubuntu 9.1 Firefox 8.0.1 100%
Linux Ubuntu 9.1 Firefox 9.0.1 100%
Table 2 Desktop Environments
3.2 COMPATIBILITY TESTING RESULTS
The results generated from the compatibility testing are available in Appendix A. These are the
screenshots of the homepage visualized in each computing environment.
Security Awareness for SME Testing Phase I
P a g e 5 | 15

From examining the resulting screenshots, a number of observations were made and tasks created. The
overall observation was the homepage was performing well within the majority of environments,
particularly in the most common environments (Windows OS, Mac OS, Android OS & iOS) which would
be utilised by our target users. This can be seen in Table 1 Mobile Environments and Table 2 Desktop
Environments under Compatibility results. We decided a value of 90% would suffice as a pass mark due
to the page structure requiring a balance between mobile and desktop performance.
All of the above results are based on the screenshots in Appendix A. We are currently adapting our page
structure to achieve a pass rate for all the mobile platforms, as we consider this a key component for
our target user experience.
An example for this testing can be seen in the figures below. It can be seen that some compromise has
had to be made to maintain a balance between mobile and desktop compatibility.

Figure 1 - Apple iOS 7 iPhone 5s Problem - button appears on top of text
Security Awareness for SME Testing Phase I
P a g e 6 | 15


Figure 2 - Apple iOS 7 iPhone 5s Problem fixed to 90%
What was noticeable was the Linux and Windows XP environments are not performing well. A number
of Linux proprietary browsers are not compatible with our page structure. However, we have not
prioritised this task as we judge that the majority of our core target users will not be utilising such
environments, as they tend to be utilised by more tech savvy individuals. We will consider adapting our
structure to become more compatibile with Windows XP as we believe a number of target users are still
utilising this environment despite its lack of support from Microsoft.
4 FUNCTIONAL TESTING
The functional testing is designed to test if the functions within the page do as per page specification.
The functions on the homepage are primarily to redirect link functions to a number of other site pages
e.g. to the About Us page. Therefore, the testing is relatively straightforward and is considered only a
pass or fail.
4.1 FUNCTIONAL TESTING ENVIRONMENTS
The following environments were utilised to perform the functional testing of the homepage.
No. Operating System Platform
1 Apple iOS 7 Apple iPhone 5
2 Apple iOS 7 Apple iPad 3
3 Android OS Ice Cream Sandwich (4.0) HTC One
Security Awareness for SME Testing Phase I
P a g e 7 | 15

4 Windows 8 Chrome 35.0
5 Windows 8 Firefox 30.0
6 Windows 8 Internet Explorer 11.0.96
7 Windows 7 Chrome 35.0
8 Windows 7 Firefox 30.0
9 Windows 7 Internet Explorer 11
10 Windows 7 Opera 22.0
11 Windows 7 Sogou
12 Apple Mac OS X Maverick Safari 7.0.5
13 Apple Mac OS X Maverick Chrome 35.0
Table 3 Functional Testing Environments



4.2 FUNCTIONAL TESTING RESULTS
There are 15 functions on our homepage which had to be tested. The following table outlines each
function and its result in each environment.
Function Environments Passed Environments Failed
CyberAware (Logo) Redirect to
Home
1,2,3,4,5,6,7,8,9,10,11,12,13
Home (Text) Redirect to Home 1,2,3,4,5,6,7,8,9,10,11,12,13
About (Text) Redirect to About
Us page
1,2,3,4,5,6,7,8,9,10,11,12,13
Learning (Text) Redirect to
Learning page
1,2,3,4,5,6,7,8,9,10,11,12,13
Contact Us (Text) Redirect to
Contact Us page
1,2,3,4,5,6,7,8,9,10,11,12,13
Slider Button Navigate slider to
the left
1,2,3,4,5,6,7,8,9,10,11,12,13
Slider Button Navigate slider to
the right
1,2,3,4,5,6,7,8,9,10,11,12,13
Slider Button Learn More auto
scroll page to assessment section
1,2,3,4,5,6,7,8,9,10,11,12,13
Perform Self-Assessment Button
Redirect to Assessment page
1,2,3,4,5,6,7,8,9,10,11,12,13
UCD (Icon) Redirect to UCD
website
1,2,3,4,5,6,7,8,9,10,11,12,13
Facebook (Icon) Redirect to 1,2,3,4,5,6,7,8,9,10,11,12,13
Security Awareness for SME Testing Phase I
P a g e 8 | 15

Facebook account
Twitter (Icon) Redirect to
Twitter account
1,2,3,4,5,6,7,8,9,10,11,12,13
Google+ (Icon) Redirect to
Google plus account
1,2,3,4,5,6,7,8,9,10,11,12,13
Table 4 Functional Testing Results
As can be seen from Table 4, the majority of the functionality is operational. The only functionality
which is not operational are our social media redirects. These will be updated when we implement our
social media campaign.



5 LOAD TESTING
Load testing is an important step when creating a website, as it can highlight any fundamental system
failures before launching a site for public use. Its purpose is to test the performance and capabilities of
the system architecture under normal and anticipated peak load conditions, load being the number of
users or requests made to a web service.
There are a number of online tools which can be utilised to perform a generic load test on a domain. The
tool utilised during this test was [Link] .
5.1 LOAD TESTING RESULTS
The main set of results can be seen in Appendix B, which is a print out of the file created by
[Link].
An overview of the results states that at a peak load of 192 TCP connections from 50 clients, page load
time was under 15 seconds for a user from a worldwide aggregate of users. While this may be
considered an average performance, we must appreciate the system architecture is a shared platform
with a limited budget for expansion. We still consider this appropriate for our predicted user base.
Security Awareness for SME Testing Phase I
P a g e 9 | 15

6 PERFORMANCE TESTING
Performance testing defines how our site performs in terms of responsiveness and stability under a
particular workload. This provides an overview on how each element of a page loads and its effect on
the overall performance of the page.
Similar to the previous tests there are a number of online tools to help execute performance testing on
a website page. The tool utilised for this session of performance testing was
[Link] .
6.1 PERFORMANCE TESTING RESULTS
The main set of results can be seen in Appendix C , which is a print out of the file created by
[Link].
What can be gathered from the results in Appendix C, is that the performance is being effected by a
number of elements in a disproportional manner compared to other elements. There are a number of
jpeg/png images utilised on the homepage, mainly in the slider and the 3 step outline section. The
average load time for all elements is approximately 1000ms. However, 3 image files are between 3-4
times the average load time varying between 2733ms 3829ms. On closer inspection it can be seen that
the size and resolution of each image is larger than others utilised. It was decided that these images
would be edited, reloaded onto the web page and re-tested in the next phase of testing.
7 USER ACCEPTANCE TESTING
User Acceptance testing is a critical test to establish if the page design meets the expectations of a
target user. Often known as beta testing or end-user testing it involves picking a user acceptance team
to utilise during the testing period. Once users are chosen, a set of test cases are designed for a user to
perform, on which any issues encountered are documented.
The user acceptance team for phase I of our testing comprised of some of those users who we have
previously interviewed during the user stories and prototyping stages of our project, and some who
have a limited exposure to our project. We felt this was a good mix to get a broad range of user
Security Awareness for SME Testing Phase I
P a g e 10 | 15

feedback. As our homepage is a relatively passive interface the feedback would be fundamental UI
design related.
7.1 USER ACCEPTANCE TESTING RESULTS
We had a broad range of people on the user acceptance team and hence had a broad range of results.
Some of the results were included previously in the Tara Whelan document but are included here for
completeness.
The following table outlines the function or element which is relevant, the user feedback and the status
of the feedback.
Security Awareness for SME Testing Phase I

P a g e 11 | 15

Function / Element User Acceptance Feedback User Status
Overall Page I know it is something for me and that I can do
nearly everything in a few steps. I know what I am
going to get for my time
Edward Lavery Small business
owner (BB Nurseries)
Complete
Overall Page Readability and Contrast testing Tara Whelan In progress desktop done
Overall Page Solid layout, great idea to lead with the statistics
and questions.
Josh Barber Small business
owner ([Link])
Complete
Overall Page The design is clear and straightforward, but
enough use of graphics to keep it interesting
Ronan Martin Marketing
Specialist (Auxilion)
Complete
CyberAware Logo Test the logo with the same font as the webpage Tara Whelan In progress
CyberAware Logo Is a desktop not a little dated to have in a logo? Tony Cronin Small Business
owner (Cronin Designs)
Complete
Slider Frame 1 - Text Text colour difficult to read on mobile screen size Edward Lavery Small business
owner (BB Nurseries)
Complete colour changed
Navigation bar No Home text with the navigation bar Tara Whelan Complete Home text
added
Slider Graphics Impressive, smooth and professional Complete
Slider Graphics Looks interesting on initial viewing Alma Judge - Doctor Complete
Slider Graphics The intro is brilliant, very refreshing Tony Cronin Small Business
owner (Cronin Designs)
Complete
Slider Graphics Use of text colour boxes looks like ransom note Tara Whelan Complete
Perform Self-Assessment Not noticeable, not effective too small Tara Whelan Complete Button made
Security Awareness for SME Testing Phase I
P a g e 12 | 15

Button bigger
Perform Self-Assessment
Button
Doesnt look like a button Donald Mucci Software
Engineer (Auxilion)
Complete shadow added
Perform Self-Assessment
Button
No hand click function when hovered over Tara Whelan Complete hand click
added
Perform Self-Assessment
Button
On the first page, its very right the big button
Assess your security
Donald Mucci Software
Engineer (Auxilion)
Complete
Slider Content &
Message
Relevant to user concerns like the questions Eoin Brennan Small business
owner (Scimet R+D)
Complete
Slider Content &
Message
Whats very good idea is the shocking histories,
combined i.e. with statistics
Jorge Sanz Software Engineer
(ITAlliance)
Complete
Page Fonts Dont use more than 2 same family Tara Whelan Complete
Logos Only UCD as advocate Tara Whelan In Progress
Other Get some user quotes saying how much your
service helped them
Josh Barber Small business
owner ([Link])
Complete
Other If I have a small business I will not have a lot of
time to spend especially if my business is not very
IT dependent
Edward Lavery Small business
owner (BB Nurseries)
Complete


Security Awareness for SME Testing Phase I

P a g e 13 | 15

8 APPENDICES
8.1 APPENDIX A

See slide show of images on blog, under Compatibility Testing.
Security Awareness for SME Testing Phase I
P a g e 14 | 15

8.2 APPENDIX B

See document [Link] On Demand Website Load [Link] on blog
Security Awareness for SME Testing Phase I
P a g e 15 | 15

8.3 APPENDIX C

See document [Link] On Demand Website Performance [Link] on blog

You might also like