Strategies For Testing Web Applications Client Side
Strategies For Testing Web Applications Client Side
Client Side
by
Bachelor of Science
Mathematical Sciences with the Computer Science Option
The University of North Carolina at Chapel Hill
1980
A thesis
submitted to the School of Engineering
at Florida Institute of Technology
in partial fulfillment of the requirements
for the degree of
Master of Science
in
Computer Science
Melbourne, Florida
May 2003
Copyright 2003 Lawrence Thaddeus Prevatte III
All Rights Reserved
____________________________________
J. A. Whittaker, Ph.D.
Professor, Computer Sciences
____________________________________
M. Andrews, Ph.D.
Assistant Professor, Computer Sciences
____________________________________
J. C. Wheeler, Ph.D.
Professor and Head
Electrical Engineering
____________________________________
W. D. Shoaff, Ph.D.
Associate Professor and Head
Computer Sciences
Abstract
iii
Table of Contents
Abstract................................................................................................................. iii
iv
2.5 Test Number 4: Check Accessibility ................................................... 36
2.6 Test Number 5: Check Usability ......................................................... 38
2.7 Test Number 6: Verify Forms ............................................................. 41
2.8 Test Number 7: Enter Invalid Input ..................................................... 44
2.9 Test Number 8: Resize the Browser .................................................... 48
2.10 Test Number 9: Examine Web Page Objects ....................................... 50
2.11 Test Number 10: Check for Unauthorized Access ............................... 52
2.12 Test Number 11: Change Browser Settings ......................................... 55
References............................................................................................................ 73
v
List of Figures
vi
Acknowledgments
Dr. James A. Whittaker for the great part he played in the conception,
research, and preparation of this manuscript and for his astute guidance in the thesis
development process
Dr. Carl I. Delaune for allowing me the opportunity to take his courses, for
the knowledge he imparted to me on numerous occasions, for the encouragement he
offered freely on many more, and for his advice to develop a thesis as part of my
Masters degree
The love of my life, my wonderful wife, Suzi, for enduring, encouraging, and
enabling me as Ive pursued this degree for the past three years, and for serving as a
great example for going back to school after several years away from it
vii
My parents, Rev. Thad and Martha Ann Prevatte, for the sacrifices they made
to ensure that I was afforded every opportunity to acquire a quality education, for
instilling in me the desire to do my best at anything I attempt, for helping me
appreciate the value in obtaining a sound education, and for all of the other invaluable
life lessons they taught me
My sister, Pam Land, for impressing upon me the importance of striving for,
retaining, and improving upon the essential skills of common sense and social
interaction throughout my life
My extended family, Craig Land, Nicole Land, John and Sue Fick, Brad and
Wanda Fick, John Fick, Kate and Chris Walker, Amanda Fick, Casey Fick, Brian
Fick, Kevin Fick, Betty Jo Prevatte, Mary Jo Kaczmarczyk, Andy Dresser, Pat
Gilbert, John Kurowski, and Jeff Oliver for being so understanding and supportive of
me as I undertook the pursuit of this degree
Dr. Fred Brooks, Jr., Dr. Stephen Pizer, Dr. Peter Calingaert, Dr. Don Parnas,
Dr. Karl Freund, Dr. Dave Pargas, Dr. Donald Stanat, Dr. Carlo Ghezzi, Dr. Donald
Richardson, Dr. William Shoaff, Dr. Joseph Wheeler, Carolyn Henninger, Sharon
Stites, and all of the other professors and staff at the University of North Carolina at
Chapel Hill and the Florida Institute of Technology who inspired me and helped me
appreciate the fascination of the Computer Sciences
My coworkers and friends Joe Wehlen, Lyn Picou, Cathy Emerick, Lawrence
Robinson, Mark Flom, Gary Meier, Frank Gutcher, Paul Halpin, Bob Henry, Pat
Ruddell, Steve Hauss, Dr. Mike Sklar, Dr. Alan Drysdale, Jon Hillman, Dave Ungar,
Colette Bessette, Nate Shedd, Dan Davis, Mike Shilkitus, Kevin Hoshstrasser,
Georgiann Allen, Paul Seccuro, Bruce Birch, Doug Hammond, Pam Mullenix, Sue
Waterman, and Paul Burke who encouraged me to pursue postgraduate education
My good friend Neil Scott for proving to me that its never to late to go back
to school and for showing me the way
viii
Chapter 1
A Fault Model to Guide
Web Software Testing
There are also considerations for Web applications that are different than
other software applications or simply do not apply to other applications. First and
foremost among these is the fact that a Web application is as much an extension of a
companys or organizations storefront or office space as it is a system of software
that performs some set of tasks. One of the most self-serving tasks a Web
application, particularly a commercial one, performs is to keep the user at the Web
site.
One could make the argument that encouraging the prolonged use of any
application may not be in the best interest of the application because it merely
increases the chance that incorrect or poor functionality will be discovered, but thats
the price for success as well as the implied challenge for any software application.
Any software application that performs poorly or malfunctions runs the risk of
driving users to turn to alternative applications.
1
However, nowhere is this facet of application competition of more concern
than in the world of the Web, where the user is only one click away gone [1][2]
either exiting from the Web altogether or navigating off of the current Web
application and onto another, possibly competing companys Web site. [3] While this
is of most concern to commercial Web application deployers, for any Web
application: If the user loses interest in it, the functions that a Web application is to
perform are superfluous.
Almost all of the users who were asked, in a survey concerning Web rage,
what they were most likely to do when they didnt like a companys Web site
indicated theyd move on to a competitors site. None of the respondents planned to
spend any more time on a bad Web site, although a few said theyd try to contact the
company for help. The same Web users described frustrating Web sites as being
hard to read (due to font choices), poorly organized, or out-of-date, or making it
difficult to find a telephone number to use to get in touch with the company. [5]
The testing of Web applications in order to assure their deployers that users
experiences with the Web applications will be pleasant is therefore of extreme
importance. Even internally deployed Web applications, such as Web-based training
or corporate sites are useless when employees refuse to use them. So Web
applications should be tested as often and as thoroughly as time, money, and other
resources allow.
2
1.2 Types of Tests
Some of the more common types of tests are acceptance tests, conformance
tests, usability tests, performance tests, reliability tests, and robustness tests. [1]
Acceptance testing demonstrates the applicability of an application to the automation,
usability, performance, reliability, and robustness requirements of an organization to
the satisfaction of the organization. Conformance testing matches the
implementation of an application to its stated requirements. Usability testing
determines the ability of the application to meet the needs of users in performing
their tasks in an automated fashion. Performance testing measures how much time
and other system resources are required by an application to accomplish the tasks it
performs. Reliability testing stresses the application in the face of system resource
and other failures, including the ability of the application to continue to perform its
tasks over a long period of time. Robustness testing investigates the applications
ability to perform tasks in ways not originally intended and the applications potential
and capacity to be modified and scaled beyond original requirements.
Adept testers who test Web applications at this point and have achieved
mastery of discerning situations and scenarios that might cause Web applications to,
3
at best, perform inadequately may be referred to as breakers. A breaker might
seem to be something out of a Web applications developers worst nightmare, but I
contend that a tester who can discover and report what any user might accidentally
stumble across while using a software application will prove even more invaluable
for a Web applications developers than for developers of traditional software
applications. This is because Web applications tend to be used more immediately
and much more often than traditional software applications due to their being hosted
on the Web. This text will also focus primarily on the areas of Web application
software functionality and security, as they are most often viewed as the two primary
foci of whether or not a Web application actually works correctly.
4
is anticipated when the subject area is as new as the testing of Web application
software.
A good tester should have gained through experience intuition as to the types
of input and output data and algorithmic combinations that often prove troublesome
for the type of application that is being tested. However, any deeper knowledge of
the internals and implementation of an application, particularly a Web application,
may actually prove detrimental for a tester. Tests designed to focus on particular
implementation choices often have to be redesigned or rethought as subsequent
versions of the application are released. Implementation-based tests may also suffer
5
from the tendency to test the commercially available development and runtime
software upon which the custom-built portions of the application rely rather than on
the custom-built portions of the application itself.
Web software, like any other software application, has particular potential
trouble spots, which will result in application or system failure if the application fails
to properly anticipate and handle trouble. To a software application, trouble
manifests itself in many ways, including bad input data, unacceptable combinations
of good input data, invalid sequencing of events, and failures in system resource
availability and response. Applications cannot avoid these types of problems. How
6
well a software application detects, protects itself from, and recovers from such
occurrences is what separates sane applications from pathologic ones.
For Web applications, clumsy Web page and site presentation and navigation,
insufficient handling of client-side, network, and server-side interface errors and
fumbling data exchanges compound the opportunities to fail that software
applications of any sort already provide. Good testers have either gained the
experience, possess the amazing intuition, or both, to focus their attacks on software
applications to devise tests that will cover the areas of most concern in the operation
and performance of a software application as it attempts to perform its tasks. [6]
Good Web application testers must also become well versed in the myriad of
interfaces available to the Web application developer these days, due to the
proliferation of various Web browsers, client systems, server systems, network
systems, and client- and server-based Web software components.
7
are similar to other types of software applications and the features that distinguish a
Web application from other classes of software applications is a crucial step in
determining how and where to apply testing resources in order to make the best use
of a testers time and energy. The types of interfaces and interactions with other
components in the Web applications universe, and the nature of those components
themselves figure heavily into generating a model of the environment in which the
Web application operates.
The Fortran processor designers plowed ahead with their machine designs,
and the majority of the programming that was taught and put into practice was
procedural. Structured programming came about when many in the industry came
to the conclusion that procedural programmers were writing code willy-nilly, with
what seemed like total disregard toward readability of program flow and other
qualities that make code maintenance (repair and change) a little easier to perform.
8
Terms like spaghetti code described programs whose control flow entered and
exited sections of code with little regard for code cohesiveness.
But Web application software development also sprang up during this time of
adjustment, on the heels of the fast-growing development of the Internet and the
World Wide Web. The nature of the differences of Web application software, and
PC-and-network-based software in general, from software that existed before the
advent of the Internet and the resulting World Wide Web has been recognized as an
9
inevitable consequence of the proliferation of the personal computer and large-scale
inexpensive networking. In recognition of the new testing approaches that would be
necessitated by the new operational and delivery models Web application
development brought to application software, James Bach observed:
Weve come to the point where the textbooks on software quality and
testing must be rewritten. Most of them still sing the praises of white-
box testing and 100% statement and branch coverage, which may
have been fine for the mainframe world in 1976, but are completely
impractical in the fast-moving desktop world of 1996. Testing can
easily become an infinite task, so testing methods must be tailored to
match the technological, economic and human factors in each
projectModern software consists, to a large degree, of components
written by someone else. That means programmers can create more
functionality while understanding less about it than ever before. Bad
software has never been so easy to create. [10]
Creators of personal Web sites may not need to worry much about offending
Web surfers, but businesses are piling onto the Web in increasing numbers, and
appear to be becoming much more concerned and aware of the dearth of quality Web
site development that has been taking place. More and more, these days, businesses
that deploy Web sites are requiring that testers perform more stringent testing and
understand that the best testers are those who best understand how Web sites can be
brought to their knees. In the mad rush to bring Web sites online as soon as possible,
testers must understand the makeup of Web applications as well as which areas of
10
Web applications are at greatest risk of failure, because testing resources and time
tend to be in limited supply.
11
third-party components and system services, the networks over which these various
pieces of software exchange data, and the Web navigation techniques employed by
the Web application to perform its tasks.
12
Figure 1.1 A Web Applications Environment
On the client machine, it makes very little sense to test most of these factors
(you tend to end up testing how the Web browser reacts to system resource
13
deprivation rather than how the Web application itself reacts), but on the server side
these factors not only affect how the custom, server-side components perform but
also the performance and reaction of the Web applications client-side software. So
although the client side of a Web application calls for a new model, testing the Web
application for its reaction to server-side operations can mostly be accomplished
using the familiar software application model applied on the server machine.
Note that testing the reaction of a Web application by invoking failures and
near failures of system resources on the client side should be done to some extent, but
a tester has to be very careful to not turn client-side testing into trying to break the
Web browser instead of the Web application. Remember that the Web browser is a
software application in its own right that runs on the client machine. Presumably, the
14
Web browser has been tested by its manufacturer and at any rate is not under control
of the developer of the Web application.
This is especially important because Web browsers are not consistent in the
way they handle HTML, much less in the way they handle incorrectly written
HTML. How the Web application allows the user to move around from object to
object on the Web page is important because users expect the user interface to be
consistent from application to application, even though Web applications available to
any user on the Web may have been written by developers all over the world.
Web page navigation also includes provisions for providing data input values
to and viewing data output values from the Web application via objects on the
current Web page, whose importance will be discussed in section 4. Besides HTML
forms and client-side scripts, input and output on Web pages may also be handled by
client-side plug-ins and Java.
15
Surfing from Web page to Web page or window or even to other points on
the current page may be provided for via active links on the current Web page, client-
side and server-side image maps, plug-ins, scripts, and Java. Even if the method of
navigation is not entirely consistent from browser to browser for the same Web page,
navigation should be as consistent as possible both within the same Web application
and between Web applications hosted under the same browser.
Web applications should also be developed with the expectation that the user
will not have the latest and greatest Web browser loaded on their system, nor the
latest client-side plug-ins (including the latest Java applet plug-in), nor will they
necessarily have enabled every Web interpretation option, such as allow Java,
allow all cookies, and allow JavaScript. And the users loaded browser may not
support frames. Web applications should both anticipate these situations and provide
convenient and user-friendly means to handle them. A tester must consider and test
how the Web application handles many different types of client-side software
configurations.
16
using Microsofts Internet Explorer.) rather than providing a convenient means for
the user to download a plug-in that will handle ActiveX.
The savvy Web application tester tend to ask the Web application questions
like: Okay, now what are you going to do if I change some of the data youre
expecting from your server? or Alright, what if I drop, scramble, or slightly change
your message to your server? A good tester does not generally ask the browser:
What will you do if I bog down your view of the network so you time out trying to
load the new Web page the user just tried to go to via an active link on the current
Web page? unless the tester is testing the Web browser itself.
17
While load testing a Web site is crucial to ensuring the Web sites success, this type
of testing tests very little of the functionality of the Web application itself. Load
testing is more for finding out how much load the Web site (its machines, systems,
and software) can bear and still allow the Web application to perform in a timely
fashion as advertised than for testing whether or not the Web application functions as
required. [14]
One common misperception about load testing is that the testing is about
what loads will break the Web site, particularly loads that place the Web sites
hardware and server-side components under stress. This type of testing is more
along the lines of stress testing, which should be done in addition to load testing, but
its purpose is to discover what the Web application does in reaction to having its
resources become unavailable and broken due to duress. [14]
Another of the most common misperceptions regarding load testing is that the
number of concurrent users equals the load on the Web site. As the number of users
of a Web application increases, especially as the number of concurrent users
increases, the load on the applications site may very well increase, but it has been
found that the two are not as tightly coupled as one would assume. [15]
Load testing is also generally performed by the Web site creators or their
testers, because load testing is usually not considered valid unless the Web site
layout, in terms of machines, software, and network components, are exactly
duplicated in the test environment. Breakers will usually test Web applications from
the perspective of the Web user on the client side of the Web site, which does not
have the ability, resources, or access to exactly duplicate the server side of the Web
application. [9][15]
In fact, Anderson claims that one of the top mistakes made in load testing
Web applications is in starting too late. He observes that while many development
projects schedule load testing after an application is complete, load testing should
18
begin as soon as the processes that make up the application are running, even before
the user interface has been developed. Although he advocates performing load
testing just before putting a Web site into production and load testing during
regression testing after changes have been made to a Web site, he explains that
conducting load testing during implementation is also useful for choosing between
alternatives when considering what hardware the Web site should contain. [14]
Automation can be overused if the tester is not careful. The creation of test
scripts and parameterization of some automation tools can become a development
project of its own, which can consume the time resources a tester would otherwise
have available for actual testing. [16] Another consideration regarding use of
automated tools is learning how to use tools will require time, and some of the tools
available for testing Web applications can take a considerable amount of time to
learn to use. [17]
Once created to test a Web application, automated scripts can be re-used for
regression testing. [16] Indeed, testing tools have been developed that some may use
primarily for regression testing of Web applications, such as a tool developed by
Hyoung-Rae Kim, a graduate student at Florida Institute of Technology. This type of
19
tool parses a Web application and builds a profile that includes the pages of the Web
application, the number of, types of, and links to objects on each Web page of the
application that can be compared to a parse of the Web application after changes
have been made. Analyses and comparisons of the data can be used to guide testers
to pages that appear to be more complicated and to particular areas of the Web
application that appear to have been modified. [18]
1.10.1 Functionality
The functional areas of particular importance to testers of Web applications
include navigation, operations (computation), controls and displays, data handling
(input and output), and error handling. These components are similar to their
counterparts in other software applications, but their implementation in the more
distributed world of Web applications can lead to some interesting, and sometimes
confounding, problems as inputs, computation, and outputs may travel back and forth
among components across the World Wide Web.
1.10.2 Security
Web applications face greater potential problems in terms of providing
security to user data and to corporate data because of their existence on the wide-
20
open World Wide Web. User data must be protected from other Web users as well
as from unauthorized personnel on the Web applications server-side machines.
Corporate data on the Web applications server-side machines must be protected
from unauthorized access on those machines, as with other software applications, but
it is a much more complex problem to protect this type of data from users attempting
to access the Web sites data via interfaces on the Web which were designed to allow
wide-spread network access than from applications on the same machines or from
dedicated client machines.
1.10.3 Usability
One of the most difficult aspects of Web applications to quantify, usability
covers many of the capabilities as well as content arrangement of a Web application.
While navigation layout and visibility, consistency, usefulness, and ease of use of a
Web application may first pop into someones head when thinking about the usability
of a Web application, other factors in human-machine interface design are also
examined by experts in this field, including accessibility. For example, the ability of
people who cannot see or the ability of people who cannot hear to use Web
applications depends on the implementations chosen and standards of usability that
were followed when the Web application was created.
1.10.4 Compatibility
How the Web application interacts with the client-side machine can also turn
a users experiences using the Web from enjoying a pleasant Web experience to
having a bad day. A Web application must function correctly in the users Web
browser, it must not damage the users settings in the Web browser nor in the users
machine, it must not harm the users machine environment, it must not disable the
users shared libraries, plug-ins, nor other, shared software, and it must not allow
unauthorized access to any information.
21
1.11 Narrowing the Research Focus
As I conducted research into how to test Web applications, the following
thoughts emerged and influenced how I narrowed the focus of my research. The first
two thoughts I tried to classify were what the components of Web applications are
and what the areas of most concern for Web applications are.
The network, which is the World Wide Web, appears to be the independent
component of a Web application. Web applications require the World Wide Web for
interaction with the user, but the Web application developer has no control over the
Web network itself. What the Web applications can do, however, is anticipate
network conditions and ensure that the problem conditions between a Web
applications client side and server side are handled by the Web application. Testing
how a Web application handles network problems, however, can also be conducted
from the client side of the Web application.
22
including load testing. However, such an abundance of security issues peculiar to the
server side of Web applications appear to exist that they, too, seem to deserve their
own, more detailed discussion.
1.12 Summary
Software applications need to be tested for many reasons and throughout their
life cycles, so testing manifests itself in many forms. Although Web applications,
like other software applications, should be tested as they are developed, especially at
the component level but also during integration and under load, the ready for users
phase of a Web application is where a large number of the bugs users uncover will
emerge. It is at this point that software testers highly skilled in attacking software are
most likely to discover the lurking problems in functionality, security, usability, and
compatibility that may make or break the Web application. The following chapters
will present tests that expose problems in Web applications in these areas.
23
Chapter 2
The User Interface
The user interface includes both the look and the feel of a Web application.
The look of a Web application includes the layout of screen objects, the colors and
patterns of text and graphics (including background graphics), the appearance of help
text, and the appearance of screen control objects. Consistency on each screen and
across the Web application contribute mightily to the overall appearance of
cohesiveness across the site. The feel of the Web application how dynamic screen
objects react to user interactions with them and how the screens and screen objects
are presented and changed also determine the ease and applicability of use of the
Web site for the user.
The term coined for how easy to use and how relevant the components of a
Web sites screens are for users is usability. Usability also encompasses other
aspects of Web sites. Increasing use of Web sites by people with visual, aural, and
motor impairments has brought with it an increase in the attention to the need to
design Web sites to be more inclusive of people with varying physical abilities
24
(accessibility). Research into usability is well underway across the world, and has
been addressed by academic research programs, such as the E3 Lab at Florida
Institute of Technology, run by Dr. Shirley A. Becker, as well as by organizations
such as the W3C, in their Web Accessibility Initiative (http://www.w3.org/WAI/).
So, testing the user interface is generally testing the Web applications
usability. There are tools available for help in testing particular aspects of a sites
user interface and there are also tools available which include these and other aspects
of usability when reporting test results. While the tools range from commercial
products to downloadable programs in the public domain, several sites also offer
online tools. Online tools can be used by supplying a URL on the tools Web pages.
Some of the online tool sites report results right on the Web page, some send E-Mail
messages containing results, and others post the information on their site, sending an
E-Mail message that includes a URL to the information.
Several Web sites provide links to Web testing tools, including commercial,
fee-based, and free tools that are available online or for download. Sites like the
Software Testing Resources site, at http://www.aptest.com/resources.html#web-
source and the Web Test Tools Web site described in section 1.8, which is part of the
Software QA and Testing Resource Center site, located at URL
http://softwareqatest.com/ provide annotated links to various tools that can be
used to help automate the testers task of breaking Web software.
When a tester devises and applies this type of test, its often helpful to play
user and try to emulate a novice users behavior, as well as behavior of a user with
intimate knowledge of the Web application. Often, as with any other software
program, a Web applications developer may unconsciously ignore error checking
due to the a user would never do that mindset.
25
Manufacturing facility in West Columbia, South Carolina. If a user opened or
otherwise fiddled with the transparent access door that allows the user to extract an
ice-filled cup of soda from a drink machine, the sequence of cup drop, syrup and
carbonated water pour, and ice addition often became disrupted enough so that the
machine dropped a cup after pouring the liquids or dropping the ice.
The assumption the drink machine designers made was that since it makes no
sense to extract the cup of soda until the syrup, carbonated water, and ice had been
added to the cup, no one would have any reason to open the access door until the
drink was built. However, users being users, several bored users actually fiddled
with the access door while waiting for their drink to be constructed. That drink
machine taught a lot of young engineers the danger of assumptions especially of
the a user would never do that variety.
When HTML or other codes are incorrectly written, the Web browser that
interprets a Web applications Web page may not correctly render the contents of the
26
page to the user as the page designer intended. Validating the code on a Web page
means checking the Web page programming to ensure that it is correct according to a
particular version of HTML (usually specified by the Web page designer within the
Web page source code). Web page validation is one way to be reasonably assured
that a Web browser that claims to correctly render a particular version of the code the
Web page was written in will display the Web page content correctly to the user and
handle the users interactions such as mouse clicks in the way the Web page designer
intended.
Some errors in page encoding may not be so easily detectable, however, and
may only be found upon examination or parsing of the page source code. Visual
inspection can be made of the page source code, but automated tools for checking the
source code are readily available and tend to be more reliable than human visual
inspection, since visual inspection is prone to some of the same errors as human
encoding.
27
Running this test
An example of invalid Web page encoding is an order status page on the Dell
Computer Web site. Using a Web page code validation online service, such as the
Web Design Groups (WDG) HTML validation service, available online at the Web
address http://www.htmlhelp.com/tools/validator/, the tester enters the URL for
the Web page whose HTML encoding will be validated, then the tester mouse clicks
the validate button.
The result for the Dell computer order status page at the Web address
http://www.dell.com/us/en/dhs/topics/segtopic_orderstatus.htm displays the
URL of the page whose HTML encoding was checked, the date and time the Web
page was last modified, the character encoding used by the Web page, and the
HTML version the Web page claims to have been encoded to (HTML 3.2), and the
errors and warnings found by the WDG HTML validator. Portions of the report are
shown in Figures 2.1 and 2.2.
28
Figure 2.1 WDG HTML Validation Report Part 1
29
Figure 2.2 WDG HTML Validation Report Part 2
The WDG HTML validator reports that the Web page contains errors such as
HTML tags that are not valid for this version of HTML (the SPAN tag, for
example), attributes that are not valid for this version of HTML (the TYPE
attribute on the STYLE tag on line 16), attribute values that are not valid for this
version of HTML (the NOWRAP attribute on the TD tag on line 30 cannot be
1), and duplicate attribute definitions for some tags (the BORDER attribute for
the TD tag on line 28). The report also indicates that some tags contain an extra
slash (/) just prior to the closing character (>) (an IMAGE tag on line 32). A
30
Web application tester that is familiar with different versions of HTML might
conjecture from these hints that the first problem with the way the page is coded is
that the HTML version the page claims to follow is incorrect.
Making use of another Web page coding validator service that is fee-based,
NetMechanic (which is located at http://www.netmechanic.com/), the tester
would receive a similar report, because the NetMechanic validator (HTML Toolbox)
reports on the validity of the code for various versions of HTML. There are various
Web page code checking tools and services available on the Web -- free, commercial,
and fee-based -- each filling in different parts of the puzzle.
Web page location targets URLs that include page tags are used to move
the user to a particular location on a Web page, especially if the page is long enough
to probably require the user to scroll to access parts of the page. Although tags to the
top and to the bottom of a page are most common, tags often allow the user to
navigate directly to a page location that is somewhere between the top and bottom of
the page. Even though the entire Web page is available for the user to view when the
page is rendered in the users browser, the page tags provide a means for direct
31
access to particular parts of the page. Even when accessed from another page, an
active link using a page tag may present the user with the desired portion of a page as
it is brought in the browser.
If there is no active link to each area of a Web site, then if those areas are
unreachable by any other means, such as form returns, users will never view those
parts of the site, nor will users be able to participate in any interaction with those
areas of the site. When this situation exists, then besides the confusion users may
encounter when using the site, one would expect that electronic business sites would
probably produce less revenue than with correctly functioning active links.
When a Web application is modified, links to pages and page location tags
that disappear or change must be modified along with the creation of links to pages
and page location tags that are created as part of the modifications. Failure to
accurately reflect these changes will also result in broken active links.
32
tag, the browser will not move to the portion of the page that the developer intended
to be displayed. If the user is already on the page, then the browser usually shows no
movement or moves to the top of the page. If the user is on a different page, then the
browser usually brings up the desired page, but displays the top of the page rather
than the desired area.
33
graphics have not yet completed. The alternate text is specified using the ALT tag
of the graphical objects.
Images are often used as format control within Web pages. Such images are
usually very small, even single pixel graphics. Display or vocal rendering of these
images may or may not be appropriate, depending on how the formatting function
they serve contributes to the material the page is intended to convey. Even when a
Web page author uses an image to separate page content and at is not appropriate to
provide a textual alternative to the graphic, however, an empty alt text description is
preferred to a missing description.
34
Web authors should also be especially careful as to surrounding content when
specifying these tags, as even rudimentary vocalized walk-throughs of the page
content can reveal incongruous discourse when the alt text is not well thought out.
For example, if an image of a red text list bullet is coded as red bullet, and
preceding page content is similar to Abraham Lincoln was slain by a , then
accessibility aids might render the combination as Abraham Lincoln was slain by a
red bullet .
35
fast enough Internet connection that the images on Web pages usually download too
quickly to allow the tester to view and read the alt text for all of the images on each
page. However, a tester can usually turn off image downloading for the browser that
is being used, which will enable the tester to examine the text without the image
downloading and replacing the text.
There are also legal reasons for considering accessibility that businesses and
organizations need to consider when producing Web sites. Major companies and
organizations that failed to have been successfully sued and have received public
complaints. The National Federation for the Blind, for example, brought and won a
lawsuit against AOL in 1999, claiming that AOL violated the Americans with
Disabilities Act because its Web site was not accessible to the blind. [20]
36
Rationale for this test
Web pages encoded with formatting items that primarily present the content
in a way that is visually pleasing does not always translate well to vocalization
software. Complicated mouse and keyboard interactions, such as those necessitated
by complicated image maps and pinpoint-accuracy-required mouse and link targets,
are examples of problems for both blind-assisting systems and motor-assisting
technologies. Motor-assisting technologies are used by people with severe motor
skill challenges such as arthritis and the inability to quickly and precisely move
fingers, hands, or arms. Improper and missing encoding of alt text for images, use of
tables to control content formatting, and poor descriptions of link targets, titles, and
headings, although providing sighted users with clarity, can very easily trip up blind
users access-assisting technologies, thereby reducing their access to the Web site.
[20]
37
Running this test
Using an automated testing tool, such as WebSAT, provide the URL to the
Web site or to individual Web pages, depending on the tools ability to traverse
active links. Examine the reports that result from running the tool to evaluate the
page content. Also examine the Web pages themselves to detect potential problems
with accessibility. Another tool for testing accessibility is Bobby, originally
developed by at CAST (Center for Applied Special Technology), a not-for-profit
group that specialized in promoting accessibility of computer technology. The
CAST Web site is at URL http://www.cast.org/. Bobby is now owned by
Watchfire Corporation, and is available at the Bobby Web site at URL
http://bobby.watchfire.com/.
Commercial and other sites that cater to larger audiences are generally more
concerned with the load times of their pages, and even Web sites such as the heavy
graphic user sites mentioned above should follow enough of the usability guidelines
38
that are intended to help the user use the sites more easily and assimilate the
information the sites are intended to convey.
The usability of a Web site includes how easy navigation and interaction with
controls within and among the sites pages are to understand and use, how easy the
content is to follow and how the content is arranged on Web pages, accessibility, and
compatibility with browsers and other client-side tools. [23] Although personal taste
will invariably enter into the design of Web site navigation, content, and formatting,
attention to common guidelines is helpful in designing Web sites that are useful for
as broad an audience as possible.
39
measured and reported by automated tools. Just as text can be rated as more complex
depending on word usage and sentence structure, forms can be measured for relative
complexity.
The tester should navigate through the Web site to determine whether or not
navigation and site structure are consistent, easy to understand, with clearly labeled
links, consistency in the location of navigation among different pages, availability of
40
alternate forms of navigation (text links in addition to graphical links), and inclusion
of navigation backwards as well as forwards within the site. One rule of thumb used
for navigation is that it takes at most three (3) mouse clicks to move from any portion
of the site to any other. [23]
The basic elements of usability should also be checked, some with automated
tools and some with visual inspection and interaction. These include the logical
arrangement and descriptions of form object and Web page controls, accessibility,
correctly functioning controls and links on the pages, and sufficient explanation and
directions related to the need for any special plug-ins, browsers, browser settings, or
other helper applications.
Web page forms can be programmed so that form data can be checked for
content and even checked for relative form data validity all on the client, or browser
side. Usually, these checks are made when the user clicks on one of the forms
interactive buttons.
41
Rationale for this test
Programming a Web page form is generally not all that tricky, but as with any
type of software, programming errors can find their way in. When forms are
programmed by hand, programming errors like initiating the wrong action for a
button or incorrectly checking form data fields for validity can creep in by simply
calling the wrong function or misusing a variable. Even with forms that are
generated automatically by Web server page languages, however, button actions and
form data verification data errors can occur.
Knowledge of what the form is intended to do, and what the data or
configuration of data on the form is intended for (the forms data domain) is of
paramount importance for the tester. Once the forms use and knowledge about the
data the form uses is determined, the tester should have a good idea as to what data is
acceptable, what data should be unacceptable, how the navigation of the form is
supposed to work, and how to determine if any of these functions fails.
42
quality), there appear to be two forms on this page: one that allows a user to enter an
Email address and password, and another that allows a user to enter an order number
and verification data. From the information on the page, it appears that users who
placed orders online will possess the former set of information, and users who placed
orders by telephone will have the later set of data available for checking the status of
their orders.
In this example, the online order information was entered and the submit
button on the Ordered Online? side of the page was clicked. As the screenshot
shows, the action taken by the form when the submit button on the online side is
clicked somehow incorrectly checks the contents of the order number and
verification data fields.
Examination of the page source code reveals that the two forms were
generated, not hand coded, and one likely contributor to the problem is that the script
function to check each form has the same name. One possibility is that the form
generator was not designed to generate two forms on the same Web page, yet was
used to do just that. The effect of this programming error is that a very obvious form
bug appeared when apparently valid data was entered, and that the error affected both
content validity checking and navigation.
43
Figure 2.3 Dell Order Status Form Bug
44
component of a Web site has the same potential for input data to create errors. The
types of potential error conditions that regular software has can therefore be expected
to be areas that should be checked on Web software: error messages, boundary
conditions, repetition of the same data, features that share or interact with each
others data, initial condition problems, and value type problems. Any part of a Web
site that uses data to operate, and particularly those Web site operations that use user-
entered data, are potential places where errors can creep into the computational
portions of the site. [6]
45
data can result in error messages from site resource problems when the developer or
the softwares environment fail to release resources properly and resources used per
instantiation of the data approach or overrun those available to the software. [6]
A lack of error indication when invalid data is entered can indicate that the
input data was not recognized as invalid data but was somehow magically
transformed without notification into acceptable data by the software, which is not an
acceptable outcome. Invalid input data should be recognized and flagged by the
software. [6]
Since the tester is entering data and using Web page controls from the user
interface, however, input data testing at the Web page level is still testing from the
user interface. Try to enter data such as very low and very high values to test
boundary conditions based on typical data types and sizes, such as the expected sizes
of integer types, floating point types and the accuracy expected of various floating
point types based on size. Enter numerical values that in collaboration with other
data will produce potential problem computations such as divide by zero. Even enter
data that is very near expected allowed minimums and maximums to see if
computations using that data might produce errors. [6]
Try to enter enough erroneous values as to produce every error message that
seems possible, based on the expected input and computations involving that data.
For character strings, try to enter very long character strings, longer than one would
expect to have as with first and last names, city and state names, and even freeform
46
text fields. Attempt to include characters that are not in the acceptable character set,
if possible. Attempt to choose more than the allowed number of choices for fields
that allow multiple choices. Attempt to leave fields blank, especially for data that
one would generally always assume must be filled in. [6]
Repeat input of the same data in computational areas and in form entries. Try
to update information using exactly the same data as was originally entered and
accepted. Attempt to change already entered data that might be used as part of the
lookup for other information. Enter data of the wrong type. For example, enter text
characters in numerical fields, floating point values in integer fields, and integer
values in floating point fields. [6]
Entry of input data can also be accomplished using automated test tools that
record traffic between the Web pages and the Web server. These scripts can be
edited so that input data values validated by Web pages can be changed to invalid
data values actually being sent to the server. While entering input data using the
Web pages tests the Web sites validation of input data at the page level, altering the
network packets sent from the Web pages to the Web server tests the Web sites
ability to handle invalid data on the server side. Most functional test tools include
this capability, including WebART, from OCLC (Online Computer Library Center,
Inc.)(http://www.oclc.org/webart/index.htm), WinRunner, from Mercury
Interactive (http://www-svca.mercuryinteractive.com/), Solex, a plug-in for
Eclipse, from Eclipse.org (http://solex.sourceforge.net/), HttpUnit, from
SourceForge.net (http://httpunit.sourceforge.net/), QA Wizard, from Seapine
Software (http://www.seapine.com/), MaxQ, from BitMechanic
(http://www.bitmechanic.com/), e-Tester, from Empirix
(http://www.empirix.com/), SilkTest, from Segue (http://www.segue.com/),
Rational Robot, from Rational (http://www.rational.com/), and SiteTools
Monitor, from softlight technologies, inc. (http://www.softlight.com/).
47
2.9 Test Number 8: Resize the Browser
Background for this test
Web pages can include content, including images, tables, lists, and
paragraphs, formatting of the content, and background images. Viewing the pages in
at one browser size, often influenced by screen size and resolution, even using
different browsers, can lead to the false assumption that the Web sites pages will
appear okay to users no matter how their machine and software are configured.
A developer can fail to account for what happens when a user visits a Web
site using a higher or lower screen size or resolution than the Web site developer, or
uses a browser that is sized larger or smaller than the browser the developer was
using when developing the Web pages. Unless the developer is either very careful or
very lucky, rendering the Web pages of a site in a size other than the developer used
to view the pages as they were being developed can produce visual results that were
48
completely unintended by the developer. The Web site could very easily be
rearranged into a visual mess, or the content could be very hard to follow.
49
2.10 Test Number 9: Examine Web Page Objects
Background for this test
The number and types of Web page objects, especially form items and
controls, in part determine the expected amount of time a user will need to spend on a
Web page to extract the information provided by the content on the page and to
interact with the Web site, entering data and operating controls. How the Web site
ranks in usability depends, in part, on the degree of user intuition and interaction
required to navigate within and among the pages of the Web site, whether entered
information is remembered and provided upon leaving a Web page and returning, the
amount of help and hints provided by the pages, and the level of familiarity with the
subject matter of the pages content. The arrangement of the objects on the Web
sites pages should also be considered, as well as the frequency of page modifications
due to the pages being automatically generated and due to updates to manually
generated pages.
50
Results of this test
From reports generated by automated tools and from visual inspection of the
Web sites pages, the tester can find links that are incorrectly coded, pages with too
many form elements and other controls for most users to be expected to understand
and use. The tester should especially be on the lookout for Web pages that require
learning about, reasoning about, or remembering data not visible on the current page
or form. While extremely large forms can often be broken up and presented across
multiple pages at the site because different parts of a large form may really only be
needed for particular subsets of data, forms that are concerned with related data that
are split across multiple Web pages can also be confusing and difficult for the user to
remember how to fill out.
51
and empty anchors (code with no link encoded between the begin and end link tags),
empty interface controls are reported.
Security on the server side of a Web site includes the security measures put in
place on the server side of the Web site by its system and network administrators,
changing default passwords for applications and services the site uses, blocking
access to machine ports that are not used by the Web site, and putting in place
protective system and network firewalls and the like. There are also safeguards that
Web site system administrators must take that prevent technically authorized
accesses between applications on the site. Some of the major companies that are
employing Web-based operations actually contract out security of their sites with
Web security services. [24] Users of Web sites are likely more concerned about the
privacy of their data and personal information than they are about any other security
the operators of a Web site might have, so client data security should also remain at
the top of the security concerns of the Web site developer.
52
Users have become paranoid enough about privacy and piracy on the Web
that many of them have begun to install their own firewalls and intrusion detection
software. Cookie data hijacking has become such a well-known source of hacked
information that many users have turned on alerts every time a cookie is touched, and
a number of users have blocked any type of cookie access whatsoever. If a Web site
uses cookies or encoded equivalents injudiciously, or fails to adequately safeguard
users data from others, even on the client side, you can be sure that users will tend to
shun that Web site.
When no encryption is used, the data and user information that is transferred
between the client and server sides is out there for the world to see, and you can bet
that someones looking for just such unsecured data at any given time. Cookies that
are created and contain unencrypted data and user information are available on the
client side machine for any third party application or Web site to peruse at their
leisure, particularly if the cookie is not destroyed immediately after it is set (which
pretty much would defeat the purpose of most cookies, after all, since many are used
to maintain state information between sessions or within a session but between Web
53
page accesses). While credit card numbers and other user data can be kept in
cookies, it is preferable to transmit them in encrypted form to the server and let the
server security protect them from other users.
54
numbers. Cookies are usually kept in a text file on the users machine. This file can
be opened in a text editor. If the data is text readable, it is not secure.
Using a network sniffer, examine the data packets (HTTP) between the client
and the server. Look for unencrypted user information in the packets. Also look for
URLs and file names on the server. Check (by entering the URL in the browser goto
line) that the file names and other URLs are not accessible without verification from
the client side. One good, inexpensive, and easy to use network sniffer is Analyzer,
from Politecnico di Torina, available at their Web site (http://analyzer.polito.it/).
55
unconventional ways; make similar use of Java, plug-ins, or downloadable
executable files; rely on font, color, images, or background graphics for content
conveyance; and do not communicate well to the user how these items will be used
run the risk of losing out to users who abandon the site for fear their data or other
software may be compromised. What can also happen is that Web pages that
appeared to make sense when rendered with the developers browsers can turn into
meaningless pages of seemingly poorly formatted content, depending on local
browser settings.
56
Chapter 3
The Network Interface
How the Web page on the client side handles invalid information from the
server, however, can be tested, as can invalid information being sent from the client
to the server. Because Web software should be designed so that the client side Web
page does not generate invalid data to send to the server, and vice versa, injecting
faulty data into the received and transmitted HTTP packets between the two sides can
test how each side handles this type of situation, although this test is usually
performed from the client side. And adjusting the speed of network traffic between
the two sides, rather than simply causing packet loss due to timeouts, can point out
potential timing related issues.
57
3.2 Test Number 12: Change the Network Access
Speed
Background for this test
Users connect to the Internet at varying rates. High-speed connections are
becoming more prevalent these days than in the past. Currently, cable connections
and DSL connections serve those geographically lucky enough and affluent enough
to have them, but many users still use dial-up connections. Although even dial-up
speeds are higher than they were just a few years ago, individuals and businesses are
still connecting to the Internet at very low speeds.
Its easy to forget that the LAN speeds that Web sites are often developed in
are not how the population in general will access most sites. Often, the tendency in
Web site testing is to investigate how the Web server will handle excessively high
rates of demand rather than individually low speed access. Yet the degree of
satisfaction of low speed access by a lot of people visiting a businesss Web site may
determine how well or how poorly the business may operate.
58
Downloading large amounts of data should take into account the possibility
that the data transfer and related acknowledgements may take a very long time just
because the rate of data communication is very slow into some users machines.
Users are more inclined to allow downloads of large amounts of data to run in the
background than they are to wait for excessively large Web pages to show up in their
browsers. There are several ways to compress large amounts of data to reduce the
time it takes to download, however, and there are even methods and online services
that will compress Web page content and images. But consideration of the size of
Web page content in the design of the pages can often make the download of the
pages more tolerable for the users with low connection speeds. No matter how
amazing and wonderful the images, graphics, and sheer volume of content on a Web
page may be, if its not being perused because users have abandoned the page due to
excessively slow page loading time, all the fanciness and information in the world
wont make up for the loss of a significant number of potential customers.
59
relative speeds mean much and that download times may be dependent on networks
proximity and routing from the Web site to the users machine.
Using an automated test tool, enter the URL(s) of the Web site pages to have
the tool generate reports on the size of the Web page and the estimated download
times dependent on different connection speeds. Use tools like Canned HEAT, from
FITs Center for Information Assurance, degrade the network access speed to
simulate lower connection rates. Simulate user access at lower rates and test page
loading, navigating among, and interacting with pages from the Web site to verify
whether or not the Web site can handle activity at slow speeds.
60
Chapter 4
The Client-Side Components
61
contained a message to the user that the plug-in needed to be downloaded and
installed, requiring termination of the browser before the plug-in could be installed.
Today, however, many plug-ins are capable of being automatically installed while
the browser is running, which means that the user doesnt have to handle recording a
link to the page that requires the plug-in, manually shutting down the browser,
manually installing the plug-in, manually restarting the browser, and then surfing to
the Web page that contains the content.
62
plug-ins, and can possibly provide another method for the user to obtain or view the
content that requires the plug-in.
Another good check for plug-ins is to uninstall all plug-ins from Web
browsers before bringing up the Web site. Pages that are encoded to user plug-ins
should recognize that the required plug-ins are unavailable and should provide help
to the user for loading the plug-in or obtaining the content without it, if possible.
63
process resources to handle the additional load imposed by this activity. Of course,
the client machines configuration is the users responsibility, but Web applications
should consider the burden that could be placed on the client machine when they are
recommending and instructing the client to download, install, and execute software
originating from the Web site or from third parties.
64
Results of this test
Overuse and general browser inability to handle resource requests are often
evidenced by browser failures to operate properly. For instance, many of the popular
browsers such as Netscape and Internet Explorer begin experiencing toolbar icon
display problems when the Web pages they are rendering cause resource availability
to drop below levels that allow the browsers to function properly. The same types of
problems toolbar display problems and operational problems with other software
running on the client machine at the same time of the Web session may also be
visible to the casual observer or tester.
Problems with overwritten shared libraries and DLLs may not surface until
long after the interaction with the Web site is over, as other applications on the client
machine that depend on these common code modules are run and begin exhibiting
strange behavior and crashes. One way to determine whether the interaction with a
Web site causes such problems is to compare client machine configurations,
including shared code module versions, between the state of the client machine
before and after a Web site has been tested. Although there are tools available that
can record machine configurations and the reports can be compared after any test of a
Web page, any time there is interaction by a Web site that switches the responsibility
of the machines configuration from the browser to the user, the client machine
configuration should be considered suspect.
65
the Web browser, and compare the initial client machine configuration against the
configuration of the client machine following such operations, paying particular
attention to shared code modules additions and version changes. Examine any
discrepancies as potential problems from interaction with the Web site.
66
Chapter 5
Server-Side Components
67
that can use the Web site at once is important, also of interest is whether or not the
same user can maintain multiple sessions on the Web site at once, on the same
machine as well as on multiple clients. Besides users who may be logged in to
multiple machines at the same time, user IDs may also be share among several
different persons who may be logged in to the Web site concurrently.
Often regarded as an input factor for performance testing such as load testing,
the number of concurrent users is considered by some performance testing experts as
a result of load testing. As load testing is measured over time, some even suggest
that a better input for load testing than number of concurrent users at a given time,
which denotes no real time measure, should be the number of user sessions started
per unit of time. [14]
Testing for concurrency itself, then, involves finding out how many
concurrent user sessions the Web site can successfully service. If the site cannot
accept any more user sessions after some are in progress, or if the site cannot
maintain Web page load times within user-tolerable limits, or if the site cannot allow
a user to log in more than once at one time, then the Web sites ability to handle
concurrent users should be brought into question.
68
While a Web site may use databases, file services, and other server-side
applications, it may not always, for example, use a database within each session or
within a particular part of a transaction within a session. The known limits of
concurrency by each of the components the Web site makes use of, however, can be
used as part of an estimate of how well the site will scale in relation to user sessions.
One reason determining where the actual problem lies in Web site testing of
concurrency, and performance in general, is that besides multiple software
components in use on one server-side machine, there may be multiple server-side
machines that handle different types of requests. The Web server may attempt to
balance load between multiple machines running the same component, there may be
separate machines devoted to different types of Web site component (a Web server,
an Application Server, and a Database Server, for example), and some combination
of specialization and load balancing may also be used.
69
of the Web site to handle multiple sessions in general, employ performance-testing
tools such as EasyWebLoad (http://www.easywebload.com/), Apache JMeter
(http://jakarta.apache.org/jmeter/), Loadrunner and Astra LoadTest, from
Mercury Interactive (http://www-svca.mercuryinteractive.com/products/),
http_load, from Acme Labs (http://www.acme.com/software/), e-load Expert,
from Empirix (http://www.empirix.com/), or one of the many other load and
concurrency testing tools available. To find the best tool for the particular Web site
under test, it is often helpful to characterize the site as either high complexity, low
traffic, low complexity, high traffic, or high complexity, high traffic (which
may prove relatively much harder to test than the other two). [14]
The major difference, then between load testing and stress testing can be
thought of as stress testing is load testing taken to an extreme. [27] In Web site
testing terms, stress testing is applying a sufficiently high enough request level to
observe what happens to the site when it cant keep up with the level of requests.
70
ones, livelihood, material possessions, or thesis defenses, or when changes (loss of
control of ones environmental resources) are involuntarily thrust upon people, stress
occurs. Although each persons strategy for handling stress may differ, each person
is responsible for choosing their own course of action and thereby how the losses and
changes they experience will alter their lives and the lives of those around them.
When a Web application, just like any other type of software application,
experiences stress, it is a result of the loss or change of something the application
depends upon for execution because the demand on the service is higher than the
service can handle. Possible losses include unavailability of a service such as a
database, a credit card verification service, a search engine, a stock market reporting
service, a Web server, an XML server, an application server, a legacy program, or
any other service the application uses, as well as any type of resource on any system
on the Web site. Resources like memory, files, and command processing can bog
down and become unavailable to a Web application. How the Web application
handles stress is what the tester is investigating.
71
attempt to partially recover from the loss, or it may refuse to continue performing the
work that requires that service or resource until it is restored. The other important
piece of data the tester needs to note is what sort of informative or error messages the
Web application gave the user when the loss occurred. (Of course, if the loss was not
directly detected, the resulting course of action is likely to result in informative or
error messages that do not truly reflect the original deprivation.)
The tester can use a software test tool such as Holodeck, available from FITs
Center for Information Assurance (http://www.se.fit.edu/) on a machine on the
server side of a Web site to simulate low system resources or loss of services.
Several performance-testing tools also purport to be able to perform stress testing on
Web sites. These tools include EasyWebLoad (http://www.easywebload.com/),
Loadrunner and Astra LoadTest, from Mercury Interactive (http://www-
svca.mercuryinteractive.com/products/), and e-load Expert, from Empirix
(http://www.empirix.com/).
72
References
1. Mike Powers, Why Test the Web? How Much Should You Test? , Testers
Network, January 2000, Available:
http://www.veritest.com/testersnetwork/Web_testing21.asp, Last
accessed: February 20, 2002.
2. Eric Kaufman, Testing Your Web Site, Testers Network, November 1999,
Available: http://www.veritest.com/testersnetwork/Web_testing1-1.asp,
Last accessed: February 20, 2002.
3. Daniel J. Mosley, Client-Server Software Testing on the Desktop and the Web,
Prentice Hall PTR, 2000.
4. The Word Spy, Web rage, Logophilia Website, November 15, 2000, Available:
http://www.logophilia.com/WordSpy/Webrage.asp, Last accessed: March
12, 2002.
5. Jonathan Lehrer, Web Rage User Frustrations of the Internet, The Web Audit
Group, Available: http://www.webauditgroup.com/advice/frustrate.shtml,
Last accessed: February 20, 2002.
7. Jesper Rydn and Pr Svensson, Web Application Testing, Chalmers and Sigma
nBit AB, Gothenburg, February 2001.
12. Steven Splaine and Stefan P. Jaskiel, The Web Testing Handbook, STQE
Publishing, 2001.
73
13. Robert Zemeckis and Bob Gale, Back to the Future, Universal City Studios,
Inc., 1985.
14. Mark D. Anderson, The Top 13 Mistakes in Load Testing Applications, STQE
Magazine, Available: http://www.stickyminds.com, Last accessed June 20,
2002.
15. Alberto Savoia, Trade Secrets from a Web Testing Expert, STQE Magazine,
Available: http://www.stickyminds.com, Last accessed June 20, 2002.
16. Glenn A. Stout, Testing a Website: Best Practices, The Revere Group,
Available: http://www.reveregroup.com/exchange/articles/stout2.pdf,
April 2001, Last Accessed: June 20, 2002.
19. A. J. Flavell, Use of alt Texts in imgs, Glasgow University, 1994, Available:
http://www.ppewww.ph.gla.ac.uk/~flavell/alt/alt-text.html, Last
accessed: October 20, 2002.
20. Frontend, Website Accessibility, Frontend Research, March 11, 2001, Available:
http://infocentre.frontend.com/servlet/Infocentre/Infocentre?page=arti
cle&id=74, Last accessed October 20, 2002.
21. Andrew Chak, Usability Tools: A Useful Start, New Architect, 2002,
Available:
http://www.webtechniques.com/archives/2000/08/stratrevu/, Last
accessed: October 20, 2002.
22. NIST, NIST Web Metrics -- Technical Overview, NIST Web site, 2002,
Available: http://zing.ncsl.nist.gov/WebTools/tech.html, Last accessed:
October 20, 2002.
23. Jean Kaiser, Criteria for Web Site Evaluation, CNET.com Web site, Available:
http://webdesign.about.com/library/weekly/aa071801a.htm, Last
accessed: October 20, 2002.
24. Anne Chen, Web Services Secure?, eWeek, May 27, 2002, pp. 47-50.
74
25. Russ Smith, Behind Closed Doors What Every Tester Should Know About
Web Privacy, STQE Magazine, January/February 2001, Available:
http://www.stickyminds.com/getfile.asp?ot=XML&id=5064&fn=Smzr1
XDD2554filelistfilename1%2Epdf, Last accessed October 20, 2002.
27. Matt Baskett, Stress and Load Testing on the Web, Testers Network,
Available: http://www.veritest.com/tester%27network/stressWeb1.asp,
Last accessed: February 20, 2002.
28. W3C, About the Word Wide Web, W3C, January 24, 2001, Available:
http://www.w3.org/WWW/, Last accessed: March 19, 2002
75
Additional Resources
3. Cem Kaner, Jack Falk, Hung Quoc Nguyen, Testing Computer Software,
Wiley, 1999.
4. Stefan Asbck, Load Testing for Confidence, Segue Software, Inc., 2000.
10. Edward Hieatt and Robert Mee, Going Faster: Testing the Web Application,
IEEE Software, March/April 2002.
11. Wing Lam, Testing E-Commerce Systems: A Practical Guide, IEEE IT Pro,
March/April 2001.
12. Steve Driscoll, Systematic Testing of WWW Applications, WebART Web Site,
Available: http://www.oclc.org/webart/paper2, Last accessed February
20, 2002.
76
13. Mark Cundy, Testing Mobile Applications is Different From Testing
Traditional Applications, Testers Network, Available:
http://www.veritest.com/testers%27Network/mobilevstraditional.asp,
Last accessed February 20, 2002.
14. Filippo Ricca and Paolo Tonella, Analysis and Testing of Web Applications,
IEEE, 2001.
18. QA Labs, Inc., The Living Creature Testing Web Applications, QA Labs,
Inc., 2000.
19. Paul Reeser and Rema Hariharan, Analytic Model of Web Servers in
Distributed Environments, WOSP 2000, ACM, 2000.
20. Marjorie Lovatt, Herding Cats: A Case Study on the Development of Internet
and Intranet Strategies within an Engineering Organization, SIGCPR 97,
ACM, 1997.
21. Hal Berghei, HTML Compliance and the Return of the Test Pattern,
Communications of the ACM, Volume 39, Number 2, February 1996.
23. Chaitanya Kallepalli and Jeff Tian, Measuring and Modeling Usage and
Reliability for Statistical Web Testing, IEEE Transactions on Software
Engineering, Volume 27, Number 11, November 2001.
24. Tarek F. Abdelzaher, Kang G. Shin, and Nina Bhatti, Performance Guarantees
for Web Server End-Systems: A Control-Theoretical Approach, IEEE
Transactions on Parallel and Distributed Systems, Volume 13, Number 1,
January 2002.
77
25. Amdreas Zeller and Ralf Hildebrandt, Simplifying and Isolating Failure-
Inducing Input, IEEE Transactions on Software Engineering, Volume 28,
Number 2, February 2002.
26. Brian Globerman, Online Retailing Makes Its Mark, Testers Network,
Available:
http://www.veritest.com/tester%27network/online_retailing1.asp, Last
accessed: February 20, 2002.
29. Melody Y. Ivory and Marti A. Hearst, The State of the Art in Automating
Usability Evaluation of User Interfaces, ACM Computing Surveys, Volume
33, Number 4, December 2001.
78