background image
<< Viewing a Verification Point in the Comparators | About Test Case Reports >>
<< Viewing a Verification Point in the Comparators | About Test Case Reports >>

Playback/Environmental Differences

Reporting Results
Playback/Environmental Differences
Differences between the recording environment and the playback environment can
generate failure indications that do not represent an actual defect in the software. This
can happen if applications or open windows are indicated in the recorded
environment that are not in the environment, or vice versa.
For example, if you create a file using Notepad in the recorded environment, when
you play back the test script, the file already exists and the test log shows a failure that
has nothing to do with the software that you are actually testing.
You should analyze these apparent failures with the appropriate Comparator to
determine whether the window that Robot could not find is an application window
that should have opened during the test script playback or an unrelated window.
Intentional Changes to an Application Build
Revisions to the application-under-test can generate failure indications in test scripts
and verification points developed using a previous build as the baseline. This is
especially true if the user interface has changed.
For example, the Window Image verification point compares a pixel-for-pixel bitmap
from the recorded baseline image file to the current version of the
application-under-test. If the user interface changes, the Window Image verification
point fails. When intentional application changes result in failures, you can easily
update the baseline file to correspond to the new interface using the Image
Comparator. Intentional changes in other areas can also be updated using the other
For information about updating the baseline, see Using the Comparators on page 221.
Reporting Results
TestManager lets you create many different types of reports. The reporting tools that
TestManager provides are flexible enough for you to create a variety of queries and
display formats that help you determine whether your testing effort comprehensively
covers all of the requirements for your application. TestManager also provides several
default report definitions that you can use to create simple reports.
TestManager provides the following types of reports:
Test case reports Use to track the progress of planning, implementation, and
execution of test cases.