Share This Post

Introduction to the WDK Automated Test Framework

Introduction

The WDK Automated Test Framework was originally created by Documentum for internal testing of WDK components and WDK applications. Although the framework is not an officially supported EMC product, it is made available to partners and developers via the
EMC Developer Network (EDN).
The WDK Automated Test Framework is available in two versions:

  • Version 1.0 is compatible with WDK 5.3 SP5 (or later service packs)
  • Version 2.0 is compatible with WDK 6.0.

The test framework includes a web-based user interface which allows you to create new tests by recording actions in your custom Webtop or WDK application. The framework is tightly integrated into WDK and runs on any supported WDK browser. It records WDK mouse selection, keyboard shortcuts, right-click menu actions, and supports UCF transfers for importing, viewing, or versioning documents. The user interface allows you to playback recorded tests and verify the behavior of your application without requiring any command-line interactions or editing any configuration files.

Setting up the Framework

The testing framework works with any Webtop or WDK application, but requires that you deploy your application as an unpacked webapp directory, rather than a single WAR file. The unpacked directory is necessary because new tests will be saved as XML and Java files inside the webapp directory. Login to your application and verify that everything is working correctly before you move on.
Note:
For clarity, this article bases all examples on an environment consisting of Webtop 6.0 running on Tomcat version 5.5.
Download the correct version of the WDK Automated Test Framework from EDN; version 2.0 of the framework is required for Webtop 6.0. Unpack the test framework files into the directory which contains your webapp. The test framework only adds new files, so you should not need to overwrite any existing files. Unpacking the archive is the only step necessary to setup the test framework for testing a standard version Webtop that does not utilize any custom WDK controls.
Once this archive is unpacked, you may access the WDK Automated Test Framework UI at:
https://{host}:{port} /{webapp}/component/testtool
where {webapp} is the name of the deployed Webtop or WDK application. For example, if you are testing a Webtop instance running on port 8080 of your local machine, then you would point your web browser at:
https://localhost:8080/webtop/component/testtool
Now that you have verified that the test framework has been correctly deployed, it is time to specify the parameters for recording test cases.

Configuring Recording Parameters

The WDK Automated Test Framework UI generates two artifacts for each recording: an XML configuration file and a Java class. The auto-generated Java class extends com.documentum.web.test.ComponentTestCase and defines a sequence of actions to perform. The XML configuration file is created inside the webapp directory structure under /config/testcase/ and defines values that are used to automatically navigate the WDK application and to verify expected values.
To get started, click the [Test Recorder Launcher] link at the top of the main test framework page to access the Recorder Launcher, and then update the settings which will be used to record your new test case.

rec001c

Figure 1:
The Test Recorder Launcher

rec002c
Figure 2:
Test Recorder Configruation Settings

  1. The most important elements to be configured are the names and locations of the new XML configuration file and the associated Java class. The “classname” is the name of the new Java class file and “testcaseiId” is the name of the new XML configuration file.
  2. The “packagename” value indicates what Java package will be declared in the new auto-generated Java class. The “path” value is a directory inside the webapp directory structure and should match the “packagename”.
  3. The “configpath” is the full directory path of the XML configuration file which will be created by the Test Recorder.
  4. The “URL” setting controls the starting URL for the test case. By default this will be the root WebTop component, but it is possible to choose a specific component, for example “/component/search”.
  5. Set “auto login” to “Yes” and enter valid “user”, “password”, and “docbase” login information. Your session is automatically logged in using this information, these values will not be recorded in the test case, and you can specify a different login when you playback the test.
  6. The “clientDataFolderPath” specifies a directory on your local machine which contains files used during testing, for example a sample document to be imported.

Now that you’ve successfully set up the testing framework, it is time to record a test.

Recording a Test

A simple Webtop test has three stages:

  1. Start Recording
  2. Perform Validation
  3. Finish Recording

The test framework Test Recorder augments the WDK interface to show the webapp being tested in the center of the web page with some special links at the top and bottom. These new links allow you to manage the recording process.

1. Start Recording

Click the [Record] button located in the bottom right corner of the Test Recorder Launcher to start recording your test. A Recorder Monitor window will be opened and you will immediately be prompted to click [OK] in order to start recording your test actions.

rec003
Figure 3:
Click [OK] to start recording

For this introductory example, we will record the steps to perform an import and then modify an attribute. Follow these steps to record your test:

  • Navigate into a temporary folder and import a sample document. Note that the standard Webtop UCF client is utilized for the import.
  • Use Webtop to view the properties of the imported document, assign a new value to the subject attribute, and save the changes.

rec005c
Figure 4:
Import a test document

rec006c
Figure 5:
Modify the Subject Attribute

This sample document is now ready to be validated in the next stage of our test.

2. Perform Validation

An important part of testing is validating that the correct data is displayed to the user. To validate that the test document has the correct attribute value assigned, open the standard Webtop properties viewer and click the “Turn On Inspection Mode” link (located in the bottom right corner of the Recorder Monitor).

rec007c
Figure 6:
Click – Turn On Inspection Mode

Inspection mode allows you to inspect the values of any WDK component shown by the webapp and record a comparison-check of these values as part of your test. The test will fail if a different value is encountered during playback. When inspection mode is active, a red box is shown as you move the mouse pointer over elements on the properties page. Continuing our example, click the “Subject” attribute field to open a validation box and perform a comparison against the current value.

rec008c
Figure 7:
Select the “Subject” attribute for inspection

rec009c
Figure 8:
Dialog when inspecting the “Subject” attribute

You may stop inspection mode by clicking the “Turn Off Inspection Mode” link in the bottom right corner. Close the property inspector and return to the folder browser in Webtop.

rec010c
Figure 9:
Click – Turn Off Inspection Mode

Let’s validate that the correct version label is displayed in the document list. To do this, enable inspection mode when the document list is displayed:

rec011c
Figure 10:
Select the version label for validation

rec012c
Figure 11:
Validate the current version label

As before, click [OK] to record your validation values and click “Turn Off Inspection Mode” to end inspection mode.

3. Finish Recording

Now that we’ve performed a few validation tests, let’s clean up the content repository. Using Webtop, delete the test document you previously created and then click the “Stop Recording” link in the bottom right corner of the Recorder Monitor. This ends your test recording. Now click the [Compile] link to compile and hot-deploy the Java class associated with your new test. Close both the “Compilation” and “Recorder Monitor” windows to get back to the main test framework UI.

rec014c
Figure 12:
After recording is is finished

rec015c
Figure 13:
Compile the Java class for your test

Executing the Test

New let’s execute the test we just recorded. Note that new tests are not immediately visible for playback. To run a recently recorded test you must click the [Refresh ConfigService] button in the bottom left corner of the Test Launcher and then switch to view by “testcase”.

play000c

Figure 14:
Click [Refresh ConfigService]

play001c
Figure 15:
Select View By: testcase

Select the new test in left side tree and then click the right-arrow button to add the test to the current Test Plan. Set “auto login” to “Yes” and enter appropriate authentication information.

play003c2
Figure 16:
Add a test to execute

play004c
Figure 17:
Enter authentication information

Click the [Launch] button to start playback. After playback finishes, the test results viewer shows success or failure for each test, along with timing information. You can click on individual tests to view more information, which is particularly helpful for finding out why a test failed.


add03c
Figure 18:
Monitor playback, in progress

add002c
Figure 19:
Results, after playback

add004c
Figure 20:
Details of an executed test case

Tips & Tricks

Use the [Save State] button to record custom settings.
The settings in the Test Recorder Launcher will be reset to default values when the webapp server is restarted or your browser session ends. The [Save State] button saves the current configuration values out to a file, and you may reload these values anytime using the [Load State] button. Once you establish the recording configuration appropriate for your environment, be sure to save the state. This enables you to quickly reload it later when you want to record a new test case.
The test recorder can be finicky.
Plan out the sequence of actions before you start recording and keep your tests straightforward. You can break one large test into a series of smaller tests and play them back in sequence using a custom testsuite. It will be easier to re-record individual sections of a large test when necessary rather than the entire test.
Pay attention to which version of Webtop is supported.
For example, version 2.0 of the WDK Automated Test Framework (Build Number: 6.0.0.089) only indicates support for Webtop 6.0. We have experienced intermittent errors when using Webtop 6.0 SP1, and these errors were resolved by switching to Webtop 6.0 with no service pack.
Avoid reusing testcase names.
Although the test framework supports hot-deploy of Java classes, updating an existing test can cause Tomcat to throw an exception or otherwise fail to reload the Webtop application. You may need to restart Tomcat, and Tomcat could end up in an invalid state which requires you to manually empty the Tomcat work directory. If you simply append a new version number to any test that is re-recorded, you can avoid issues caused by hot-deployment of existing Java class files.
The recorder may include unintended comparisons.
This issue may require new tests to be manually tweaked before they will playback correctly. For example, we have seen the recorder unexpectedly check for a specific value of the r_access_date attribute. This works fine when the test is first recorded, but fails each time the test is played back. To remove these unintended validations, you must edit the XML configuration file and remove the fragments related to the fields that should not be validated (r_access_date in the above example). Because of the way the auto-generated Java classes are structured, it may even be necessary to modify the Java class in order to remove the validation step. Straightforward tests are less likely to include unintended comparisons.
Do not leave the test framework on a production system.
The test framework allows you edit and hot-deploy Java classes without providing authentication information, which poses a security risk. Make sure you update your deployment process so that you can easily deploy without the framework files or delete all files and directories associated with the test framework after deployment. The WDK Automated Test Framework User Guide provides details on which files must be deleted, as well as an ANT snippet for packaging a WAR file which excludes all of the test framework files.

Advanced Usage

Once you understand how to create and playback a basic test, you may want to move on to more advanced test cases.Here is an overview of the test framework’s more advanced features:

Setup and TearDown

When recording a new testcase, you can designate tests to be run as a setup or teardown step. As with standard JUnit-based Java test cases, the setup is performed before executing your test and the teardown is performed after the test execution completes. This allows you to encapsulate common setup or teardown behaviors into a special test which can be reused in new tests.

adv001c
Figure 21:
Specify Setup and TearDown tests

Variables

You can define input and output variables which will be passed between tests at runtime. For example, you might specify a variable that holds the r_object_id of a test document so that it can be deleted by a teardown test. When sequencing tests together, the output variables from one test are automatically passed into the next test as input variables. You can specify default values which may be overridden when tests are launched in the UI. Managing variables does add complexity, but it also provides flexibility for creating advanced tests.

Modify a Test Case

Once you have created a useful test, you may want to integrate the java and configuration XML file into your build process. The compiled class may be deployed inside of a jar file, rather than hot-deployed from the source code. The com.documentum.web.test package is provided by the standard webtop.jar, so no additional libraries are required to compile the web test classes. The variable names can also be cleaned up so that it’s easier to re-use tests.

Custom WDK controls

If your application uses custom controls, then you may need to customize the testing framework in order to record tests. The WDK Automated Test Framework Development Guide provides details on how to instrument the test framework to handle custom events and to inspect custom controls. If you do not provide support for a custom control, the test recorder includes an “unknown control” comment message in the generated java test class.

Tracing

If you are already familiar with WDK trace flags, then you may be interested to know that the test framework introduces three new flags for tracing:

  • com.documentum.web.test.Trace.TESTRECORDER

This flag traces the recording and production of a test case.

  • com.documentum.web.test.Trace.TESTCASE

This flag traces the traversal of test suites and test cases and the execution of test cases.

  • com.documentum.web.test.Trace.TESTSTEP

This flag traces the traversal of test steps and the execution of test steps.
I encourage you to visit the EMC Developer Network (EDN) for the latest version of the WDK Automated Test Framework and to access the developer forums.

Conclusion

The WDK test framework provides a robust, reproducible mechanism for performing automated end-to-end WDK testing.Because value inspections can be performed on any WDK control, you can test any functionality
invoked – directly or indirectly – from the UI,
including complex Documentum functionality such as lifecycle state transitions and custom TBO behaviors. Although custom WDK components require additional customization of the test framework, the many benefits of automated WDK testing should easily make up for the additional effort.

Other Resources

Introduction to the WDK Automated Test Framework

More To Explore

AI to Write Requirements

How We Use AI to Write Requirements

At ArgonDigital, we’ve been writing requirements for 22 years. I’ve watched our teams waste hours translating notes into requirements. Now, we’ve cut the nonsense with AI. Our teams can spend

ArgonDigital | Making Technology a Strategic Advantage