DevOps Automation

 View Only

Tips & Tricks - RPT: Pruning Recorded Domains

By Jerry Shengulette posted Mon July 24, 2023 12:12 PM



Your browser does more than you might think under the covers. While you're accessing your company's site, you might also be checking your Google accounts, updating Facebook cookies, polling for an updated nefarious site blocker list, etc. Unless you are working on a carefully isolated machine, your browser is probably multitasking on behalf of some of your other applications. 

In and of itself, this is not automatically a bad thing. It can greatly increase your efficiency with some other application use later. However, if you are in the process of recording a test with Rational Performance Tester, you may record more than you anticipated. Sometimes this can complicate your testing unnecessarily.

This post is here to help with that.


Let us step through the RPT recording wizard (File > New > Test From Recording).

  1. Choose the test type (HTTP Test).
  2. Provide a test name.
    Adding a test name
  3. Choose a client application (Chrome).
    Selecting Google Chrome
  4. For purposes of the exercise, assume the Google Chrome Recorder Settings are okay, as is.
    Partial image of Google Chrome Recorder Settings
  5. When the browser instantiates, recording begins.
  6. Some time passes while navigating through the application-under-test.
  7. The example test is not particularly complex. It's a Google search on a name. When the results page has fully loaded, close the browser and wait for the next dialog box.

This is the interesting part. The test was only a search using The list of recorded domains is certainly larger than just Everything that is not is background noise from the browser.

The idea is to uncheck the boxes that are irrelevant to the application-under-test.

In this case, eliminating everything this is not, prunes the background noise.

The catch is that background noise varies by browser and can change over time, as a company adopts new software, for example.

How to identify the background noise?

Start a recording session using File > New > Test From Recording.

Step through the wizard as demonstrated in the Introduction.

When the chosen browser instantiates, go for coffee.


The packet counter in the HTTP Recorder console will continue to increase over time, even with no destination URL provided to the browser.


After coffee, close the browser. It may take a bit longer to wrap up the lingering open connections. If it takes too long, feel free to use the blue "Stop Recording Session" button. The list of domains displayed is all background noise. Take a screen shot of the list and proceed with the actual recording of the application-under-test. When the domains list is presented again at the completion of your recording session, you know which ones can be deselected.

An interesting note here is that appears in the background noise, suggesting that isn't the best site to record for a clean example. For now, look at is as real-time illustration that it is a good idea to be familiar with the specific list of domains you wish to record.

Words of Warning

Pruning a test like this has the obvious benefits of reducing the overall size of your test, and perhaps reducing the accidental, sometimes troublesome, manual correlation of network traffic that was irrelevant. It has another benefit that might not be as obvious.

If your requirement is to record a test and then set up a schedule to emulate 10,000 users against your application, AND you do not prune your domain list, some of these extraneous sites may not be excited to see 10,000 emulated users hitting their site at roughly the same time. They can identify your point of origin and, even though you be doing this accidentally, they may consider it a denial of service attack. 

Having said this, it may be necessary to include some third-party domains that are used for authentication or other ancillary services. If you don't include these domains, your test will fail.

As with many things related to performance testing, familiarity with the data transmitted to/from the application-under-test and a willingness to experiment are valuable contributors to success.