Web Polygraph 

 Documentation
    Reference Manual
    User Manual
       Getting started
       Simple tests
       Addresses
       Using DNS
       Content simulation
       Compression
       Authentication
       POST/PUT
       Ranges
       Content-driven Robots
       Trace replay
       Watchdogs
       From access logs
       Recycling tests
          Continuing tests
·         Repeating tests
             Introduction
             Test #1
             Test #2
       Reporting
       Troubleshooting
    Workloads

 Downloads
    Copyright
    Changes
    Upgrade

 Support
    Wish List
    Sponsors
    Mailing List
    Users

 Miscellaneous
    Results
    Compliance
    Other Tools
    Search

  

Home · Search · Print · Help 


Repeating exactly the same tests

Please read tests recycling page for an alternative to the kludge described below.

Table of Contents

1. Introduction
2. Test #1
3. Test #2

1. Introduction

On rare occasions, it is necessary to repeat exactly the same test more then once. In general, it is not possible because some traffic variations are outside of Polygraph control. However, it is possible to replay a very similar sequence of URLs with very similar properties.

When you use the technique described below, it is very important to keep in mind that Polygraph does not know that you are repeating a test. For example, Polygraph does not know what URLs were requested during the original test and has no special expectations for the URLs it requests in the repeated test.

You may wonder why Polygraph cannot realize that the test is being repeated. After all, you are going to use some command-line options (described below) that could give Polygraph a clue. The problem is that it is not sufficient to know that the test is being repeated. For example, to act intelligently, Polygraph would have to know (among many other things) how long the previous test was executed for. Without such knowledge, Polygraph does not have enough information to, say, expect a hit when requesting a URL for the first time during the second test. Was that URL requested during the first test?

To run the exact same test, you have to use special command-line options when running both the original test and the repeated one. The sections below describe the procedure.

2. Test #1

With each Polygraph process, use "--unique_world no" command line option. This option will prevent Polygraph from generating unique URLs. Instead, Polygraph will depend on the seed of the local random number generator to generate URLs.

For each Polygraph process, specify explicit seeds for local random number generators using the --local_rng_seed command line option. You can use sequential seeds. No two processes can share a seed. That is, you must use a unique seed for each process. The default seed is 1 which violates the latter requirement. Note that using the same seed with the default --unique_world setting is OK (though it is better to use different seeds anyway).

clt1> polyclt ... --unique_world no --local_rng_seed 101
clt2> polyclt ... --unique_world no --local_rng_seed 102
...
srv1> polysrv ... --unique_world no --local_rng_seed 201
srv2> polysrv ... --unique_world no --local_rng_seed 202
...

Record what seed values you have used.

Other than the two options described above, you can run the test as usual.

3. Test #2

Run the second test exactly like you ran the first one. Use the same PGL configuration and the same command line options (including seed values). You can change log file names, but do not change anything else.

This second test should yield the same URL sequence as the first test.

During the second test, you may get some "new" URLs, and/or you may not get some "old" URLs if the durations of the two tests differ and/or if there were errors.


Home · Search · Print · Help