Recycling tests

There are two distinct ways to recycle a Polygraph test.

This page talks about common principles behind tests/URLs recycling. Please see Continuing Tests and Repeating Tests pages for documentation specific to those two subjects.

1. Do not do it

In most cases, you do not want to force Polygraph to reuse URLs. By default, Polygraph will use a unique set of URLs for every run. Unique URL sets are convenient because they reduce side-effects that one test might have on another. For example, one does not have to flush (empty) the cache between to tests to get configured hit ratio because the second test will not (by default) reuse old URLs.

If you read this page out of curiosity, stop now and come back only if you really need two tests to use the same URL set.

2. What is [not] reproducible

Workload features explicitly controlled by PGL objects can be reproduced with good precision. For example, two tests with the same object size distribution are likely to yield very similar object size distributions. This is default Polygraph behavior, and no special effort is required to enable it. In other words, Polygraph tests are meant to be reproduceable by default. Reproducibility in this context does not imply reusing things like URL strings or request sequences; it implies reusing things like object size distributions or request interarrival distributions.

It is possible to force Polygraph to use the same URL set or request sequence in two tests using one of the two techniques mentioned above.

In general, it is not possible to repeat exact request submission times or exact network packet sizes. Polygraph does not have much control over those things. Kernel scheduler decisions or TCP stack behavior cannot be reproduced in a general environment. Fortunately, such reproduction is usually unnecessary.