> > For now I have set hot_set_frac and hot_set_prob both at 100%, but
> > the hot_set_frac will allow me to change the number of pages that the
> > actually requests.
> All looks fine, though it is difficult to see bugs in code without
> running any tests. You may want to have a ramp phase between "startup"
> and "testing" to provide for a smooth transition, but you do not have
Apparently this doesn't seem to work the way I wanted it. The hot_set is
shared amongst all robots, so I can limit every robot to only request 20 of
the 100 objects, but all robots will request the same 20 objects, which is
not what I wanted.
I think, the only other option that I can use to influence this is the
public_interest field, but this also does not have the effect I'm looking
for. With a public_interest of 50% for instance, half of the objects is
shared amongst robots, but the other 50% are not shared at all, ie. no other
robot wil request them. Which is also not what I was aiming at.
Maybe I can use different content types like you suggested, but I think it's
impossible to make sure that the 50 objects I use each have a different
> > Yes, this is a problem, but I suppose I can solve this by adding a third
> > phase with a number of new objects that haven't been cached before (or
> > possibly disabling synchronization).
> Disabling synchronization is probably the easiest way out. Another
> option is to add low-request-rate robots with low recurrence.
When adjusting my tests, I decreased the lifelength of the objects, which
solved this problem as a side-effect :)
This archive was generated by hypermail 2b29 : Mon Feb 06 2006 - 12:00:22 MST