Research: what's the relationship between capacity, bytes used, and torperf?
It seems that when our network capacity goes down, the torperf latency goes up. And vice versa. Except it's not a perfect correlation, and sometimes the response is damped more than other times.
Seems to me that the less capacity there is (relative to actual load), the more losing further capacity will influence the torperf results. And the more capacity there is (relative to actual load), the less gaining further capacity will influence the torperf results.
Is there a way to predict where the 'healthy' or 'unhealthy' parts of this relationship graph are?
Step one is to try to figure out what data points we have so far, and get an intuition for what we think happened, and how healthy we think the network was, in each case.
Step two might be to design some experiments where we intentionally gain or lose capacity in a controlled way, and observe the effect on torperf results, either to confirm a hypothesis or to get more data points.