Skip to content
GitLab
Projects Groups Topics Snippets
  • /
  • Help
    • Help
    • Support
    • Community forum
    • Submit feedback
    • Contribute to GitLab
  • Sign in
  • Trac Trac
  • Project information
    • Project information
    • Activity
    • Labels
    • Members
  • Issues 246
    • Issues 246
    • List
    • Boards
    • Service Desk
    • Milestones
  • Monitor
    • Monitor
    • Metrics
    • Incidents
  • Analytics
    • Analytics
    • Value stream
  • Wiki
    • Wiki
  • Activity
  • Create a new issue
  • Issue Boards
Collapse sidebar
  • Legacy
  • TracTrac
  • Issues
  • #7516
Closed
Open
Issue created Nov 19, 2012 by Roger Dingledine@arma

Track down Will Scott's torperf-like scripts, make them public if needed, and do a trial deployment somewhere

Will Scott says he's

using the code to actively look at the performance impact of
proxies on web page load time.  Essentially it's a wrapper around
http://phantomjs.org/ with some aggregation and reporting added.

He adds that there are two design things that ought to get figured out

Where the monitoring should live.  I have servers I can use to get a system
working at UW. At some point in a few years I'll graduate, and my
experience is that things which get left behind decay pretty fast, so I'm
somewhat hesitant to go that route.

How to get a stable / meaningful measurements. We need enough aggregation
across both the circuit and the destination domain to dampen individual
server issues and be able to say something about tor as a whole.  Are there
other factors I'm missing that aggregation + setting up a new circuit
before each measurement won't be able to overcome?
To upload designs, you'll need to enable LFS and have an admin enable hashed storage. More information
Assignee
Assign to
Time tracking