Skip to content

GitLab

  • Projects
  • Groups
  • Snippets
  • Help
    • Loading...
  • Help
    • Help
    • Support
    • Community forum
    • Submit feedback
    • Contribute to GitLab
  • Sign in
Trac
Trac
  • Project overview
    • Project overview
    • Details
    • Activity
  • Issues 246
    • Issues 246
    • List
    • Boards
    • Labels
    • Service Desk
    • Milestones
  • Operations
    • Operations
    • Metrics
    • Incidents
  • Analytics
    • Analytics
    • Value Stream
  • Wiki
    • Wiki
  • Members
    • Members
  • Activity
  • Create a new issue
  • Issue Boards
Collapse sidebar

GitLab is used only for code review, issue tracking and project management. Canonical locations for source code are still https://gitweb.torproject.org/ https://git.torproject.org/ and git-rw.torproject.org.

  • Legacy
  • TracTrac
  • Issues
  • #1656

Closed (moved)
Open
Opened Jul 02, 2010 by Peter Eckersley@pde

More efficient ruleset checking

Currently every request from the browser occurs an overhead that is O(N) where N is the number of rules (or rulesets with match_rules).

This is not good enough, especially if we intend to include all the rules people are submitting.

O(1) lookups should be possible. One way is a dictionary of target domains, with lookups looking something like this:

If the request is for content at blah.thing.com, we look up

.thing.com, thing..com, blah.thing.*

in the dictionary. For the time being, rulesets should be able to signal what domains they target in at least that level of specificity

Eg, Google.xml targets:

google.* www.google.* google.com.* www.google.com.* google.co.* www.google.co.*

BUT, if we ever had to worry about .google. this wouldn't be enough...

To upload designs, you'll need to enable LFS and have an admin enable hashed storage. More information
Assignee
Assign to
None
Milestone
None
Assign milestone
Time tracking
None
Due date
None
Reference: legacy/trac#1656