Skip to content

GitLab

  • Menu
Projects Groups Snippets
    • Loading...
  • Help
    • Help
    • Support
    • Community forum
    • Submit feedback
    • Contribute to GitLab
  • Sign in
  • Trac Trac
  • Project information
    • Project information
    • Activity
    • Labels
    • Members
  • Issues 246
    • Issues 246
    • List
    • Boards
    • Service Desk
    • Milestones
  • Monitor
    • Monitor
    • Metrics
    • Incidents
  • Analytics
    • Analytics
    • Value stream
  • Wiki
    • Wiki
  • Activity
  • Create a new issue
  • Issue Boards
Collapse sidebar
  • Legacy
  • TracTrac
  • Issues
  • #12909

Closed
Open
Created Aug 20, 2014 by Trac@tracbot

stem.util.str_tools.get_size_label() insufficiently precise

There is a known issue in Arm that causes AccountingMax sizes to be displayed with insufficient precision. For example, a real AccountingMax of 1850 GB will be displayed as 1 TB.

https://trac.torproject.org/projects/tor/ticket/12452

This is confusing and perplexing, especially for those worried about big bandwidth overage charges.

It seems that this is an issue with stem.util.str_tools.get_size_label(), which is itself mostly a wrapper for stem.util.str_tools._get_label(). These functions format and return the string that is then displayed:

(the call in arm)

https://gitweb.torproject.org/arm.git/blob/ac7923e31f52d3cf51b538ddf799162d67c04ecc:/arm/graphing/bandwidth_stats.py#l504

(the called Stem function)

https://gitweb.torproject.org/stem.git/blob/6c78d9acf9606eb6a93712cfe115a40c45504c70:/stem/util/str_tools.py#l134

Here's the above example problem returned from the master branch of Stem (version 0.1.2.3, I believe), cloned on 20-08-2014. 1850000000000 is 1850 GB converted to bytes:

>>> from stem.util.str_tools import get_size_label
>>> get_size_label(1850000000000)
'1 TB'

Trac:
Username: mmcc

To upload designs, you'll need to enable LFS and have an admin enable hashed storage. More information
Assignee
Assign to
Time tracking