Skip to content
GitLab
  • Menu
Projects Groups Snippets
  • Help
    • Help
    • Support
    • Community forum
    • Submit feedback
    • Contribute to GitLab
  • Sign in
  • Trac Trac
  • Project information
    • Project information
    • Activity
    • Labels
    • Members
  • Issues 246
    • Issues 246
    • List
    • Boards
    • Service Desk
    • Milestones
  • Monitor
    • Monitor
    • Metrics
    • Incidents
  • Analytics
    • Analytics
    • Value stream
  • Wiki
    • Wiki
  • Activity
  • Create a new issue
  • Issue Boards
Collapse sidebar
  • Legacy
  • TracTrac
  • Issues
  • #4673
Closed (moved) (moved)
Open
Created Dec 08, 2011 by Karsten Loesing@karsten

Partition the 67,000,000 row statusentry table in the metrics database

Thomas Termin suggested splitting the huge statusentry table in the metrics database into multiple tables to solve some of our metrics website/database performance problems. Tim Wilde chimed in saying they're already doing that for large tables down to the hour level.

There was some more discussion about splitting the whole table covering 3+ years of data into 36+ month tables and adding a new month table every month. Another suggestion was to move old data into a history table of some kind using a cron-job-like stored procedure. I later saw Tim explain something about using a year table, month tables, etc. down to hour tables and deciding in the application which table(s) to query. Basically, there was some discussion whether to do the splitting and merging in the database or in the application.

A good next step would be to look at the PostgreSQL documentation for partitioning tables. I'm also going to look more at the schema and Java code to find the places which are affected by such a change.

To upload designs, you'll need to enable LFS and have an admin enable hashed storage. More information
Assignee
Assign to
Time tracking