study trac.torproject.org archival possibilities
this is a split out of legacy/trac#30857 (moved) to discuss specifically the question of if/how to archive trac.torproject.org.
As mentioned in that ticket, there are a few options on how to deal with trac, provided we have another system we want to use:
-
the golden redirect set: every migrated ticket and wiki page has a corresponding ticket/wiki page in GitLab and a gigantic set of redirection rules makes sure they are mapped correctly. probably impractical, but solves the maintenance problem possibly forever.
-
read-only Trac: user creation is disabled and existing users are locked from making any change to the site. only a temporary or intermediate measure.
-
fossilization: Trac is turned into a static HTML site that can be mirrored like any other site. can be a long term solution and a good compromise with a possibly impossible to design and therefore failing (because incomplete) set of redirection rules.
-
destruction: we hate the web and pretend link rot is not a problem and just get rid of the old site, assuming everything is migrated and people will find their stuff eventually. probably not an option.
-
redirect to the wayback machine: like fossilization, but delegate to the internet archive and hope for the best
== Archive team work
With my archive team hat, I was able to coordinate a first archival of the website during the summer of 2019, as documented in legacy/trac#30857 (moved). This is an attempt at doing "3. fossilization".
All those jobs end up populating the wayback machine at web.archive.org, but are also available as WARC files, an archival format for web pages.
A first archival of all tickets up to legacy/trac#30856 (moved) has been performed here:
https://archive.fart.website/archivebot/viewer/job/5vytc
It's about 600MB of compressed HTML (more or less).
Then a full archival job of the entire site was performed here:
https://archive.fart.website/archivebot/viewer/job/bpu6j
It created about 10GB of WARC files, crawled over 730,000 links (including external sites linked from Trac) and 105.34GiB of data. It took over 5 days:
2019-06-17 01:49:02,514 - wpull.application.tasks.stats - INFO - Duration: 5 days, 7:32:55. Speed: 0.0 B/s.
2019-06-17 01:49:02,514 - wpull.application.tasks.stats - INFO - Downloaded: 732488 files, 105.4 GiB.
== Other statistics
Archiving the server itself means dealing with:
- ~1GB of attachments
- 4GB PostgreSQL database
The actual server uses around 25GB of disk space because of random junk here and there but that's the very minimum it can be trimmed down to. naturally, we can keep that data forever, the problem is keeping the app running on top of that... That would be some incarnation of "4. destruction".