Add Git repository containing lots of large files
I have been working on a Git repository that I'm using for running integration tests of metrics code bases. That repository contains libraries (to avoid downloading them from
dist.tp.o over and over), a given start state (Tor descriptors, files written in a previous execution), and an expected results (CSV files, JSON files). Here are the current file sizes for testing two metrics code bases (metrics-web and Onionoo):
|Files||Total size (MiB)||# of files|
I'm currently hosting this repository at GitHub, but I'd like to move this over to Tor's Git server at some point. The total file size and possibly the number of files are what stop me right now. But the repository really belongs on the Tor server in some form.
Do we support Git large file storage or something similar? If so, how do I use it? (I never used it before and could try one of the tutorials on the internet, but maybe I should pay special attention to something before hitting
Is the number of expected results files going to be problematic? If so, I can probably tar them up and un-tar them on disk when running tests. Of course, then it's going to be a single binary large file, and when a single contained file changes, the whole file changes, too. What's the preference here?