On 22 Sep 2022, at 21:59, Sean McBride <s...@rogue-research.com> wrote:
> Our svn repo is about 110 GB for a full checkout. Larger on the server of > course, with all history, weighting about 142 GB. > > There haven't been any performance issues, it's working great. > > But now some users are interested in committing an additional 200 GB of > mostly large binary files. > > I worry about it becoming "too big". At what point does that happen? > Terabytes? Petabytes? 100s of GB? From experience it becomes too big when the underlying disk gets full. As long as your underlying disks can handle it, it works fine. I use SVN for versioned incremental backups of files in the 0.5GB range. I’ve seen reports of others checking in multi GB files as backups with no trouble. Best thing to do is to physically try it. Make a copy of your repo, then try check things into it, and see where your issues are. Regards, Graham —