On Thu, Sep 22, 2022 at 3:59 PM Sean McBride <s...@rogue-research.com> wrote:
>
> Our svn repo is about 110 GB for a full checkout. Larger on the server of 
> course, with all history, weighting about 142 GB.
>
> There haven't been any performance issues, it's working great.
>
> But now some users are interested in committing an additional 200 GB of 
> mostly large binary files.
>
> I worry about it becoming "too big".  At what point does that happen?  
> Terabytes?  Petabytes?  100s of GB?

I've never encountered a problem with "too big," but I have
encountered problems with binary file types causing an SVN client or
server to hang. I experienced it back in 2012 or 2013 on a very large
collection of repos. I tried to check out/clone and the operation
would hang about 6 or 8 hours into the operation.

Through trial and error we discovered a developer had checked-in
object files from an XCode build, and the SVN client or server would
hang on the object files. I don't recall if it was all object files,
or just a particular one. As an added twist, I think we were using
TortoiseSVN on Windows. So it may have been a bad interaction with
TortoiseSVN on Windows. Once we manually deleted object files the
check-out/clone proceeded.

I don't know if that would happen nowadays.

Jeff

Reply via email to