On 03.01.2018 21:13, Dave Huang wrote: > On 1/3/2018 13:19, Nico Kadel-Garcia wrote: >> NTFS also has limits on the maximum number of files for a filesystem. > > FWIW, that limit is 2^32 - 1 files, or approximately 4 billion (see > Table 3.12 of https://technet.microsoft.com/en-us/library/cc938432.aspx) > >> There are also notable performance limits on having too many files in >> a directory. > > I don't know enough about NTFS internals to say whether that's the > case with NTFS or not, but in the context of this discussion, the > default SVN shard size is 1000 revisions, which I don't think could be > considered "too many files in a directory". > > While I don't have any actual numbers, my gut feeling is that packing > the repo isn't really needed on NTFS in terms of day to day > performance of clients using the repo, or in terms of filesystem > limitations. That said, I do it anyway because backing up the repo > does go faster when dealing with a couple dozen large files vs. tens > of thousands of small files.
And there's your answer to the question of day to day performance of the repository: Subversion also has to open many vs. just one file when its reading historical revisions, so packing will definitely help to reduce the number of directory lookups and file opens (the latter are notoriously slow on Windows). Depending on usage patterns, the performance boost may be significant. -- Brane