Re: filename encodings and conversion failure

2022-12-29 Thread Branko Čibej

On 26.12.2022 22:26, Karl Berry wrote:

I certainly don't expect such fundamental behavior to change, but I
can't help but respond a little. Just ignore me :).

 All the world is not Unix
 ...
 the problem of cross-platform compatibility.
 
Of course.  Precisely the reason why storing filenames as bytes would be

more portable than forcing any particular encoding, in principle, seems
to me.



Well my point is that this would not work everywhere. Blame it on 
Subversion's insisting on cross-platform compatibility. The thing that 
stores filenames as bytes is called something else. :)


-- Brane

Re: Revision too big to dump?

2022-12-29 Thread Nico Kadel-Garcia
On Wed, Dec 28, 2022 at 4:56 PM Ash Rubigo  wrote:
>
> This seems to be solved. Well past the troublesome revision after about
> 10 hours and 70GB of virtual memory.

Son of a... Somebody committed disk images or a suite of DVD images?
The inability to split and discard such revisions is an old issue for
a poorly handled Subversion repo.


> On 27/12/2022 22:02:53, Ash Rubigo wrote:
> > Thanks for the suggestion Jeff.
> >
> > Increased the amount of virtual memory on Windows. Previously it was
> > allocated by the system, so not sure why it didn't keep increasing as
> > needed. Anyway, the upper limit is set to 100GB now, so we'll see if
> > that is sufficient. Currently it's using 40GB after a few hours of
> > dumping this one revision with no end in sight.
> >
> > On 27/12/2022 18:13:12, Jeffrey Walton wrote:
> >> On Tue, Dec 27, 2022 at 1:05 PM Ash Rubigo  wrote:
> >>>
> >>> I'm trying to extract projects from a large repository into their own
> >>> separate repositories.
> >>>
> >>> I understand to use `svnadmin dump`, then `svndumpfilter`, and finally
> >>> `svnadmin load`.
> >>>
> >>> Trouble is during `svnadmin dump` one of the revisions results in an
> >>> 'out of memory' error. The revision in question is large, on the order
> >>> of 20GB and I only have 16GB of RAM.
> >>>
> >>> I have tried incrementally dumping, but had the same issue.
> >>>
> >>> Is it really the case that if a revision is larger than the available
> >>> RAM it cannot be dumped?
> >>
> >> Just a thought, but it does not answer the question...
> >>
> >> Use GParted, and increase the swap file to say, 64 GB. Then set
> >> swappiness to a low value, like 2. The low swappiness value will keep
> >> most stuff in RAM, and spill over to disk rarely. Most of the swap
> >> file will remain unused. But it should allow your dumps to proceed.
> >>
> >> Jeff
> >
>
> --
> Regards,
>
> Ash Rubigo