Stefan Sperling wrote:
> > After the 15000th commit, the size of the repository on disk is 5.5G
> > with the working directory size being 120M. Besides, after several
> > thousand commits to this directory SVN slows down considerably. This
> > must be some design flaw (or peculiarity if you like)
Les Mikesell wrote:
> >On Tue, Feb 08, 2011 at 11:32:47PM +0600, Victor Sudakov wrote:
> >>After the 15000th commit, the size of the repository on disk is 5.5G
> >>with the working directory size being 120M. Besides, after several
> >>thousand commits to this directory SVN slows down considerably.
On 2/8/2011 1:34 PM, Stefan Sperling wrote:
On Tue, Feb 08, 2011 at 11:32:47PM +0600, Victor Sudakov wrote:
After the 15000th commit, the size of the repository on disk is 5.5G
with the working directory size being 120M. Besides, after several
thousand commits to this directory SVN slows down co
On Tue, Feb 08, 2011 at 11:32:47PM +0600, Victor Sudakov wrote:
> After the 15000th commit, the size of the repository on disk is 5.5G
> with the working directory size being 120M. Besides, after several
> thousand commits to this directory SVN slows down considerably. This
> must be some design f
Johan Corveleyn wrote:
[dd]
> But that doesn't explain why the resulting repository is so large
> (compared to the original CVS repository). Sure, there might be memory
> usage problems in dump/load (it uses more memory than the resulting
> repository uses diskspace), but I think there is more go
On Thu, Jan 20, 2011 at 6:11 PM, Daniel Shahaf wrote:
> Victor Sudakov wrote on Thu, Jan 20, 2011 at 14:18:00 +0600:
>> Colleagues,
>>
>> I have finally completed a test cvs2svn conversion on an amd64 system.
>> The peak memory requirement of svnadmin during the conversion was
>> 9796M SIZE, 1880M
That's not a nice result, but I think I said somewhere in this thread
that there are known memory-usage bugs in svnadmin dump/load. Which
means the fix (as opposed to 'workaround') to this issue is to have
someone (possibly you or someone you hire) look into those bugs.
With a bit of luck, this w
On Thu, Jan 20, 2011 at 9:18 AM, Victor Sudakov
wrote:
> Colleagues,
>
> I have finally completed a test cvs2svn conversion on an amd64 system.
> The peak memory requirement of svnadmin during the conversion was
> 9796M SIZE, 1880M RES. The resulting SVN repo size is 8.5G on disk.
>
> "svnadmin du
> -Original Message-
> From: Victor Sudakov [mailto:suda...@sibptus.tomsk.ru]
> Sent: 20 January 2011 08:18
> Subject: Re: Betr.: Re: "svnadmin load" a huge file
>
> Colleagues,
>
> I have finally completed a test cvs2svn conversion on an amd64 system
Colleagues,
I have finally completed a test cvs2svn conversion on an amd64 system.
The peak memory requirement of svnadmin during the conversion was
9796M SIZE, 1880M RES. The resulting SVN repo size is 8.5G on disk.
"svnadmin dump --deltas" of this new SVN repo required 6692M SIZE,
2161M RES of
, December 31, 2010 10:36 AM
To: Victor Sudakov
Cc: users@subversion.apache.org
Subject: Re: "svnadmin load" a huge file
I migrated a large CVS repository (25-50 GB) to SVN years ago on SVN 1.3.
Our repo had many sections (projects) within it. We had to migrate each
project independent
On 10 January 2011 08:30, Michael Haggerty wrote:
> On 01/07/2011 08:38 PM, Victor Sudakov wrote:
>> Daniel Shahaf wrote:
>>> I don't know cvs2svn, but it could have a --sharded-output option, so eg
>>> it would produce a dumpfile per 1000 revisions, rather than one huge
>>> dumpfile.
>>
>> cvs2sv
> On 1/7/2011 7:57 AM, Victor Sudakov wrote:
> > It would be fine if the project in question did not contain almost all
> > the files in one directory. You may call the layout silly, but CVS
does
> > not seem to mind. OTOH, I would have distributed the files over
> > several subdirectories, but CV
On 01/07/2011 08:38 PM, Victor Sudakov wrote:
> Daniel Shahaf wrote:
>> I don't know cvs2svn, but it could have a --sharded-output option, so eg
>> it would produce a dumpfile per 1000 revisions, rather than one huge
>> dumpfile.
>
> cvs2svn-2.3.0_2 does not seem to have such an option:
> "cvs2svn
Kevin Grover wrote:
[dd]
> 2) Don't use '--dumpfile' on cvs2svn, let cvs2svn load it into a subversion
> repo directly.
It did not make any difference. Frankly speaking, I would be
surprised if it did.
Starting Subversion r10773 / 23520
Starting Subversion r10774 / 23520
Starting Subversion r1
On Thu, Dec 30, 2010 at 19:07, Victor Sudakov wrote:
> Colleagues,
>
> I have a CVS repository sized 54M with 17751 files.
>
> "cvs2svn --dumpfile" produces a dump sized 13G. svnadmin cannot load
> this dump aborting with an out of memory condition on a FreeBSD
> 8.1-RELEASE box with 1G of RAM and
Les Mikesell wrote:
[dd]
> > Does it mean that on a 32bit OS I am stuck hopelessly? A dump/load
> > cycle will eventually fail as the repository grows beyond a certain
> > size?
>
> A 'real' svnadmin dump would let you specify revision ranges so you
> could do it incrementally but cvs2svn doesn'
On Sat, Jan 8, 2011 at 4:33 AM, Victor Sudakov wrote:
>> >I ran "svnadmin load" on a machine with 1 GB RAM and 25 GB swap (added
>> >so much swap specially for the occasion). svnadmin crashed after
>> >reaching the SIZE about 2.5 GB.
>> >
>> >Is 1 GB RAM and 25 GB swap not enough?
>>
>> If it is
Victor Sudakov wrote on Sat, Jan 08, 2011 at 01:38:00 +0600:
> Daniel Shahaf wrote:
>
> [dd]
>
> >
> > I believe there are known issues with memory usage in svnadmin. See the
> > issue tracker.
>
> Namely?
>
Search for 'svnadmin' and you should find it.
> >
> > I don't know cvs2svn, but it
Johan Corveleyn wrote:
>
> Like Stephen Connolly suggested a week ago: I think you should take a
> look at svndumptool: http://svn.borg.ch/svndumptool/
>
> I've never used it myself, but in the README.txt file, there is
> mention of a subcommand "split":
I am already trying it but it turns out n
Les Mikesell wrote:
>
> >>I don't think you are hitting some absolute limit in the software here,
> >>just running out of RAM on your particular machine. Can you do the
> >>conversion on a machine with more RAM?
> >
> >I ran "svnadmin load" on a machine with 1 GB RAM and 25 GB swap (added
> >so m
On Fri, Jan 7, 2011 at 8:47 PM, Les Mikesell wrote:
> On 1/7/2011 1:31 PM, Victor Sudakov wrote:
>
>>>
>>> I don't think you are hitting some absolute limit in the software here,
>>> just running out of RAM on your particular machine. Can you do the
>>> conversion on a machine with more RAM?
>>
>
On 1/7/2011 1:31 PM, Victor Sudakov wrote:
I don't think you are hitting some absolute limit in the software here,
just running out of RAM on your particular machine. Can you do the
conversion on a machine with more RAM?
I ran "svnadmin load" on a machine with 1 GB RAM and 25 GB swap (added
Daniel Shahaf wrote:
[dd]
>
> I believe there are known issues with memory usage in svnadmin. See the
> issue tracker.
Namely?
>
> I don't know cvs2svn, but it could have a --sharded-output option, so eg
> it would produce a dumpfile per 1000 revisions, rather than one huge
> dumpfile.
cvs2
Les Mikesell wrote:
> >
> I migrated a large CVS repository (25-50 GB) to SVN years ago on SVN
> 1.3. Our repo had many sections (projects) within it. We had to
> migrate each project independently so that it's team could coordinate
> when they migrated to SVN. As such, I dumped
Les Mikesell wrote on Fri, Jan 07, 2011 at 10:37:12 -0600:
> On 1/7/2011 7:57 AM, Victor Sudakov wrote:
>>
> I migrated a large CVS repository (25-50 GB) to SVN years ago on SVN
> 1.3. Our repo had many sections (projects) within it. We had to
> migrate each project independently so t
On 1/7/2011 7:57 AM, Victor Sudakov wrote:
I migrated a large CVS repository (25-50 GB) to SVN years ago on SVN
1.3. Our repo had many sections (projects) within it. We had to
migrate each project independently so that it's team could coordinate
when they migrated to SVN. As such, I dumped e
Brian Brophy wrote:
> >
> >>I migrated a large CVS repository (25-50 GB) to SVN years ago on SVN
> >>1.3. Our repo had many sections (projects) within it. We had to
> >>migrate each project independently so that it's team could coordinate
> >>when they migrated to SVN. As such, I dumped eac
Fair enough, the same pattern is still applicable. For example, in our
CVS repo what separated one "project" from another was basically a
root-level folder.
In kind, you could similarly use cvs2svn to "chunk/dump" subdirectories
at a time.
For example, if in CVS you have something like:
/Fo
> On 31 Dec 2010 15:10, "Victor Sudakov" wrote:
> > Daniel Shahaf wrote:
> >> (or dive into the source and help us plug that memory leak --- compile
> >> with APR pool debugging enabled)
> >
> > I will try to do that but unfortunately I need some immediate
> > workaround :(
> >
Thanks. I was ref
Google is your friend: svndumptool
You moght need to append a .py
Also if this is a _top post_ it's three phone what done it... Haven't
figured out how to control where it puts the reply
- Stephen
---
Sent from my Android phone, so random spelling mistakes, random nonsense
words and other nonse
Brian Brophy wrote:
> I migrated a large CVS repository (25-50 GB) to SVN years ago on SVN
> 1.3. Our repo had many sections (projects) within it. We had to
> migrate each project independently so that it's team could coordinate
> when they migrated to SVN. As such, I dumped each project when
I migrated a large CVS repository (25-50 GB) to SVN years ago on SVN
1.3. Our repo had many sections (projects) within it. We had to
migrate each project independently so that it's team could coordinate
when they migrated to SVN. As such, I dumped each project when ready
and then svnadmin lo
Daniel Shahaf wrote:
> Split the dumpfile to smaller dumpfiles
How do I do that? I have not found such an option in cvs2svn.
I don't mind writing a script if I knew the idea how to split the dump.
I haven't found any "svnadmin load" option to import part of a dump
either. man what?
> or try a n
Split the dumpfile to smaller dumpfiles or try a newer version of svnadmin.
(or dive into the source and help us plug that memory leak --- compile
with APR pool debugging enabled)
Victor Sudakov wrote on Fri, Dec 31, 2010 at 09:07:32 +0600:
> Colleagues,
>
> I have a CVS repository sized 54M wi
35 matches
Mail list logo