On Mon, Nov 05, 2007 at 10:51:24AM +0100, Guido Guenther wrote:
> On Mon, Nov 05, 2007 at 07:35:08AM +0100, Mike Hommey wrote:
> > On Sun, Nov 04, 2007 at 03:37:28PM +0100, Guido Guenther wrote:
> > > Hi Mike,
> > > On Fri, Nov 02, 2007 at 09:46:48PM +0100, Mike Hommey wrote:
> > > > Well, everything is in the subject. One of the culprits is copy_from(),
> > > > which reads/writes the whole tree one while it is far from necessary.
> > > What would you recommend? Using hardlinks? We can't untar right into the
> > > git repo since we might need to mangle/filter files. Any suggestions are
> > > very welcome.
> > 
> > Why do you need to mangle/filter files ?
> To get the exact directory names, but this can probably be achieved with
> a combination of --strip and -C - I'll have a look.

Or you can simply feed git-fast-import with data from the tar stream
(you could use python's tarfile for that).
That would definitely be the fastest, since it would save a lot of
writes (git-fast-import won't write objects that already exist in the
repo).

Mike



-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of "unsubscribe". Trouble? Contact [EMAIL PROTECTED]

Reply via email to