On Sat, 16 Jul 2005, Junio C Hamano wrote:
[EMAIL PROTECTED] (Eric W. Biederman) writes:
Junio C Hamano <[EMAIL PROTECTED]> writes:
- Anonymous pull from packed archives on remote sites via
non-rsync, non-ssh transport. ...
... but we may also end up wanting something HTTP
reachable.
For this we need a cgi script that will generate an appropriate
pack.
I agree that nothing would beat a pack customized for each
puller from the bandwidth point of view. I like the general
idea of git-daemon Linus did and the cgi script you suggest, but
I wonder what the CPU/disk load implications for the server.
I think you need to nail down the various scenerios that people will be
useing here.
a very common one will be prople who want to setup a cron job to update
their local tree nightly, in this case having a pre-generated pack file
with each day's updates will save a significant amount of processing
power.
would it make sense to have it do something along the lines of sending the
day;s pack file plus a small number of individual object (even if the pack
file will partially duplicate object the puller already has)
David Lang
--
There are two ways of constructing a software design. One way is to make it so
simple that there are obviously no deficiencies. And the other way is to make
it so complicated that there are no obvious deficiencies.
-- C.A.R. Hoare
-
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to [EMAIL PROTECTED]
More majordomo info at http://vger.kernel.org/majordomo-info.html