Joey Hess writes ("Bug#720526: assumes I have bandwidth"):
> dgit downloads files multiple times in a typical fetch / build / push
> workflow.

I can see that this must be quite annoying.

> joey@gnu:~/tmp/alien[sid]>dgit push
> canonical suite name for unstable is sid
> downloading http://http.debian.net/debian//pool/main/a/alien/alien_8.88.dsc...
> downloading http://http.debian.net/debian//pool/main/a/alien/alien_8.88.dsc...
> 
> Surely the above redundant download can be optimised away.

Probably, yes, but the repeated downloading of tarballs and diffs is
IMO the big problem.

> So that's probably excusable. But then, dgit pull already downloaded
> these same source files, and seemingly threw them away after setting up
> the git repository.

Yes.

> Do you think it would make sense for dgit to cache these on its own,
> or will I need to throw a caching proxy in front of it to make it
> usable in bandwidth contstrained environments? 

Well, personally I have it running through squid, but that has its own
annoyances.  Perhaps dgit should just leave the files that it got from
dget in the parent directory.  Although if it did that you'd have to
clean them out by hand somehow, which would run the risk tht you'd
delete .orig tarballs by mistake.

> The caching proxy option is complicated by dgit push uploading files
> to the archive, which a later dgit push is apparently going to want to
> download back

The archive has a lot of lag anyway :-/.

Ian.


-- 
To UNSUBSCRIBE, email to debian-bugs-dist-requ...@lists.debian.org
with a subject of "unsubscribe". Trouble? Contact listmas...@lists.debian.org

Reply via email to