On Tue, Jan 08, 2013 at 11:15:18PM +0000, Karl Berry wrote: > download all relevant sources > > One of the original conceptions for gsrc was to have one huge tarball > with all current sources that could be downloaded. Perhaps that idea > could be revived now that there's been a request for it :). Brandon, wdyt? > > k
If one is connected to internet to be able to download a two gig tarball, he or she could just as well run this script: #!/bin/bash set -x # simple script to download (ONLY) all package files available through gsrc, # up to date to this very minute, # skipping redownloading packages one already has. # # # first , install gsrc. # then cd to the gsrc directory # then run this script you are currently reading. # # directories with stuff to download: deps gnome gnu gnustep alpha ls -1 deps |xargs -I NAME echo make -C deps/NAME fetch |bash ls -1 gnome |xargs -I NAME echo make -C gnome/NAME fetch |bash ls -1 gnu |xargs -I NAME echo make -C gnu/NAME fetch |bash ls -1 gnustep |xargs -I NAME echo make -C gnustep/NAME fetch |bash ls -1 alpha |xargs -I NAME echo make -C alpha/NAME fetch |bash # This script is in the public domain. It is too simple to copyright. There's a lot of good reasons why this is preferable to bothering to generating a giant tarball, but it would be too boring to type them out when you can probably think of them yourself.
