Hi,
On 12/19/24 02:27, Simon Josefsson wrote:
Bruno Haible via Gnulib discussion list <bug-gnulib@gnu.org> writes:
Hi,
we have been using downloads from cgit
on Savannah (https://git.savannah.gnu.org/cgit/gnulib.git, for instance,
https://git.savannah.gnu.org/cgit/gnulib.git/snapshot/gnulib-d271f86.tar.gz),
to get specific .tar.gz files of particular revisions of Gnulib.
'git' is the protocol that was designed for this purpose, and has
the maximum efficiency (when you use it with --depth=1). So, that
is the protocol that you should recommend.
But 'git' is not designed for transferring a serialized copy of the
repository, and getting anything serialized and reproducible out of git
is difficult and inefficient. While I also believe most people should
use 'git' to download gnulib, I would rather have people use a tarball
snapshot from https://ftp.gnu.org/gnu/gnulib (which could be PGP-signed)
rather than some dynamically generated tarball from one of Savannah's
web-based interface, which could be modifed at any time (even on a
per-IP basis) and is not in-transit protected beyond https.
'git archive' is the best thing I know of with serialized/reproducible-ish
output from Git.
Having gnulib tarballs would be quite ideal - but is it feasible for every
commit?
FWIW, I don't really like dynamically generated tarballs either. In fact...
Could live-bootstrap start to use git cloning? Maybe we can win this
particular example, but I suspect the question will come back again.
Without going into our technical structure - yes and no. However, we are actively moving away from dynamically generated
tarballs (to making them ourselves from Git clones), not just for gnulib, but for everything where we consume a Git
snapshot. We can get past this issue ourselves. However, at this point gnulib effectively has a dependency on git to use
-- depends on how you feel about that how problematic that is.
Samuel (fosslinux)