On Monday 29 January 2007 14:11, Neil Bothwick wrote:
> On Mon, 29 Jan 2007 11:50:34 +0200, Alan McKinnon wrote:
> > I already use a fairly complicate solution with emerge -pvf and
> > wget in a cron on one of the fileservers, but it's getting
> > cumbersome. And I'd rather not maintain an entire gentoo install on
> > a server simply to act as a proxy. Would I be right in saying that
> > I'd have to keep the "proxy" machine up to date to avoid the
> > inevitable blockers that will happen in short order if I don't?
> >
> > I've been looking into kashani's suggestion of http-replicator,
> > this might be a good interim solution till I can come up with
> > something better suited to our needs.
>
> I was suggesting the emerge -uDNf world in combination in
> http-replicator. The first request forces http-replicator to download
> the files, all other request for those files are then handled
> locally. 

OK, that does make more sense. It's what I first thought you meant but 
then I (stupidly) thought I'd assumed wrongly...

> So if you run this on a suitable cross-section of machines 
> overnight, http-replicator's cache will be primed by the time you
> stumble bleary-eyed into the office.

That has to be the most accurate description of my typical mornings I've 
ever read anywhere... :-)

> If all your machines run a similar mix of software, say KDE desktops,
> you only need to run the cron task on one of them.

Um, that's the hard part. Here's KDE, Gnome, Fluxbox, e17 - just for 
WMs. All machines are ~x86 but that's where the similarities end. I 
suppose I could set up a master machine whose world is a combination of 
all the clients. But whatever I chose, the solution doe not appear to 
be simple :-(


alan
-- 
gentoo-user@gentoo.org mailing list

Reply via email to