Op di 23-09-2003, om 22:17 schreef Osamu Aoki: > Hi, my neibour :-) > > On Tue, Sep 23, 2003 at 06:02:08PM +0000, Benedict Verheyen wrote: > > i have a server and it has a quite extensive sources.list. Now i would > > like to make a cron job that does "apt-get update" every so often and > > then use that cache for my connected client so i don't have to download > > the packages list again and thus use unnecessary bandwidth. > > 1) Why do "apt-get update", do "apt-get update; apt-get -d upgrade" > 2) web cache ==> use squid > 3) Always install not by CRON.
Well, the basic idea is to save bandwith and only have 1 computer get the packages lists. As other people have suggested apt-proxy seems to be able to do that. > But why have quite extensive sources.list for server. You should be > running STABLE ! for server. I know. I like living on the edge :) and it's not a production server. But i know i should run stable. > > > In the future i plan to make packages of my own scripts. Can i combine > > those? I mean putting my packages in the cache of packages on the server > > and thus also have them available for the pc's connected to my LAN? > > You can do this in many ways. But I found running squid to be most > simple and flexible. Another suggestion posted here is to make my own repository with dpkg-scanpackages and adding this to the sources lis of the server. Thanks all for the suggestions made! Benedict -- To UNSUBSCRIBE, email to [EMAIL PROTECTED] with a subject of "unsubscribe". Trouble? Contact [EMAIL PROTECTED]