Hi ;)
I would like to imagine a time approach, could be made easier with the
updates.
With daily changing databases like virus database, blacklists, card or
similar is always a part of the old data is erased as new data are added.
Programs such as clamav bring it with their own update client.
But
I'd like use this for Antivirus-Databases, Blacklists, etc. - Anything that
supports many updates in short time for huge datasets.
The reason for this is, that:
1. Every update would produce much bandwith, because for every update the
complete database has to be downloaded by aptitude.
2. On eve
Ok here is the specific place:
I've got blacklists, some with over 1 million entries, so the .deb packages
have a big size.
Debdelta doesn't function good, because so the whole list would be uninstalled
and the new list installed. For all 2 million transactions this needs lots of
time. And I t
> A DNSBL is the traditional solution for blacklists, why are you
> putting your blacklist in a .deb?
I meant blacklists specially for squidguard. That are hughe files with
domains / URLs inside. So e.g. porn could be blocked in your network.
> What if a node misses an update, then?
> And what i
>
> I don't know how big your squidguard blacklists are (its a good idea to
> include details when asking questions), but the largest one I could find
> was 20MB [...]
>
My biggest list is 22MB and gets daily updates. But because it uses much
RAM when be loaded I check all entrys one time per week
5 matches
Mail list logo