Also you can set an expiration policy maybe, and delete files that expire
after some time and aren't older than other... but i don't know if you can
iterate over the existing ids...

On Wed, Oct 20, 2010 at 1:34 PM, Shawn Heisey <s...@elyograg.org> wrote:

> On 10/20/2010 9:59 AM, bbarani wrote:
>
>> We actually use virtual DB modelling tool to fetch the data from various
>> sources during run time hence we dont have any control over the source.
>>
>> We consolidate the data from more than one source and index the
>> consolidated
>> data using SOLR. We dont have any kind of update / access rights to source
>> data.
>>
>
> It seems likely that those who are in control of the data sources would be
> maintaining some kind of delete log, and that they should be able to make
> those logs available to you.
>
> For my index, the data comes from a MySQL database.  When a delete is done
> at the database level, a database trigger records the old information to a
> main delete log table, as well as a separate table for the search system.
>  The build system uses that separate table to run deletes every ten minutes
> and keeps it trimmed to 48 hours of delete history.
>
>
>


-- 
______
Ezequiel.

Http://www.ironicnet.com

Reply via email to