If by "corrupt index" you mean an index that's just not quite
up to date, could you do a delta import? In other words, how
do you make our Solr index reflect changes to the DB even
without a schema change? Could you extend that method
to handle your use case?

So the scenario is something like this:
Record the time
rebuild the index
import all changes since you recorded the original time.
switch cores or replicate.

Best
Erick

2010/11/11 Robert Gründler <rob...@dubture.com>

> Hi again,
>
> we're coming closer to the rollout of our newly created solr/lucene based
> search, and i'm wondering
> how people handle changes to their schema on live systems.
>
> In our case, we have 3 cores (ie. A,B,C), where the largest one takes about
> 1.5 hours for a full dataimport from the relational
> database. The Index is being updated in realtime, through post
> insert/update/delete events in our ORM.
>
> So far, i can only think of 2 scenarios for rebuilding the index, if we
> need to update the schema after the rollout:
>
> 1. Create 3 more cores (A1,B1,C1) - Import the data from the database -
> After importing, switch the application to cores A1, B1, C1
>
> This will most likely cause a corrupt index, as in the 1.5 hours of
> indexing, the database might get inserts/updates/deletes.
>
> 2. Put the Livesystem in a Read-Only mode and rebuild the index during that
> time. This will ensure data integrity in the index, with the drawback for
> users not being
> able to write to the app.
>
> Does Solr provide any built-in approaches to this problem?
>
>
> best
>
> -robert
>
>
>
>

Reply via email to