Could someone give advise on a better way to do this?

I have an index of many merchants and each day I delete merchant products
and re-update my database. After doing this I than re-create the entire
index and move it to production replacing the current index.

I was thinking about updating the index in realtime with only products that
need updated. My concern is that I might be updating 2 million products,
deleting 1 million, and inserting another 1-2 million all in one process. I
guess I could send batches of files to be sucked in and processed but it's
just not as clean as just creating a new index. Do you see an issue with
these massive updates, deletes, and inserts in solr? The problem now is that
I might just be updating 1/2 or 1/4 of the index and I don't need to
re-re-create the entire index again.

What do some of you keep your index updated?  I'm running it off of windows
server so I haven't even looked into the snappuller etc.. stuff.

Thanks,
Mike

Reply via email to