Hi folks,

We use SOLR 5.2.1. We have ebooks stored in SOLR. The majority of the
fields do not change at all like content, author, publisher.... Only the
price field changes frequently.

We let the customers to make full text search so we indexed the content
filed. Due to the frequency of the price updates we use the atomic update
feature. As a requirement of the atomic updates we have to store all the
fields even the content field which is 1MB/document and we did not want to
store it just index it.

As we wanted to update 100 documents with atomic update it took about 3
minutes. Taking into account that our metadata /document is 1 Kb and our
content field / document is 1MB we use 1000 more memory to accelerate the
update process.

I am almost 100% sure that we make something wrong.

What is the best practice of the frequent updates when 99% part of a given
document is constant forever?

Thank in advance

-- 
<https://www.linkedin.com/pub/roland-sz%C5%B1cs/28/226/24/hu> Roland Szűcs
<https://www.linkedin.com/pub/roland-sz%C5%B1cs/28/226/24/hu> Connect with
me on Linkedin <https://www.linkedin.com/pub/roland-sz%C5%B1cs/28/226/24/hu>
<https://bookandwalk.hu/>
CEO Phone: +36 1 210 81 13
Bookandwalk.hu <https://bokandwalk.hu/>

Reply via email to