All, Thanks for reply.
Regards,
Ganesh
On Sun 23 Oct, 2016 7:21 pm Yonik Seeley, wrote:
> No reason to think it would be a problem. 10K documents isn't very much.
> -Yonik
>
>
> On Sun, Oct 23, 2016 at 3:14 AM, Ganesh M wrote:
> > Is anyone tried summation of numeric field with 10k to 100k do
The reason node is in recovery for long time could be related to
https://issues.apache.org/jira/browse/SOLR-9310
On Tue, Oct 4, 2016 at 9:14 PM, Rallavagu wrote:
> Solr Cloud 5.4.1 with embedded Jetty - jdk 8
>
> Is there a way to disable incoming updates (from leader) during startup
> until "fi
This error is thrown when you add (or remove) on an existing field but do
not reindex you data from scratch. It is result of removing field cache
from Lucene. Although you were not getting error with Solr 4.8, I am pretty
sure that you were getting incorrect results.
Stand up a small test cluster
No reason to think it would be a problem. 10K documents isn't very much.
-Yonik
On Sun, Oct 23, 2016 at 3:14 AM, Ganesh M wrote:
> Is anyone tried summation of numeric field with 10k to 100k documents very
> frequently and faced any performance issues ?
> Pls share your experience.
>
> On Sun
Hi Ganesh,
In general it shouldn't be an issue if you execute sum queries every other
hour but you may want to share your cluster configuration (solr version,
solr cloud?, # machines, machine configuration, index size) and
load(indexing & query load) and perform some tests.
Also FYI, there is str
Hi,
I'm using JSON Facet in Solr 5.4.0. Currently, this is what I'm getting in
the JSON output when I'm using the Aggregation Functions.
"facets":{
"count":9,
"doDate_dts":{
"buckets":[{
{
"val":"2016-08-29T00:00:00Z",
"count":3,
"sum":10.3,
Is anyone tried summation of numeric field with 10k to 100k documents very
frequently and faced any performance issues ?
Pls share your experience.
On Sun 23 Oct, 2016 12:27 am Ganesh M,
mailto:mgane...@live.in>> wrote:
Hi,
We will have 10K documents for every hour. We would like to find sum on