On 5/11/2019 2:06 PM, Abhijit Pawar wrote:
"ISODate("2019-03-12T21:53:16.841Zā)ā saves the date in mongoDB as*
2019-05-09 21:53:16.841Z* which is passed to SOLR while indexing.
It then throws below error:
*java.text.ParseException: Unparseable date: "Tue Mar 12 21:53:16 UTC 2019"*
If that e
On 5/11/2019 12:49 PM, Saurabh Sharma wrote:
Full collection is present on all 3 nodes.I have checked max docs on every
node and they were around 1.5 million on each node with 0.9 Millions active
records.
*How much disk space do all the indexes take?*
-> index size is around 2GB/per node.
*What
Hello,
"ISODate("2019-03-12T21:53:16.841Zā)ā saves the date in mongoDB as*
2019-05-09 21:53:16.841Z* which is passed to SOLR while indexing.
It then throws below error:
*java.text.ParseException: Unparseable date: "Tue Mar 12 21:53:16 UTC 2019"*
On Fri, May 10, 2019 at 9:53 PM Walter Underwoo
Hi Shwan,
I am providing the data asked .In case any thing else is required please
let me know.
*If you get the maxDoc number from every core (index) in that Solr *
*instance, and add those numbers up, you'll get a total document count *
*for the whole node. What are those numbers?*
-This solr
Can you share the sort criteria and search query? The main strategy for
improving performance of the export handler is adding more shards. This is
different than with typical distributed search, where deep paging issues
get worse as you add more shards. With the export handler if you double the
sha
There actually is an undocumented function called valueAt. It works both
for an array and for a matrix.
For an array:
let(echo="b", a=array(1,2,3,4,5), b=valueAt(a, 2)) should return 3.
I have lot's of documentation still to do.
Joel Bernstein
http://joelsolr.blogspot.com/
On Fri, M
Justin Sweeney wrote:
[Index: 10 shards, 450M docs]
> We are creating a CloudSolrStream and when we call CloudSolrStream.open()
> we see that call being slower than we had hoped. For some queries, that
> call can take 800 ms. [...]
As far as I can see in the code, CloudSolrStream.open() opens s
On 5/11/2019 8:05 AM, Saurabh Sharma wrote:
I have been observing a very unique pattern in our solr resource usage.
I am running a cluster with 3 nodes and RAM on each node is 12GB.
We are doing hard commits every 1 minute and soft commits every 15 seconds.
Under normal circumstances solr respons
Hi All ,
I have been observing a very unique pattern in our solr resource usage.
I am running a cluster with 3 nodes and RAM on each node is 12GB.
We are doing hard commits every 1 minute and soft commits every 15 seconds.
Under normal circumstances solr response time is ~15 ms and CPU usage of
ar
Hi,
We are getting ArrayIndexOutOfBoundsException from the Solr core library on
calling CoreContainer.createAndLoad API while executing a Junit test case
through gradle. We can see from the logs that solr.xml is getting initialized.
Here is the code that we have in the Junit which is causing th
10 matches
Mail list logo