Has anyone ever been successful in processing 150M records using the
Suggester Component? The make of the component, please comment.
On Tue, Jun 26, 2018 at 1:37 AM, Ratnadeep Rakshit
wrote:
> The site_address field has all the address of United states. Idea is to
> build something simi
The site_address field has all the address of United states. Idea is to
build something similar to Google Places autosuggest.
Here's an example query: curl "
http://localhost/solr/addressbook/suggest?suggest.q=1054%20club&wt=json";
Response:
{
"responseHeader": {
"status": 0,
"QTime": 3125,
"par
Anyone from the Solr team who can shed some more light?
On Tue, Jun 12, 2018 at 8:13 PM, Ratnadeep Rakshit
wrote:
> I observed that the build works if the data size is below 25M. The moment
> the records go beyond that, this OOM error shows up. Solar itself shows 56%
> usage of 2
I observed that the build works if the data size is below 25M. The moment
the records go beyond that, this OOM error shows up. Solar itself shows 56%
usage of 20GB space during the build. So, is there some settings I need to
change to handle larger data size?
On Tue, Jun 12, 2018 at 3:17 PM, Aless
Can anyone put some light on this?
On Tue, Jun 12, 2018 at 12:32 AM, Ratnadeep Rakshit
wrote:
> Here's the stack trace :
>
> 538 ERROR - 2018-06-07 09:07:36.030; [ x:addressbook]
> org.apache.solr.common.SolrException; null:java.lang.RuntimeException:
> java.lang.OutOfMe
Scope(SessionHandler.java:185)
630 at
org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1061)
On Mon, Jun 11, 2018 at 11:34 PM, Christopher Schultz <
ch...@christopherschultz.net> wrote:
> Ratnadeep,
>
> On 6/11/18 12:25 PM, Ratnadeep Rakshit w
I am using the Solr Suggester component in Solr 5.5 with a lot of address
data. My Machine has allotted 20Gb RAM for solr and the machine has 32GB
RAM in total.
I have an address book core with the following vitals -
"numDocs"=153242074
"segmentCount"=34
"size"=30.29 GB
My solrconfig.xml looks s