/connector-j-reference-configuration-properties.html
I also included the below stuff in datasource settings..
--
View this message in context:
http://lucene.472066.n3.nabble.com/Indexing-Heavy-dataset-tp4068279p4068460.html
Sent from the Solr - User mailing list archive at Nabble.com.
: Furthermore, I have realized that the issue is with MySQL as its not
: processing this table when a "where" is applied
http://wiki.apache.org/solr/DataImportHandlerFaq#I.27m_using_DataImportHandler_with_a_MySQL_database._My_table_is_huge_and_DataImportHandler_is_going_out_of_memory._Why_doe
Note that stored=true/false is irrelevant to the raw search time.
What it _is_ relevant to is the time it takes to assemble the doc
for return, if (and only if) you return that field. I claim your search
time would be fast if you went ahead and stored the field,
and specified an fl clause that did
some values in the field are up to a 1M as well
On Wed, Jun 5, 2013 at 7:27 PM, Raheel Hasan wrote:
> ok thanks for the reply The field having values like 60kb each
>
> Furthermore, I have realized that the issue is with MySQL as its not
> processing this table when a "where" is applied.
ok thanks for the reply The field having values like 60kb each
Furthermore, I have realized that the issue is with MySQL as its not
processing this table when a "where" is applied
Secondly, I have turned this field to "*stored=false*" and now the "*select/
*" is fast working again
On 6/5/2013 3:08 AM, Raheel Hasan wrote:
> Hi,
>
> I am trying to index a heavy dataset with 1 particular field really too
> heavy...
>
> However, As I start, I get Memory warning and rollback (OutOfMemoryError).
> So, I have learned that we can use -Xmx1024m option with java command to
> start t
Hi,
I am trying to index a heavy dataset with 1 particular field really too
heavy...
However, As I start, I get Memory warning and rollback (OutOfMemoryError).
So, I have learned that we can use -Xmx1024m option with java command to
start the solr and allocate more memory to the heap.
My questio