I'm Loading a CSV file into Solr, since the CSV file contains a huge amount
of data its taking a very long time to load and sometimes resulting in
OutOfMemoryException. Is there any way so that we can read the data from the
CSV file and load it into the Solr database without using "/update/csv" or
increasing heap
space.
Please suggest.
Yonik Seeley-2 wrote:
>
> On Tue, Jul 7, 2009 at 8:41 AM, Anand Kumar
> Prabhakar wrote:
>> Is there any way so that we can read the data from the
>> CSV file and load it into the Solr database without using "/update/csv"
>
>
ley-2 wrote:
>
> On Tue, Jul 7, 2009 at 9:14 AM, Anand Kumar
> Prabhakar wrote:
>> I want to know is there any method to do
>> it much faster, we have overcome the OutOfMemoryException by increasing
>> heap
>> space.
>
> Optimize your schema - eliminate all
Is there any way to Place the CSV file to index in the SOLR Server so that
the file can be indexed and searched. If so please let me know the location
in which we have to place the file. We are looking for a workaround to avoid
the HTTP request to the SOLR server as it is taking much time.
--
Vie
t; solr server must be used.
> """
>
> So you can put it anywhere local and give solr the full path to
> directly read it.
>
> -Yonik
> http://www.lucidimagination.com
>
>
>
> On Wed, Jul 8, 2009 at 8:34 AM, Anand Kumar
> Prabhakar wrote:
>&
Is there any possiblity of Adding Multiple fields to the UniqueKey in
Schema.xml(An Implementation similar to Compound Primary Key)?
--
View this message in context:
http://www.nabble.com/Using-Multiple-fields-in-UniqueKey-tp24476088p24476088.html
Sent from the Solr - User mailing list archiv