Maybe you can start by testing this with split -l and xargs :-) These are
standard Unix toolkit approaches and since you use one of them (curl) you
may be happy to use others too.
Regards,
Alex.
Personal blog: http://blog.outerthoughts.com/
LinkedIn: http://www.linkedin.com/in/alexandrerafalov
Thank you eric
I didnt know that we could write a Java class for it , can you provide me
with some info on how to
Thanks
--
View this message in context:
http://lucene.472066.n3.nabble.com/Solr-Indexing-MAX-FILE-LIMIT-tp4019952p4020407.html
Sent from the Solr - User mailing list archive at N
Have you considered writing a small SolrJ (or other client) program that
processed the rows in your huge file and sent them to solr in sensible
chunks? That would give you much finer control over how the file was
processed, how many docs were sent to Solr at a time, what to do with
errors. You coul
Thankyou
*** I understand that the default size for HTTP POST in tomcat is 2mb can we
change that somehow
so that i dont need to split the 10gb csv into 2mb chunks
curl http://localhost:8080/solr/update/csv -F "stream.file=D:\eighth.csv" -F
"commit=true" -F "optimize=true" -F "encapsulate
Hi - instead of trying to make the system ingest such large files perhaps you
can split the files in many small pieces.
-Original message-
> From:mitra
> Sent: Tue 13-Nov-2012 09:05
> To: solr-user@lucene.apache.org
> Subject: Solr Indexing MAX FILE LIMIT
>
> Hello Guys
>
> Im using