Have you considered writing a small SolrJ (or other client) program that processed the rows in your huge file and sent them to solr in sensible chunks? That would give you much finer control over how the file was processed, how many docs were sent to Solr at a time, what to do with errors. You could even run N simultaneous programs to increase throughput...
FWIW, Erick On Tue, Nov 13, 2012 at 3:42 AM, mitra <mitra.re...@ornext.com> wrote: > Thankyou > > > *** I understand that the default size for HTTP POST in tomcat is 2mb can > we > change that somehow > so that i dont need to split the 10gb csv into 2mb chunks > > curl http://localhost:8080/solr/update/csv -F "stream.file=D:\eighth.csv" > -F > "commit=true" -F "optimize=true" -F "encapsulate="" -F "keepEmpty=true" > > *** As I mentioned im using the above command to post rather than using > this > below format > > curl http://localhost:8080/solr/update/csv --data-binary @eighth.csv -H > 'Content-type:text/plain; charset=utf-8' > > ***My question Is the Limit still applicable even when not using the above > data binary format also > > > > > -- > View this message in context: > http://lucene.472066.n3.nabble.com/Solr-Indexing-MAX-FILE-LIMIT-tp4019952p4019965.html > Sent from the Solr - User mailing list archive at Nabble.com. >