Hi

Since it is php, we are using solphp for calling curl based call,

what my concern here is that for each user, we might be having 20-40
attachments needed to be indexed each day, and there are various users
..daily we are targeting around 500-1000 users ..

right now if you see, we

<?php
$ch = curl_init('
http://localhost:8010/solr/update/extract?literal.id=doc2&commit=true');
 curl_setopt ($ch, CURLOPT_POST, 1);
 curl_setopt ($ch, CURLOPT_POSTFIELDS, array('myfile'=>"@paper.pdf"));
 $result= curl_exec ($ch);
?>

also we are planning to use other fields which are to be indexed and stored
...


There are couple of questions here

1. what would be the best strategies for commit. if we take all the
documents in an array and iterating one by one and fire the curl and for the
last doc, if we commit, will it work or for each doc, we need to commit?

2. we are having several fields which are already defined in schema and few
of the them are required earlier, but for this purpose, we don't want, how
to have two requirement together in the same schema?

3. since it is frequent commit, how to use solr multicore for write and read
operations separately ?

Thanks
Naveen

Reply via email to