The cursor-based deep paging in 4.7+ works very well and the performance on
large extracts (for us, maybe  up to 100K documents) is excellent, though
it will obviously depend on the number and size of fields that you need to
 pull. I wrote a Perl module to do the extractions from Solr without
problems (and DBI takes care of  writing to a database).

I'm probably going to rewrite in Python since the final destination of many
of our extracts is Tableau,  which has  a Python API for creating TDEs
(Tableau data extracts)

regards

-Simon


On Fri, May 2, 2014 at 7:43 AM, Siegfried Goeschl <sgoes...@gmx.at> wrote:

> Hi Per,
>
> basically I see three options
>
> * use a lot of memory to scope with huge result sets
> * user result set paging
> * SOLR 4.7 supports cursors (https://issues.apache.org/
> jira/browse/SOLR-5463)
>
> Cheers,
>
> Siegfried Goeschl
>
>
> On 02.05.14 13:32, Per Steffensen wrote:
>
>> Hi
>>
>> I want to make extracts from my Solr to MySQL. Any tools around that can
>> help med perform such a task? I find a lot about data-import from SQL
>> when googling, but nothing about export/extract. It is not all of the
>> data in Solr I need to extract. It is only documents that full fill a
>> normal Solr query, but the number of documents fulfilling it will
>> (potentially) be huge.
>>
>> Regards, Per Steffensen
>>
>
>

Reply via email to