Thanks Shawn, it worked.

At present I am uploading the data from my MySql table to solr. Since the
Mysql table has got corrupted can I directly load data from solr to my
MySql table.

Regards,

Anuj

On Sat, 26 Feb 2022 at 13:43, Shawn Heisey <[email protected]> wrote:

> On 2/25/2022 11:36 PM, Anuj Bhargava wrote:
> > Tried the following
> > wget -O 2019.csv
> >
> http://xx.xxx.xxx.xxx:8983/solr/data_2019/select?q=*%3A*&rows=20000&wt=csv
>
> Also ... I have not used the /export handler, though I understand it's
> good for massive data dumps.
>
> If you try standard pagination (using the start and rows parameters)
> with a large data set, you'll find that performance will quickly
> deteriorate as the start parameter gets larger.  There is a feature
> called cursorMark that allows for efficient deep paging.
>
> https://solr.apache.org/guide/8_6/pagination-of-results.html#using-cursors
>
> Thanks,
> Shawn
>

Reply via email to