I tested dsbulk too. But there are many errors:
"[1710949318] Error writing cancel request. This is not critical (the
request will eventually time out server-side)."
"Forcing termination of Connection[/127.0.0.1:9042-14, inFlight=1,
closed=true]. This should not happen and is likely a bug, please
The table has more than 10 M rows. I used COPY command in a cluster with
five machine for this table and everything was OK.
I took a backup to a single machine using sstableloader.
Now I want to extract rows using COPY command but I can't!
On Mon, Sep 23, 2019 at 6:30 AM Durity, Sean R
wrote:
>
Copy command tries to export all rows in the table, not just the ones on the
node. It will eventually timeout if the table is large. It is really built for
something under 5 million rows or so. Dsbulk (from DataStax) is great for this,
if you are a customer. Otherwise, you will probably need to