> I have no idea whether you can successfully recover anything from that
> index now that it has broken the hard limit.
Theoretically, I think it's possible with some very surgical edits.
However, I've tried to do this in the past and abandoned it. The code to
split the index needs to be able to o
On 8/7/2017 9:41 AM, Wael Kader wrote:
> I faced an issue that is making me go crazy.
> I am running SOLR saving data on HDFS and I have a single node setup with
> an index that has been running fine until today.
> I know that 2 billion documents is too much on a single node but it has
> been runni
>
> Hello,
>
> I am facing an issue on my live environment and I couldn’t find a solution
> yet.
> I am running SOLR saving data on HDFS and I have a single node setup with an
> index that has been running fine until today.
> I know that 2 billion documents is too much on a single node but it
You have the maximum number of docs in a single shard.
If I'm not wrong, the only solution is split the index in more shards (if you
are running solrcloud mode).
--
/Yago Riveiro
On 7 Aug 2017, 16:48 +0100, Wael Kader , wrote:
> Hello,
>
> I faced an issue that is making me go crazy.
> I am ru
Hello,
I faced an issue that is making me go crazy.
I am running SOLR saving data on HDFS and I have a single node setup with
an index that has been running fine until today.
I know that 2 billion documents is too much on a single node but it has
been running fine for my requirements and it was pr