On 6/3/2014 1:47 PM, Jack Krupansky wrote: > Anybody care to forecast when hardware will catch up with Solr and we > can routinely look forward to newbies complaining that they indexed > "some" data and after only 10 minutes they hit this weird 2G document > count limit?
I would speculate that Lucene will update its index format to use 64-bit (or maybe VInt) document identifiers before a "typical" (or test) data source would present a problem like this. I also seriously doubt that it would be a complete newbie, it is more likely that it would be an experienced admin or developer who would instantly know why it happened as soon as they saw how many documents were successfully indexed. They might need someone to point out SolrCloud shards to them as the solution. I've already heard of a production SolrCloud install with five billion documents, so even now it's not a theoretical problem. Thanks, Shawn