solr-user
Subject: Re: Help needed in breaking large index file into smaller ones
Hi Erick,
Its due to some past issues observed with Joins on Solr 4, which got OOM on
joining to large indexes after optimization/compaction, if those are stored as
smaller files those gets fit into memory and opera
in breaking large index file into smaller ones
Why do you have a requirement that the indexes be < 4G? If it's
arbitrarily imposed why bother?
Or is it a non-negotiable requirement imposed by the platform you're on?
Because just splitting the files into a smaller set won't he
Why do you have a requirement that the indexes be < 4G? If it's
arbitrarily imposed why bother?
Or is it a non-negotiable requirement imposed by the platform you're on?
Because just splitting the files into a smaller set won't help you if
you then start to index into it, the merge process will ju
Can you provide more information about:
- Are you using Solr in standalone or SolrCloud mode? What version of Solr?
- Why do you want this? Lack of disk space? Uneven distribution of data on
shards?
- Do you want this data together i.e. as part of a single collection?
You can check out the followi
Perhaps you can copy this index into a separate location. Remove odd and
even docs into former and later indexes consequently, and then force merge
to single segment in both locations separately.
Perhaps shard splitting in SolrCloud does something like that.
On Mon, Jan 9, 2017 at 1:12 PM, Narsimh
s really works for lucene index files?
>>
>> Thanks,
>> Manan Sheth
>>
>> From: Moenieb Davids
>> Sent: Monday, January 9, 2017 7:36 PM
>> To: solr-user@lucene.apache.org
>> Subject: RE: Help needed in breaking larg
eally works for lucene index files?
>>
>> Thanks,
>> Manan Sheth
>>
>> From: Moenieb Davids
>> Sent: Monday, January 9, 2017 7:36 PM
>> To: solr-user@lucene.apache.org
>> Subject: RE: Help needed in breaking large index file into smaller ones
>>
>
s > Sent: Monday, January 9, 2017 7:36 PM
> > To: solr-user@lucene.apache.org
> > Subject: RE: Help needed in breaking large index file into smaller ones
> >
> > Hi,
> >
> > Try split on linux or unix
> >
> > split -l 100 originalfile.csv
>
Manan Sheth
>
> From: Moenieb Davids
> Sent: Monday, January 9, 2017 7:36 PM
> To: solr-user@lucene.apache.org
> Subject: RE: Help needed in breaking large index file into smaller ones
>
> Hi,
>
> Try split on linux or unix
>
> split -l 100 originalfil
Is this really works for lucene index files?
Thanks,
Manan Sheth
From: Moenieb Davids
Sent: Monday, January 9, 2017 7:36 PM
To: solr-user@lucene.apache.org
Subject: RE: Help needed in breaking large index file into smaller ones
Hi,
Try split on linux or
@lucene.apache.org
Subject: Help needed in breaking large index file into smaller ones
Hi All,
My solr server has a few large index files (say ~10G). I am looking for
some help on breaking them it into smaller ones (each < 4G) to satisfy my
application requirements. Are there any such tools availa
Hi All,
My solr server has a few large index files (say ~10G). I am looking
for some help on breaking them it into smaller ones (each < 4G) to satisfy
my application requirements. Are there any such tools available?
Appreciate your help.
Thanks
NRC
12 matches
Mail list logo