Dear Erick,
Could you please name those problems that SolrCloud can not tackle them
alone? Maybe I need solrCloud+ Hadoop and I am not aware of that yet.
Regards.
On Thu, Aug 7, 2014 at 7:37 PM, Erick Erickson
wrote:
> If SolrCloud meets your needs, without Hadoop, then
> there's no real reason
If SolrCloud meets your needs, without Hadoop, then
there's no real reason to introduce the added complexity.
There are a bunch of problems that do _not_ work
well with SolrCloud over non-Hadoop file systems. For
those problems, the combination of SolrCloud and Hadoop
make tackling them possible.
Thank you very much. But why we should go for solr distributed with hadoop?
There is already solrCloud which is pretty applicable in the case of big
index. Is there any advantage for sending indexes over map reduce that
solrCloud can not provide?
Regards.
On Wed, Aug 6, 2014 at 9:09 PM, Erick Eri
bq: Are you aware of Cloudera search? I know they provide an integrated
Hadoop ecosystem.
What Cloudera Search does via the MapReduceIndexerTool (MRIT) is create N
sub-indexes for
each shard in the M/R paradigm via EmbeddedSolrServer. Eventually, these
sub-indexes for
each shard are merged (perhap
Dear Erick,
I remembered some times ago, somebody asked about what is the point of
modify Solr to use HDFS for storing indexes. As far as I remember somebody
told him integrating Solr with HDFS has two advantages. 1) having hadoop
replication and HA. 2) using indexes and Solr documents for other pu
Dear Erick,
Hi,
Thank you for you reply. Yeah I am aware that SolrJ is my last option. I
was thinking about raw I/O operation. So according to your reply probably
it is not applicable somehow. What about the Lily project that Michael
mentioned? Is that consider SolrJ too? Are you aware of Cloudera
What you haven't told us is what you mean by "modify the
index outside Solr". SolrJ? Using raw Lucene? Trying to modify
things by writing your own codec? Standard Java I/O operations?
Other?
You could use SolrJ to connect to an existing Solr server and
both read and modify at will form your M/R jo
Actually I am going to do some analysis on the solr data using map reduce.
For this purpose it might be needed to change some part of data or add new
fields from outside solr.
On Tue, Aug 5, 2014 at 5:51 PM, Shawn Heisey wrote:
> On 8/5/2014 7:04 AM, Ali Nazemian wrote:
> > I changed solr 4.9 t
Probably the "most correct" way to modify the index would be to use the
Solr REST API to push your changes out.
Another thing you might want to look at is Lilly. Basically it's a way to
set up a Solr collection as an HBase replication target, so changes to your
HBase table would automatically prop
On 8/5/2014 7:04 AM, Ali Nazemian wrote:
> I changed solr 4.9 to write index and data on hdfs. Now I am going to
> connect to those data from the outside of solr for changing some of the
> values. Could somebody please tell me how that is possible? Suppose I am
> using Hbase over hdfs for do these
10 matches
Mail list logo