Solr index Backup and restore of large indexs
Hi, We are loading daily 1TB (Apprx) of index data .Please let me know the best procedure to take Backup and restore of the indexes. I am using Solr 4.2. Thanks & Regards Sandeep A Ext : 02618-2856 M : 0502493820 The content of this email together with any attachments, statements and opinions expressed herein contains information that is private and confidential are intended for the named addressee(s) only. If you are not the addressee of this email you may not copy, forward, disclose or otherwise use it or any part of it in any form whatsoever. If you have received this message in error please notify postmas...@etisalat.ae by email immediately and delete the message without making any copies.
Solr 4.2 Incremental backups
Hi, Is there any option to do Incremental backups in Solr 4.2? Thanks & Regards Sandeep A Ext : 02618-2856 M : 0502493820 The content of this email together with any attachments, statements and opinions expressed herein contains information that is private and confidential are intended for the named addressee(s) only. If you are not the addressee of this email you may not copy, forward, disclose or otherwise use it or any part of it in any form whatsoever. If you have received this message in error please notify postmas...@etisalat.ae by email immediately and delete the message without making any copies.
SOLR 4.2 SolrQuery exception
I am using the below code and getting the exception while using SolrQuery Mar 24, 2013 3:08:07 PM org.apache.solr.core.QuerySenderListener newSearcher INFO: QuerySenderListener sending requests to Searcher@795e0c2b main{StandardDirectoryReader(segments_49:524 _4v(4.2):C299313 _4x(4.2):C2953/1396 _4y(4.2):C2866/1470 _4z(4.2):C4263/2793 _50(4.2):C3554/761 _51(4.2):C1126/365 _52(4.2):C650/285 _53(4.2):C500/215 _54(4.2):C1808/1593 _55(4.2):C1593)} Mar 24, 2013 3:08:07 PM org.apache.solr.common.SolrException log SEVERE: java.lang.NullPointerException at org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:181) at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:135) at org.apache.solr.core.SolrCore.execute(SolrCore.java:1797) at org.apache.solr.core.QuerySenderListener.newSearcher(QuerySenderListener.java:64) at org.apache.solr.core.SolrCore$5.call(SolrCore.java:1586) at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334) at java.util.concurrent.FutureTask.run(FutureTask.java:166) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603) at java.lang.Thread.run(Thread.java:679) Mar 24, 2013 3:08:07 PM org.apache.solr.core.SolrCore execute INFO: [collection1] webapp=null path=null params={event=firstSearcher&q=static+firstSearcher+warming+in+solrconfig.xml&distrib=false} status=500 QTime=4 Mar 24, 2013 3:08:07 PM org.apache.solr.core.QuerySenderListener newSearcher INFO: QuerySenderListener done. Mar 24, 2013 3:08:07 PM org.apache.solr.handler.component.SpellCheckComponent$SpellCheckerListener newSearcher INFO: Loading spell index for spellchecker: default Mar 24, 2013 3:08:07 PM org.apache.solr.handler.component.SpellCheckComponent$SpellCheckerListener newSearcher INFO: Loading spell index for spellchecker: wordbreak Mar 24, 2013 3:08:07 PM org.apache.solr.core.SolrCore registerSearcher INFO: [collection1] Registered new searcher Searcher@795e0c2b main{StandardDirectoryReader(segments_49:524 _4v(4.2):C299313 _4x(4.2):C2953/1396 _4y(4.2):C2866/1470 _4z(4.2):C4263/2793 _50(4.2):C3554/761 _51(4.2):C1126/365 _52(4.2):C650/285 _53(4.2):C500/215 _54(4.2):C1808/1593 _55(4.2):C1593)} Mar 24, 2013 3:08:07 PM org.apache.solr.core.CoreContainer registerCore INFO: registering core: collection1 server value -org.apache.solr.client.solrj.embedded.EmbeddedSolrServer@3a32ea4 query value -q=smstext%3AEMIRATES&rows=50 Mar 24, 2013 3:08:07 PM org.apache.solr.common.SolrException log SEVERE: java.lang.NullPointerException at org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:181) at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:135) at org.apache.solr.core.SolrCore.execute(SolrCore.java:1797) at org.apache.solr.client.solrj.embedded.EmbeddedSolrServer.request(EmbeddedSolrServer.java:150) at org.apache.solr.client.solrj.request.QueryRequest.process(QueryRequest.java:90) at org.apache.solr.client.solrj.SolrServer.query(SolrServer.java:301) at SolrQueryResult.solrQuery(SolrQueryResult.java:31) at SolrQueryResult.main(SolrQueryResult.java:65) Mar 24, 2013 3:08:07 PM org.apache.solr.core.SolrCore execute INFO: [collection1] webapp=null path=/select params={q=smstext%3AEMIRATES&rows=50} status=500 QTime=0 org.apache.solr.client.solrj.SolrServerException: org.apache.solr.client.solrj.SolrServerException: java.lang.NullPointerException at org.apache.solr.client.solrj.embedded.EmbeddedSolrServer.request(EmbeddedSolrServer.java:223) at org.apache.solr.client.solrj.request.QueryRequest.process(QueryRequest.java:90) at org.apache.solr.client.solrj.SolrServer.query(SolrServer.java:301) at SolrQueryResult.solrQuery(SolrQueryResult.java:31) at SolrQueryResult.main(SolrQueryResult.java:65) Caused by: org.apache.solr.client.solrj.SolrServerException: java.lang.NullPointerException at org.apache.solr.client.solrj.embedded.EmbeddedSolrServer.request(EmbeddedSolrServer.java:155) ... 4 more Caused by: java.lang.NullPointerException at org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:181) at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:135) at org.apache.solr.core.SolrCore.execute(SolrCore.java:1797) at org.apache.solr.client.solrj.embedded.EmbeddedSolrServer.request(EmbeddedSolrServer.java:150) ... 4 more try{ String SOLR_HOME = "/data/solr1/example/solr/"; CoreContainer coreContainer = new CoreContainer(SOLR_HOME); CoreDescriptor discriptor = new CoreDescriptor(coreContai
RE: SOLR 4.2 SolrQuery exception
Hi, I managed to resolve this issue and I am getting the results also. But this time I am getting a different exception while loading Solr Container Here is the Code. String SOLR_HOME = "/data/solr1/example/solr/collection1"; CoreContainer coreContainer = new CoreContainer(SOLR_HOME); CoreDescriptor discriptor = new CoreDescriptor(coreContainer, "collection1", new File(SOLR_HOME).getAbsolutePath()); SolrCore solrCore = coreContainer.create(discriptor); coreContainer.register(solrCore, false); File home = new File( SOLR_HOME ); File f = new File( home, "solr.xml" ); coreContainer.load( SOLR_HOME, f ); server = new EmbeddedSolrServer( coreContainer, "collection1" ); SolrQuery q = new SolrQuery(); Parameters inside Solrconfig.xml simple true WARNING: Unable to get IndexCommit on startup org.apache.lucene.store.LockObtainFailedException: Lock obtain timed out: SimpleFSLock@/data/solr1/example/solr/collection1/./data/index/write.lock at org.apache.lucene.store.Lock.obtain(Lock.java:84) at org.apache.lucene.index.IndexWriter.(IndexWriter.java:636) at org.apache.solr.update.SolrIndexWriter.(SolrIndexWriter.java:77) at org.apache.solr.update.SolrIndexWriter.create(SolrIndexWriter.java:64) at org.apache.solr.update.DefaultSolrCoreState.createMainIndexWriter(DefaultSolrCoreState.java:192) at org.apache.solr.update.DefaultSolrCoreState.getIndexWriter(DefaultSolrCoreState.java:106) at org.apache.solr.handler.ReplicationHandler.inform(ReplicationHandler.java:904) at org.apache.solr.core.SolrResourceLoader.inform(SolrResourceLoader.java:592) at org.apache.solr.core.SolrCore.(SolrCore.java:801) at org.apache.solr.core.SolrCore.(SolrCore.java:619) at org.apache.solr.core.CoreContainer.createFromLocal(CoreContainer.java:1021) at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1051) at org.apache.solr.core.CoreContainer$3.call(CoreContainer.java:634) at org.apache.solr.core.CoreContainer$3.call(CoreContainer.java:629) at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334) at java.util.concurrent.FutureTask.run(FutureTask.java:166) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334) at java.util.concurrent.FutureTask.run(FutureTask.java:166) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603) at java.lang.Thread.run(Thread.java:679) From: Sandeep Kumar Anumalla Sent: 24 March, 2013 03:44 PM To: solr-user@lucene.apache.org Subject: SOLR 4.2 SolrQuery exception I am using the below code and getting the exception while using SolrQuery Mar 24, 2013 3:08:07 PM org.apache.solr.core.QuerySenderListener newSearcher INFO: QuerySenderListener sending requests to Searcher@795e0c2b main{StandardDirectoryReader(segments_49:524 _4v(4.2):C299313 _4x(4.2):C2953/1396 _4y(4.2):C2866/1470 _4z(4.2):C4263/2793 _50(4.2):C3554/761 _51(4.2):C1126/365 _52(4.2):C650/285 _53(4.2):C500/215 _54(4.2):C1808/1593 _55(4.2):C1593)} Mar 24, 2013 3:08:07 PM org.apache.solr.common.SolrException log SEVERE: java.lang.NullPointerException at org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:181) at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:135) at org.apache.solr.core.SolrCore.execute(SolrCore.java:1797) at org.apache.solr.core.QuerySenderListener.newSearcher(QuerySenderListener.java:64) at org.apache.solr.core.SolrCore$5.call(SolrCore.java:1586) at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334) at java.util.concurrent.FutureTask.run(FutureTask.java:166) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603) at java.lang.Thread.run(Thread.java:679) Mar 24, 2013 3:08:07 PM org.apache.solr.core.SolrCore execute INFO: [collection1] webapp=null path=null params={event=firstSearcher&q=static+firstSearcher+warming+in+solrconfig.xml&distrib=false} status=500 QTime=4 Mar 24, 2013 3:08:07 PM org.apache.solr.core.QuerySenderListener newSearcher INFO: QuerySenderListener done. Mar 24, 2013 3:08:07 PM org.apache.solr.handler.component.SpellCheckComponent$SpellCheckerListener newSearcher INFO: Loading spell index for spellchecker: default Mar 24, 2013 3:08:07 PM org.apache.solr.handler.component.SpellCheckComponent$SpellCheckerListener newSearcher INFO: Loading spell index for spellchecker: wordbreak Mar 24, 2013 3:08:07 PM org.apache.solr.core.SolrCore registerSearcher INFO: [collection1] Registered new searcher
RE: Solr index Backup and restore of large indexs
Hi, I am exploring all the possible options. We want to distribute 1 TB traffic among 3 Slor Shards(Masters) and corresponding 3 Solr Slaves. Initially I have used Master/Slave setup. But in this case my traffic rate on Master is very high, because of this we are facing the blow issue while replicating to Slave. - SnapPull failed SEVERE: SnapPull failed :org.apache.solr.common.SolrException: Unable to download _xv0_Lucene41_0.doc completely. Downloaded 0!=5935 In this case the Slave machine also has to be the same Hardware and Software configuration as such the Master; this seems to be more expensive. - Then I decided to use multiple Solr instances on single machine and accessing them by Using "EmbeddedSolrServer", and planned to query all these instances to get the required result. In this case there is no need of Slave machine,just we need to take the backup and we can store in any external hard disks. Here there are 2 issues I am facing. 1. Loading is not that much fast when compare to Database. 2. How to take incremental backup? Means I don't want to take the full back up every time. - Thanks Sandeep A -Original Message- From: Joel Bernstein [mailto:joels...@gmail.com] Sent: 28 March, 2013 04:51 AM To: solr-user@lucene.apache.org Subject: Re: Solr index Backup and restore of large indexs Hi, Are you running Solr Cloud or Master/Slave? I'm assuming with 1TB a day you're sharding. With master/slave you can configure incremental index replication to another core. The backup core can be local on the server, on a separate sever or in a separate data center. With Solr Cloud replicas can be setup to automatically have redundant copies of the index. These copies though are live copies and will handle queries. Replicating data to a separate data center is typically not done through Solr Cloud replication. Joel On Mon, Mar 25, 2013 at 11:43 PM, Otis Gospodnetic < otis.gospodne...@gmail.com> wrote: > Hi, > > Try something like this: http://host/solr/replication?command=backup > > See: http://wiki.apache.org/solr/SolrReplication > > Otis > -- > Solr & ElasticSearch Support > http://sematext.com/ > > > > > > On Thu, Mar 21, 2013 at 3:23 AM, Sandeep Kumar Anumalla > wrote: > > > > Hi, > > > > We are loading daily 1TB (Apprx) of index data .Please let me know > > the > best procedure to take Backup and restore of the indexes. I am using > Solr 4.2. > > > > > > > > Thanks & Regards > > Sandeep A > > Ext : 02618-2856 > > M : 0502493820 > > > > > > > > The content of this email together with any attachments, statements > > and > opinions expressed herein contains information that is private and > confidential are intended for the named addressee(s) only. If you are > not the addressee of this email you may not copy, forward, disclose or > otherwise use it or any part of it in any form whatsoever. If you have > received this message in error please notify postmas...@etisalat.ae by > email immediately and delete the message without making any copies. > -- Joel Bernstein Professional Services LucidWorks The content of this email together with any attachments, statements and opinions expressed herein contains information that is private and confidential are intended for the named addressee(s) only. If you are not the addressee of this email you may not copy, forward, disclose or otherwise use it or any part of it in any form whatsoever. If you have received this message in error please notify postmas...@etisalat.ae by email immediately and delete the message without making any copies.
RE: Solr index Backup and restore of large indexs
Please update? -Original Message- From: Sandeep Kumar Anumalla Sent: 31 March, 2013 12:08 PM To: solr-user@lucene.apache.org Cc: 'Joel Bernstein' Subject: RE: Solr index Backup and restore of large indexs Hi, I am exploring all the possible options. We want to distribute 1 TB traffic among 3 Slor Shards(Masters) and corresponding 3 Solr Slaves. Initially I have used Master/Slave setup. But in this case my traffic rate on Master is very high, because of this we are facing the blow issue while replicating to Slave. - SnapPull failed SEVERE: SnapPull failed :org.apache.solr.common.SolrException: Unable to download _xv0_Lucene41_0.doc completely. Downloaded 0!=5935 In this case the Slave machine also has to be the same Hardware and Software configuration as such the Master; this seems to be more expensive. - Then I decided to use multiple Solr instances on single machine and accessing them by Using "EmbeddedSolrServer", and planned to query all these instances to get the required result. In this case there is no need of Slave machine,just we need to take the backup and we can store in any external hard disks. Here there are 2 issues I am facing. 1. Loading is not that much fast when compare to Database. 2. How to take incremental backup? Means I don't want to take the full back up every time. - Thanks Sandeep A -Original Message- From: Joel Bernstein [mailto:joels...@gmail.com] Sent: 28 March, 2013 04:51 AM To: solr-user@lucene.apache.org Subject: Re: Solr index Backup and restore of large indexs Hi, Are you running Solr Cloud or Master/Slave? I'm assuming with 1TB a day you're sharding. With master/slave you can configure incremental index replication to another core. The backup core can be local on the server, on a separate sever or in a separate data center. With Solr Cloud replicas can be setup to automatically have redundant copies of the index. These copies though are live copies and will handle queries. Replicating data to a separate data center is typically not done through Solr Cloud replication. Joel On Mon, Mar 25, 2013 at 11:43 PM, Otis Gospodnetic < otis.gospodne...@gmail.com> wrote: > Hi, > > Try something like this: http://host/solr/replication?command=backup > > See: http://wiki.apache.org/solr/SolrReplication > > Otis > -- > Solr & ElasticSearch Support > http://sematext.com/ > > > > > > On Thu, Mar 21, 2013 at 3:23 AM, Sandeep Kumar Anumalla > wrote: > > > > Hi, > > > > We are loading daily 1TB (Apprx) of index data .Please let me know > > the > best procedure to take Backup and restore of the indexes. I am using > Solr 4.2. > > > > > > > > Thanks & Regards > > Sandeep A > > Ext : 02618-2856 > > M : 0502493820 > > > > > > > > The content of this email together with any attachments, statements > > and > opinions expressed herein contains information that is private and > confidential are intended for the named addressee(s) only. If you are > not the addressee of this email you may not copy, forward, disclose or > otherwise use it or any part of it in any form whatsoever. If you have > received this message in error please notify postmas...@etisalat.ae by > email immediately and delete the message without making any copies. > -- Joel Bernstein Professional Services LucidWorks The content of this email together with any attachments, statements and opinions expressed herein contains information that is private and confidential are intended for the named addressee(s) only. If you are not the addressee of this email you may not copy, forward, disclose or otherwise use it or any part of it in any form whatsoever. If you have received this message in error please notify postmas...@etisalat.ae by email immediately and delete the message without making any copies.
RE: Solr 4.2 Incremental backups
HI Erick, My main point is if I use replication I have to use similar kind of setup (Hardware, storage space) as such as the Master, it more cost effective, that is why I am looking at incremental backup options, so that I can keep these backup any place like external Hard disks, tapes. And moreover when I am using replication we are facing the blow issue while replicating to Slave. - SnapPull failed SEVERE: SnapPull failed :org.apache.solr.common.SolrException: Unable to download _xv0_Lucene41_0.doc completely. Downloaded 0!=5935 Thanks -Original Message- From: Erick Erickson [mailto:erickerick...@gmail.com] Sent: 25 March, 2013 07:11 PM To: solr-user@lucene.apache.org Subject: Re: Solr 4.2 Incremental backups That's essentially what replication does, only backs up parts of the index that have changed. However, when segments merge, that might mean the entire index needs to be replicated. Best Erick On Sun, Mar 24, 2013 at 12:08 AM, Sandeep Kumar Anumalla < sanuma...@etisalat.ae> wrote: > Hi, > > Is there any option to do Incremental backups in Solr 4.2? > > Thanks & Regards > Sandeep A > Ext : 02618-2856 > M : 0502493820 > > > > The content of this email together with any attachments, statements > and opinions expressed herein contains information that is private and > confidential are intended for the named addressee(s) only. If you are > not the addressee of this email you may not copy, forward, disclose or > otherwise use it or any part of it in any form whatsoever. If you have > received this message in error please notify postmas...@etisalat.ae by > email immediately and delete the message without making any copies. > The content of this email together with any attachments, statements and opinions expressed herein contains information that is private and confidential are intended for the named addressee(s) only. If you are not the addressee of this email you may not copy, forward, disclose or otherwise use it or any part of it in any form whatsoever. If you have received this message in error please notify postmas...@etisalat.ae by email immediately and delete the message without making any copies.
RE: SnapPull failed - SOLR 4.1
SEVERE: SnapPull failed :org.apache.solr.common.SolrException: Unable to download _xv0_Lucene41_0.doc completely. Downloaded 0!=5935 We are continuously getting the above exception in our replication (SALVE) machine. I tried compression option, increased bandwidth between Master and Slave also, still facing the issue Please let me know how resolve this issue. Thanks & Regards Sandeep A Ext : 02618-2856 M : 0502493820 The content of this email together with any attachments, statements and opinions expressed herein contains information that is private and confidential are intended for the named addressee(s) only. If you are not the addressee of this email you may not copy, forward, disclose or otherwise use it or any part of it in any form whatsoever. If you have received this message in error please notify postmas...@etisalat.ae by email immediately and delete the message without making any copies.
RE: SnapPull failed - SOLR 4.1
I am new to Solr. I did a set up using Solr 4.1.0 , I did a Master and Replication set up on different machines. But in the salve machine I am getting the below exception SEVERE: SnapPull failed :org.apache.solr.common.SolrException: Unable to download _8kj.fnm completely. Downloaded 0!=724 When I checked this exception some users are suggesting to upgrade the Solr version to 4.2.0 . I have check the change logs of Solr 4.2.0 and they have a fix for this - SOLR-4032: Files larger than an internal buffer size fail to replicate. (Mark Miller, Markus Jelsma) So I want to upgrade the Solr from 4.1 to 4.2.Please guide me how to do this. -Original Message- From: Mark Miller [mailto:markrmil...@gmail.com] Sent: 18 March, 2013 10:25 AM To: solr-user@lucene.apache.org Subject: Re: SnapPull failed - SOLR 4.1 This is probably related to some Replication bugs that 4.1 had - 4.2 is probably your best bet for a fix. - Mark On Mar 18, 2013, at 1:48 AM, Sandeep Kumar Anumalla wrote: > SEVERE: SnapPull failed :org.apache.solr.common.SolrException: Unable to > download _xv0_Lucene41_0.doc completely. Downloaded 0!=5935 > > > We are continuously getting the above exception in our replication (SALVE) > machine. > > I tried compression option, increased bandwidth between Master and Slave > also, still facing the issue > > Please let me know how resolve this issue. > > > Thanks & Regards > Sandeep A > Ext : 02618-2856 > M : 0502493820 > > > > The content of this email together with any attachments, statements and > opinions expressed herein contains information that is private and > confidential are intended for the named addressee(s) only. If you are not the > addressee of this email you may not copy, forward, disclose or otherwise use > it or any part of it in any form whatsoever. If you have received this > message in error please notify postmas...@etisalat.ae by email immediately > and delete the message without making any copies. The content of this email together with any attachments, statements and opinions expressed herein contains information that is private and confidential are intended for the named addressee(s) only. If you are not the addressee of this email you may not copy, forward, disclose or otherwise use it or any part of it in any form whatsoever. If you have received this message in error please notify postmas...@etisalat.ae by email immediately and delete the message without making any copies.
RE: SnapPull failed - SOLR 4.1
Hi Mark I am new to Solr. I did a set up using Solr 4.1.0 , I did a Master and Replication set up on different machines. But in the salve machine I am getting the below exception SEVERE: SnapPull failed :org.apache.solr.common.SolrException: Unable to download _8kj.fnm completely. Downloaded 0!=724 When I checked this exception some users are suggesting to upgrade the Solr version to 4.2.0 . I have check the change logs of Solr 4.2.0 and they have a fix for this - SOLR-4032: Files larger than an internal buffer size fail to replicate. (Mark Miller, Markus Jelsma) So I want to upgrade the Solr from 4.1 to 4.2.Please guide me how to do this. -Original Message- From: Mark Miller [mailto:markrmil...@gmail.com] Sent: 18 March, 2013 10:25 AM To: solr-user@lucene.apache.org Subject: Re: SnapPull failed - SOLR 4.1 This is probably related to some Replication bugs that 4.1 had - 4.2 is probably your best bet for a fix. - Mark On Mar 18, 2013, at 1:48 AM, Sandeep Kumar Anumalla wrote: > SEVERE: SnapPull failed :org.apache.solr.common.SolrException: Unable to > download _xv0_Lucene41_0.doc completely. Downloaded 0!=5935 > > > We are continuously getting the above exception in our replication (SALVE) > machine. > > I tried compression option, increased bandwidth between Master and Slave > also, still facing the issue > > Please let me know how resolve this issue. > > > Thanks & Regards > Sandeep A > Ext : 02618-2856 > M : 0502493820 > > > > The content of this email together with any attachments, statements and > opinions expressed herein contains information that is private and > confidential are intended for the named addressee(s) only. If you are not the > addressee of this email you may not copy, forward, disclose or otherwise use > it or any part of it in any form whatsoever. If you have received this > message in error please notify postmas...@etisalat.ae by email immediately > and delete the message without making any copies. The content of this email together with any attachments, statements and opinions expressed herein contains information that is private and confidential are intended for the named addressee(s) only. If you are not the addressee of this email you may not copy, forward, disclose or otherwise use it or any part of it in any form whatsoever. If you have received this message in error please notify postmas...@etisalat.ae by email immediately and delete the message without making any copies.
RE: SnapPull failed - SOLR 4.1
Hi Mark, I have upgraded Solr 4.2 still I am getting this exception. INFO: removing temporary index download directory files NRTCachingDirectory(org.apache.lucene.store.MMapDirectory@/data/solr-4.2.0/example/solr/collection1/data/index.20130319101506108 lockFactory=org.apache.lucene.store.SimpleFSLockFactory@47042c25; maxCacheMB=48.0 maxMergeSizeMB=4.0) Mar 19, 2013 10:19:02 AM org.apache.solr.common.SolrException log SEVERE: SnapPull failed :org.apache.solr.common.SolrException: Unable to download _1l1.fdt completely. Downloaded 0!=256950 Thanks Sandeep A. -Original Message- From: Mark Miller [mailto:markrmil...@gmail.com] Sent: 18 March, 2013 10:25 AM To: solr-user@lucene.apache.org Subject: Re: SnapPull failed - SOLR 4.1 This is probably related to some Replication bugs that 4.1 had - 4.2 is probably your best bet for a fix. - Mark On Mar 18, 2013, at 1:48 AM, Sandeep Kumar Anumalla wrote: > SEVERE: SnapPull failed :org.apache.solr.common.SolrException: Unable to > download _xv0_Lucene41_0.doc completely. Downloaded 0!=5935 > > > We are continuously getting the above exception in our replication (SALVE) > machine. > > I tried compression option, increased bandwidth between Master and Slave > also, still facing the issue > > Please let me know how resolve this issue. > > > Thanks & Regards > Sandeep A > Ext : 02618-2856 > M : 0502493820 > > > > The content of this email together with any attachments, statements and > opinions expressed herein contains information that is private and > confidential are intended for the named addressee(s) only. If you are not the > addressee of this email you may not copy, forward, disclose or otherwise use > it or any part of it in any form whatsoever. If you have received this > message in error please notify postmas...@etisalat.ae by email immediately > and delete the message without making any copies. The content of this email together with any attachments, statements and opinions expressed herein contains information that is private and confidential are intended for the named addressee(s) only. If you are not the addressee of this email you may not copy, forward, disclose or otherwise use it or any part of it in any form whatsoever. If you have received this message in error please notify postmas...@etisalat.ae by email immediately and delete the message without making any copies.
RE: SnapPull failed - SOLR 4.1
NO, is this something to do with commit time in the master. Currently I am doing an explicit commit in the Java (SolrJ) program every 5 minutes after loading data into the master. And my load on the master also huge. Thanks Sandeep A -Original Message- From: Mark Miller [mailto:markrmil...@gmail.com] Sent: 19 March, 2013 07:18 PM To: solr-user@lucene.apache.org Subject: Re: SnapPull failed - SOLR 4.1 Any exceptions on the master? - Mark On Mar 19, 2013, at 2:21 AM, Sandeep Kumar Anumalla wrote: > Hi Mark, > > I have upgraded Solr 4.2 still I am getting this exception. > > > INFO: removing temporary index download directory files > NRTCachingDirectory(org.apache.lucene.store.MMapDirectory@/data/solr-4 > .2.0/example/solr/collection1/data/index.20130319101506108 > lockFactory=org.apache.lucene.store.SimpleFSLockFactory@47042c25; > maxCacheMB=48.0 maxMergeSizeMB=4.0) Mar 19, 2013 10:19:02 AM > org.apache.solr.common.SolrException log > SEVERE: SnapPull failed :org.apache.solr.common.SolrException: Unable > to download _1l1.fdt completely. Downloaded 0!=256950 > > > > Thanks > Sandeep A. > > -Original Message- > From: Mark Miller [mailto:markrmil...@gmail.com] > Sent: 18 March, 2013 10:25 AM > To: solr-user@lucene.apache.org > Subject: Re: SnapPull failed - SOLR 4.1 > > This is probably related to some Replication bugs that 4.1 had - 4.2 is > probably your best bet for a fix. > > - Mark > > On Mar 18, 2013, at 1:48 AM, Sandeep Kumar Anumalla > wrote: > >> SEVERE: SnapPull failed :org.apache.solr.common.SolrException: Unable >> to download _xv0_Lucene41_0.doc completely. Downloaded 0!=5935 >> >> >> We are continuously getting the above exception in our replication (SALVE) >> machine. >> >> I tried compression option, increased bandwidth between Master and >> Slave also, still facing the issue >> >> Please let me know how resolve this issue. >> >> >> Thanks & Regards >> Sandeep A >> Ext : 02618-2856 >> M : 0502493820 >> >> >> >> The content of this email together with any attachments, statements and >> opinions expressed herein contains information that is private and >> confidential are intended for the named addressee(s) only. If you are not >> the addressee of this email you may not copy, forward, disclose or otherwise >> use it or any part of it in any form whatsoever. If you have received this >> message in error please notify postmas...@etisalat.ae by email immediately >> and delete the message without making any copies. > > > The content of this email together with any attachments, statements and > opinions expressed herein contains information that is private and > confidential are intended for the named addressee(s) only. If you are not the > addressee of this email you may not copy, forward, disclose or otherwise use > it or any part of it in any form whatsoever. If you have received this > message in error please notify postmas...@etisalat.ae by email immediately > and delete the message without making any copies. The content of this email together with any attachments, statements and opinions expressed herein contains information that is private and confidential are intended for the named addressee(s) only. If you are not the addressee of this email you may not copy, forward, disclose or otherwise use it or any part of it in any form whatsoever. If you have received this message in error please notify postmas...@etisalat.ae by email immediately and delete the message without making any copies.