Solr jam after all my jvm thread pool hang in blocked state
I, I'm running solr 1.3 in production for now 1 year and i never had any problem with it since 2 weeks. It happen 6-7 times a day, all of my thread but one are in a blocked state. All thread that are blocked are waiting on the Console monitor owned by the "Runnable" thread. We did not changed anything on the application / server. I have monitored the thread count and there's no accumulation of thread during the period solr is ok. The problem don't seem to be related to high load of queries since it also happen during low load period. Anyone got a clue of is going on ? -- View this message in context: http://lucene.472066.n3.nabble.com/Solr-jam-after-all-my-jvm-thread-pool-hang-in-blocked-state-tp1303361p1303361.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: Solr jam after all my jvm thread pool hang in blocked state
Thread dump Got like 240 thread like this : "http-8080-Processor222" daemon prio=10 tid=0x7fe36c010c00 nid=0x1e94 waiting for monitor entry [0x4caa6000..0x4caa6d20] java.lang.Thread.State: BLOCKED (on object monitor) at java.util.logging.StreamHandler.publish(StreamHandler.java:174) - waiting to lock <0x7fe37e72b340> (a java.util.logging.ConsoleHandler) at java.util.logging.ConsoleHandler.publish(ConsoleHandler.java:88) at java.util.logging.Logger.log(Logger.java:472) at java.util.logging.Logger.doLog(Logger.java:494) at java.util.logging.Logger.log(Logger.java:517) at java.util.logging.Logger.info(Logger.java:1036) at org.apache.solr.core.SolrCore.execute(SolrCore.java:1212) at org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:303) at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:232) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:215) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:188) at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:213) at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:172) at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127) at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:117) at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:108) at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:151) at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:874) at org.apache.coyote.http11.Http11BaseProtocol$Http11ConnectionHandler.processConnection(Http11BaseProtocol.java:665) at org.apache.tomcat.util.net.PoolTcpEndpoint.processSocket(PoolTcpEndpoint.java:528) at org.apache.tomcat.util.net.LeaderFollowerWorkerThread.runIt(LeaderFollowerWorkerThread.java:81) at org.apache.tomcat.util.threads.ThreadPool$ControlRunnable.run(ThreadPool.java:689) at java.lang.Thread.run(Thread.java:619) Here the running thread : "http-8080-Processor156" daemon prio=10 tid=0x00df2000 nid=0x1e52 runnable [0x44521000..0x44521c20] java.lang.Thread.State: RUNNABLE at java.io.FileOutputStream.writeBytes(Native Method) at java.io.FileOutputStream.write(FileOutputStream.java:260) at java.io.BufferedOutputStream.write(BufferedOutputStream.java:105) - locked <0x7fe37e3abcd8> (a java.io.BufferedOutputStream) at java.io.PrintStream.write(PrintStream.java:430) - locked <0x7fe37e3abca0> (a java.io.PrintStream) at sun.nio.cs.StreamEncoder.writeBytes(StreamEncoder.java:202) at sun.nio.cs.StreamEncoder.implFlushBuffer(StreamEncoder.java:272) at sun.nio.cs.StreamEncoder.implFlush(StreamEncoder.java:276) at sun.nio.cs.StreamEncoder.flush(StreamEncoder.java:122) - locked <0x7fe37e72cd90> (a java.io.OutputStreamWriter) at java.io.OutputStreamWriter.flush(OutputStreamWriter.java:212) at java.util.logging.StreamHandler.flush(StreamHandler.java:225) - locked <0x7fe37e72b340> (a java.util.logging.ConsoleHandler) at java.util.logging.ConsoleHandler.publish(ConsoleHandler.java:89) at java.util.logging.Logger.log(Logger.java:472) at java.util.logging.Logger.doLog(Logger.java:494) at java.util.logging.Logger.log(Logger.java:517) at java.util.logging.Logger.info(Logger.java:1036) at org.apache.solr.core.SolrCore.execute(SolrCore.java:1212) at org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:303) at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:232) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:215) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:188) at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:213) at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:172) at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127) at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:117) at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:108) at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:151) at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:874) at org.apache.coyote.http11.Http11BaseProtocol$Http11ConnectionHandler.processConnection(Http11BaseProtocol.java:665) at org.apache.tomca
Delta import
I'm using the delta-import command. Here's the deltaQuery and deltaImportQuery i use : select uid from profil_view where last_modified > '${dataimporter.last_index_time}' select * from profil_view where uid='${dataimporter.delta.uid} When i look at the delta import status i see that the total request to datasource equal the number of modification i had. Is it possible to make only one request to database and fetch all modification ? select * from profil_view where uid in ('${dataimporter.delta.ALLuid}') (something like that). -- View this message in context: http://www.nabble.com/Delta-import-tp22663196p22663196.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: Delta import
Ok i'm ok with the fact the solr gonna do X request to database for X update.. but when i try to run the delta-import command with 2 row to update is it normal that its kinda really slow ~ 1 document fetched / sec ? Noble Paul നോബിള് नोब्ळ् wrote: > > not possible really, > > that may not be useful to a lot of users because there may be too many > changed ids and the 'IN' part can be really long. > > You can raise an issue anyway > > > > On Mon, Mar 23, 2009 at 9:30 PM, AlexxelA > wrote: >> >> I'm using the delta-import command. >> >> Here's the deltaQuery and deltaImportQuery i use : >> >> select uid from profil_view where last_modified > >> '${dataimporter.last_index_time}' >> select * from profil_view where uid='${dataimporter.delta.uid} >> >> When i look at the delta import status i see that the total request to >> datasource equal the number of modification i had. Is it possible to >> make >> only one request to database and fetch all modification ? >> >> select * from profil_view where uid in ('${dataimporter.delta.ALLuid}') >> (something like that). >> -- >> View this message in context: >> http://www.nabble.com/Delta-import-tp22663196p22663196.html >> Sent from the Solr - User mailing list archive at Nabble.com. >> >> > > > > -- > --Noble Paul > > -- View this message in context: http://www.nabble.com/Delta-import-tp22663196p22689588.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: Delta import
Yes my database is remote, mysql 5 and i'm using connector/J 5.1.7. My index has 2 documents. When i try to do lets say 14 updates it takes about 18 sec total. Here's the resulting log of the operation : 2009-03-25 15:53:57 org.apache.solr.handler.dataimport.JdbcDataSource$1 call INFO: Time taken for getConnection(): 411 2009-03-25 15:53:59 org.apache.solr.handler.dataimport.DocBuilder collectDelta INFO: Completed ModifiedRowKey for Entity: profil rows obtained : 14 2009-03-25 15:53:59 org.apache.solr.handler.dataimport.DocBuilder collectDelta INFO: Completed DeletedRowKey for Entity: profil rows obtained : 0 2009-03-25 15:53:59 org.apache.solr.handler.dataimport.DocBuilder collectDelta INFO: Completed parentDeltaQuery for Entity: profil 2009-03-25 15:54:00 org.apache.solr.core.SolrDeletionPolicy onInit INFO: SolrDeletionPolicy.onInit: commits:num=1 commit{dir=/home/solr-tomcat/solr/data/index,segFN=segments_sb,version=1237322897338,generation=1019,filenames=[_uj.frq, _uj.fdx, _uj.tii, _uj.nrm, _uj.tis, _uj.fnm, _uj.prx, segments_sb, _uj.fdt] 2009-03-25 15:54:00 org.apache.solr.core.SolrDeletionPolicy updateCommits INFO: last commit = 1237322897338 2009-03-25 15:54:13 org.apache.solr.handler.dataimport.DocBuilder doDelta INFO: Delta Import completed successfully BOTTLE NECK 2009-03-25 15:54:13 org.apache.solr.handler.dataimport.DocBuilder commit INFO: Full Import completed successfully 2009-03-25 15:54:13 org.apache.solr.update.DirectUpdateHandler2 commit INFO: start commit(optimize=true,waitFlush=false,waitSearcher=true) 2009-03-25 15:54:15 org.apache.solr.core.SolrDeletionPolicy onCommit INFO: SolrDeletionPolicy.onCommit: commits:num=2 commit{dir=/home/solr-tomcat/solr/data/index,segFN=segments_sb,version=1237322897338,generation=1019,filenames=[_uj.frq, _uj.fdx, _uj.tii, _uj.nrm, _uj.tis, _uj.fnm, _uj.prx, segments_sb, _uj.fdt] commit{dir=/home/solr-tomcat/solr/data/index,segFN=segments_sc,version=1237322897339,generation=1020,filenames=[_ul.prx, _ul.fnm, _ul.tii, _ul.fdt, _ul.nrm, _ul.fdx, _ul.tis, _ul.frq, segments_sc] 2009-03-25 15:54:15 org.apache.solr.core.SolrDeletionPolicy updateCommits INFO: last commit = 1237322897339 2009-03-25 15:54:15 org.apache.solr.search.SolrIndexSearcher INFO: Opening searc...@3da850 main When i do a full-import it is much faster. Take about 1 min to index 2 documents. I tried to play a bit with the config but nothing seems to work for the moment. What i want to do is pretty interactive, my production db has 1,2M documents and must be able to delta-import around 2k update every 5min. Is it possible with the dataimporthandle to reach those kinda of number ? Shalin Shekhar Mangar wrote: > > On Wed, Mar 25, 2009 at 2:25 AM, AlexxelA > wrote: > >> >> Ok i'm ok with the fact the solr gonna do X request to database for X >> update.. but when i try to run the delta-import command with 2 row to >> update is it normal that its kinda really slow ~ 1 document fetched / sec >> ? >> >> > Not really, I've seen 1000x faster. Try firing a few of those queries on > the > database directly. Are they slow? Is the database remote? > > -- > Regards, > Shalin Shekhar Mangar. > > -- View this message in context: http://www.nabble.com/Delta-import-tp22663196p22710222.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: Delta import
I found what was the prob. I was using a mysql view and it seems it don't take in consideration the index i had on the last_modified field from the original table ><. Mysql calls were taking 1 sec each :| I just switch back to a request with join instead of a request to my view. Now doing around 400 updates / sec instead of 1 update / sec :) Thanks Noble Paul നോബിള് नोब्ळ् wrote: > > Hi Alex , you may be able to use CachedSqlEntityprocessor. you can do > delta-import using full-import > http://wiki.apache.org/solr/DataImportHandlerFaq#fullimportdelta > > the inner entity can use a CachedSqlEntityProcessor > > On Thu, Mar 26, 2009 at 1:45 AM, AlexxelA > wrote: >> >> Yes my database is remote, mysql 5 and i'm using connector/J 5.1.7. My >> index >> has 2 documents. When i try to do lets say 14 updates it takes about >> 18 >> sec total. Here's the resulting log of the operation : >> >> 2009-03-25 15:53:57 org.apache.solr.handler.dataimport.JdbcDataSource$1 >> call >> INFO: Time taken for getConnection(): 411 >> 2009-03-25 15:53:59 org.apache.solr.handler.dataimport.DocBuilder >> collectDelta >> INFO: Completed ModifiedRowKey for Entity: profil rows obtained : 14 >> 2009-03-25 15:53:59 org.apache.solr.handler.dataimport.DocBuilder >> collectDelta >> INFO: Completed DeletedRowKey for Entity: profil rows obtained : 0 >> 2009-03-25 15:53:59 org.apache.solr.handler.dataimport.DocBuilder >> collectDelta >> INFO: Completed parentDeltaQuery for Entity: profil >> 2009-03-25 15:54:00 org.apache.solr.core.SolrDeletionPolicy onInit >> INFO: SolrDeletionPolicy.onInit: commits:num=1 >> >> commit{dir=/home/solr-tomcat/solr/data/index,segFN=segments_sb,version=1237322897338,generation=1019,filenames=[_uj.frq, >> _uj.fdx, _uj.tii, _uj.nrm, _uj.tis, _uj.fnm, _uj.prx, segments_sb, >> _uj.fdt] >> 2009-03-25 15:54:00 org.apache.solr.core.SolrDeletionPolicy updateCommits >> INFO: last commit = 1237322897338 >> 2009-03-25 15:54:13 org.apache.solr.handler.dataimport.DocBuilder doDelta >> INFO: Delta Import completed successfully BOTTLE NECK >> 2009-03-25 15:54:13 org.apache.solr.handler.dataimport.DocBuilder commit >> INFO: Full Import completed successfully >> 2009-03-25 15:54:13 org.apache.solr.update.DirectUpdateHandler2 commit >> INFO: start commit(optimize=true,waitFlush=false,waitSearcher=true) >> 2009-03-25 15:54:15 org.apache.solr.core.SolrDeletionPolicy onCommit >> INFO: SolrDeletionPolicy.onCommit: commits:num=2 >> >> commit{dir=/home/solr-tomcat/solr/data/index,segFN=segments_sb,version=1237322897338,generation=1019,filenames=[_uj.frq, >> _uj.fdx, _uj.tii, _uj.nrm, _uj.tis, _uj.fnm, _uj.prx, segments_sb, >> _uj.fdt] >> >> commit{dir=/home/solr-tomcat/solr/data/index,segFN=segments_sc,version=1237322897339,generation=1020,filenames=[_ul.prx, >> _ul.fnm, _ul.tii, _ul.fdt, _ul.nrm, _ul.fdx, _ul.tis, _ul.frq, >> segments_sc] >> 2009-03-25 15:54:15 org.apache.solr.core.SolrDeletionPolicy updateCommits >> INFO: last commit = 1237322897339 >> 2009-03-25 15:54:15 org.apache.solr.search.SolrIndexSearcher >> INFO: Opening searc...@3da850 main >> >> When i do a full-import it is much faster. Take about 1 min to index >> 2 >> documents. I tried to play a bit with the config but nothing seems to >> work >> for the moment. >> >> What i want to do is pretty interactive, my production db has 1,2M >> documents >> and must be able to delta-import around 2k update every 5min. Is it >> possible with the dataimporthandle to reach those kinda of number ? >> >> >> >> Shalin Shekhar Mangar wrote: >>> >>> On Wed, Mar 25, 2009 at 2:25 AM, AlexxelA >>> wrote: >>> >>>> >>>> Ok i'm ok with the fact the solr gonna do X request to database for X >>>> update.. but when i try to run the delta-import command with 2 row >>>> to >>>> update is it normal that its kinda really slow ~ 1 document fetched / >>>> sec >>>> ? >>>> >>>> >>> Not really, I've seen 1000x faster. Try firing a few of those queries on >>> the >>> database directly. Are they slow? Is the database remote? >>> >>> -- >>> Regards, >>> Shalin Shekhar Mangar. >>> >>> >> >> -- >> View this message in context: >> http://www.nabble.com/Delta-import-tp22663196p22710222.html >> Sent from the Solr - User mailing list archive at Nabble.com. >> >> > > > > -- > --Noble Paul > > -- View this message in context: http://www.nabble.com/Delta-import-tp22663196p22727243.html Sent from the Solr - User mailing list archive at Nabble.com.
Query with bitwise operation
One field of my document I use for solr is an integer. I want to do bitwise operations on that field in my queries Ex : status = 46 (in my solr document) In want to know if the bit #1 (2¹) = 1 --> (46&2) > 0 ? or if bit #2 and #3 (2² + 2³) = 1 (46&12) > 0 ? Can we do something like this : q=status:((46&2)>0) ? -- View this message in context: http://www.nabble.com/Query-with-bitwise-operation-tp22956950p22956950.html Sent from the Solr - User mailing list archive at Nabble.com.
Stored Document encoding
I'm using the DataImportHandler and my database is in "latin1". When i retreive documents that i have indexed in solr they seem to have been converted in utf-8. Is it normal ? Is it possible to store in latin1 in solr ? -- View this message in context: http://www.nabble.com/Stored-Document-encoding-tp23078724p23078724.html Sent from the Solr - User mailing list archive at Nabble.com.