Re: Retrieving large num of docs

2009-11-27 Thread Raghuveer Kancherla
Hi Andrew,
We are running solr using its http interface from python. From the resources
I could find, EmbeddedSolrServer is possible only if I am using solr from a
java program.  It will be useful to understand if a significant part of the
performance increase is due to bypassing HTTP before going down this path.

In the mean time I am trying my luck with the other suggestions. Can you
share the patch that helps cache solr documents instead of lucene documents?


On a different note, I am wondering why does it take 4 - 5 seconds for Solr
to return the ID's of ranked documents when it can rank the results in about
20 milli seconds? Am I missing something here?

Thanks,
Raghu



On Fri, Nov 27, 2009 at 2:15 AM, Andrey Klochkov  wrote:

> Hi
>
> We obtain ALL documents for every query, the index size is about 50k. We
> use
> number of stored fields. Often the result set size is several thousands of
> docs.
>
> We performed the following things to make it faster:
>
> 1. Use EmbeddedSolrServer
> 2. Patch Solr to avoid unnecessary marshalling while using
> EmbeddedSolrServer (there's an issue  in Solr JIRA)
> 3. Patch Solr to cache SolrDocument instances instead of Lucene's Document
> instances. I was going to share this patch, but then decided that our usage
> of Solr is not common and this functionality is useless in most cases
> 4. We have all documents in cache
> 5. In fact our index is stored in a data grid, not a file system. But as
> tests showed this is not important because standard FSDirectory is faster
> if
> you have enough of RAM free for OS caches.
>
> These changes improved the performance very much, so in the end we have
> performance comparable (about 3-5 times slower) to the "proper" Solr usage
> (obtaining first 20 documents).
>
> To get more details on how different Solr components perform we injected
> perf4j statements into key points in the code. And a profiler was helpful
> too.
>
> Hope it helps somehow.
>
> On Thu, Nov 26, 2009 at 8:48 PM, Raghuveer Kancherla <
> raghuveer.kanche...@aplopio.com> wrote:
>
> > Hi,
> > I am using Solr1.4 for searching through half a million documents. The
> > problem is, I want to retrieve nearly 200 documents for each search
> query.
> > The query time in Solr logs is showing 0.02 seconds and I am fairly happy
> > with that. However Solr is taking a long time (4 to 5 secs) to return the
> > results (I think it is because of the number of docs I am requesting). I
> > tried returning only the id's (unique key) without any other stored
> fields,
> > but it is not helping me improve the response times (time to return the
> > id's
> > of matching documents).
> > I understand that retrieving 200 documents for each search term is
> > impractical in most scenarios but I dont have any other option. Any
> > pointers
> > on how to improve the response times will be a great help.
> >
> > Thanks,
> >  Raghu
> >
>
>
>
> --
> Andrew Klochkov
> Senior Software Engineer,
> Grid Dynamics
>


Re: Retrieving large num of docs

2009-11-27 Thread AHMET ARSLAN
> Hi Andrew,
> We are running solr using its http interface from python.
> From the resources
> I could find, EmbeddedSolrServer is possible only if I am
> using solr from a
> java program.  It will be useful to understand if a
> significant part of the
> performance increase is due to bypassing HTTP before going
> down this path.
> 
> In the mean time I am trying my luck with the other
> suggestions. Can you
> share the patch that helps cache solr documents instead of
> lucene documents?

May be these links can help
http://wiki.apache.org/lucene-java/ImproveSearchingSpeed
http://wiki.apache.org/lucene-java/ImproveIndexingSpeed
http://www.lucidimagination.com/Downloads/LucidGaze-for-Solr

how often do you update your index?
is your index optimized?
configuring caching can also help:

http://wiki.apache.org/solr/SolrCaching
http://wiki.apache.org/solr/SolrPerformanceFactors







restore space between words by spell checker

2009-11-27 Thread Andrey Klochkov
Hi

If a user issued a misspelled query, forgetting to place space between
words, is it possible to fix it with a spell checker or by some other
mechanism?

For example, if we get query "tommyhitfiger" and have terms "tommy" and
"hitfiger" in the index, how to fix the query?

-- 
Andrew Klochkov
Senior Software Engineer,
Grid Dynamics


What does this error mean?

2009-11-27 Thread Paul Tomblin
NFO: start 
commit(optimize=false,waitFlush=true,waitSearcher=true,expungeDeletes=false)
Nov 27, 2009 3:45:35 AM
org.apache.solr.update.processor.LogUpdateProcessor finish
INFO: {} 0 634
Nov 27, 2009 3:45:35 AM org.apache.solr.core.SolrCore getSearcher
WARNING: [nutch] Error opening new searcher. exceeded limit of
maxWarmingSearchers=2, try again later.
Nov 27, 2009 3:45:35 AM
org.apache.solr.update.processor.LogUpdateProcessor finishINFO: {} 0
635
Nov 27, 2009 3:45:35 AM org.apache.solr.common.SolrException log
SEVERE: org.apache.solr.common.SolrException: Error opening new
searcher. exceeded limit of maxWarmingSear
chers=2, try again later.
at org.apache.solr.core.SolrCore.getSearcher(SolrCore.java:1029)
at 
org.apache.solr.update.DirectUpdateHandler2.commit(DirectUpdateHandler2.java:418)
at 
org.apache.solr.update.processor.RunUpdateProcessor.processCommit(RunUpdateProcessorFactory.jav
a:85)
at 
org.apache.solr.handler.RequestHandlerUtils.handleCommit(RequestHandlerUtils.java:107)
at 
org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(ContentStreamHandlerBase.java:48)
at 
org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:131)
at org.apache.solr.core.SolrCore.execute(SolrCore.java:1316)
at 
org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:338)
at 
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:241)
at 
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at 
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at 
org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
   at 
org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
at 
org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:128)
at 
org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)
at 
org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
   at 
org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:293)
at 
org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:849)
at 
org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:583)
at 
org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:454)
at java.lang.Thread.run(Thread.java:619)

Nov 27, 2009 3:45:35 AM org.apache.solr.core.SolrCore execute
INFO: [nutch] webapp=/solrChunk path=/update
params={waitSearcher=true&commit=true&wt=javabin&waitFlush=true&version=1}
status=503 QTime=634
Nov 27, 2009 3:45:35 AM org.apache.solr.common.SolrException log
SEVERE: org.apache.solr.common.SolrException: Error opening new
searcher. exceeded limit of maxWarmingSearchers=2, try again later.
at org.apache.solr.core.SolrCore.getSearcher(SolrCore.java:1029)
at 
org.apache.solr.update.DirectUpdateHandler2.commit(DirectUpdateHandler2.java:418)
at 
org.apache.solr.update.processor.RunUpdateProcessor.processCommit(RunUpdateProcessorFactory.java:85)
at 
org.apache.solr.handler.RequestHandlerUtils.handleCommit(RequestHandlerUtils.java:107)
at 
org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(ContentStreamHandlerBase.java:48)
at 
org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:131)
at org.apache.solr.core.SolrCore.execute(SolrCore.java:1316)
at 
org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:338)
at 
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:241)
at 
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at 
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at 
org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
at 
org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
at 
org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:128)
at 
org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)
at 
org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
at 
org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:293)
at 
org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:849)
at 
org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:583)
at 
org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:454)
at java.lang.Thread.run(Thread.java:619)

Nov 27, 2009 3:45:35 AM org.apache.solr.core.SolrCore execute
INFO: [

Solr CPU usage

2009-11-27 Thread Girish Redekar
Hi

I'm testing my Solr instance with multiple simultaneous requests. Here's my
test.

For an index of ~200K docs, I query Solr with 10 simultaneous threads. Can
someone help me explain/improve the following observations:

1) Solr doesn't seem to use all the available CPU to improve response times
(query times are good, but the time required to return documents aren't so
good). My CPU seems to be running at ~30%
2) As expected, time for response increases as the num of requested results
increase. What's surprising (and perplexing) is that Solr seems to use
*more* of the CPU when I'm requesting *fewer* docs. Consequently, its
performance in returning a larger result set is very bad
3) To counter 1, is there a way to make two Solr instances search on the
same index (so that concurrent requests are served faster)

Any help in this regard would be very useful.

Thanks !
Girish Redekar
http://girishredekar.net


Re: Solr CPU usage

2009-11-27 Thread Yonik Seeley
On Fri, Nov 27, 2009 at 9:30 AM, Girish Redekar
 wrote:
> Hi
>
> I'm testing my Solr instance with multiple simultaneous requests. Here's my
> test.
>
> For an index of ~200K docs, I query Solr with 10 simultaneous threads. Can
> someone help me explain/improve the following observations:
>
> 1) Solr doesn't seem to use all the available CPU to improve response times
> (query times are good, but the time required to return documents aren't so
> good). My CPU seems to be running at ~30%
> 2) As expected, time for response increases as the num of requested results
> increase. What's surprising (and perplexing) is that Solr seems to use
> *more* of the CPU when I'm requesting *fewer* docs. Consequently, its
> performance in returning a larger result set is very bad

This may point to another bottleneck - if the OS is low on free RAM,
it could be disk IO.
If this is on Windows, you could have contention reading the index files.
Otherwise, you may have a bottleneck in network IO.  Is the client you
are testing with on the same box?
Is this Solr 1.4?

-Yonik
http://www.lucidimagination.com


Re: Solr CPU usage

2009-11-27 Thread Girish Redekar
Yonik,
Am running both my server and client on ubuntu machines. The client is on a
different box. The server CPU and RAM are both well below 50%.

Girish Redekar
http://girishredekar.net


On Fri, Nov 27, 2009 at 10:07 PM, Yonik Seeley
wrote:

> On Fri, Nov 27, 2009 at 9:30 AM, Girish Redekar
>  wrote:
> > Hi
> >
> > I'm testing my Solr instance with multiple simultaneous requests. Here's
> my
> > test.
> >
> > For an index of ~200K docs, I query Solr with 10 simultaneous threads.
> Can
> > someone help me explain/improve the following observations:
> >
> > 1) Solr doesn't seem to use all the available CPU to improve response
> times
> > (query times are good, but the time required to return documents aren't
> so
> > good). My CPU seems to be running at ~30%
> > 2) As expected, time for response increases as the num of requested
> results
> > increase. What's surprising (and perplexing) is that Solr seems to use
> > *more* of the CPU when I'm requesting *fewer* docs. Consequently, its
> > performance in returning a larger result set is very bad
>
> This may point to another bottleneck - if the OS is low on free RAM,
> it could be disk IO.
> If this is on Windows, you could have contention reading the index files.
> Otherwise, you may have a bottleneck in network IO.  Is the client you
> are testing with on the same box?
> Is this Solr 1.4?
>
> -Yonik
> http://www.lucidimagination.com
>


Re: [SolrResourceLoader] Unable to load cached class-name

2009-11-27 Thread Stuart Grimshaw
On Wed, Nov 25, 2009 at 7:43 PM, Chris Hostetter
 wrote:
> :
> : I've deployed the contents of dist/ into JBoss's lib directory for the
> : server I'm running and I've also copied the contents of lib/ into
>
> Please be specific ... what is "dist/" what is "lib/" ? ... are you
> talking about the top level dist and lib directories in a solr release,
> then those should *not* be copied into any directory for JBoss.
> everything you need to access core solr features is available in wht
> solr.war -- that is all you need to run the solr application.

Yes, those were the directories I was taling about.

> the only reason to ever copy any jars arround when dealing with solr is to
> load plugins (ie: your own, or things counts in the contrib directory of a
> solr release) and even then they should go in the special "lib" directory
> inside your Solr HOme directory so they are loaded by the appropraite
> classlaoder -- not in the top level class loader of your servlet
> container.

Ok, I'm not sure where my particular use of Solr fits into all this.
I'm writing a log4j appender that adds each log entry to a Solr index.
It's not really a Solr plugin.

> : [SolrResourceLoader] Unable to load cached class-name :
> : org.apache.solr.search.FastLRUCache for shortname :
> : solr.FastLRUCachejava.lang.ClassNotFoundException:
> : org.apache.solr.search.FastLRUCache
>
> this is most likely because you have duplicate copies of (all of) the solr
> classes at various classloader levels -- the copies in the solr.war, and
> the copies you've put into the JBoss lib dir.  having both can cause
> problems like this because of the rules involved with
> hierarchical classloaders.

I've removed all the extra files from the lib directory as you
suggest, and now I get the following message when starting JBoss

18:54:23,083 ERROR [AbstractKernelController] Error installing to
Create: name=jboss.system:service=Logging,type=Log4jService
state=Configured mode=Manual requiredState=Create
java.lang.NoClassDefFoundError: org/apache/solr/client/solrj/SolrServerException


Re: What does this error mean?

2009-11-27 Thread Matthew Runo
It means that there was 2 warming searchers, and then a commit came in and 
caused a third to try to warm up at the same time. Do you use any warming 
queries, or have large caches?

Thanks for your time!

Matthew Runo
Software Engineer, Zappos.com
mr...@zappos.com - 702-943-7833

On Nov 27, 2009, at 5:46 AM, Paul Tomblin wrote:

> NFO: start 
> commit(optimize=false,waitFlush=true,waitSearcher=true,expungeDeletes=false)
> Nov 27, 2009 3:45:35 AM
> org.apache.solr.update.processor.LogUpdateProcessor finish
> INFO: {} 0 634
> Nov 27, 2009 3:45:35 AM org.apache.solr.core.SolrCore getSearcher
> WARNING: [nutch] Error opening new searcher. exceeded limit of
> maxWarmingSearchers=2, try again later.
> Nov 27, 2009 3:45:35 AM
> org.apache.solr.update.processor.LogUpdateProcessor finishINFO: {} 0
> 635
> Nov 27, 2009 3:45:35 AM org.apache.solr.common.SolrException log
> SEVERE: org.apache.solr.common.SolrException: Error opening new
> searcher. exceeded limit of maxWarmingSear
> chers=2, try again later.
>at org.apache.solr.core.SolrCore.getSearcher(SolrCore.java:1029)
>at 
> org.apache.solr.update.DirectUpdateHandler2.commit(DirectUpdateHandler2.java:418)
>at 
> org.apache.solr.update.processor.RunUpdateProcessor.processCommit(RunUpdateProcessorFactory.jav
> a:85)
>at 
> org.apache.solr.handler.RequestHandlerUtils.handleCommit(RequestHandlerUtils.java:107)
>at 
> org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(ContentStreamHandlerBase.java:48)
>at 
> org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:131)
>at org.apache.solr.core.SolrCore.execute(SolrCore.java:1316)
>at 
> org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:338)
>at 
> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:241)
>at 
> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
>at 
> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
>at 
> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
>   at 
> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
>at 
> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:128)
>at 
> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)
>at 
> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
>   at 
> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:293)
>at 
> org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:849)
>at 
> org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:583)
>at 
> org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:454)
>at java.lang.Thread.run(Thread.java:619)
> 
> Nov 27, 2009 3:45:35 AM org.apache.solr.core.SolrCore execute
> INFO: [nutch] webapp=/solrChunk path=/update
> params={waitSearcher=true&commit=true&wt=javabin&waitFlush=true&version=1}
> status=503 QTime=634
> Nov 27, 2009 3:45:35 AM org.apache.solr.common.SolrException log
> SEVERE: org.apache.solr.common.SolrException: Error opening new
> searcher. exceeded limit of maxWarmingSearchers=2, try again later.
>at org.apache.solr.core.SolrCore.getSearcher(SolrCore.java:1029)
>at 
> org.apache.solr.update.DirectUpdateHandler2.commit(DirectUpdateHandler2.java:418)
>at 
> org.apache.solr.update.processor.RunUpdateProcessor.processCommit(RunUpdateProcessorFactory.java:85)
>at 
> org.apache.solr.handler.RequestHandlerUtils.handleCommit(RequestHandlerUtils.java:107)
>at 
> org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(ContentStreamHandlerBase.java:48)
>at 
> org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:131)
>at org.apache.solr.core.SolrCore.execute(SolrCore.java:1316)
>at 
> org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:338)
>at 
> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:241)
>at 
> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
>at 
> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
>at 
> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
>at 
> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
>at 
> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:128)
>at 
> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)
>at 
> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
> 

Re: What does this error mean?

2009-11-27 Thread Paul Tomblin
What's a warming query, and how would I know if I'm doing one?  Does
this mean the web server restarted or something?

On Fri, Nov 27, 2009 at 3:25 PM, Matthew Runo  wrote:
> It means that there was 2 warming searchers, and then a commit came in and 
> caused a third to try to warm up at the same time. Do you use any warming 
> queries, or have large caches?
>
> Thanks for your time!
>
> Matthew Runo
> Software Engineer, Zappos.com
> mr...@zappos.com - 702-943-7833
>
> On Nov 27, 2009, at 5:46 AM, Paul Tomblin wrote:
>
>> NFO: start 
>> commit(optimize=false,waitFlush=true,waitSearcher=true,expungeDeletes=false)
>> Nov 27, 2009 3:45:35 AM
>> org.apache.solr.update.processor.LogUpdateProcessor finish
>> INFO: {} 0 634
>> Nov 27, 2009 3:45:35 AM org.apache.solr.core.SolrCore getSearcher
>> WARNING: [nutch] Error opening new searcher. exceeded limit of
>> maxWarmingSearchers=2, try again later.
>> Nov 27, 2009 3:45:35 AM
>> org.apache.solr.update.processor.LogUpdateProcessor finishINFO: {} 0
>> 635
>> Nov 27, 2009 3:45:35 AM org.apache.solr.common.SolrException log
>> SEVERE: org.apache.solr.common.SolrException: Error opening new
>> searcher. exceeded limit of maxWarmingSear
>> chers=2, try again later.
>>        at org.apache.solr.core.SolrCore.getSearcher(SolrCore.java:1029)
>>        at 
>> org.apache.solr.update.DirectUpdateHandler2.commit(DirectUpdateHandler2.java:418)
>>        at 
>> org.apache.solr.update.processor.RunUpdateProcessor.processCommit(RunUpdateProcessorFactory.jav
>> a:85)
>>        at 
>> org.apache.solr.handler.RequestHandlerUtils.handleCommit(RequestHandlerUtils.java:107)
>>        at 
>> org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(ContentStreamHandlerBase.java:48)
>>        at 
>> org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:131)
>>        at org.apache.solr.core.SolrCore.execute(SolrCore.java:1316)
>>        at 
>> org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:338)
>>        at 
>> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:241)
>>        at 
>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
>>        at 
>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
>>        at 
>> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
>>       at 
>> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
>>        at 
>> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:128)
>>        at 
>> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)
>>        at 
>> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
>>       at 
>> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:293)
>>        at 
>> org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:849)
>>        at 
>> org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:583)
>>        at 
>> org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:454)
>>        at java.lang.Thread.run(Thread.java:619)
>>
>> Nov 27, 2009 3:45:35 AM org.apache.solr.core.SolrCore execute
>> INFO: [nutch] webapp=/solrChunk path=/update
>> params={waitSearcher=true&commit=true&wt=javabin&waitFlush=true&version=1}
>> status=503 QTime=634
>> Nov 27, 2009 3:45:35 AM org.apache.solr.common.SolrException log
>> SEVERE: org.apache.solr.common.SolrException: Error opening new
>> searcher. exceeded limit of maxWarmingSearchers=2, try again later.
>>        at org.apache.solr.core.SolrCore.getSearcher(SolrCore.java:1029)
>>        at 
>> org.apache.solr.update.DirectUpdateHandler2.commit(DirectUpdateHandler2.java:418)
>>        at 
>> org.apache.solr.update.processor.RunUpdateProcessor.processCommit(RunUpdateProcessorFactory.java:85)
>>        at 
>> org.apache.solr.handler.RequestHandlerUtils.handleCommit(RequestHandlerUtils.java:107)
>>        at 
>> org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(ContentStreamHandlerBase.java:48)
>>        at 
>> org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:131)
>>        at org.apache.solr.core.SolrCore.execute(SolrCore.java:1316)
>>        at 
>> org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:338)
>>        at 
>> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:241)
>>        at 
>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
>>        at 
>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
>>        at 
>> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
>>        at 
>> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
>>        

Re: Looking for Best Practices: Analyzers vs. UpdateRequestProcessors?

2009-11-27 Thread Andreas Kahl

Am 26.11.09 11:07, schrieb Shalin Shekhar Mangar:

On Wed, Nov 25, 2009 at 9:52 PM, Andreas Kahl  wrote:

   

Hello,

are there any general criteria when to use Analyzers to implement an
indexing function and when it is better to use UpdateRequestProcessors?

The main difference I found in the documentation was that
UpdateRequestProcessors are able to manipulate several fields at once
(create, read, update, delete), while Analyzers operate on the contents of a
single field at once.


 

Analyzers can only change indexed content. If a field is marked as "stored",
then it is stored and retrieved un-modified. If you want to modify the
"stored" part as well, then only an UpdateRequestProcessor can do that. In
other words, the field's value after applying UpdateRequestProcessors is fed
into analyzers (for indexed field) and stored verbatim (for stored fields).

   

Thank you very much for your answer. That cleared my sight pretty much.

Andreas


using Xinclude with multi-core

2009-11-27 Thread Peter Wolanin
I'm trying to take advantage of the Solr 1.4 Xinclude feature to
include a different xml fragment (e.g. a different analyzer chain in
schema.xml) for each core in a multi-core setup.  When the Xinclude
operates on a relative path, it seems to NOT be acting relative to the
xml file with the Xinlclude statement.  Using the jetty example, it
looks for a file in example/.

Is this a bug in the way Solr invokes Xinclude?  If not, is there a
variable that contains the instanceDir that can be used?
${solr.instanceDir} or ${solr/instanceDir}

DOMUtil.substituteProperties(doc, loader.getCoreProperties());

I see that I could potentially specify solrcore.properties,
http://wiki.apache.org/solr/SolrConfigXml#System_property_substitutionin
order to determine the correct based path, but this seems overly
complicated in terms of wht the useual use case would be for Xinclude?

-Peter

-- 
Peter M. Wolanin, Ph.D.
Momentum Specialist,  Acquia. Inc.
peter.wola...@acquia.com


Re: $DeleteDocbyQuery in solr 1.4 is not working

2009-11-27 Thread cpmoser

Hi I just recently (well today actually) ran into the same issue, and a
Google search led me here.  Something in the log that clued me in to my
issue was this:

Nov 27, 2009 1:27:05 PM org.apache.solr.core.SolrDeletionPolicy onInit

I was expecting docs to be removed on commit (which normally happens after a
dataimport), so I thought something was broken.

However, when I restarted Solr, the docs that should have been deleted were
actually deleted. There is more discussion about the SolrDeletionPolicy 
http://issues.apache.org/jira/browse/SOLR-617 here .  I haven't read enough
to know how to add an onCommit deletion policy to the Solr config yet, and
don't know if you're running into the same issue, but hope this helps.


Mark.El wrote:
> 
> Thanks Otis... I remember that one!
> 
> It still did not remove the document! So obviously its something else
> thats
> happening.
> 
> 
> 

-- 
View this message in context: 
http://old.nabble.com/%24DeleteDocbyQuery-in-solr-1.4-is-not-working-tp26376265p26545394.html
Sent from the Solr - User mailing list archive at Nabble.com.



Re: $DeleteDocbyQuery in solr 1.4 is not working

2009-11-27 Thread Mark Ellul
Thanks I will look into it!

On Fri, Nov 27, 2009 at 11:34 PM, cpmoser  wrote:

>
> Hi I just recently (well today actually) ran into the same issue, and a
> Google search led me here.  Something in the log that clued me in to my
> issue was this:
>
> Nov 27, 2009 1:27:05 PM org.apache.solr.core.SolrDeletionPolicy onInit
>
> I was expecting docs to be removed on commit (which normally happens after
> a
> dataimport), so I thought something was broken.
>
> However, when I restarted Solr, the docs that should have been deleted were
> actually deleted. There is more discussion about the SolrDeletionPolicy
> http://issues.apache.org/jira/browse/SOLR-617 here .  I haven't read
> enough
> to know how to add an onCommit deletion policy to the Solr config yet, and
> don't know if you're running into the same issue, but hope this helps.
>
>
> Mark.El wrote:
> >
> > Thanks Otis... I remember that one!
> >
> > It still did not remove the document! So obviously its something else
> > thats
> > happening.
> >
> >
> >
>
> --
> View this message in context:
> http://old.nabble.com/%24DeleteDocbyQuery-in-solr-1.4-is-not-working-tp26376265p26545394.html
> Sent from the Solr - User mailing list archive at Nabble.com.
>
>


Re: 'Connection reset' in DataImportHandler Development Console

2009-11-27 Thread aerox7

Hi Andrew,
I download the last build of solr (1.4) and i have the same probleme with
DebugNow in Dataimport dev Console. have you found a solution ?

thank you

Andrew Clegg wrote:
> 
> 
> 
> Noble Paul നോബിള്‍  नोब्ळ्-2 wrote:
>> 
>> apparently I do not see any command full-import, delta-import being
>> fired. Is that true?
>> 
> 
> It seems that way -- they're not appearing in the logs. I've tried Debug
> Now with both full and delta selected from the dropdown, no difference
> either way.
> 
> If I click the Full Import button it starts an import okay. I don't have
> to Full Import manually every time I want to debug a config change do I?
> That's not what the docs say. (A full import takes about 6 or 7 hours...)
> 
> Thanks,
> 
> Andrew.
> 

-- 
View this message in context: 
http://old.nabble.com/%27Connection-reset%27-in-DataImportHandler-Development-Console-tp25005850p26545401.html
Sent from the Solr - User mailing list archive at Nabble.com.



[Solved] Re: VelocityResponseWriter/Solritas character encoding issue

2009-11-27 Thread Sascha Szott

Hi Erik,

I've finally solved the problem. Unfortunately, the parameter 
v.contentType was not described in the Solr wiki (I've fixed that now). 
The point is, you must specify (in your solrconfig.xml)


   text/xml;charset=UTF-8

in order to receive correctly UTF-8 encoded HTML. That's it!

Best,
Sascha

Erik Hatcher schrieb:

Sascha,

Can you give me a test document that causes an issue?  (maybe send me a 
Solr XML document in private e-mail).   I'll see what I can do once I 
can see the issue first hand.


Erik


On Nov 18, 2009, at 2:48 PM, Sascha Szott wrote:


Hi,

I've played around with Solr's VelocityResponseWriter (which is indeed 
a very useful feature for rapid prototyping). I've realized that 
Velocity uses ISO-8859-1 as default character encoding. I've changed 
this setting to UTF-8 in my velocity.properties file (inside the conf 
directory), i.e.,


  input.encoding=UTF-8
  output.encoding=UTF-8

and checked that the settings were successfully loaded.

Within the main Velocity template, browse.vm, the character encoding 
is set to UTF-8 as well, i.e.,


  

After starting Solr (which is deployed in a Tomcat 6 server on a 
Ubuntu machine), I ran into some character encoding problems.


Due to the change of input.encoding to UTF-8, no problems occur when 
non-ASCII characters are presend in the query string, e.g. german 
umlauts. But unfortunately, something is wrong with the encoding of 
characters in the html page that is generated by 
VelocityResponseWriter. The non-ASCII characters aren't displayed 
properly (for example, FF prints a black diamond with a white question 
mark). If I manually set the encoding to ISO-8859-1, the non-ASCII 
characters are displayed correctly. Does anybody have a clue?


Thanks in advance,
Sascha






Re: Solr CPU usage

2009-11-27 Thread Yonik Seeley
On Fri, Nov 27, 2009 at 12:09 PM, Girish Redekar
 wrote:
> Am running both my server and client on ubuntu machines. The client is on a
> different box. The server CPU and RAM are both well below 50%.

OK, then the obvious thing to try would be to move the client to the
server machine and see if you can max out the CPUs.  Sould be doable
on Solr 1.4

-Yonik
http://www.lucidimagination.com


Re: What does this error mean?

2009-11-27 Thread Otis Gospodnetic
Paul,

Warm-up queries are specified in solrconfig.xml, have a look.
Check markmail.org for plenty of emails explaining "exceeded limit of 
maxWarmingSearers" - people have asked about it a lot recently (short answer: 
you may be committing too often?).


Otis
--
Sematext is hiring -- http://sematext.com/about/jobs.html?mls
Lucene, Solr, Nutch, Katta, Hadoop, HBase, UIMA, NLP, NER, IR



- Original Message 
> From: Paul Tomblin 
> To: solr-user@lucene.apache.org
> Sent: Fri, November 27, 2009 3:40:58 PM
> Subject: Re: What does this error mean?
> 
> What's a warming query, and how would I know if I'm doing one?  Does
> this mean the web server restarted or something?
> 
> On Fri, Nov 27, 2009 at 3:25 PM, Matthew Runo wrote:
> > It means that there was 2 warming searchers, and then a commit came in and 
> caused a third to try to warm up at the same time. Do you use any warming 
> queries, or have large caches?
> >
> > Thanks for your time!
> >
> > Matthew Runo
> > Software Engineer, Zappos.com
> > mr...@zappos.com - 702-943-7833
> >
> > On Nov 27, 2009, at 5:46 AM, Paul Tomblin wrote:
> >
> >> NFO: start 
> commit(optimize=false,waitFlush=true,waitSearcher=true,expungeDeletes=false)
> >> Nov 27, 2009 3:45:35 AM
> >> org.apache.solr.update.processor.LogUpdateProcessor finish
> >> INFO: {} 0 634
> >> Nov 27, 2009 3:45:35 AM org.apache.solr.core.SolrCore getSearcher
> >> WARNING: [nutch] Error opening new searcher. exceeded limit of
> >> maxWarmingSearchers=2, try again later.
> >> Nov 27, 2009 3:45:35 AM
> >> org.apache.solr.update.processor.LogUpdateProcessor finishINFO: {} 0
> >> 635
> >> Nov 27, 2009 3:45:35 AM org.apache.solr.common.SolrException log
> >> SEVERE: org.apache.solr.common.SolrException: Error opening new
> >> searcher. exceeded limit of maxWarmingSear
> >> chers=2, try again later.
> >>at org.apache.solr.core.SolrCore.getSearcher(SolrCore.java:1029)
> >>at 
> org.apache.solr.update.DirectUpdateHandler2.commit(DirectUpdateHandler2.java:418)
> >>at 
> org.apache.solr.update.processor.RunUpdateProcessor.processCommit(RunUpdateProcessorFactory.jav
> >> a:85)
> >>at 
> org.apache.solr.handler.RequestHandlerUtils.handleCommit(RequestHandlerUtils.java:107)
> >>at 
> org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(ContentStreamHandlerBase.java:48)
> >>at 
> org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:131)
> >>at org.apache.solr.core.SolrCore.execute(SolrCore.java:1316)
> >>at 
> org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:338)
> >>at 
> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:241)
> >>at 
> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
> >>at 
> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
> >>at 
> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
> >>   at 
> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
> >>at 
> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:128)
> >>at 
> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)
> >>at 
> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
> >>   at 
> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:293)
> >>at 
> org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:849)
> >>at 
> org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:583)
> >>at 
> org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:454)
> >>at java.lang.Thread.run(Thread.java:619)
> >>
> >> Nov 27, 2009 3:45:35 AM org.apache.solr.core.SolrCore execute
> >> INFO: [nutch] webapp=/solrChunk path=/update
> >> params={waitSearcher=true&commit=true&wt=javabin&waitFlush=true&version=1}
> >> status=503 QTime=634
> >> Nov 27, 2009 3:45:35 AM org.apache.solr.common.SolrException log
> >> SEVERE: org.apache.solr.common.SolrException: Error opening new
> >> searcher. exceeded limit of maxWarmingSearchers=2, try again later.
> >>at org.apache.solr.core.SolrCore.getSearcher(SolrCore.java:1029)
> >>at 
> org.apache.solr.update.DirectUpdateHandler2.commit(DirectUpdateHandler2.java:418)
> >>at 
> org.apache.solr.update.processor.RunUpdateProcessor.processCommit(RunUpdateProcessorFactory.java:85)
> >>at 
> org.apache.solr.handler.RequestHandlerUtils.handleCommit(RequestHandlerUtils.java:107)
> >>at 
> org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(ContentStreamHandlerBase.java:48)
> >>at 
> org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:131)
> >>at org.apache.solr.core

Re: restore space between words by spell checker

2009-11-27 Thread Otis Gospodnetic
I'm not sure if that can be easily done (other than going char by char and 
testing), because nothing indicates where the space might be, not even an upper 
case there.  I'd be curious to know if you find a better solution.

Otis
--
Sematext is hiring -- http://sematext.com/about/jobs.html?mls
Lucene, Solr, Nutch, Katta, Hadoop, HBase, UIMA, NLP, NER, IR



- Original Message 
> From: Andrey Klochkov 
> To: solr-user 
> Sent: Fri, November 27, 2009 6:09:08 AM
> Subject: restore space between words by spell checker
> 
> Hi
> 
> If a user issued a misspelled query, forgetting to place space between
> words, is it possible to fix it with a spell checker or by some other
> mechanism?
> 
> For example, if we get query "tommyhitfiger" and have terms "tommy" and
> "hitfiger" in the index, how to fix the query?
> 
> -- 
> Andrew Klochkov
> Senior Software Engineer,
> Grid Dynamics



Re: Maximum number of fields allowed in a Solr document

2009-11-27 Thread Otis Gospodnetic
Hi Alex,

There is no build-in limit.  The limit is going to be dictated by your hardware 
resources.  In particular, this sounds like a memory intensive app because of 
sorting on lots of different fields.  You didn't mention the size of your 
index, but that's a factor, too.  Once in a while people on the list mention 
cases with lots and lots of fields, so I'd check ML archives.

Otis
--
Sematext is hiring -- http://sematext.com/about/jobs.html?mls
Lucene, Solr, Nutch, Katta, Hadoop, HBase, UIMA, NLP, NER, IR



- Original Message 
> From: Alex Wang 
> To: "solr-user@lucene.apache.org" 
> Sent: Thu, November 26, 2009 12:47:36 PM
> Subject: Maximum number of fields allowed in a Solr document
> 
> Hi,
> 
> We are in the process of designing a Solr app where we might have  
> millions of documents and within each of the document, we might have  
> thousands of dynamic fields. These fields are small and only contain  
> an integer, which needs to be retrievable and sortable.
> 
> My questions is:
> 
> 1. Is there a limit on the number of fields allowed per document?
> 2. What is the performance impact for such design?
> 3. Has anyone done this before and is it a wise thing to do?
> 
> Thanks,
> 
> Alex



Re: SolrException caused by illegal character

2009-11-27 Thread Otis Gospodnetic
Could it be that your XML contains a  control character, code 3? ;)

Check the table on http://en.wikipedia.org/wiki/ASCII  

Otis
--
Sematext is hiring -- http://sematext.com/about/jobs.html?mls
Lucene, Solr, Nutch, Katta, Hadoop, HBase, UIMA, NLP, NER, IR



- Original Message 
> From: György Frivolt 
> To: solr-user 
> Sent: Thu, November 26, 2009 8:54:20 AM
> Subject: SolrException caused by illegal character
> 
> Hi,
> I upgradeed to Solr 1.4 and tried to reindex the data. After few
> thousand of reindexed documents an exception is thrown, I did not meet
> this using 1.3 before. Do you have any idea what caused the problem?
> Thanks.
> 
> SEVERE: org.apache.solr.common.SolrException: Illegal character
> ((CTRL-CHAR, code 3))
> at [row,col {unknown-source}]: [6495,39]
> at org.apache.solr.handler.XMLLoader.load(XMLLoader.java:72)
> at 
> org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(ContentStreamHandlerBase.java:54)
> at 
> org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:131)
> at org.apache.solr.core.SolrCore.execute(SolrCore.java:1316)
> at 
> org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:338)
> at 
> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:241)
> at 
> org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1089)
> at 
> org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:365)
> at 
> org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216)
> at 
> org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:181)
> at 
> org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:712)
> at org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:405)
> at 
> org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:211)
> at 
> org.mortbay.jetty.handler.HandlerCollection.handle(HandlerCollection.java:114)
> at 
> org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:139)
> at org.mortbay.jetty.Server.handle(Server.java:285)
> at org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:502)
> at 
> org.mortbay.jetty.HttpConnection$RequestHandler.content(HttpConnection.java:835)
> at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:641)
> at org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:208)
> at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:378)
> at 
> org.mortbay.jetty.bio.SocketConnector$Connection.run(SocketConnector.java:226)
> at 
> org.mortbay.thread.BoundedThreadPool$PoolThread.run(BoundedThreadPool.java:442)
> Caused by: com.ctc.wstx.exc.WstxUnexpectedCharException: Illegal
> character ((CTRL-CHAR, code 3))
> at [row,col {unknown-source}]: [6495,39]
> at com.ctc.wstx.sr.StreamScanner.throwInvalidSpace(StreamScanner.java:675)
> at 
> com.ctc.wstx.sr.BasicStreamReader.readTextPrimary(BasicStreamReader.java:4556)
> at 
> com.ctc.wstx.sr.BasicStreamReader.nextFromTree(BasicStreamReader.java:2888)
> at com.ctc.wstx.sr.BasicStreamReader.next(BasicStreamReader.java:1019)
> at org.apache.solr.handler.XMLLoader.readDoc(XMLLoader.java:273)
> at org.apache.solr.handler.XMLLoader.processUpdate(XMLLoader.java:138)
> at org.apache.solr.handler.XMLLoader.load(XMLLoader.java:69)
> ... 22 more



ExternalFileField is broken in Solr 1.4?

2009-11-27 Thread Koji Sekiguchi
It seems that ExternalFileField doesn't work in 1.4.
In 1.4, I need to restart Solr to reflect external_[fieldname] file.
Only  was needed in 1.3...

Koji

-- 
http://www.rondhuit.com/en/



Re: SolrException caused by illegal character

2009-11-27 Thread György Frivolt
Thanks, I also found out, had to filter my data. Now I removed the
control chars.. and solr is happy like I am.

On Sat, Nov 28, 2009 at 5:13 AM, Otis Gospodnetic
 wrote:
> Could it be that your XML contains a  control character, code 3? ;)
>
> Check the table on http://en.wikipedia.org/wiki/ASCII
>
> Otis
> --
> Sematext is hiring -- http://sematext.com/about/jobs.html?mls
> Lucene, Solr, Nutch, Katta, Hadoop, HBase, UIMA, NLP, NER, IR
>
>
>
> - Original Message 
>> From: György Frivolt 
>> To: solr-user 
>> Sent: Thu, November 26, 2009 8:54:20 AM
>> Subject: SolrException caused by illegal character
>>
>> Hi,
>>     I upgradeed to Solr 1.4 and tried to reindex the data. After few
>> thousand of reindexed documents an exception is thrown, I did not meet
>> this using 1.3 before. Do you have any idea what caused the problem?
>> Thanks.
>>
>> SEVERE: org.apache.solr.common.SolrException: Illegal character
>> ((CTRL-CHAR, code 3))
>> at [row,col {unknown-source}]: [6495,39]
>>     at org.apache.solr.handler.XMLLoader.load(XMLLoader.java:72)
>>     at
>> org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(ContentStreamHandlerBase.java:54)
>>     at
>> org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:131)
>>     at org.apache.solr.core.SolrCore.execute(SolrCore.java:1316)
>>     at
>> org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:338)
>>     at
>> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:241)
>>     at
>> org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1089)
>>     at 
>> org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:365)
>>     at
>> org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216)
>>     at 
>> org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:181)
>>     at 
>> org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:712)
>>     at org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:405)
>>     at
>> org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:211)
>>     at
>> org.mortbay.jetty.handler.HandlerCollection.handle(HandlerCollection.java:114)
>>     at 
>> org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:139)
>>     at org.mortbay.jetty.Server.handle(Server.java:285)
>>     at 
>> org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:502)
>>     at
>> org.mortbay.jetty.HttpConnection$RequestHandler.content(HttpConnection.java:835)
>>     at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:641)
>>     at org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:208)
>>     at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:378)
>>     at
>> org.mortbay.jetty.bio.SocketConnector$Connection.run(SocketConnector.java:226)
>>     at
>> org.mortbay.thread.BoundedThreadPool$PoolThread.run(BoundedThreadPool.java:442)
>> Caused by: com.ctc.wstx.exc.WstxUnexpectedCharException: Illegal
>> character ((CTRL-CHAR, code 3))
>> at [row,col {unknown-source}]: [6495,39]
>>     at 
>> com.ctc.wstx.sr.StreamScanner.throwInvalidSpace(StreamScanner.java:675)
>>     at
>> com.ctc.wstx.sr.BasicStreamReader.readTextPrimary(BasicStreamReader.java:4556)
>>     at
>> com.ctc.wstx.sr.BasicStreamReader.nextFromTree(BasicStreamReader.java:2888)
>>     at com.ctc.wstx.sr.BasicStreamReader.next(BasicStreamReader.java:1019)
>>     at org.apache.solr.handler.XMLLoader.readDoc(XMLLoader.java:273)
>>     at org.apache.solr.handler.XMLLoader.processUpdate(XMLLoader.java:138)
>>     at org.apache.solr.handler.XMLLoader.load(XMLLoader.java:69)
>>     ... 22 more
>
>