Matt Weber-2 wrote:
>
> Check out the field collapsing patch:
>
> http://wiki.apache.org/solr/FieldCollapsing
> https://issues.apache.org/jira/browse/SOLR-236
>
That looks like just the ticket. Thanks for the quick response.
Peter
--
View this message in context:
http://www.nabble.com/Usin
On Mon, Sep 28, 2009 at 2:59 AM, Jibo John wrote:
> Additionally, I get the same exception even if I declare the
> in the .
>
>
>
> true
>
>
>
That should be instead of
--
Regards,
Shalin Shekhar Mangar.
You are running a very old version of Java 6 (update 6). The latest is
update 16. You should definitely upgrade. There is a bug in Java 6
starting with update 4 that may result in a corrupted Lucene/Solr index:
http://bugs.sun.com/bugdatabase/view_bug.do?bug_id=6707044
https://issues.apache.org/
Mark Miller wrote:
> Its certainly happening when its trying to switch to a new log file. It
> doesn't try and close the file otherwise. Why there is such a pause
> doing it, I dunno. Perhaps because there are tons of file descriptors
> around?
>
> Does it happen on a fresh system, or did it just s
Additionally, I get the same exception even if I declare the
in the .
class="org.apache.lucene.index.LogByteSizeMergePolicy">
true
Thanks,
-Jibo
On Sep 27, 2009, at 2:03 PM, Jibo John wrote:
Thanks for this. I've updated trunk/, rebuilt solr.war, however,
running in
Its certainly happening when its trying to switch to a new log file. It
doesn't try and close the file otherwise. Why there is such a pause
doing it, I dunno. Perhaps because there are tons of file descriptors
around?
Does it happen on a fresh system, or did it just start after running for
a long
I originally thought it was replication but one of the servers
exhibited the same issue with polling disabled. If it is tomcat why
would it block for so long on a simple log rotation?
Additionally, we see similar things during the day. Replication often
is around the occurance but never at
Thanks for this. I've updated trunk/, rebuilt solr.war, however,
running into another issue.
Sep 27, 2009 1:55:44 PM org.apache.solr.common.SolrException log
SEVERE: java.lang.IllegalArgumentException
at sun.reflect.GeneratedMethodAccessor27.invoke(Unknown Source)
at
sun
.reflect
.
Thanks Mark.
Ticket number below:
https://issues.apache.org/jira/browse/SOLR-1468
Cheers,
Tomasz
On Sun, Sep 27, 2009 at 1:46 PM, Mark Miller wrote:
> Its an unknown I believe Tomasz - I can't find anything related in JIRA
> and it still exists in trunk - please log a JIRA issue.
>
> --
> - Mar
On Mon, Sep 28, 2009 at 1:18 AM, Shalin Shekhar Mangar <
shalinman...@gmail.com> wrote:
> On Sat, Sep 26, 2009 at 7:13 AM, Jibo John wrote:
>
>> Hello,
>>
>> It looks like solr is not allowing me to change the default
>> MergePolicy/Scheduler classes.
>>
>> Even if I change the default MergePolic
Right... when I increased it to 12GB all OOM just disappear. And all the
tests are being run on the live environment and for several hours, so it is
real enough :)As soon as I update JVM and test again the GC I will let you
know. If you think I can run another test meanwhile just let me know.
On S
Jonathan Ariel wrote:
> Well.. it is strange that when I use the default GC I don't get any errors.
>
Not so strange - it's different code. The bug is Likely in the low pause
collector and not the serial collector.
> If I'm so close to run out of memory I should see those OOM exceptions as
> wel
The response times in a Solr request don't include the time to read
stored fields (since the response is streamed) and doesn't include the
time to transfer/read the response (which can be increased by a
slow/congested network link, or a slow client that doesn't read the
response immediately).
How
Fuad Efendi wrote:
> Mark,
>
>
> Nothing against orange-hat :)
>
> Nothing against GC tuning; but if SOLR needs application-specific settings
> it should be well-documented.
>
> GC-tuning: for instance, we need it for 'realtime' Online Trading
> applications. However, even Online Banking doesn't ne
On Sat, Sep 26, 2009 at 7:13 AM, Jibo John wrote:
> Hello,
>
> It looks like solr is not allowing me to change the default
> MergePolicy/Scheduler classes.
>
> Even if I change the default MergePolicy/Scheduler(LogByteSizeMErgePolicy
> and ConcurrentMergeScheduler) defined in solrconfig.xml to a
Well.. it is strange that when I use the default GC I don't get any errors.
If I'm so close to run out of memory I should see those OOM exceptions as
well with the standard GC.BTW I'm faceting on around 13 fields and my total
number of unique values is around 3.
One of the fields with the bigge
Hello,
I am trying to measure why some of my queries take a long time. I am using
EmbeddedSolrServer and with logging statements before and
after the EmbeddedSolrServer.query(SolrQuery) function, I have found the
time to be around 16s. I added the debugQuery=true and the timing component
for this r
Mark,
Nothing against orange-hat :)
Nothing against GC tuning; but if SOLR needs application-specific settings
it should be well-documented.
GC-tuning: for instance, we need it for 'realtime' Online Trading
applications. However, even Online Banking doesn't need; primary reason - GC
must happen
If he needed double the RAM, he'd likely know by now :) The JVM likes to
throw OOM exceptions when you need more RAM. Until it does - thats an
odd path to focus on. There has been no indication he has ever seen an
OOM with his over 10 GB heap. It sounds like he has run Solr in his
environment for
>> Ok. After the server ran for more than 12 hours, the time spent on GC
>> decreased from 11% to 3,4%, but 5 hours later it crashed.
All this 'black-hat' GC tuning and 'fast' object moving (especially objects
accessing by some thread during GC-defragmentation)
- try to use multithreaded load-str
: As I mentioned previously, I prefer to do this with as little java
: code as possible. That's the motivation for me to take a look at solr.
I understand, but as i already said "there is no pure configuration way to
obtain the same logic you could get from a custom HitCollector"
you can get th
> Thanks, this helps.
> But our synonym file has some 16,000 sets of synonyms.
Thats a lot. Can you give some examples?
> - the individual synonyms in your synonym file should be in
> a form as if they were sent through the tokenizers which
> come before the SynonymFilterFactory.
Exactly. Orde
Hi,
I have integerated carrot with Solr.When i give a search input string,it gives
me corresponding search results for the given input string.
I would like to know on what basis the search results are displayed.
For ex: There are 10 links that gets displayed for a given input string.The
first l
Perhaps something like Tomcat rotating it's log files nightly?
-Yonik
http://www.lucidimagination.com
On Sun, Sep 27, 2009 at 11:13 AM, Mark Miller wrote:
> Doesn't sound so random ;)
>
> Do you have anything specific going on at that time? Replication,
> something else scheduled? Pretty odd it
Doesn't sound so random ;)
Do you have anything specific going on at that time? Replication,
something else scheduled? Pretty odd it would happen at around the same
time every night unless something is set to occur then ...
Jeff Newburn wrote:
> It appears that a few seconds after midnight every
It appears that a few seconds after midnight every night our solr 1.4
instances block for about 15-30 seconds and stop serving search requests. I
have included a stack trace of the the running thread and one that is
blocked by it. Please let me know of any way I can stop this or if it is a
solr i
Its an unknown I believe Tomasz - I can't find anything related in JIRA
and it still exists in trunk - please log a JIRA issue.
--
- Mark
http://www.lucidimagination.com
Tomasz Wróbel wrote:
> When using Solrj client and setting query params:
> queryParams.setMissing("true")
> or
> queryParam
Thanks, this helps.
But our synonym file has some 16,000 sets of synonyms.
Should the wiki warn users?
- WhitespaceTokenizerFactory with synonyms at indexing will not expand synonyms
in text "... synonym[punctuation mark] ..."
- the individual synonyms in your synonym file should be in a form
When using Solrj client and setting query params:
queryParams.setMissing("true")
or
queryParams.set(FacetParams.FACET_MISSING, "true")
I'm getting an exception as below:
...
Caused by: org.apache.solr.common.SolrException: parsing error
at
org.apache.solr.client.solrj.impl.XMLResponseParser.pro
29 matches
Mail list logo