Re: custom sorter

2012-07-20 Thread Lee Carroll
take a look at
http://wiki.apache.org/solr/QueryElevationComponent

On 20 July 2012 03:48, Siping Liu  wrote:

> Hi,
> I have requirements to place a document to a pre-determined  position for
> special filter query values, for instance when filter query is
> fq=(field1:"xyz") place document abc as first result (the rest of the
> result set will be ordered by sort=field2). I guess I have to plug in my
> Java code as a custom sorter. I'd appreciate it if someone can shed light
> on this (how to add custom sorter, etc.)
> TIA.
>


RE: How to setup SimpleFSDirectoryFactory

2012-07-20 Thread Uwe Schindler
Hi Bill,

MMapDirectory uses the file system cache of your operating system, which has
following consequences: In Linux, top & free should normally report only
*few* free memory, because the O/S uses all memory not allocated by
applications to cache disk I/O (and shows it as allocated, so having 0% free
memory is just normal on Linux and also Windows). If you have other
applications or Lucene/Solr itself that allocate lot's of heap space or
malloc() a lot, then you are reducing free physical memory, so reducing fs
cache. This depends also on your swappiness parameter (if swappiness is
higher, inactive processes are swapped out easier, default is 60% on linux -
freeing more space for FS cache - the backside is of course that maybe
in-memory structures of Lucene and other applications get pages out).

You will only see no paging at all if all memory allocated all applications
+ all mmapped files fit into memory. But paging in/out the mmapped Lucene
index is much cheaper than using SimpleFSDirectory or NIOFSDirectory. If
you use SimpleFS or NIO and your index is not in FS cache, it will also read
it from physical disk again, so where is the difference. Paging is actually
cheaper as no syscalls are involved.

If you want as much as possible of your index in physical RAM, copy it to
/dev/null regularily and buy more RUM :-)

-
Uwe Schindler
H.-H.-Meier-Allee 63, D-28213 Bremen
http://www.thetaphi.de
eMail: u...@thetaphi.de

> -Original Message-
> From: Bill Bell [mailto:billnb...@gmail.com]
> Sent: Friday, July 20, 2012 5:17 AM
> Subject: Re: How to setup SimpleFSDirectoryFactory
> 
> Thanks. Are you saying that if we run low on memory, the MMapDirectory
will
> s=op using it? The least used memory will be removed from the OS
> automaticall=? Isee some paging. Wouldn't paging slow down the querying?
> 
> My index is 10gb and every 8 hours we get most of it in shared memory. The
> m=mory is 99 percent used, and that does not leave any room for other
apps. =
> Other implications?
> 
> Sent from my mobile device
> 720-256-8076
> 
> On Jul 19, 2012, at 9:49 AM, "Uwe Schindler"  wrote:
> 
> > Read this, then you will see that MMapDirectory will use 0% of your Java
> H=ap space or free system RAM:
> >
> > http://blog.thetaphi.de/2012/07/use-lucenes-mmapdirectory-on-64bit.htm
> > l
> >
> > Uwe
> >
> > -
> > Uwe Schindler
> > H.-H.-Meier-Allee 63, D-28213 Bremen
> > http://www.thetaphi.de
> > eMail: u...@thetaphi.de
> >
> >
> >> -Original Message-
> >> From: William Bell [mailto:billnb...@gmail.com]
> >> Sent: Tuesday, July 17, 2012 6:05 AM
> >> Subject: How to setup SimpleFSDirectoryFactory
> >>
> >> We all know that MMapDirectory is fastest. However we cannot always
> >> use i= since you might run out of memory on large indexes right?
> >>
> >> Here is how I got iSimpleFSDirectoryFactory to work. Just set -
> >> Dsolr.directoryFactory=solr.SimpleFSDirectoryFactory.
> >>
> >> Your solrconfig.xml:
> >>
> >>  >> class="${solr.directoryFactory:solr.StandardDirectoryFactory}"/>
> >>
> >> You can check it with http://localhost:8983/solr/admin/stats.jsp
> >>
> >> Notice that the default for Windows 64bit is MMapDirectory. Else
> >> NIOFSDirectory except for WIndows It would be nicer if we just
> >> set it=all up with a helper in solrconfig.xml...
> >>
> >> if (Constants.WINDOWS) {
> >> if (MMapDirectory.UNMAP_SUPPORTED && Constants.JRE_IS_64BIT)
> >>return new MMapDirectory(path, lockFactory);
> >> else
> >>return new SimpleFSDirectory(path, lockFactory);
> >> } else {
> >>return new NIOFSDirectory(path, lockFactory);
> >>  }
> >> }
> >>
> >>
> >>
> >> --
> >> Bill Bell
> >> billnb...@gmail.com
> >> cell 720-256-8076
> >
> >




Solr Monitoring Tool

2012-07-20 Thread Suneel
Hi, 

I am want to configure solr performance monitoring tool i surf a lot and
found some tool like "zabbix, SolrGaze"
but i am not able to decide which tool is better. 

I want integrated alerting option in tools so that i can received a message
whether website performance going down or in case of exception.

Please help me, this will be very helpful for me.

Thanks




-
Regards,

Suneel Pandey
Sr. Software Developer
--
View this message in context: 
http://lucene.472066.n3.nabble.com/Solr-Monitoring-Tool-tp3996153.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: How to Increase the number of connexion on Solr/Tomcat6?

2012-07-20 Thread Bruno Mannina

Dear Michael,

My system is:
Ubuntu 12.04
8Go Ram
4 cores

Concerning connector on server.xml, I don't modified something, so all 
values are default.

I have only one connector and no maxThreads are define inside.




Must I add a line with maxThreads=?



Le 20/07/2012 03:31, Michael Della Bitta a écrit :

Hi Bruno,

It's usually the maxThreads attribute in the  tag in
$CATALINA_HOME/conf/server.xml. But I kind of doubt you're running out
of threads... maybe you could post some more details about the system
you're running Solr on.

Michael Della Bitta


Appinions, Inc. -- Where Influence Isn’t a Game.
http://www.appinions.com


On Thu, Jul 19, 2012 at 6:47 PM, Bruno Mannina  wrote:

Dear Solr User,

I don't know if it's here that my question must be posted but I'm sure some
users have already had my problem.

Actually, I do 1556 requests with 4 Http components with my program. If I do
these requests without delay (500ms)
before sending each requests I have around 10% of requests with empty
answer. If I add delay before each requests I have no empty answer.

Empty answer has HTTP 200 OK, Header OK but Body = ''

Where can I increase the limit of Tomcat/Solr requests at the same time or
how can I solve my problem.

Thanks a lot for your Help,
Bruno






Re: Solr Monitoring Tool

2012-07-20 Thread Paul Libbrecht
Suneel,

there's many monitoring tools out there.
Zabbix is one of them, it is in PHP.
I think SolrGaze as well (not sure).
I've been using HypericHQ, which is pure java, and I have been satisfied with 
it though it leaves some space for enhancement.
Other names include Nagios, also in PHP, and RRDtools/Munin (Perl).

For any of them you can define alerts based on thresholds on certain values 
(e.g. mean page load time).

If you search around, which I've done recently, you will find many many other 
tools which offer "incomplete" solutions, often more oriented to data 
collections or data views (e.g. Graphite)

paul


Le 20 juil. 2012 à 11:58, Suneel a écrit :

> Hi, 
> 
> I am want to configure solr performance monitoring tool i surf a lot and
> found some tool like "zabbix, SolrGaze"
> but i am not able to decide which tool is better. 
> 
> I want integrated alerting option in tools so that i can received a message
> whether website performance going down or in case of exception.
> 
> Please help me, this will be very helpful for me.
> 
> Thanks
> 
> 
> 
> 
> -
> Regards,
> 
> Suneel Pandey
> Sr. Software Developer
> --
> View this message in context: 
> http://lucene.472066.n3.nabble.com/Solr-Monitoring-Tool-tp3996153.html
> Sent from the Solr - User mailing list archive at Nabble.com.



Re: NGram Indexing Basic Question

2012-07-20 Thread Erick Erickson
Try attaching &debugQuery=on to your query and look at the parsed
query. My first guess
is that your default operator is AND (or a.op in modern terms) and the
ngram with
"dl" in it is required.

Please paste the results here if that's not the cause.

Best
Erick

On Thu, Jul 19, 2012 at 7:29 AM, Husain, Yavar  wrote:
> I have set some of my fields to be NGram Indexed. Have also set analyzer both 
> at query as well as index level.
>
> Most of the stuff works fine except for use cases where I simply interchange 
> couple of characters.
>
> For an example: "springfield" retrieves correct matches, "springfi" retrieves 
> correct matches, "ingfield" retrieves correct matches.
>
> However when i say "springfiedl" it returns 0 results. I debugged and found 
> that at query/index level I have all correct N-Grams stored. So ideally it 
> should match "springfie" (which is there both in Query NGram and Index NGram) 
> and return me the correct results.
>
> As I was busy so did not get time to look at the code for NGram. What ideally 
> happens when I use NGram at Query level? Does it split the strings into 
> N-Grams and then send each of them to Solr Server?
>
> Thanks Sahi for your help yesterday. Appreciate that.
> 
> 
> **This
>  message may contain confidential or proprietary information intended only 
> for the use of theaddressee(s) named above or may contain information 
> that is legally privileged. If you arenot the intended addressee, or the 
> person responsible for delivering it to the intended addressee,you are 
> hereby notified that reading, disseminating, distributing or copying this 
> message is strictlyprohibited. If you have received this message by 
> mistake, please immediately notify us byreplying to the message and 
> delete the original message and any copies immediately thereafter.
> 
> Thank you.~
> **
> FAFLD
> 


Re: How to Increase the number of connexion on Solr/Tomcat6?

2012-07-20 Thread Bruno Mannina

More details:
First (around) 50 requests are very quick and after connection down 
(very slow) and freeze sometime.


I'm trying to install a tool to see what happens.



Le 20/07/2012 12:09, Bruno Mannina a écrit :

Dear Michael,

My system is:
Ubuntu 12.04
8Go Ram
4 cores

Concerning connector on server.xml, I don't modified something, so all 
values are default.

I have only one connector and no maxThreads are define inside.




Must I add a line with maxThreads=?



Le 20/07/2012 03:31, Michael Della Bitta a écrit :

Hi Bruno,

It's usually the maxThreads attribute in the  tag in
$CATALINA_HOME/conf/server.xml. But I kind of doubt you're running out
of threads... maybe you could post some more details about the system
you're running Solr on.

Michael Della Bitta


Appinions, Inc. -- Where Influence Isn’t a Game.
http://www.appinions.com


On Thu, Jul 19, 2012 at 6:47 PM, Bruno Mannina  wrote:

Dear Solr User,

I don't know if it's here that my question must be posted but I'm 
sure some

users have already had my problem.

Actually, I do 1556 requests with 4 Http components with my program. 
If I do

these requests without delay (500ms)
before sending each requests I have around 10% of requests with empty
answer. If I add delay before each requests I have no empty answer.

Empty answer has HTTP 200 OK, Header OK but Body = ''

Where can I increase the limit of Tomcat/Solr requests at the same 
time or

how can I solve my problem.

Thanks a lot for your Help,
Bruno










Re: Does defType overrides other settings for default request handler

2012-07-20 Thread Erick Erickson
Default operators are ignored by edismax. See the "mm" parameter here:

http://wiki.apache.org/solr/DisMaxQParserPlugin

Best
Erick

On Thu, Jul 19, 2012 at 8:16 AM, amitesh116  wrote:
> Hi,
>
> We have used *dismax* in our SOLR config with /defaultOperator="OR"/ and
> some *mm * settings. Recently, we have started using *defType=edismax * in
> query params. With this change, we have observed significant drop in results
> count. We doubt that SOLR is using default operator="AND" and hence reducing
> the results count. Please confirm if our suspicion is correct or are we
> missing some part?
>
> --
> View this message in context: 
> http://lucene.472066.n3.nabble.com/Does-defType-overrides-other-settings-for-default-request-handler-tp3995946.html
> Sent from the Solr - User mailing list archive at Nabble.com.


Re: Solr grouping / facet query

2012-07-20 Thread Erick Erickson
You might try two queries. The first would get your authors, the second
would use the returned authors as a filter query and search your titles, grouped
by author then combine the two lists. I don't know how big your corpus
is, but two
queries may well be fast enough

Best
Erick

On Thu, Jul 19, 2012 at 10:28 AM, s215903406
 wrote:
> Thanks for the reply.
>
> To clarify, the idea is to search for authors with certain specialties (eg.
> political, horror, etc.) and if they have any published titles relevant to
> the user's query, then display those titles next to the author's name.
>
> At first, I thought it would be great to have all the author's data (name,
> location, bio, titles with descriptions, etc) all in one document. Each
> title and description being a multivalued field, however, I have no idea how
> the "relevant titles" based on the user's query as described above can be
> quickly picked from within the document and displayed.
>
> The only solution I see is to have a doc per title and include the name,
> location, bio, etc in each one. As for the author's with no published
> titles, simply add their bio data to a document with no title or description
> and when I do the "grouping" check to see if the title is blank, then
> display "no titles found".
>
> This could work, though I'm concerned if having all that duplicate bio data
> will affect the relevancy of the results or speed/performance of solr?
>
> Thank you.
>
> --
> View this message in context: 
> http://lucene.472066.n3.nabble.com/Solr-grouping-facet-query-tp3995787p3995974.html
> Sent from the Solr - User mailing list archive at Nabble.com.


Re: Solr facet multiple constraint

2012-07-20 Thread davidbougearel
Ok it's nice a facet query, i will try this feature and will reply you but i
think that's the point, thanks a lot for time spent :)



--
View this message in context: 
http://lucene.472066.n3.nabble.com/Solr-facet-multiple-constraint-tp3992974p3996186.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: Solr Commit not working after delete

2012-07-20 Thread Erick Erickson
There's almost nothing to go on here. Please review:
http://wiki.apache.org/solr/UsingMailingLists

Best
Erick

On Thu, Jul 19, 2012 at 1:44 PM, Rohit  wrote:
> Hi Brandan,
>
> I am not sure if get whats being suggested. Our delete worked fine, but now
> no new data is going into the system.
>
> Could you please throw some more light.
>
> Regards,
> Rohit
>
> -Original Message-
> From: Brendan Grainger [mailto:brendan.grain...@gmail.com]
> Sent: 19 July 2012 17:33
> To: solr-user@lucene.apache.org
> Subject: Re: Solr Commit not working after delete
>
> You might be running into the same issue someone else had the other day:
>
> https://issues.apache.org/jira/browse/SOLR-3432
>
>
>
> On Jul 19, 2012, at 1:23 PM, Rohit wrote:
>
>> We delete some data from solr, post which solr is not accepting any
>> commit's. What could be wrong?
>>
>>
>>
>> We don't see any error in logs or anywhere else.
>>
>>
>>
>> Regards,
>>
>> Rohit
>>
>>
>>
>
>
>


Re: Solr facet multiple constraint

2012-07-20 Thread davidbougearel
I have tried and it works !

Thanks again a lot for this dude !

Regards,
David.



--
View this message in context: 
http://lucene.472066.n3.nabble.com/Solr-facet-multiple-constraint-tp3992974p3996189.html
Sent from the Solr - User mailing list archive at Nabble.com.


RE: NGram Indexing Basic Question

2012-07-20 Thread Husain, Yavar
Thanks Erick. Actually it was going in as a phrase query. I set the following 
filter and things are perfect



-Original Message-
From: Erick Erickson [mailto:erickerick...@gmail.com] 
Sent: Friday, July 20, 2012 5:23 PM
To: solr-user@lucene.apache.org
Subject: Re: NGram Indexing Basic Question

Try attaching &debugQuery=on to your query and look at the parsed query. My 
first guess is that your default operator is AND (or a.op in modern terms) and 
the ngram with "dl" in it is required.

Please paste the results here if that's not the cause.

Best
Erick

On Thu, Jul 19, 2012 at 7:29 AM, Husain, Yavar  wrote:
> I have set some of my fields to be NGram Indexed. Have also set analyzer both 
> at query as well as index level.
>
> Most of the stuff works fine except for use cases where I simply interchange 
> couple of characters.
>
> For an example: "springfield" retrieves correct matches, "springfi" retrieves 
> correct matches, "ingfield" retrieves correct matches.
>
> However when i say "springfiedl" it returns 0 results. I debugged and found 
> that at query/index level I have all correct N-Grams stored. So ideally it 
> should match "springfie" (which is there both in Query NGram and Index NGram) 
> and return me the correct results.
>
> As I was busy so did not get time to look at the code for NGram. What ideally 
> happens when I use NGram at Query level? Does it split the strings into 
> N-Grams and then send each of them to Solr Server?
>
> Thanks Sahi for your help yesterday. Appreciate that.
> 
> 
> **
> This message may contain confidential or 
> proprietary information intended only for the use of 
> theaddressee(s) named above or may contain information that is 
> legally privileged. If you arenot the intended addressee, or the 
> person responsible for delivering it to the intended addressee,you 
> are hereby notified that reading, disseminating, distributing or 
> copying this message is strictlyprohibited. If you have received 
> this message by mistake, please immediately notify us byreplying 
> to the message and delete the original message and any copies immediately 
> thereafter.  Thank you.~ 
> **
> FAFLD
> 


Re: Solr facet multiple constraint

2012-07-20 Thread Erick Erickson
NP, glad it's working for you!

On Fri, Jul 20, 2012 at 8:26 AM, davidbougearel
 wrote:
> I have tried and it works !
>
> Thanks again a lot for this dude !
>
> Regards,
> David.
>
>
>
> --
> View this message in context: 
> http://lucene.472066.n3.nabble.com/Solr-facet-multiple-constraint-tp3992974p3996189.html
> Sent from the Solr - User mailing list archive at Nabble.com.


Spellchecker component with multiple cores

2012-07-20 Thread sebio
Hello solr users,

i want to use the spellchecker component. All examples and tutorials i found
are dealing with one index. Our solr setup has multiple cores each one for a
single language. The spellchecker component should be based on the different
languages in the cores.

I am unsure how to handle the setup. Does it make sense to have a
spellchecker index for each index/language? If so how to handle the
solrconfig.xml files. Do i have to configure the request handlers and
spellchecker components in each core solrconfig.xml? Is it possible to
configure the dictionaries in the cores? 

What is the general case with core configurations. Does core configuration
files extend or overwrite the global config files? 

Perhaps someone could help me with that topic.

Kind regards,
Sebastian









--
View this message in context: 
http://lucene.472066.n3.nabble.com/Spellchecker-component-with-multiple-cores-tp3996216.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: Solr Monitoring Tool

2012-07-20 Thread Mark Miller
Hooking up Zabbix with Solr's / Java's JMX support is very powerful.

On Jul 20, 2012, at 5:58 AM, Suneel wrote:

> Hi, 
> 
> I am want to configure solr performance monitoring tool i surf a lot and
> found some tool like "zabbix, SolrGaze"
> but i am not able to decide which tool is better. 
> 
> I want integrated alerting option in tools so that i can received a message
> whether website performance going down or in case of exception.
> 
> Please help me, this will be very helpful for me.
> 
> Thanks
> 
> 
> 
> 
> -
> Regards,
> 
> Suneel Pandey
> Sr. Software Developer
> --
> View this message in context: 
> http://lucene.472066.n3.nabble.com/Solr-Monitoring-Tool-tp3996153.html
> Sent from the Solr - User mailing list archive at Nabble.com.

- Mark Miller
lucidimagination.com













Re: UTF-8

2012-07-20 Thread Mark Miller
It varies. Last I used Tomcat (some years ago) it defaulted to the system 
default encoding and you had to use -Dfile.encoding... to get UTF-8.

Jetty currently defaults to UTF-8.

On Jul 17, 2012, at 11:12 PM, William Bell wrote:

> -Dfile.encoding=UTF-8... Is this usually recommended for SOLR indexes?
> 
> Or is the encoding usually just handled by the servlet container like Jetty?
> 
> -- 
> Bill Bell
> billnb...@gmail.com
> cell 720-256-8076

- Mark Miller
lucidimagination.com













Re: Solr Monitoring Tool

2012-07-20 Thread Bernd Fehling
Hi,

I started with SysUsage http://sysusage.darold.net/ which grabs all system 
activities
using unix Sar and system commands. Pretty easy and simple.

I also tried zabbix. Very powerful but for me to much to configure.

Have now munin 2.0.2 installed for testing. Needs some perl knowledge to get
it configured right with fast-cgi and dynamic pages.
Munin has also problem reporting (critical, warning, unknown) and can work 
together
with nagios, if you already have nagios in use.

Regards,
Bernd


Am 20.07.2012 11:58, schrieb Suneel:
> Hi, 
> 
> I am want to configure solr performance monitoring tool i surf a lot and
> found some tool like "zabbix, SolrGaze"
> but i am not able to decide which tool is better. 
> 
> I want integrated alerting option in tools so that i can received a message
> whether website performance going down or in case of exception.
> 
> Please help me, this will be very helpful for me.
> 
> Thanks
> 




Re: Solr Monitoring Tool

2012-07-20 Thread Ahmet Arslan

> I am want to configure solr performance monitoring tool i
> surf a lot and
> found some tool like "zabbix, SolrGaze"

You might be interested in http://sematext.com/spm/index.html



NumberFormatException while indexing TextField with LengthFilter and then copying to tfloat

2012-07-20 Thread Chantal Ackermann
Hi all,

I'm trying to index float values that are not required, input is an XML file. I 
have problems avoiding the NFE.
I'm using SOLR 3.6.



Index input:
- XML using DataImportHandler with XPathProcessor

Data:
Optional, Float, CDATA like: 2.0 or 


Original Problem:
Empty values would cause a NumberFormatException when being loaded directly 
into a "tfloat" type field.

Processing chain (to avoid NFE):
via XPath loaded into a field of type text with a trim and length filter, then 
via copyField directive into the tfloat type field

data-config.xml:


schema.xml:
...









...






Problem:
Well, yet another NFE. But this time reported on the text field 
"s_estimated_hours":

WARNUNG: Error creating document : SolrInputDocument[{id=id(1.0)={2930}, 
s_estimated_hours=s_estimated_hours(1.0)={}}]
org.apache.solr.common.SolrException: ERROR: [doc=2930] Error adding field 
's_estimated_hours'=''
at 
org.apache.solr.update.DocumentBuilder.toDocument(DocumentBuilder.java:333)
at 
org.apache.solr.update.processor.RunUpdateProcessor.processAdd(RunUpdateProcessorFactory.java:60)
at 
org.apache.solr.update.processor.LogUpdateProcessor.processAdd(LogUpdateProcessorFactory.java:115)
at 
org.apache.solr.handler.dataimport.SolrWriter.upload(SolrWriter.java:66)
at 
org.apache.solr.handler.dataimport.DataImportHandler$1.upload(DataImportHandler.java:293)
at 
org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:723)
at 
org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:619)
at 
org.apache.solr.handler.dataimport.DocBuilder.doFullDump(DocBuilder.java:327)
at 
org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:225)
at 
org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:375)
at 
org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:445)
at 
org.apache.solr.handler.dataimport.DataImporter$1.run(DataImporter.java:426)
Caused by: java.lang.NumberFormatException: empty String
at 
sun.misc.FloatingDecimal.readJavaFormatString(FloatingDecimal.java:992)
at java.lang.Float.parseFloat(Float.java:422)
at org.apache.solr.schema.TrieField.createField(TrieField.java:410)
at org.apache.solr.schema.FieldType.createFields(FieldType.java:289)
at org.apache.solr.schema.SchemaField.createFields(SchemaField.java:107)
at 
org.apache.solr.update.DocumentBuilder.toDocument(DocumentBuilder.java:312)
... 11 more


It is like it would copy the empty value - which must not make it through the 
LengthFilter of "s_estimated_hours" - to the tfloat field "estimated_hours" 
anyway. How can I avoid this? Or is there any other way to make the indexer 
ignore the empty values when creating the tfloat fields? If it could at least 
create the document and enter the other values… (onError="continue" is not 
helping as this is only a Warning (I've tried))


BTW: I did try with the XPath that should only select those nodes with text: 
/issues/issue/estimated_hours[text()]
The result was that no values would make it into the tfloat fields while all 
documents would be indexed without warnings or errors. (I discarded this option 
thinking that the xpath was not correctly evaluated.)


Thank you for any suggestions!
Chantal

Why does PayloadTermWeight explanation hide the scorePayload Explanation?

2012-07-20 Thread Scott Smerchek
I'm using the PayloadTermQuery and scoring documents using a custom
algorithm based on the payloads of the matching terms. The algorithm is
implemented in the custom PayloadFunction and I have added an Override for
the explain. However, the PayloadTermWeight explanation hides the details
of the payload score...

Explanation payloadExpl = new Explanation(scorer.getPayloadScore(),
"scorePayload(...)");

This is different than the way that PayloadNearSpanWeight explains the
payload. It actually asks the payload function for the explanation rather
than hiding it:

Explanation payloadExpl = function.explain(doc, scorer.payloadsSeen,
scorer.payloadScore);

This seems like a bug/oversight. Should this be a JIRA issue, or is this
intended? The fix is obviously very simple (just ask the PayloadFunction as
in the PayloadNearSpanWeight).

- Scott


Re: How to Increase the number of connexion on Solr/Tomcat6?

2012-07-20 Thread Michael Della Bitta
Hi Bruno,

It seems the version of Tomcat I was running was customized by
Canonical to have that parameter. You might try to add it in... I have
no idea what the default is.

Do you have any idea how much RAM you're allocating to the Tomcat
process? It could be that something is off there.

http://wiki.razuna.com/display/ecp/Adjusting+Memory+Settings+for+Tomcat

Michael Della Bitta


Appinions, Inc. -- Where Influence Isn’t a Game.
http://www.appinions.com


On Fri, Jul 20, 2012 at 7:57 AM, Bruno Mannina  wrote:
> More details:
> First (around) 50 requests are very quick and after connection down (very
> slow) and freeze sometime.
>
> I'm trying to install a tool to see what happens.
>
>
>
> Le 20/07/2012 12:09, Bruno Mannina a écrit :
>
>> Dear Michael,
>>
>> My system is:
>> Ubuntu 12.04
>> 8Go Ram
>> 4 cores
>>
>> Concerning connector on server.xml, I don't modified something, so all
>> values are default.
>> I have only one connector and no maxThreads are define inside.
>>
>> > connectionTimeout="2"
>> URIEncoding="UTF-8"
>> redirectPort="8443" />
>>
>>
>> Must I add a line with maxThreads=?
>>
>>
>>
>> Le 20/07/2012 03:31, Michael Della Bitta a écrit :
>>>
>>> Hi Bruno,
>>>
>>> It's usually the maxThreads attribute in the  tag in
>>> $CATALINA_HOME/conf/server.xml. But I kind of doubt you're running out
>>> of threads... maybe you could post some more details about the system
>>> you're running Solr on.
>>>
>>> Michael Della Bitta
>>>
>>> 
>>> Appinions, Inc. -- Where Influence Isn’t a Game.
>>> http://www.appinions.com
>>>
>>>
>>> On Thu, Jul 19, 2012 at 6:47 PM, Bruno Mannina  wrote:

 Dear Solr User,

 I don't know if it's here that my question must be posted but I'm sure
 some
 users have already had my problem.

 Actually, I do 1556 requests with 4 Http components with my program. If
 I do
 these requests without delay (500ms)
 before sending each requests I have around 10% of requests with empty
 answer. If I add delay before each requests I have no empty answer.

 Empty answer has HTTP 200 OK, Header OK but Body = ''

 Where can I increase the limit of Tomcat/Solr requests at the same time
 or
 how can I solve my problem.

 Thanks a lot for your Help,
 Bruno
>>>
>>>
>>
>>
>>
>


RE: solr 4.0 cloud 303 error

2012-07-20 Thread John-Paul Drawneek
Hi.

Sorry for the noise.

Managed to get it working on another PC, so it must be something very local to 
the PC I am using.


From: John-Paul Drawneek
Sent: 19 July 2012 23:13
To: solr-user@lucene.apache.org
Subject: RE: solr 4.0 cloud 303 error

I did a search via both admin UI and /search

What I searched for was *:* as that was default in the search box in the admin 
ui (so expected something that was not an 303 error).

Will post url and server logs tomorrow when I am back in the office.

But i think the admin url was not anything odd.

Server logs was full of chatter between the nodes in the cloud setup.


From: Chris Hostetter [hossman_luc...@fucit.org]
Sent: 19 July 2012 23:03
To: solr-user
Subject: Re: solr 4.0 cloud 303 error

: > try to do a search - throws 303 error

Can you be specific about how exactly you did the search?

Was this from the admin UI?  what URL was in your browser location bar?
what values did you put in the form? what buttons did you click? what URL
was in your browser location bar when the error happened?

Can you post the logs from each of the servers from arround the time of
this error (a few lings of context before it happened as well)

: >> org.apache.solr.common.SolrException: Server at
: >> http://linux-vckp:8983/solr/collection1 returned non ok status:303,

that smells like something jetty *might* be returning automaticly
because the client asked for...
   http://linux-vckp:8983/solr/collection1
...instead of...
   http://linux-vckp:8983/solr/collection1/
... (ie: no trailing slash) ... but i'm not sure why HttpShardHandler
would be asking for either of those URLs w/o specifying a handler.



-Hoss




This email is intended for the addressee(s) named above. It may contain 
confidential or privileged information and should not be read, copied or 
otherwise used by any person for whom it was not intended.
If you have received this mail in error please contact the sender by return 
email and delete the email from your system.

The Royal National Theatre
Upper Ground, London, SE1 9PX
www.nationaltheatre.org.uk
Telephone numbers: BOX OFFICE +44 (0) 20 7452 3000, SWITCHBOARD +44 (0) 20 7452 

Registered in England as a company limited by guarantee, number 749504.
Registered Charity number 224223
 
Recipients are advised to apply their own virus checks to this message on 
delivery. 





Re: NumberFormatException while indexing TextField with LengthFilter and then copying to tfloat

2012-07-20 Thread Chris Hostetter

: Processing chain (to avoid NFE): via XPath loaded into a field of type 
: text with a trim and length filter, then via copyField directive into 
: the tfloat type field

The root of the problem you are seeing is that copyField directives are 
applied to the *raw* field values -- the analyzer used on your "source" 
field won't have any effect on the values given to your "dest" field.

My suggestion would be to modify the XPath expression you are using to 
pull data out of your original XML files and ignore  ""

Alternatively: there are some new UpdateProcessors available in 4.0 that 
let you easily prune field values based on various criteria (update 
porcessors happen well before copyField)...

http://lucene.apache.org/solr/api-4_0_0-ALPHA/org/apache/solr/update/processor/RemoveBlankFieldUpdateProcessorFactory.html


: Problem:
: Well, yet another NFE. But this time reported on the text field 
"s_estimated_hours":

I believe this is intentional, but i can understand how it might be 
confusing.

I think the point here is that since the field submitted by the client was 
named "s_estimated_hours" that's the field used in the error reported back 
to the client when something goes wrong with the copyField -- if the error 
message refered to "estimated_hours" the client may not have any idea 
why/where that field came from.

But i can certainly understand the confusion, i've opened SOLR-3657 to try 
and improve on this.  Ideally the error message should make it clear that 
the "value" from "source" field was copied to "dest" field which then 
encountered "error"

: 
: WARNUNG: Error creating document : SolrInputDocument[{id=id(1.0)={2930}, 
s_estimated_hours=s_estimated_hours(1.0)={}}]
: org.apache.solr.common.SolrException: ERROR: [doc=2930] Error adding field 
's_estimated_hours'=''
:   at 
org.apache.solr.update.DocumentBuilder.toDocument(DocumentBuilder.java:333)
...
: Caused by: java.lang.NumberFormatException: empty String
:   at 
sun.misc.FloatingDecimal.readJavaFormatString(FloatingDecimal.java:992)
:   at java.lang.Float.parseFloat(Float.java:422)
:   at org.apache.solr.schema.TrieField.createField(TrieField.java:410)
:   at org.apache.solr.schema.FieldType.createFields(FieldType.java:289)
:   at org.apache.solr.schema.SchemaField.createFields(SchemaField.java:107)
:   at 
org.apache.solr.update.DocumentBuilder.toDocument(DocumentBuilder.java:312)
:   ... 11 more


-Hoss


Re: Reg issue with indexing data from one of the sqlserver DB

2012-07-20 Thread Lakshmi Bhargavi
Thank you Michael .. 

I overlooked that . Now its working and got the data indexed.

Regards,
Lakshmi



--
View this message in context: 
http://lucene.472066.n3.nabble.com/Reg-issue-with-indexing-data-from-one-of-the-sqlserver-DB-tp3996078p3996303.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: Reg issue with indexing data from one of the sqlserver DB

2012-07-20 Thread Michael Della Bitta
Good to hear!

Michael Della Bitta


Appinions, Inc. -- Where Influence Isn’t a Game.
http://www.appinions.com


On Fri, Jul 20, 2012 at 2:56 PM, Lakshmi Bhargavi
 wrote:
> Thank you Michael ..
>
> I overlooked that . Now its working and got the data indexed.
>
> Regards,
> Lakshmi
>
>
>
> --
> View this message in context: 
> http://lucene.472066.n3.nabble.com/Reg-issue-with-indexing-data-from-one-of-the-sqlserver-DB-tp3996078p3996303.html
> Sent from the Solr - User mailing list archive at Nabble.com.


Re: Search results not returned for a str field

2012-07-20 Thread Michael Della Bitta
Hello, Lakshmi,

The issue is the fieldType you've assigned to the fields in your
schema does not perform any analysis on the string before indexing it.
So it will only do exact matches. If you want to do matches against
portions of the field value, use one of the "text" types that come in
the default schema.

Michael Della Bitta


Appinions, Inc. -- Where Influence Isn’t a Game.
http://www.appinions.com


On Fri, Jul 20, 2012 at 3:18 PM, Lakshmi Bhargavi
 wrote:
> Hi ,
>
> I have the following configuration
> 
>
>
> 
>   
> omitNorms="true"/>
>   
>
>  
>
>multiValued="false" required="true"/>
>multiValued="false" />
>multiValued="false" />
>multiValued="false" />
>  
>
>
>  id
>
>
>  name
>
>
>  
> 
>
> I am also attaching the solr config file
> http://lucene.472066.n3.nabble.com/file/n3996313/solrconfig.xml
> solrconfig.xml
>
> I indexed a document
>
> 
>   MA147LL/A
>   Apple 60 GB iPod with Video Playback Black
>
> 
>
> When I do a wildcard search , the results are returned
> http://localhost:8983/solr/select?q=*:*
>
>   
> - 
> - 
>   0
>   1
>   
> - 
> - 
>   MA147LL/A
>   Apple 60 GB iPod with Video Playback Black
>   
>   
>   
>
>
> but the results are not returned for specific query
> http://localhost:8983/solr/core0/select?q=iPod
>
> 
> - 
> - 
>   0
>   5
>   
>   
>   
>
> Could some one please let me know what is wrong? Also would be very helpful
> if some one can explain the significance of the defaultsearch field.
>
> Thanks,
> lakshmi
>
>
>
>
> --
> View this message in context: 
> http://lucene.472066.n3.nabble.com/Search-results-not-returned-for-a-str-field-tp3996313.html
> Sent from the Solr - User mailing list archive at Nabble.com.


Re: Search results not returned for a str field

2012-07-20 Thread Dikchant Sahi
DefaultSearchField is the field which is queried if you don't explicitly
specify the fields to query on.

Please refer to the below link:
http://wiki.apache.org/solr/SchemaXml

On Sat, Jul 21, 2012 at 12:56 AM, Michael Della Bitta <
michael.della.bi...@appinions.com> wrote:

> Hello, Lakshmi,
>
> The issue is the fieldType you've assigned to the fields in your
> schema does not perform any analysis on the string before indexing it.
> So it will only do exact matches. If you want to do matches against
> portions of the field value, use one of the "text" types that come in
> the default schema.
>
> Michael Della Bitta
>
> 
> Appinions, Inc. -- Where Influence Isn’t a Game.
> http://www.appinions.com
>
>
> On Fri, Jul 20, 2012 at 3:18 PM, Lakshmi Bhargavi
>  wrote:
> > Hi ,
> >
> > I have the following configuration
> > 
> >
> >
> > 
> >   
> > > omitNorms="true"/>
> >   
> >
> >  
> >
> >> multiValued="false" required="true"/>
> >> multiValued="false" />
> >> multiValued="false" />
> >> multiValued="false" />
> >  
> >
> >
> >  id
> >
> >
> >  name
> >
> >
> >  
> > 
> >
> > I am also attaching the solr config file
> > http://lucene.472066.n3.nabble.com/file/n3996313/solrconfig.xml
> > solrconfig.xml
> >
> > I indexed a document
> >
> > 
> >   MA147LL/A
> >   Apple 60 GB iPod with Video Playback Black
> >
> > 
> >
> > When I do a wildcard search , the results are returned
> > http://localhost:8983/solr/select?q=*:*
> >
> >   
> > - 
> > - 
> >   0
> >   1
> >   
> > - 
> > - 
> >   MA147LL/A
> >   Apple 60 GB iPod with Video Playback Black
> >   
> >   
> >   
> >
> >
> > but the results are not returned for specific query
> > http://localhost:8983/solr/core0/select?q=iPod
> >
> > 
> > - 
> > - 
> >   0
> >   5
> >   
> >   
> >   
> >
> > Could some one please let me know what is wrong? Also would be very
> helpful
> > if some one can explain the significance of the defaultsearch field.
> >
> > Thanks,
> > lakshmi
> >
> >
> >
> >
> > --
> > View this message in context:
> http://lucene.472066.n3.nabble.com/Search-results-not-returned-for-a-str-field-tp3996313.html
> > Sent from the Solr - User mailing list archive at Nabble.com.
>


Re: How to Increase the number of connexion on Solr/Tomcat6?

2012-07-20 Thread Bruno Mannina

Hi Michael,

I set Xms1024m Xmx2048

I will take a look to your link, thanks !!!

Actually, all my tests works slowlyeven with 150 requests :'(


Le 20/07/2012 18:17, Michael Della Bitta a écrit :

Hi Bruno,

It seems the version of Tomcat I was running was customized by
Canonical to have that parameter. You might try to add it in... I have
no idea what the default is.

Do you have any idea how much RAM you're allocating to the Tomcat
process? It could be that something is off there.

http://wiki.razuna.com/display/ecp/Adjusting+Memory+Settings+for+Tomcat

Michael Della Bitta


Appinions, Inc. -- Where Influence Isn’t a Game.
http://www.appinions.com


On Fri, Jul 20, 2012 at 7:57 AM, Bruno Mannina  wrote:

More details:
First (around) 50 requests are very quick and after connection down (very
slow) and freeze sometime.

I'm trying to install a tool to see what happens.



Le 20/07/2012 12:09, Bruno Mannina a écrit :


Dear Michael,

My system is:
Ubuntu 12.04
8Go Ram
4 cores

Concerning connector on server.xml, I don't modified something, so all
values are default.
I have only one connector and no maxThreads are define inside.




Must I add a line with maxThreads=?



Le 20/07/2012 03:31, Michael Della Bitta a écrit :

Hi Bruno,

It's usually the maxThreads attribute in the  tag in
$CATALINA_HOME/conf/server.xml. But I kind of doubt you're running out
of threads... maybe you could post some more details about the system
you're running Solr on.

Michael Della Bitta


Appinions, Inc. -- Where Influence Isn’t a Game.
http://www.appinions.com


On Thu, Jul 19, 2012 at 6:47 PM, Bruno Mannina  wrote:

Dear Solr User,

I don't know if it's here that my question must be posted but I'm sure
some
users have already had my problem.

Actually, I do 1556 requests with 4 Http components with my program. If
I do
these requests without delay (500ms)
before sending each requests I have around 10% of requests with empty
answer. If I add delay before each requests I have no empty answer.

Empty answer has HTTP 200 OK, Header OK but Body = ''

Where can I increase the limit of Tomcat/Solr requests at the same time
or
how can I solve my problem.

Thanks a lot for your Help,
Bruno











Re: Solr Add-ons

2012-07-20 Thread Ahmet Arslan
> I would like to know whether there
> exist any add-on for semantic search.inSolr?

May be this http://siren.sindice.com/ ?


Re: How to Increase the number of connexion on Solr/Tomcat6?

2012-07-20 Thread Bruno Mannina

hum... by using
|export JAVA_OPTS=||"-Xms1024m -Xmx2048m -XX:MaxPermSize=512m"|

it seems to be very quick, but I need to add delay between each requests 
because I loose answer with http answer 200 OK :'(


I must do another and another tests but It's a begin !

Le 20/07/2012 22:40, Bruno Mannina a écrit :

Hi Michael,

I set Xms1024m Xmx2048

I will take a look to your link, thanks !!!

Actually, all my tests works slowlyeven with 150 requests :'(


Le 20/07/2012 18:17, Michael Della Bitta a écrit :

Hi Bruno,

It seems the version of Tomcat I was running was customized by
Canonical to have that parameter. You might try to add it in... I have
no idea what the default is.

Do you have any idea how much RAM you're allocating to the Tomcat
process? It could be that something is off there.

http://wiki.razuna.com/display/ecp/Adjusting+Memory+Settings+for+Tomcat

Michael Della Bitta


Appinions, Inc. -- Where Influence Isn’t a Game.
http://www.appinions.com


On Fri, Jul 20, 2012 at 7:57 AM, Bruno Mannina  wrote:

More details:
First (around) 50 requests are very quick and after connection down 
(very

slow) and freeze sometime.

I'm trying to install a tool to see what happens.



Le 20/07/2012 12:09, Bruno Mannina a écrit :


Dear Michael,

My system is:
Ubuntu 12.04
8Go Ram
4 cores

Concerning connector on server.xml, I don't modified something, so all
values are default.
I have only one connector and no maxThreads are define inside.




Must I add a line with maxThreads=?



Le 20/07/2012 03:31, Michael Della Bitta a écrit :

Hi Bruno,

It's usually the maxThreads attribute in the  tag in
$CATALINA_HOME/conf/server.xml. But I kind of doubt you're running 
out

of threads... maybe you could post some more details about the system
you're running Solr on.

Michael Della Bitta


Appinions, Inc. -- Where Influence Isn’t a Game.
http://www.appinions.com


On Thu, Jul 19, 2012 at 6:47 PM, Bruno Mannina  
wrote:

Dear Solr User,

I don't know if it's here that my question must be posted but I'm 
sure

some
users have already had my problem.

Actually, I do 1556 requests with 4 Http components with my 
program. If

I do
these requests without delay (500ms)
before sending each requests I have around 10% of requests with 
empty

answer. If I add delay before each requests I have no empty answer.

Empty answer has HTTP 200 OK, Header OK but Body = ''

Where can I increase the limit of Tomcat/Solr requests at the 
same time

or
how can I solve my problem.

Thanks a lot for your Help,
Bruno















Re: How to Increase the number of connexion on Solr/Tomcat6?

2012-07-20 Thread Michael Della Bitta
Hmm, are you seeing any errors in $CATALINA_HOME/logs/catalina.out
that suggest that you're running out of permgen space, or anything
else?

Michael Della Bitta


Appinions, Inc. -- Where Influence Isn’t a Game.
http://www.appinions.com


On Fri, Jul 20, 2012 at 4:55 PM, Bruno Mannina  wrote:
> hum... by using
> |export JAVA_OPTS=||"-Xms1024m -Xmx2048m -XX:MaxPermSize=512m"|
>
> it seems to be very quick, but I need to add delay between each requests
> because I loose answer with http answer 200 OK :'(
>
> I must do another and another tests but It's a begin !
>
> Le 20/07/2012 22:40, Bruno Mannina a écrit :
>
>> Hi Michael,
>>
>> I set Xms1024m Xmx2048
>>
>> I will take a look to your link, thanks !!!
>>
>> Actually, all my tests works slowlyeven with 150 requests :'(
>>
>>
>> Le 20/07/2012 18:17, Michael Della Bitta a écrit :
>>>
>>> Hi Bruno,
>>>
>>> It seems the version of Tomcat I was running was customized by
>>> Canonical to have that parameter. You might try to add it in... I have
>>> no idea what the default is.
>>>
>>> Do you have any idea how much RAM you're allocating to the Tomcat
>>> process? It could be that something is off there.
>>>
>>> http://wiki.razuna.com/display/ecp/Adjusting+Memory+Settings+for+Tomcat
>>>
>>> Michael Della Bitta
>>>
>>> 
>>> Appinions, Inc. -- Where Influence Isn’t a Game.
>>> http://www.appinions.com
>>>
>>>
>>> On Fri, Jul 20, 2012 at 7:57 AM, Bruno Mannina  wrote:

 More details:
 First (around) 50 requests are very quick and after connection down
 (very
 slow) and freeze sometime.

 I'm trying to install a tool to see what happens.



 Le 20/07/2012 12:09, Bruno Mannina a écrit :

> Dear Michael,
>
> My system is:
> Ubuntu 12.04
> 8Go Ram
> 4 cores
>
> Concerning connector on server.xml, I don't modified something, so all
> values are default.
> I have only one connector and no maxThreads are define inside.
>
>   connectionTimeout="2"
>  URIEncoding="UTF-8"
>  redirectPort="8443" />
>
>
> Must I add a line with maxThreads=?
>
>
>
> Le 20/07/2012 03:31, Michael Della Bitta a écrit :
>>
>> Hi Bruno,
>>
>> It's usually the maxThreads attribute in the  tag in
>> $CATALINA_HOME/conf/server.xml. But I kind of doubt you're running out
>> of threads... maybe you could post some more details about the system
>> you're running Solr on.
>>
>> Michael Della Bitta
>>
>> 
>> Appinions, Inc. -- Where Influence Isn’t a Game.
>> http://www.appinions.com
>>
>>
>> On Thu, Jul 19, 2012 at 6:47 PM, Bruno Mannina 
>> wrote:
>>>
>>> Dear Solr User,
>>>
>>> I don't know if it's here that my question must be posted but I'm
>>> sure
>>> some
>>> users have already had my problem.
>>>
>>> Actually, I do 1556 requests with 4 Http components with my program.
>>> If
>>> I do
>>> these requests without delay (500ms)
>>> before sending each requests I have around 10% of requests with empty
>>> answer. If I add delay before each requests I have no empty answer.
>>>
>>> Empty answer has HTTP 200 OK, Header OK but Body = ''
>>>
>>> Where can I increase the limit of Tomcat/Solr requests at the same
>>> time
>>> or
>>> how can I solve my problem.
>>>
>>> Thanks a lot for your Help,
>>> Bruno
>>
>>
>
>
>>>
>>
>>
>>
>


Re: How to Increase the number of connexion on Solr/Tomcat6?

2012-07-20 Thread Bruno Mannina

Very Strange $CATALINA_HOME is empty ?!!!

Help is welcome !

Another thing, in the /usr/share/tomcat6/catalina.sh I added twice time
JAVA_OPTS="$JAVA_OPTS . -Xms1024m -Xmx2048m -XX:MaxPermSize=512m"



Le 20/07/2012 23:02, Michael Della Bitta a écrit :

Hmm, are you seeing any errors in $CATALINA_HOME/logs/catalina.out
that suggest that you're running out of permgen space, or anything
else?

Michael Della Bitta


Appinions, Inc. -- Where Influence Isn’t a Game.
http://www.appinions.com


On Fri, Jul 20, 2012 at 4:55 PM, Bruno Mannina  wrote:

hum... by using
|export JAVA_OPTS=||"-Xms1024m -Xmx2048m -XX:MaxPermSize=512m"|

it seems to be very quick, but I need to add delay between each requests
because I loose answer with http answer 200 OK :'(

I must do another and another tests but It's a begin !

Le 20/07/2012 22:40, Bruno Mannina a écrit :


Hi Michael,

I set Xms1024m Xmx2048

I will take a look to your link, thanks !!!

Actually, all my tests works slowlyeven with 150 requests :'(


Le 20/07/2012 18:17, Michael Della Bitta a écrit :

Hi Bruno,

It seems the version of Tomcat I was running was customized by
Canonical to have that parameter. You might try to add it in... I have
no idea what the default is.

Do you have any idea how much RAM you're allocating to the Tomcat
process? It could be that something is off there.

http://wiki.razuna.com/display/ecp/Adjusting+Memory+Settings+for+Tomcat

Michael Della Bitta


Appinions, Inc. -- Where Influence Isn’t a Game.
http://www.appinions.com


On Fri, Jul 20, 2012 at 7:57 AM, Bruno Mannina  wrote:

More details:
First (around) 50 requests are very quick and after connection down
(very
slow) and freeze sometime.

I'm trying to install a tool to see what happens.



Le 20/07/2012 12:09, Bruno Mannina a écrit :


Dear Michael,

My system is:
Ubuntu 12.04
8Go Ram
4 cores

Concerning connector on server.xml, I don't modified something, so all
values are default.
I have only one connector and no maxThreads are define inside.




Must I add a line with maxThreads=?



Le 20/07/2012 03:31, Michael Della Bitta a écrit :

Hi Bruno,

It's usually the maxThreads attribute in the  tag in
$CATALINA_HOME/conf/server.xml. But I kind of doubt you're running out
of threads... maybe you could post some more details about the system
you're running Solr on.

Michael Della Bitta


Appinions, Inc. -- Where Influence Isn’t a Game.
http://www.appinions.com


On Thu, Jul 19, 2012 at 6:47 PM, Bruno Mannina 
wrote:

Dear Solr User,

I don't know if it's here that my question must be posted but I'm
sure
some
users have already had my problem.

Actually, I do 1556 requests with 4 Http components with my program.
If
I do
these requests without delay (500ms)
before sending each requests I have around 10% of requests with empty
answer. If I add delay before each requests I have no empty answer.

Empty answer has HTTP 200 OK, Header OK but Body = ''

Where can I increase the limit of Tomcat/Solr requests at the same
time
or
how can I solve my problem.

Thanks a lot for your Help,
Bruno













Re: How to Increase the number of connexion on Solr/Tomcat6?

2012-07-20 Thread Michael Della Bitta
Sorry, if you're running the Ubuntu-provided Tomcat, your log should
be in /var/log/tomcat6/catalina.out.

Michael Della Bitta


Appinions, Inc. -- Where Influence Isn’t a Game.
http://www.appinions.com


On Fri, Jul 20, 2012 at 5:09 PM, Bruno Mannina  wrote:
> Very Strange $CATALINA_HOME is empty ?!!!
>
> Help is welcome !
>
> Another thing, in the /usr/share/tomcat6/catalina.sh I added twice time
> JAVA_OPTS="$JAVA_OPTS . -Xms1024m -Xmx2048m -XX:MaxPermSize=512m"
>
>
>
> Le 20/07/2012 23:02, Michael Della Bitta a écrit :
>
>> Hmm, are you seeing any errors in $CATALINA_HOME/logs/catalina.out
>> that suggest that you're running out of permgen space, or anything
>> else?
>>
>> Michael Della Bitta
>>
>> 
>> Appinions, Inc. -- Where Influence Isn’t a Game.
>> http://www.appinions.com
>>
>>
>> On Fri, Jul 20, 2012 at 4:55 PM, Bruno Mannina  wrote:
>>>
>>> hum... by using
>>> |export JAVA_OPTS=||"-Xms1024m -Xmx2048m -XX:MaxPermSize=512m"|
>>>
>>> it seems to be very quick, but I need to add delay between each requests
>>> because I loose answer with http answer 200 OK :'(
>>>
>>> I must do another and another tests but It's a begin !
>>>
>>> Le 20/07/2012 22:40, Bruno Mannina a écrit :
>>>
 Hi Michael,

 I set Xms1024m Xmx2048

 I will take a look to your link, thanks !!!

 Actually, all my tests works slowlyeven with 150 requests :'(


 Le 20/07/2012 18:17, Michael Della Bitta a écrit :
>
> Hi Bruno,
>
> It seems the version of Tomcat I was running was customized by
> Canonical to have that parameter. You might try to add it in... I have
> no idea what the default is.
>
> Do you have any idea how much RAM you're allocating to the Tomcat
> process? It could be that something is off there.
>
> http://wiki.razuna.com/display/ecp/Adjusting+Memory+Settings+for+Tomcat
>
> Michael Della Bitta
>
> 
> Appinions, Inc. -- Where Influence Isn’t a Game.
> http://www.appinions.com
>
>
> On Fri, Jul 20, 2012 at 7:57 AM, Bruno Mannina 
> wrote:
>>
>> More details:
>> First (around) 50 requests are very quick and after connection down
>> (very
>> slow) and freeze sometime.
>>
>> I'm trying to install a tool to see what happens.
>>
>>
>>
>> Le 20/07/2012 12:09, Bruno Mannina a écrit :
>>
>>> Dear Michael,
>>>
>>> My system is:
>>> Ubuntu 12.04
>>> 8Go Ram
>>> 4 cores
>>>
>>> Concerning connector on server.xml, I don't modified something, so
>>> all
>>> values are default.
>>> I have only one connector and no maxThreads are define inside.
>>>
>>> >>   connectionTimeout="2"
>>>   URIEncoding="UTF-8"
>>>   redirectPort="8443" />
>>>
>>>
>>> Must I add a line with maxThreads=?
>>>
>>>
>>>
>>> Le 20/07/2012 03:31, Michael Della Bitta a écrit :

 Hi Bruno,

 It's usually the maxThreads attribute in the  tag in
 $CATALINA_HOME/conf/server.xml. But I kind of doubt you're running
 out
 of threads... maybe you could post some more details about the
 system
 you're running Solr on.

 Michael Della Bitta

 
 Appinions, Inc. -- Where Influence Isn’t a Game.
 http://www.appinions.com


 On Thu, Jul 19, 2012 at 6:47 PM, Bruno Mannina 
 wrote:
>
> Dear Solr User,
>
> I don't know if it's here that my question must be posted but I'm
> sure
> some
> users have already had my problem.
>
> Actually, I do 1556 requests with 4 Http components with my
> program.
> If
> I do
> these requests without delay (500ms)
> before sending each requests I have around 10% of requests with
> empty
> answer. If I add delay before each requests I have no empty answer.
>
> Empty answer has HTTP 200 OK, Header OK but Body = ''
>
> Where can I increase the limit of Tomcat/Solr requests at the same
> time
> or
> how can I solve my problem.
>
> Thanks a lot for your Help,
> Bruno


>>>


>>
>


Re: How to Increase the number of connexion on Solr/Tomcat6?

2012-07-20 Thread Bruno Mannina

If I try to do a :

cd /var/log/tomcat6

I get a permission denied ??!!

tomcat6/ directory exists and it has drwxr-x--- 2 tomcat6 adm


Le 20/07/2012 23:16, Michael Della Bitta a écrit :

Sorry, if you're running the Ubuntu-provided Tomcat, your log should
be in /var/log/tomcat6/catalina.out.

Michael Della Bitta


Appinions, Inc. -- Where Influence Isn’t a Game.
http://www.appinions.com


On Fri, Jul 20, 2012 at 5:09 PM, Bruno Mannina  wrote:

Very Strange $CATALINA_HOME is empty ?!!!

Help is welcome !

Another thing, in the /usr/share/tomcat6/catalina.sh I added twice time
JAVA_OPTS="$JAVA_OPTS . -Xms1024m -Xmx2048m -XX:MaxPermSize=512m"



Le 20/07/2012 23:02, Michael Della Bitta a écrit :


Hmm, are you seeing any errors in $CATALINA_HOME/logs/catalina.out
that suggest that you're running out of permgen space, or anything
else?

Michael Della Bitta


Appinions, Inc. -- Where Influence Isn’t a Game.
http://www.appinions.com


On Fri, Jul 20, 2012 at 4:55 PM, Bruno Mannina  wrote:

hum... by using
|export JAVA_OPTS=||"-Xms1024m -Xmx2048m -XX:MaxPermSize=512m"|

it seems to be very quick, but I need to add delay between each requests
because I loose answer with http answer 200 OK :'(

I must do another and another tests but It's a begin !

Le 20/07/2012 22:40, Bruno Mannina a écrit :


Hi Michael,

I set Xms1024m Xmx2048

I will take a look to your link, thanks !!!

Actually, all my tests works slowlyeven with 150 requests :'(


Le 20/07/2012 18:17, Michael Della Bitta a écrit :

Hi Bruno,

It seems the version of Tomcat I was running was customized by
Canonical to have that parameter. You might try to add it in... I have
no idea what the default is.

Do you have any idea how much RAM you're allocating to the Tomcat
process? It could be that something is off there.

http://wiki.razuna.com/display/ecp/Adjusting+Memory+Settings+for+Tomcat

Michael Della Bitta


Appinions, Inc. -- Where Influence Isn’t a Game.
http://www.appinions.com


On Fri, Jul 20, 2012 at 7:57 AM, Bruno Mannina 
wrote:

More details:
First (around) 50 requests are very quick and after connection down
(very
slow) and freeze sometime.

I'm trying to install a tool to see what happens.



Le 20/07/2012 12:09, Bruno Mannina a écrit :


Dear Michael,

My system is:
Ubuntu 12.04
8Go Ram
4 cores

Concerning connector on server.xml, I don't modified something, so
all
values are default.
I have only one connector and no maxThreads are define inside.




Must I add a line with maxThreads=?



Le 20/07/2012 03:31, Michael Della Bitta a écrit :

Hi Bruno,

It's usually the maxThreads attribute in the  tag in
$CATALINA_HOME/conf/server.xml. But I kind of doubt you're running
out
of threads... maybe you could post some more details about the
system
you're running Solr on.

Michael Della Bitta


Appinions, Inc. -- Where Influence Isn’t a Game.
http://www.appinions.com


On Thu, Jul 19, 2012 at 6:47 PM, Bruno Mannina 
wrote:

Dear Solr User,

I don't know if it's here that my question must be posted but I'm
sure
some
users have already had my problem.

Actually, I do 1556 requests with 4 Http components with my
program.
If
I do
these requests without delay (500ms)
before sending each requests I have around 10% of requests with
empty
answer. If I add delay before each requests I have no empty answer.

Empty answer has HTTP 200 OK, Header OK but Body = ''

Where can I increase the limit of Tomcat/Solr requests at the same
time
or
how can I solve my problem.

Thanks a lot for your Help,
Bruno










Re: How to Increase the number of connexion on Solr/Tomcat6?

2012-07-20 Thread Michael Della Bitta
Bruno,

That sounds like either you need sudo permissions on your machine, or
you need help from someone who has them. Having a look at the logs in
there should be fairly revealing.

Failing that, you could always go back to Jetty. :)

Michael Della Bitta


Appinions, Inc. -- Where Influence Isn’t a Game.
http://www.appinions.com


On Fri, Jul 20, 2012 at 5:32 PM, Bruno Mannina  wrote:
> If I try to do a :
>
> cd /var/log/tomcat6
>
> I get a permission denied ??!!
>
> tomcat6/ directory exists and it has drwxr-x--- 2 tomcat6 adm
>
>
> Le 20/07/2012 23:16, Michael Della Bitta a écrit :
>
>> Sorry, if you're running the Ubuntu-provided Tomcat, your log should
>> be in /var/log/tomcat6/catalina.out.
>>
>> Michael Della Bitta
>>
>> 
>> Appinions, Inc. -- Where Influence Isn’t a Game.
>> http://www.appinions.com
>>
>>
>> On Fri, Jul 20, 2012 at 5:09 PM, Bruno Mannina  wrote:
>>>
>>> Very Strange $CATALINA_HOME is empty ?!!!
>>>
>>> Help is welcome !
>>>
>>> Another thing, in the /usr/share/tomcat6/catalina.sh I added twice time
>>> JAVA_OPTS="$JAVA_OPTS . -Xms1024m -Xmx2048m -XX:MaxPermSize=512m"
>>>
>>>
>>>
>>> Le 20/07/2012 23:02, Michael Della Bitta a écrit :
>>>
 Hmm, are you seeing any errors in $CATALINA_HOME/logs/catalina.out
 that suggest that you're running out of permgen space, or anything
 else?

 Michael Della Bitta

 
 Appinions, Inc. -- Where Influence Isn’t a Game.
 http://www.appinions.com


 On Fri, Jul 20, 2012 at 4:55 PM, Bruno Mannina  wrote:
>
> hum... by using
> |export JAVA_OPTS=||"-Xms1024m -Xmx2048m -XX:MaxPermSize=512m"|
>
> it seems to be very quick, but I need to add delay between each
> requests
> because I loose answer with http answer 200 OK :'(
>
> I must do another and another tests but It's a begin !
>
> Le 20/07/2012 22:40, Bruno Mannina a écrit :
>
>> Hi Michael,
>>
>> I set Xms1024m Xmx2048
>>
>> I will take a look to your link, thanks !!!
>>
>> Actually, all my tests works slowlyeven with 150 requests :'(
>>
>>
>> Le 20/07/2012 18:17, Michael Della Bitta a écrit :
>>>
>>> Hi Bruno,
>>>
>>> It seems the version of Tomcat I was running was customized by
>>> Canonical to have that parameter. You might try to add it in... I
>>> have
>>> no idea what the default is.
>>>
>>> Do you have any idea how much RAM you're allocating to the Tomcat
>>> process? It could be that something is off there.
>>>
>>>
>>> http://wiki.razuna.com/display/ecp/Adjusting+Memory+Settings+for+Tomcat
>>>
>>> Michael Della Bitta
>>>
>>> 
>>> Appinions, Inc. -- Where Influence Isn’t a Game.
>>> http://www.appinions.com
>>>
>>>
>>> On Fri, Jul 20, 2012 at 7:57 AM, Bruno Mannina 
>>> wrote:

 More details:
 First (around) 50 requests are very quick and after connection down
 (very
 slow) and freeze sometime.

 I'm trying to install a tool to see what happens.



 Le 20/07/2012 12:09, Bruno Mannina a écrit :

> Dear Michael,
>
> My system is:
> Ubuntu 12.04
> 8Go Ram
> 4 cores
>
> Concerning connector on server.xml, I don't modified something, so
> all
> values are default.
> I have only one connector and no maxThreads are define inside.
>
> connectionTimeout="2"
>URIEncoding="UTF-8"
>redirectPort="8443" />
>
>
> Must I add a line with maxThreads=?
>
>
>
> Le 20/07/2012 03:31, Michael Della Bitta a écrit :
>>
>> Hi Bruno,
>>
>> It's usually the maxThreads attribute in the  tag in
>> $CATALINA_HOME/conf/server.xml. But I kind of doubt you're running
>> out
>> of threads... maybe you could post some more details about the
>> system
>> you're running Solr on.
>>
>> Michael Della Bitta
>>
>> 
>> Appinions, Inc. -- Where Influence Isn’t a Game.
>> http://www.appinions.com
>>
>>
>> On Thu, Jul 19, 2012 at 6:47 PM, Bruno Mannina 
>> wrote:
>>>
>>> Dear Solr User,
>>>
>>> I don't know if it's here that my question must be posted but I'm
>>> sure
>>> some
>>> users have already had my problem.
>>>
>>> Actually, I do 1556 requests with 4 Http components with my
>>> pro

Re: How to Increase the number of connexion on Solr/Tomcat6?

2012-07-20 Thread Bruno Mannina

Michael,

I'm admin of my server, I have only 2 accounts.

If I use
sudo cd /var/log/tomcat6
I enter the pwd and I get the message:
sudo: cd: command not found

my account is admin.

I don't understand what happens but If I do:
sudo lsof -p pid_of_tomcat |grep log

I see several logs file :
catalinat.out   <-- twice
catalina.2012-07-20.log
localhost.2012-07-20.log

in the /var/log/tomcat6

I can see the content of all .log file but not catalina.out



Le 20/07/2012 23:34, Michael Della Bitta a écrit :

Bruno,

That sounds like either you need sudo permissions on your machine, or
you need help from someone who has them. Having a look at the logs in
there should be fairly revealing.

Failing that, you could always go back to Jetty. :)

Michael Della Bitta


Appinions, Inc. -- Where Influence Isn’t a Game.
http://www.appinions.com


On Fri, Jul 20, 2012 at 5:32 PM, Bruno Mannina  wrote:

If I try to do a :

cd /var/log/tomcat6

I get a permission denied ??!!

tomcat6/ directory exists and it has drwxr-x--- 2 tomcat6 adm


Le 20/07/2012 23:16, Michael Della Bitta a écrit :


Sorry, if you're running the Ubuntu-provided Tomcat, your log should
be in /var/log/tomcat6/catalina.out.

Michael Della Bitta


Appinions, Inc. -- Where Influence Isn’t a Game.
http://www.appinions.com


On Fri, Jul 20, 2012 at 5:09 PM, Bruno Mannina  wrote:

Very Strange $CATALINA_HOME is empty ?!!!

Help is welcome !

Another thing, in the /usr/share/tomcat6/catalina.sh I added twice time
JAVA_OPTS="$JAVA_OPTS . -Xms1024m -Xmx2048m -XX:MaxPermSize=512m"



Le 20/07/2012 23:02, Michael Della Bitta a écrit :


Hmm, are you seeing any errors in $CATALINA_HOME/logs/catalina.out
that suggest that you're running out of permgen space, or anything
else?

Michael Della Bitta


Appinions, Inc. -- Where Influence Isn’t a Game.
http://www.appinions.com


On Fri, Jul 20, 2012 at 4:55 PM, Bruno Mannina  wrote:

hum... by using
|export JAVA_OPTS=||"-Xms1024m -Xmx2048m -XX:MaxPermSize=512m"|

it seems to be very quick, but I need to add delay between each
requests
because I loose answer with http answer 200 OK :'(

I must do another and another tests but It's a begin !

Le 20/07/2012 22:40, Bruno Mannina a écrit :


Hi Michael,

I set Xms1024m Xmx2048

I will take a look to your link, thanks !!!

Actually, all my tests works slowlyeven with 150 requests :'(


Le 20/07/2012 18:17, Michael Della Bitta a écrit :

Hi Bruno,

It seems the version of Tomcat I was running was customized by
Canonical to have that parameter. You might try to add it in... I
have
no idea what the default is.

Do you have any idea how much RAM you're allocating to the Tomcat
process? It could be that something is off there.


http://wiki.razuna.com/display/ecp/Adjusting+Memory+Settings+for+Tomcat

Michael Della Bitta


Appinions, Inc. -- Where Influence Isn’t a Game.
http://www.appinions.com


On Fri, Jul 20, 2012 at 7:57 AM, Bruno Mannina 
wrote:

More details:
First (around) 50 requests are very quick and after connection down
(very
slow) and freeze sometime.

I'm trying to install a tool to see what happens.



Le 20/07/2012 12:09, Bruno Mannina a écrit :


Dear Michael,

My system is:
Ubuntu 12.04
8Go Ram
4 cores

Concerning connector on server.xml, I don't modified something, so
all
values are default.
I have only one connector and no maxThreads are define inside.




Must I add a line with maxThreads=?



Le 20/07/2012 03:31, Michael Della Bitta a écrit :

Hi Bruno,

It's usually the maxThreads attribute in the  tag in
$CATALINA_HOME/conf/server.xml. But I kind of doubt you're running
out
of threads... maybe you could post some more details about the
system
you're running Solr on.

Michael Della Bitta


Appinions, Inc. -- Where Influence Isn’t a Game.
http://www.appinions.com


On Thu, Jul 19, 2012 at 6:47 PM, Bruno Mannina 
wrote:

Dear Solr User,

I don't know if it's here that my question must be posted but I'm
sure
some
users have already had my problem.

Actually, I do 1556 requests with 4 Http components with my
program.
If
I do
these requests without delay (500ms)
before sending each requests I have around 10% of requests with
empty
answer. If I add delay before each requests I have no empty
answer.

Empty answer has HTTP 200 OK, Header OK but Body = ''

Where can I increase the limit of Tomcat/Solr requests at the
same
time
or
how can I solve my problem.

Thanks a lot for your Help,
Bruno








Re: How to Increase the number of connexion on Solr/Tomcat6?

2012-07-20 Thread Bruno Mannina

Le 21/07/2012 00:00, Bruno Mannina a écrit :
catalinat.out <-- twice 

Sorry concerning this file, I do a
sudo cat .. |more and it's ok I see the content


Re: How to Increase the number of connexion on Solr/Tomcat6?

2012-07-20 Thread Bruno Mannina

Le 21/07/2012 00:02, Bruno Mannina a écrit :

Le 21/07/2012 00:00, Bruno Mannina a écrit :
catalinat.out <-- twice 

Sorry concerning this file, I do a
sudo cat .. |more and it's ok I see the content

And inside the catalina.out I have all my requests, without error or 
missing requests


:'( it's amazing


Re: SOLR 4 Alpha Out Of Mem Err

2012-07-20 Thread sausarkar
Hi Mark,

I am am also facing the same issue when trying to index in SolrCloud using
DIH running on a non-leader server. The DIH server is creating around 10k
threads and then OOM cannot create thread error. 

Do you know when or which version this issue will be solved. I think a
workaround for this issue is to find the leader from zookeeper and run the
DIH on the leader.

Sauvik



--
View this message in context: 
http://lucene.472066.n3.nabble.com/SOLR-4-Alpha-Out-Of-Mem-Err-tp3995033p3996378.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: How to Increase the number of connexion on Solr/Tomcat6?

2012-07-20 Thread Bruno Mannina

In the catalina.out, I have only these few rows with:

.
INFO: Closing Searcher@1faa614 main
fieldValueCache{lookups=0,hits=0,hitratio=0.00,inserts=0,evictions=0,size=0,warmupTime=0,cumulative_lookups=0,cumulative_hits=0,cumulative_hitratio=0.00,cumulative_inserts=0,cumulative_evictions=0}
15 juil. 2012 13:51:31 org.apache.catalina.loader.WebappClassLoader 
clearThreadLocalMap
GRAVE: The web application [/solr] created a ThreadLocal with key of 
type [org.apache.solr.schema.DateField.ThreadLocalDateFormat] (value 
[org.apache.solr.schema.DateField$ThreadLocalDateFormat@75a744]) and a 
value of type 
[org.apache.solr.schema.DateField.ISO8601CanonicalDateFormat] (value 
[org.apache.solr.schema.DateField$ISO8601CanonicalDateFormat@6b2ed43a]) 
but failed to remove it when the web application was stopped. This is 
very likely to create a memory leak.
15 juil. 2012 13:51:31 org.apache.catalina.loader.WebappClassLoader 
clearThreadLocalMap
GRAVE: The web application [/solr] created a ThreadLocal with key of 
type [org.apache.solr.schema.DateField.ThreadLocalDateFormat] (value 
[org.apache.solr.schema.DateField$ThreadLocalDateFormat@75a744]) and a 
value of type 
[org.apache.solr.schema.DateField.ISO8601CanonicalDateFormat] (value 
[org.apache.solr.schema.DateField$ISO8601CanonicalDateFormat@6b2ed43a]) 
but failed to remove it when the web application was stopped. This is 
very likely to create a memory leak.
15 juil. 2012 13:51:31 org.apache.catalina.loader.WebappClassLoader 
clearThreadLocalMap
GRAVE: The web application [/solr] created a ThreadLocal with key of 
type [org.apache.solr.schema.DateField.ThreadLocalDateFormat] (value 
[org.apache.solr.schema.DateField$ThreadLocalDateFormat@75a744]) and a 
value of type 
[org.apache.solr.schema.DateField.ISO8601CanonicalDateFormat] (value 
[org.apache.solr.schema.DateField$ISO8601CanonicalDateFormat@6b2ed43a]) 
but failed to remove it when the web application was stopped. This is 
very likely to create a memory leak.
15 juil. 2012 13:51:31 org.apache.catalina.loader.WebappClassLoader 
clearThreadLocalMap
GRAVE: The web application [/solr] created a ThreadLocal with key of 
type [org.apache.solr.schema.DateField.ThreadLocalDateFormat] (value 
[org.apache.solr.schema.DateField$ThreadLocalDateFormat@75a744]) and a 
value of type 
[org.apache.solr.schema.DateField.ISO8601CanonicalDateFormat] (value 
[org.apache.solr.schema.DateField$ISO8601CanonicalDateFormat@6b2ed43a]) 
but failed to remove it when the web application was stopped. This is 
very likely to create a memory leak.

15 juil. 2012 13:51:31 org.apache.coyote.http11.Http11Protocol destroy
INFO: Arrêt de Coyote HTTP/1.1 sur http-8983
15 juil. 2012 13:54:29 org.apache.catalina.startup.ClassLoaderFactory 
validateFile
ATTENTION: Problem with directory [/usr/share/tomcat6/server/classes], 
exists: [false], isDirectory: [false], canRead: [false]
15 juil. 2012 13:54:29 org.apache.catalina.startup.ClassLoaderFactory 
validateFile
ATTENTION: Problem with directory [/usr/share/tomcat6/server], exists: 
[false], isDirectory: [false], canRead: [false]
15 juil. 2012 13:54:29 org.apache.catalina.startup.ClassLoaderFactory 
validateFile
ATTENTION: Problem with directory [/usr/share/tomcat6/shared/classes], 
exists: [false], isDirectory: [false], canRead: [false]
15 juil. 2012 13:54:29 org.apache.catalina.startup.ClassLoaderFactory 
validateFile
ATTENTION: Problem with directory [/usr/share/tomcat6/shared], exists: 
[false], isDirectory: [false], canRead: [false]

15 juil. 2012 13:54:29 org.apache.coyote.http11.Http11Protocol init
INFO: Initialisation de Coyote HTTP/1.1 sur http-8983
...
...
...

Le 21/07/2012 00:04, Bruno Mannina a écrit :

Le 21/07/2012 00:02, Bruno Mannina a écrit :

Le 21/07/2012 00:00, Bruno Mannina a écrit :
catalinat.out <-- twice 

Sorry concerning this file, I do a
sudo cat .. |more and it's ok I see the content

And inside the catalina.out I have all my requests, without error or 
missing requests


:'( it's amazing






Re: SOLR 4 Alpha Out Of Mem Err

2012-07-20 Thread Mark Miller
>
> Hi Mark,
> I am am also facing the same issue when trying to index in SolrCloud using
> DIH running on a non-leader server. The DIH server is creating around 10k
> threads and then OOM cannot create thread error.
> Do you know when or which version this issue will be solved. I think a
> workaround for this issue is to find the leader from zookeeper and run the
> DIH on the leader.
> Sauvik


Oddly, I didn't get the above email that I can find - just found it on
markmail...odd.

I've made a JIRA issue for this:
https://issues.apache.org/jira/browse/SOLR-3658

I'm working on / testing a fix.

Yes, it affects both httpconcurrent server and dih. Anything that adds docs
with a single UpdateProcessor chain rather than a new one for each doc - as
long as it's adding fast enough.


RE: Solr grouping / facet query

2012-07-20 Thread Petersen, Robert
Why not just index one title per document, each having author and specialty 
fields included?  Then you could search titles with a user query and also 
filter/facet on the author and specialties at the same time.   The author bio 
and other data could be looked up on the fly from a DB if you didn't want to 
store that all in each document.  If the users query is for the titles though, 
I don't really see the point of indexing authors with no titles but you could 
include them with nothing in the title field if you wanted them to show up in 
facets or use a title placeholder for them which says 'No Titles Available' 
perhaps.

Just a thought
Robi


-Original Message-
From: Erick Erickson [mailto:erickerick...@gmail.com] 
Sent: Friday, July 20, 2012 5:07 AM
To: solr-user@lucene.apache.org
Subject: Re: Solr grouping / facet query

You might try two queries. The first would get your authors, the second would 
use the returned authors as a filter query and search your titles, grouped by 
author then combine the two lists. I don't know how big your corpus is, but two 
queries may well be fast enough

Best
Erick

On Thu, Jul 19, 2012 at 10:28 AM, s215903406  
wrote:
> Thanks for the reply.
>
> To clarify, the idea is to search for authors with certain specialties (eg.
> political, horror, etc.) and if they have any published titles 
> relevant to the user's query, then display those titles next to the author's 
> name.
>
> At first, I thought it would be great to have all the author's data 
> (name, location, bio, titles with descriptions, etc) all in one 
> document. Each title and description being a multivalued field, 
> however, I have no idea how the "relevant titles" based on the user's 
> query as described above can be quickly picked from within the document and 
> displayed.
>
> The only solution I see is to have a doc per title and include the 
> name, location, bio, etc in each one. As for the author's with no 
> published titles, simply add their bio data to a document with no 
> title or description and when I do the "grouping" check to see if the 
> title is blank, then display "no titles found".
>
> This could work, though I'm concerned if having all that duplicate bio 
> data will affect the relevancy of the results or speed/performance of solr?
>
> Thank you.
>
> --
> View this message in context: 
> http://lucene.472066.n3.nabble.com/Solr-grouping-facet-query-tp3995787
> p3995974.html Sent from the Solr - User mailing list archive at 
> Nabble.com.




SOLR 4 Alpha - distributed DIH available?

2012-07-20 Thread sausarkar
If I try to run DIH on the SolrCloud it can hit any one of the servers and
start the import process, but if we try to get the import status from any
other server it returns no import is running. Only the server that is
running the DIH gives back the correct import status. So if we run DIH
behind a load balancer we can get incorrect import status so we have to
stick DIH to a specific server.

So my question is there a distributed DIH available for the SolrCloud? 



--
View this message in context: 
http://lucene.472066.n3.nabble.com/SOLR-4-Alpha-distributed-DIH-available-tp3996404.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: ICUCollation throws exception

2012-07-20 Thread Robert Muir
Can you include the entire exception? This is really necessary!

On Tue, Jul 17, 2012 at 2:58 AM, Oliver Schihin
 wrote:
> Hello
>
> According to release notes from 4.0.0-ALPHA, SOLR-2396, I replaced
> ICUCollationKeyFilterFactory with ICUCollationField in our schema. But this
> throws an exception, see the following excerpt from the log:
> 
> Jul 16, 2012 5:27:48 PM org.apache.solr.common.SolrException log
> SEVERE: null:org.apache.solr.common.SolrException: Plugin init failure for
> [schema.xml] fieldType "alphaOnlySort": Pl
> ugin init failure for [schema.xml] analyzer/filter: class
> org.apache.solr.schema.ICUCollationField
> at
> org.apache.solr.util.plugin.AbstractPluginLoader.load(AbstractPluginLoader.java:168)
> at org.apache.solr.schema.IndexSchema.readSchema(IndexSchema.java:359)
> 
> The deprecated filter of ICUCollationKeyFilterFactory is working without any
> problem. This is how I did the schema (with the deprecated filter):
> 
>
> sortMissingLast="true" omitNorms="true">
>   
> 
>  locale="de@collation=phonebook"
> strength="primary"
>  />
>   
> 
> 
>
> Do I have to replace jars in /contrib/analysis-extras/, or any other hints
> of what might be wrong in my install and configuration?
>
> Thanks a lot
> Oliver
>
>



-- 
lucidimagination.com


Re: Solr Monitoring Tool

2012-07-20 Thread Lance Norskog
Also, newrelic.com has a saas-based Solr monitor. This and Sematec are
the least work

We use Zabbix internally in LucidWorks Cloud. We picked it for a
production site because it connects to JMX, monitors & archives,
graphs, and sends alerts. We could not find anything else that did all
of these well. The JMX interface specification for Zabbix is included
in our free LucidWorks download at
http://www.lucidimagination.com/downloads

On Fri, Jul 20, 2012 at 7:08 AM, Ahmet Arslan  wrote:
>
>> I am want to configure solr performance monitoring tool i
>> surf a lot and
>> found some tool like "zabbix, SolrGaze"
>
> You might be interested in http://sematext.com/spm/index.html
>



-- 
Lance Norskog
goks...@gmail.com


Re: Importing data to Solr

2012-07-20 Thread Lance Norskog
> My data is in an enormous text file that is parsed in python,

You mean it is in Python s-expressions? I don't think there is a
parser in DIH for that.

On Thu, Jul 19, 2012 at 9:27 AM, Erick Erickson  wrote:
> First, turn off all your soft commit stuff, that won't help in your situation.
> If you do leave autocommit on, make it a really high number
> (let's say 1,000,000 to start).
>
> You won't have to make 300M calls, you can batch, say, 1,000 docs
> into each request.
>
> DIH supports a bunch of different data sources, take a
> look at: http://wiki.apache.org/solr/DataImportHandler, the
> EntityProcessor, DataSource and the like.
>
> There is also the CSV update processor, see:
> http://wiki.apache.org/solr/UpdateCSV. It might be better to, say,
> break up your massive file into N CSV files and import those.
>
> Best
> Erick
>
> On Thu, Jul 19, 2012 at 12:04 PM, Jonatan Fournier
>  wrote:
>> Hello,
>>
>> I was wondering if there's other ways to import data in Solr than
>> posting xml/json/csv to the server URL (e.g. locally building the
>> index). Is the DataImporter only for database?
>>
>> My data is in an enormous text file that is parsed in python, I get
>> clean json/xml out of it if I want, but the thing is that it drills
>> down to about 300 millions "documents", so I don't want to execute 300
>> millions http post in a for loop, even with relaxed soft commits etc
>> it will take weeks, months to populate the index.
>>
>> I need to do that only once on an offline server and never add data
>> back to the index (e.g. becomes a read-only instance).
>>
>> Any temporary index configuration I could have to populate the server
>> with optimal add speed, then turn back the settings optimized for a
>> read only instance?
>>
>> Thanks!
>>
>> --
>> jonatan



-- 
Lance Norskog
goks...@gmail.com