Score is always zero

2014-04-20 Thread Arun V C .
Score is always zero

my search query is

http://localhost:8080/solr/nutch?indent=on&defType=dismax&version=2.2&q=potenzialit%C3%A0&fq=host:www.gruppozenit.com&debugQuery=true&cache=false

There are 6 results
Word potenzialità is appearing in title and content in the 5th record, that 
should be ranked 1st right.

Am i doing something wrong

See the debug statement.

potenzialitàpotenzialità+DisjunctionMaxQuery((content:potenzialità^3.0 | 
title:potenzialità^5.0 | anchor:potenzialità)~0.01) 
DisjunctionMaxQuery((content:potenzialità^3.0 | title:potenzialità^5.0 | 
anchor:potenzialità)~0.01)+(content:potenzialità^3.0 | title:potenzialità^5.0 
| anchor:potenzialità)~0.01 (content:potenzialità^3.0 | title:potenzialità^5.0 
| anchor:potenzialità)~0.01http://www.gruppozenit.com/eng/digital_dalla_rete.html";>
0.0 = (MATCH) sum of:
  0.0 = (MATCH) max plus 0.01 times others of:
0.0 = (MATCH) weight(content:potenzialità^3.0 in 639), product of:
  0.22427149 = queryWeight(content:potenzialità^3.0), product of:
3.0 = boost
3.9210415 = idf(docFreq=65, maxDocs=1225)
0.019065639 = queryNorm
  0.0 = (MATCH) fieldWeight(content:potenzialità in 639), product of:
1.0 = tf(termFreq(content:potenzialità)=1)
3.9210415 = idf(docFreq=65, maxDocs=1225)
0.0 = fieldNorm(field=content, doc=639)
  0.0 = (MATCH) max plus 0.01 times others of:
0.0 = (MATCH) weight(content:potenzialità^3.0 in 639), product of:
  0.22427149 = queryWeight(content:potenzialità^3.0), product of:
3.0 = boost
3.9210415 = idf(docFreq=65, maxDocs=1225)
0.019065639 = queryNorm
  0.0 = (MATCH) fieldWeight(content:potenzialità in 639), product of:
1.0 = tf(termFreq(content:potenzialità)=1)
3.9210415 = idf(docFreq=65, maxDocs=1225)
0.0 = fieldNorm(field=content, doc=639)
http://www.gruppozenit.com/ita/digital_communication.html";>
0.0 = (MATCH) sum of:
  0.0 = (MATCH) max plus 0.01 times others of:
0.0 = (MATCH) weight(content:potenzialità^3.0 in 725), product of:
  0.22427149 = queryWeight(content:potenzialità^3.0), product of:
3.0 = boost
3.9210415 = idf(docFreq=65, maxDocs=1225)
0.019065639 = queryNorm
  0.0 = (MATCH) fieldWeight(content:potenzialità in 725), product of:
1.0 = tf(termFreq(content:potenzialità)=1)
3.9210415 = idf(docFreq=65, maxDocs=1225)
0.0 = fieldNorm(field=content, doc=725)
  0.0 = (MATCH) max plus 0.01 times others of:
0.0 = (MATCH) weight(content:potenzialità^3.0 in 725), product of:
  0.22427149 = queryWeight(content:potenzialità^3.0), product of:
3.0 = boost
3.9210415 = idf(docFreq=65, maxDocs=1225)
0.019065639 = queryNorm
  0.0 = (MATCH) fieldWeight(content:potenzialità in 725), product of:
1.0 = tf(termFreq(content:potenzialità)=1)
3.9210415 = idf(docFreq=65, maxDocs=1225)
0.0 = fieldNorm(field=content, doc=725)
http://www.gruppozenit.com/ita/digital_dalla_rete.html";>
0.0 = (MATCH) sum of:
  0.0 = (MATCH) max plus 0.01 times others of:
0.0 = (MATCH) weight(content:potenzialità^3.0 in 726), product of:
  0.22427149 = queryWeight(content:potenzialità^3.0), product of:
3.0 = boost
3.9210415 = idf(docFreq=65, maxDocs=1225)
0.019065639 = queryNorm
  0.0 = (MATCH) fieldWeight(content:potenzialità in 726), product of:
1.0 = tf(termFreq(content:potenzialità)=1)
3.9210415 = idf(docFreq=65, maxDocs=1225)
0.0 = fieldNorm(field=content, doc=726)
  0.0 = (MATCH) max plus 0.01 times others of:
0.0 = (MATCH) weight(content:potenzialità^3.0 in 726), product of:
  0.22427149 = queryWeight(content:potenzialità^3.0), product of:
3.0 = boost
3.9210415 = idf(docFreq=65, maxDocs=1225)
0.019065639 = queryNorm
  0.0 = (MATCH) fieldWeight(content:potenzialità in 726), product of:
1.0 = tf(termFreq(content:potenzialità)=1)
3.9210415 = idf(docFreq=65, maxDocs=1225)
0.0 = fieldNorm(field=content, doc=726)
http://www.gruppozenit.com/ita/digital_servizi.html";>
0.0 = (MATCH) sum of:
  0.0 = (MATCH) max plus 0.01 times others of:
0.0 = (MATCH) weight(content:potenzialità^3.0 in 728), product of:
  0.22427149 = queryWeight(content:potenzialità^3.0), product of:
3.0 = boost
3.9210415 = idf(docFreq=65, maxDocs=1225)
0.019065639 = queryNorm
  0.0 = (MATCH) fieldWeight(content:potenzialità in 728), product of:
1.0 = tf(termFreq(content:potenzialità)=1)
3.9210415 = idf(docFreq=65, maxDocs=1225)
0.0 = fieldNorm(field=content, doc=728)
  0.0 = (MATCH) max plus 0.01 times others of:
0.0 = (MATCH) weight(content:pote

RE: Score is always zero

2014-04-20 Thread Doug Turnbull
Arun, your field norms are all suspiciously 0, which.multiplied through
the scoring calculation causes the overall score to be 0.

Are you using anything other than the default similarity? Could you
post the relevant parts.of your schema (field definition, field types,
similarity, etc)?

-Doug

Sent from my Windows Phone From: Arun V C.
Sent: ‎4/‎20/‎2014 11:09 AM
To: solr-user@lucene.apache.org
Subject: Score is always zero
Score is always zero

my search query is

http://localhost:8080/solr/nutch?indent=on&defType=dismax&version=2.2&q=potenzialit%C3%A0&fq=host:www.gruppozenit.com&debugQuery=true&cache=false

There are 6 results
Word potenzialità is appearing in title and content in the 5th record,
that should be ranked 1st right.

Am i doing something wrong

See the debug statement.

potenzialitàpotenzialità+DisjunctionMaxQuery((content:potenzialità^3.0 |
title:potenzialità^5.0 | anchor:potenzialità)~0.01)
DisjunctionMaxQuery((content:potenzialità^3.0 | title:potenzialità^5.0
| anchor:potenzialità)~0.01)+(content:potenzialità^3.0 |
title:potenzialità^5.0 | anchor:potenzialità)~0.01
(content:potenzialità^3.0 | title:potenzialità^5.0 |
anchor:potenzialità)~0.01http://www.gruppozenit.com/eng/digital_dalla_rete.html";>
0.0 = (MATCH) sum of:
  0.0 = (MATCH) max plus 0.01 times others of:
0.0 = (MATCH) weight(content:potenzialità^3.0 in 639), product of:
  0.22427149 = queryWeight(content:potenzialità^3.0), product of:
3.0 = boost
3.9210415 = idf(docFreq=65, maxDocs=1225)
0.019065639 = queryNorm
  0.0 = (MATCH) fieldWeight(content:potenzialità in 639), product of:
1.0 = tf(termFreq(content:potenzialità)=1)
3.9210415 = idf(docFreq=65, maxDocs=1225)
0.0 = fieldNorm(field=content, doc=639)
  0.0 = (MATCH) max plus 0.01 times others of:
0.0 = (MATCH) weight(content:potenzialità^3.0 in 639), product of:
  0.22427149 = queryWeight(content:potenzialità^3.0), product of:
3.0 = boost
3.9210415 = idf(docFreq=65, maxDocs=1225)
0.019065639 = queryNorm
  0.0 = (MATCH) fieldWeight(content:potenzialità in 639), product of:
1.0 = tf(termFreq(content:potenzialità)=1)
3.9210415 = idf(docFreq=65, maxDocs=1225)
0.0 = fieldNorm(field=content, doc=639)
http://www.gruppozenit.com/ita/digital_communication.html";>
0.0 = (MATCH) sum of:
  0.0 = (MATCH) max plus 0.01 times others of:
0.0 = (MATCH) weight(content:potenzialità^3.0 in 725), product of:
  0.22427149 = queryWeight(content:potenzialità^3.0), product of:
3.0 = boost
3.9210415 = idf(docFreq=65, maxDocs=1225)
0.019065639 = queryNorm
  0.0 = (MATCH) fieldWeight(content:potenzialità in 725), product of:
1.0 = tf(termFreq(content:potenzialità)=1)
3.9210415 = idf(docFreq=65, maxDocs=1225)
0.0 = fieldNorm(field=content, doc=725)
  0.0 = (MATCH) max plus 0.01 times others of:
0.0 = (MATCH) weight(content:potenzialità^3.0 in 725), product of:
  0.22427149 = queryWeight(content:potenzialità^3.0), product of:
3.0 = boost
3.9210415 = idf(docFreq=65, maxDocs=1225)
0.019065639 = queryNorm
  0.0 = (MATCH) fieldWeight(content:potenzialità in 725), product of:
1.0 = tf(termFreq(content:potenzialità)=1)
3.9210415 = idf(docFreq=65, maxDocs=1225)
0.0 = fieldNorm(field=content, doc=725)
http://www.gruppozenit.com/ita/digital_dalla_rete.html";>
0.0 = (MATCH) sum of:
  0.0 = (MATCH) max plus 0.01 times others of:
0.0 = (MATCH) weight(content:potenzialità^3.0 in 726), product of:
  0.22427149 = queryWeight(content:potenzialità^3.0), product of:
3.0 = boost
3.9210415 = idf(docFreq=65, maxDocs=1225)
0.019065639 = queryNorm
  0.0 = (MATCH) fieldWeight(content:potenzialità in 726), product of:
1.0 = tf(termFreq(content:potenzialità)=1)
3.9210415 = idf(docFreq=65, maxDocs=1225)
0.0 = fieldNorm(field=content, doc=726)
  0.0 = (MATCH) max plus 0.01 times others of:
0.0 = (MATCH) weight(content:potenzialità^3.0 in 726), product of:
  0.22427149 = queryWeight(content:potenzialità^3.0), product of:
3.0 = boost
3.9210415 = idf(docFreq=65, maxDocs=1225)
0.019065639 = queryNorm
  0.0 = (MATCH) fieldWeight(content:potenzialità in 726), product of:
1.0 = tf(termFreq(content:potenzialità)=1)
3.9210415 = idf(docFreq=65, maxDocs=1225)
0.0 = fieldNorm(field=content, doc=726)
http://www.gruppozenit.com/ita/digital_servizi.html";>
0.0 = (MATCH) sum of:
  0.0 = (MATCH) max plus 0.01 times others of:
0.0 = (MATCH) weight(content:potenzialità^3.0 in 728), product of:
  0.22427149 = queryWeight(content:potenzialità^3.0)

Re: need help from hard core solr experts - out of memory error

2014-04-20 Thread Candygram For Mongo
We have tried using fetchSize and we still got the same out of memory
errors.


On Fri, Apr 18, 2014 at 9:39 PM, Shawn Heisey  wrote:

> On 4/18/2014 6:15 PM, Candygram For Mongo wrote:
> > We are getting Out Of Memory errors when we try to execute a full import
> > using the Data Import Handler.  This error originally occurred on a
> > production environment with a database containing 27 million records.
>  Heap
> > memory was configured for 6GB and the server had 32GB of physical memory.
> >  We have been able to replicate the error on a local system with 6
> million
> > records.  We set the memory heap size to 64MB to accelerate the error
> > replication.  The indexing process has been failing in different
> scenarios.
> >  We have 9 test cases documented.  In some of the test cases we increased
> > the heap size to 128MB.  In our first test case we set heap memory to
> 512MB
> > which also failed.
>
> One characteristic of a JDBC connection is that unless you tell it
> otherwise, it will try to retrieve the entire resultset into RAM before
> any results are delivered to the application.  It's not Solr doing this,
> it's JDBC.
>
> In this case, there are 27 million rows in the resultset.  It's highly
> unlikely that this much data (along with the rest of Solr's memory
> requirements) will fit in 6GB of heap.
>
> JDBC has a built-in way to deal with this.  It's called fetchSize.  By
> using the batchSize parameter on your JdbcDataSource config, you can set
> the JDBC fetchSize.  Set it to something small, between 100 and 1000,
> and you'll probably get rid of the OOM problem.
>
> http://wiki.apache.org/solr/DataImportHandler#Configuring_JdbcDataSource
>
> If you had been using MySQL, I would have recommended that you set
> batchSize to -1.  This sets fetchSize to Integer.MIN_VALUE, which tells
> the MySQL driver to stream results instead of trying to either batch
> them or return everything.  I'm pretty sure that the Oracle driver
> doesn't work this way -- you would have to modify the dataimport source
> code to use their streaming method.
>
> Thanks,
> Shawn
>
>


Re: need help from hard core solr experts - out of memory error

2014-04-20 Thread Shawn Heisey
On 4/20/2014 11:12 AM, Candygram For Mongo wrote:
> We have tried using fetchSize and we still got the same out of memory
> errors.

It needs to be batchSize, not fetchSize.  I mentioned too much of the
internal details.  The fetchSize name is only important if you're
writing source code that uses JDBC.

http://wiki.apache.org/solr/DataImportHandler#Configuring_JdbcDataSource

Thanks,
Shawn



Re: need help from hard core solr experts - out of memory error

2014-04-20 Thread Mikhail Khludnev
I noticed enormous number of commits, which reasonably triggers merges that
hits OOMe. Try to disable autocommits completely. Monitor commit
occurrences in the log.


On Sun, Apr 20, 2014 at 9:12 PM, Candygram For Mongo <
candygram.for.mo...@gmail.com> wrote:

> We have tried using fetchSize and we still got the same out of memory
> errors.
>
>
> On Fri, Apr 18, 2014 at 9:39 PM, Shawn Heisey  wrote:
>
> > On 4/18/2014 6:15 PM, Candygram For Mongo wrote:
> > > We are getting Out Of Memory errors when we try to execute a full
> import
> > > using the Data Import Handler.  This error originally occurred on a
> > > production environment with a database containing 27 million records.
> >  Heap
> > > memory was configured for 6GB and the server had 32GB of physical
> memory.
> > >  We have been able to replicate the error on a local system with 6
> > million
> > > records.  We set the memory heap size to 64MB to accelerate the error
> > > replication.  The indexing process has been failing in different
> > scenarios.
> > >  We have 9 test cases documented.  In some of the test cases we
> increased
> > > the heap size to 128MB.  In our first test case we set heap memory to
> > 512MB
> > > which also failed.
> >
> > One characteristic of a JDBC connection is that unless you tell it
> > otherwise, it will try to retrieve the entire resultset into RAM before
> > any results are delivered to the application.  It's not Solr doing this,
> > it's JDBC.
> >
> > In this case, there are 27 million rows in the resultset.  It's highly
> > unlikely that this much data (along with the rest of Solr's memory
> > requirements) will fit in 6GB of heap.
> >
> > JDBC has a built-in way to deal with this.  It's called fetchSize.  By
> > using the batchSize parameter on your JdbcDataSource config, you can set
> > the JDBC fetchSize.  Set it to something small, between 100 and 1000,
> > and you'll probably get rid of the OOM problem.
> >
> > http://wiki.apache.org/solr/DataImportHandler#Configuring_JdbcDataSource
> >
> > If you had been using MySQL, I would have recommended that you set
> > batchSize to -1.  This sets fetchSize to Integer.MIN_VALUE, which tells
> > the MySQL driver to stream results instead of trying to either batch
> > them or return everything.  I'm pretty sure that the Oracle driver
> > doesn't work this way -- you would have to modify the dataimport source
> > code to use their streaming method.
> >
> > Thanks,
> > Shawn
> >
> >
>



-- 
Sincerely yours
Mikhail Khludnev
Principal Engineer,
Grid Dynamics


 


Sunspot SolrException: The field location_s does not support spatial filtering

2014-04-20 Thread funkdified
For reference here is the gem (plugin) I am using
https://github.com/sunspot/sunspot/

I am trying to run this command as part of my solr search:

with(:location).in_radius(x, y, 50, :bbox => true)

I have defined this in my model definition:

location :location do
  Sunspot::Util::Coordinates.new(latitude, longitude) if self.latitude
end

I can run this command: 

with(:location).near(x, y, :precision => 3)

However, **I can't run the in-radius search**, which is what I really need.

Here is the server log:

  SOLR Request (5.0ms)  [ path=#
parameters={data:
fq=type%3ADispenser&fq=%7B%21bbox+sfield%3Dlocation_s+pt%3D29.7601927%2C-95.3693895999+d%3D50%7D&start=0&rows=10&q=%2A%3A%2A,
method: post, params: {:wt=>:ruby}, query: wt=ruby, headers:
{"Content-Type"=>"application/x-www-form-urlencoded; charset=UTF-8"}, path:
select, uri: http://localhost:8982/solr/select?wt=ruby, open_timeout: ,
read_timeout: , retry_503: , retry_after_limit: } ]
Completed 500 Internal Server Error in 293.3ms

RSolr::Error::Http - RSolr::Error::Http - 500 Internal Server Error
Error: The field location_s does not support spatial filtering

org.apache.solr.common.SolrException: The field location_s does not
support spatial filtering
at
org.apache.solr.search.SpatialFilterQParser.parse(SpatialFilterQParser.java:86)
at org.apache.solr.search.QParser.getQuery(QParser.java:142)
at
org.apache.solr.handler.component.QueryComponent.prepare(QueryComponent.java:114)
at
org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:173)
at
org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:129)
at org.apache.solr.core.SolrCore.execute(SolrCore.java:1372)
at
org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:356)
at
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:252)

Request Data:
"fq=type%3ADispenser&fq=%7B%21bbox+sfield%3Dlocation_s+pt%3D29.7601927%2C-95.3693895999+d%3D50%7D&start=0&rows=10&q=%2A%3A%2A"



--
View this message in context: 
http://lucene.472066.n3.nabble.com/Sunspot-SolrException-The-field-location-s-does-not-support-spatial-filtering-tp4132262.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: Sunspot SolrException: The field location_s does not support spatial filtering

2014-04-20 Thread funkdified
By the way, for anyone interested in seeming my schema.xml:

https://gist.github.com/funkdified/3b503314738b97ab19e8



--
View this message in context: 
http://lucene.472066.n3.nabble.com/Sunspot-SolrException-The-field-location-s-does-not-support-spatial-filtering-tp4132262p4132263.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: Score is always zero

2014-04-20 Thread Doug Turnbull
Googling around for "fieldNorm == 0" brought me to this post that also
involved Nutch several years ago, it might be relavant to you:

http://mail-archives.apache.org/mod_mbox/lucene-solr-user/201011.mbox/%3c201011031930.05980.markus.jel...@openindex.io%3E

My only thought is that if your fields are abnormally large, then I wonder
if its possible to encounter a rather large floating point underflow (
http://en.wikipedia.org/wiki/Arithmetic_underflow). Lucene's very low
precision 8-bit floats used to encode norms would exacerbate this.

-Doug


On Mon, Apr 21, 2014 at 12:22 AM, Arun V C.  wrote:

> Thank you very much for the reply,
>
> Please find attached schema.xml and solrconfig.xml, most it is default.
>
>
> -Original Message-
> From: Doug Turnbull [mailto:dturnb...@opensourceconnections.com]
> Sent: Sunday, April 20, 2014 9:42 PM
> To: Arun V C.; solr-user@lucene.apache.org
> Subject: RE: Score is always zero
>
> Arun, your field norms are all suspiciously 0, which.multiplied through
> the scoring calculation causes the overall score to be 0.
>
> Are you using anything other than the default similarity? Could you post
> the relevant parts.of your schema (field definition, field types,
> similarity, etc)?
>
> -Doug
>
> Sent from my Windows Phone From: Arun V C.
> Sent: ‎4/‎20/‎2014 11:09 AM
> To: solr-user@lucene.apache.org
> Subject: Score is always zero
> Score is always zero
>
> my search query is
>
>
> http://localhost:8080/solr/nutch?indent=on&defType=dismax&version=2.2&q=potenzialit%C3%A0&fq=host:www.gruppozenit.com&debugQuery=true&cache=false
>
> There are 6 results
> Word potenzialità is appearing in title and content in the 5th record,
> that should be ranked 1st right.
>
> Am i doing something wrong
>
> See the debug statement.
>
> potenzialità name="querystring">potenzialità name="parsedquery">+DisjunctionMaxQuery((content:potenzialità^3.0 |
> title:potenzialità^5.0 | anchor:potenzialità)~0.01)
> DisjunctionMaxQuery((content:potenzialità^3.0 | title:potenzialità^5.0
> | anchor:potenzialità)~0.01) name="parsedquery_toString">+(content:potenzialità^3.0 |
> title:potenzialità^5.0 | anchor:potenzialità)~0.01
> (content:potenzialità^3.0 | title:potenzialità^5.0 |
> anchor:potenzialità)~0.01http://www.gruppozenit.com/eng/digital_dalla_rete.html";><
> http://www.gruppozenit.com/eng/digital_dalla_rete.html%22%3E>
> 0.0 = (MATCH) sum of:
>   0.0 = (MATCH) max plus 0.01 times others of:
> 0.0 = (MATCH) weight(content:potenzialità^3.0 in 639), product of:
>   0.22427149 = queryWeight(content:potenzialità^3.0), product of:
> 3.0 = boost
> 3.9210415 = idf(docFreq=65, maxDocs=1225)
> 0.019065639 = queryNorm
>   0.0 = (MATCH) fieldWeight(content:potenzialità in 639), product of:
> 1.0 = tf(termFreq(content:potenzialità)=1)
> 3.9210415 = idf(docFreq=65, maxDocs=1225)
> 0.0 = fieldNorm(field=content, doc=639)
>   0.0 = (MATCH) max plus 0.01 times others of:
> 0.0 = (MATCH) weight(content:potenzialità^3.0 in 639), product of:
>   0.22427149 = queryWeight(content:potenzialità^3.0), product of:
> 3.0 = boost
> 3.9210415 = idf(docFreq=65, maxDocs=1225)
> 0.019065639 = queryNorm
>   0.0 = (MATCH) fieldWeight(content:potenzialità in 639), product of:
> 1.0 = tf(termFreq(content:potenzialità)=1)
> 3.9210415 = idf(docFreq=65, maxDocs=1225)
> 0.0 = fieldNorm(field=content, doc=639) http://www.gruppozenit.com/ita/digital_communication.html";><
> http://www.gruppozenit.com/ita/digital_communication.html%22%3E>
> 0.0 = (MATCH) sum of:
>   0.0 = (MATCH) max plus 0.01 times others of:
> 0.0 = (MATCH) weight(content:potenzialità^3.0 in 725), product of:
>   0.22427149 = queryWeight(content:potenzialità^3.0), product of:
> 3.0 = boost
> 3.9210415 = idf(docFreq=65, maxDocs=1225)
> 0.019065639 = queryNorm
>   0.0 = (MATCH) fieldWeight(content:potenzialità in 725), product of:
> 1.0 = tf(termFreq(content:potenzialità)=1)
> 3.9210415 = idf(docFreq=65, maxDocs=1225)
> 0.0 = fieldNorm(field=content, doc=725)
>   0.0 = (MATCH) max plus 0.01 times others of:
> 0.0 = (MATCH) weight(content:potenzialità^3.0 in 725), product of:
>   0.22427149 = queryWeight(content:potenzialità^3.0), product of:
> 3.0 = boost
> 3.9210415 = idf(docFreq=65, maxDocs=1225)
> 0.019065639 = queryNorm
>   0.0 = (MATCH) fieldWeight(content:potenzialità in 725), product of:
> 1.0 = tf(termFreq(content:potenzialità)=1)
> 3.9210415 = idf(docFreq=65, maxDocs=1225)
> 0.0 = fieldNorm(field=content, doc=725) http://www.gruppozenit.com/ita/digital_dalla_rete.html";><
> http://www.gruppozenit.com/ita/digital_dalla_rete.html%22%3E>
> 0.0 = (MATCH) sum of:
>   0.0 = (MATCH) max plus 0.01 times others of:
> 0.0 = (MATCH) weight(content:potenzialità^3.0 in 726), product of:
>   0.22427149 = queryWeight(content:potenziali

Re: 'qt' parameter is not working in search call of SolrPhpClient

2014-04-20 Thread harshrossi
Yes I know but I am using SolrPhpClient API where by default the search()
function access the '/select' request handler. So I used the 'qt' parameter
to access '/select_test' as given in this link:

Non-Default Request Handler

  

I have used the 'qt' as mentioned in the link but still it points to the
default '/select'.

Any suggestions?






--
View this message in context: 
http://lucene.472066.n3.nabble.com/qt-parameter-is-not-working-in-search-call-of-SolrPhpClient-tp4131934p4132282.html
Sent from the Solr - User mailing list archive at Nabble.com.