Return doc if one or more query keywords occur multiple times

2009-11-12 Thread gistolero
Hello,

I am using Dismax request handler for queries:

...select?q=foo bar foo2 bar2&qt=dismax&mm=2...

With parameter "mm=2" I configure that at least 2 of the optional clauses must 
match, regardless of how many clauses there are.

But now I want change this to the following:

List all documents that have at least 2 of the optional clauses OR that have at 
least one of the query terms (e.g. foo) more than once.

Is this possible?
Thanks,
Gisto

-- 
DSL-Preisknaller: DSL Komplettpakete von GMX schon für 
16,99 Euro mtl.!* Hier klicken: http://portal.gmx.net/de/go/dsl02


Re: Return doc if one or more query keywords occur multiple times

2009-11-13 Thread gistolero
Anyone?

 Original-Nachricht 
> Datum: Thu, 12 Nov 2009 13:29:20 +0100
> Von: gistol...@gmx.de
> An: solr-user@lucene.apache.org
> Betreff: Return doc if one or more query keywords occur multiple times

> Hello,
> 
> I am using Dismax request handler for queries:
> 
> ...select?q=foo bar foo2 bar2&qt=dismax&mm=2...
> 
> With parameter "mm=2" I configure that at least 2 of the optional clauses
> must match, regardless of how many clauses there are.
> 
> But now I want change this to the following:
> 
> List all documents that have at least 2 of the optional clauses OR that
> have at least one of the query terms (e.g. foo) more than once.
> 
> Is this possible?
> Thanks,
> Gisto
> 
> -- 
> DSL-Preisknaller: DSL Komplettpakete von GMX schon für 
> 16,99 Euro mtl.!* Hier klicken: http://portal.gmx.net/de/go/dsl02

-- 
Jetzt kostenlos herunterladen: Internet Explorer 8 und Mozilla Firefox 3.5 -
sicherer, schneller und einfacher! http://portal.gmx.net/de/go/chbrowser


delta-import for XML files, Solr statistics

2008-10-24 Thread gistolero
Hello,

I have some questions about DataImportHandler and Solr statistics...


1.)
I'm using the DataImportHandler for creating my Lucene index from XML files:

###
$ cat data-config.xml 

 
  
   

  ...

###

No problems with this configuration - All works fine for full-imports, but...

===> What means 'rootEntity="false"' and 'dataSource="null"'?



2.)
The documentation from DataImportHandler describes the index update process for 
SQL databases only...

My scenario:
- My application creates, deletes and modifies files from /tmp/files every 
night.
- delta-import / DataImportHandler should "mirror" _all_ this changes to my 
lucene index (=> create, delete, update documents).

===> Is this possible with delta-import / DataImportHandler?
===> If not: Do you have any suggestions on how to do this?



3.)
My scenario:
- /tmp/files contains 682 'myDoc_.*\.xml' XML files. 
- Each XML file contains 12 XML elements (e.g. foo).
- DataImportHandler transfer only 5 from this 12 elements to the lucene index. 


I don't understand the output from 'solr/dataimport' (=> status):

###

 ...
 
  0
  1363
  0
  2008-10-24 13:19:03
  
Indexing completed. Added/Updated: 681 documents. Deleted 0 documents.
  
  2008-10-24 13:19:05
  2008-10-24 13:19:05
  0:0:2.648
  
...


===> What is "Total Rows Fetched" rsp. what is a "row" in a XML file? An 
element? Why 1363?
===> Why shows the "Added/Updated" counter 681 and not 682?



4.)
And my last questions about Solr statistics/informations...

===> Is it possible to get informations (number of indexed documents, stored 
values from documents etc.) from the current lucene index?
===> The admin webinterface shows 'numDocs' and 'maxDoc' in 'statistics/core'. 
Is 'numDocs' = number of indexed documents? What means 'maxDocs'?


Thanks a lot!
gisto
-- 
GMX Kostenlose Spiele: Einfach online spielen und Spaß haben mit Pastry Passion!
http://games.entertainment.gmx.net/de/entertainment/games/free/puzzle/6169196


Re: delta-import for XML files, Solr statistics

2008-10-24 Thread gistolero
Thanks for your very fast response :-)


> > 2.)
> > The documentation from DataImportHandler describes the index update
> process for SQL databases only...
> >
> > My scenario:
> > - My application creates, deletes and modifies files from /tmp/files
> every night.
> > - delta-import / DataImportHandler should "mirror" _all_ this changes to
> my lucene index (=> create, delete, update documents).
> The only Entityprocessor which supports delta is SqlEntityProcessor.
> The XPathEntityProcessor has not implemented it , because we do not
> know of a consistent way of finding deltas for XML. So ,
> unfortunately,no delta support for XML. But that said you can
> implement those methods in XPathEntityProcessor . The methods are
> explained in EntityProcessor.java. if you have questions specific to
> this I can help.Probably we can contribute it back
> >
> > ===> Is this possible with delta-import / DataImportHandler?
> > ===> If not: Do you have any suggestions on how to do this?

Ok so, at the moment I have to do a full-import to update my index. What 
happens with (user) queries while full-import is running? Does Solr block this 
queries the import is finished? Which configuration options control this 
behavior? 



> > My scenario:
> > - /tmp/files contains 682 'myDoc_.*\.xml' XML files.
> > - Each XML file contains 12 XML elements (e.g. foo).
> > - DataImportHandler transfer only 5 from this 12 elements to the lucene
> index.
> >
> >
> > I don't understand the output from 'solr/dataimport' (=> status):
> >
> > ###
> > 
> >  ...
> >  
> >  0
> >  1363
> >  0
> >  2008-10-24 13:19:03
> >  
> >Indexing completed. Added/Updated: 681 documents. Deleted 0
> documents.
> >  
> >  2008-10-24 13:19:05
> >  2008-10-24 13:19:05
> >  0:0:2.648
> >  
> > ...
> > 
> >
> > ===> Why shows the "Added/Updated" counter 681 and not 682?
> 
> Added updated is the no:of docs . How do you know the number is not
> accurate?


/tmp/files$ ls myDoc_*.xml | wc -l
682

But "Added/Updated" shows 681. Does this mean that one file has an XML error? 
But the statistic says "Total Documents Skipped" = 0?!

 

> > 4.)
> > And my last questions about Solr statistics/informations...
> >
> > ===> Is it possible to get informations (number of indexed documents,
> stored values from documents etc.) from the current lucene index?
> > ===> The admin webinterface shows 'numDocs' and 'maxDoc' in
> 'statistics/core'. Is 'numDocs' = number of indexed documents? What means 
> 'maxDocs'?

Do you have answers for this questions too?

Bye,
Simon
-- 
Der GMX SmartSurfer hilft bis zu 70% Ihrer Onlinekosten zu sparen! 
Ideal für Modem und ISDN: http://www.gmx.net/de/go/smartsurfer


Calculating peaks

2008-11-06 Thread gistolero
Hello,

How can I get ALL the matching documents back? How can I return an unlimited 
number of rows?
Yes, I have read the FAQ and I got your point, but I need Solr to calculate 
number based peaks for my indexed data:

- For each of my documents the text ('text'), the creation time ('date') and 
other fields are saved. But for this query I'm only using 'text' and 'date'!
- If the query string is found in 'text', Solr returns only the value from 
'date' (not from 'text').
- My application uses a map to store all returned dates and to count the 
frequency of occurrence.
- The application uses the map for calculating peaks.


Two questions:

===> Is it possible to return _all_ documents, if the query response contains 
only one small field ('date')?

===> Do you have an hints for tuning such a query? I am using the 'fl=date' 
parameter. Is there a better way?


Thanks a lot
Gisto
-- 
Ist Ihr Browser Vista-kompatibel? Jetzt die neuesten 
Browser-Versionen downloaden: http://www.gmx.net/de/go/browser


Re: Calculating peaks

2008-11-06 Thread gistolero
Thank you, Erik. Thats what I need. Sorry, I missed the 'facet' chapter.


 Original-Nachricht 
> Datum: Thu, 6 Nov 2008 05:07:39 -0600
> Von: Erik Hatcher <[EMAIL PROTECTED]>
> An: solr-user@lucene.apache.org
> Betreff: Re: Calculating peaks

> Would faceting on date (&facet.field=date&facet=on) satisfy your  
> need?   It'll give you back all the dates and frequencies of them  
> within the matched results.
> 
>   Erik
> 
> On Nov 6, 2008, at 4:59 AM, [EMAIL PROTECTED] wrote:
> > How can I get ALL the matching documents back? How can I return an  
> > unlimited number of rows?
> > Yes, I have read the FAQ and I got your point, but I need Solr to  
> > calculate number based peaks for my indexed data:
> >
> > - For each of my documents the text ('text'), the creation time  
> > ('date') and other fields are saved. But for this query I'm only  
> > using 'text' and 'date'!
> > - If the query string is found in 'text', Solr returns only the  
> > value from 'date' (not from 'text').
> > - My application uses a map to store all returned dates and to count  
> > the frequency of occurrence.
> > - The application uses the map for calculating peaks.
> >
> >
> > Two questions:
> >
> > ===> Is it possible to return _all_ documents, if the query response  
> > contains only one small field ('date')?
> >
> > ===> Do you have an hints for tuning such a query? I am using the  
> > 'fl=date' parameter. Is there a better way?
> >
> >
> > Thanks a lot
> > Gisto
> > -- 
> > Ist Ihr Browser Vista-kompatibel? Jetzt die neuesten
> > Browser-Versionen downloaden: http://www.gmx.net/de/go/browser

-- 
Der GMX SmartSurfer hilft bis zu 70% Ihrer Onlinekosten zu sparen! 
Ideal für Modem und ISDN: http://www.gmx.net/de/go/smartsurfer


Re: Calculating peaks - solrj support for facet.date?

2008-11-07 Thread gistolero
Sorry, but I have one more question. Does the java client solrj support 
facet.date?

QueryResponse knows the getFacetDates() method but I don't understand how to 
set facet.date, facet.date.start, facet.date.end, and facet.date.gap for the 
query. It seems that SolrQuery doesn't provide functions for this purpose.

It would be very nice if you could post some code examples.

Thanks again
Simon



 Original-Nachricht 
Datum: Thu, 06 Nov 2008 22:08:35 +0100
Von: [EMAIL PROTECTED]
An: solr-user@lucene.apache.org
Betreff: Re: Calculating peaks

Thank you, Erik. Thats what I need. Sorry, I missed the 'facet' chapter.


 Original-Nachricht 
> Datum: Thu, 6 Nov 2008 05:07:39 -0600
> Von: Erik Hatcher <[EMAIL PROTECTED]>
> An: solr-user@lucene.apache.org
> Betreff: Re: Calculating peaks

> Would faceting on date (&facet.field=date&facet=on) satisfy your  
> need?   It'll give you back all the dates and frequencies of them  
> within the matched results.
> 
>   Erik
> 
> On Nov 6, 2008, at 4:59 AM, [EMAIL PROTECTED] wrote:
> > How can I get ALL the matching documents back? How can I return an  
> > unlimited number of rows?
> > Yes, I have read the FAQ and I got your point, but I need Solr to  
> > calculate number based peaks for my indexed data:
> >
> > - For each of my documents the text ('text'), the creation time  
> > ('date') and other fields are saved. But for this query I'm only  
> > using 'text' and 'date'!
> > - If the query string is found in 'text', Solr returns only the  
> > value from 'date' (not from 'text').
> > - My application uses a map to store all returned dates and to count  
> > the frequency of occurrence.
> > - The application uses the map for calculating peaks.
> >
> >
> > Two questions:
> >
> > ===> Is it possible to return _all_ documents, if the query response  
> > contains only one small field ('date')?
> >
> > ===> Do you have an hints for tuning such a query? I am using the  
> > 'fl=date' parameter. Is there a better way?
> >
> >
> > Thanks a lot
> > Gisto
> > -- 
> > Ist Ihr Browser Vista-kompatibel? Jetzt die neuesten
> > Browser-Versionen downloaden: http://www.gmx.net/de/go/browser

-- 
Der GMX SmartSurfer hilft bis zu 70% Ihrer Onlinekosten zu sparen! 
Ideal für Modem und ISDN: http://www.gmx.net/de/go/smartsurfer

-- 
Ist Ihr Browser Vista-kompatibel? Jetzt die neuesten 
Browser-Versionen downloaden: http://www.gmx.net/de/go/browser


full-import with solrj (solr Java client)

2008-11-13 Thread gistolero
Hello,

I want to run a (DataImportHandler) full-import with solrj. I understand how to 
send queries with SolrQuery etc., but I don't know how to construct the 
"dataimport?command=full-import&commit=true" path. Which classes do I have to 
use? SolrRequest? It would be very nice if you could post some code examples.

Thank you
Gisto
-- 
"Feel free" - 10 GB Mailbox, 100 FreeSMS/Monat ...
Jetzt GMX TopMail testen: http://www.gmx.net/de/go/topmail


Re: full-import with solrj (solr Java client)

2008-11-13 Thread gistolero
Erik, thanks a lot for this example.

Now, all works fine :-)



 Original-Nachricht 
> Datum: Thu, 13 Nov 2008 06:53:36 -0500
> Von: Erik Hatcher <[EMAIL PROTECTED]>
> An: solr-user@lucene.apache.org
> Betreff: Re: full-import with solrj (solr Java client)

> On Nov 13, 2008, at 6:20 AM, [EMAIL PROTECTED] wrote:
> > I want to run a (DataImportHandler) full-import with solrj. I  
> > understand how to send queries with SolrQuery etc., but I don't know  
> > how to construct the "dataimport?command=full-import&commit=true"  
> > path. Which classes do I have to use? SolrRequest? It would be very  
> > nice if you could post some code examples.
> 
> I've added an example here:  
> 
> Basically any request handler can be easily called from SolrJ, using  
> ModifiableSolrParams, and making sure wt is set, along with the other  
> parameters.
> 
>   Erik

-- 
Sensationsangebot nur bis 30.11: GMX FreeDSL - Telefonanschluss + DSL 
für nur 16,37 Euro/mtl.!* http://dsl.gmx.de/?ac=OM.AD.PD003K11308T4569a


Using properties from core configuration in data-config.xml

2008-11-17 Thread gistolero
Hello,

is it possible to use properties from core configuration in data-config.xml?
I want to define the "baseDir" for DataImportHandler.


I tried the following configuration:


*** solr.xml ***


  

 
  

...
  





*** data-config.xml ***


 
  
   
 



But this is the result:

...
Nov 17, 2008 1:50:08 PM org.apache.solr.handler.dataimport.DataImporter 
doFullImport
INFO: Starting Full Import
Nov 17, 2008 1:50:08 PM org.apache.solr.core.SolrCore execute
INFO: [posts-politics] webapp=/solr path=/dataimport 
params={optimize=true&commit=true&command=full-import&qt=/dataimport&wt=javabin&version=2.2}
 status=0 QTime=66 
Nov 17, 2008 1:50:08 PM org.apache.solr.core.SolrCore execute
INFO: [posts-politics] webapp=/solr path=/dataimport 
params={qt=/dataimport&wt=javabin&version=2.2} status=0 QTime=0 
Nov 17, 2008 1:50:08 PM org.apache.solr.update.DirectUpdateHandler2 deleteAll
INFO: [posts-politics] REMOVING ALL DOCUMENTS FROM INDEX
Nov 17, 2008 1:50:08 PM org.apache.solr.handler.dataimport.DataImporter 
doFullImport
SEVERE: Full Import failed
org.apache.solr.handler.dataimport.DataImportHandlerException: 'baseDir' should 
point to a directory Processing Document # 1
 at 
org.apache.solr.handler.dataimport.FileListEntityProcessor.init(FileListEntityProcessor.java:81)
...




I tried also to configure all dataimport settings in solrconfig.xml, but I 
don't know how to do this exactly. Among other things, I tried this format:


*** solrconfig.xml ***

...

 
  
   FileDataSource
   

 xmlFile
 FileListEntityProcessor
 ${xmlDataDir}
 id-.*\.xml
 false
 null"
 
   data
   id
   ${xmlFile.fileAbsolutePath}
 ...

...



But all my tests (with different "dataimport" formats in solrconfig.xml) failed:


...
INFO: Reusing parent classloader
Nov 17, 2008 2:18:14 PM org.apache.solr.common.SolrException log
SEVERE: Error in solrconfig.xml:org.apache.solr.common.SolrException: No system 
property or default value specified for xmlFile.fileAbsolutePath
at 
org.apache.solr.common.util.DOMUtil.substituteProperty(DOMUtil.java:311)
at 
org.apache.solr.common.util.DOMUtil.substituteProperties(DOMUtil.java:264)
...



Thanks again for your excellent support!

Gisto

-- 
Der GMX SmartSurfer hilft bis zu 70% Ihrer Onlinekosten zu sparen! 
Ideal für Modem und ISDN: http://www.gmx.net/de/go/smartsurfer


Re: Using properties from core configuration in data-config.xml

2008-11-18 Thread gistolero
Very cool :-)

Both suggestions work fine! But only with solr version 1.4:
https://issues.apache.org/jira/browse/SOLR-823

Use a nightly build (e.g. 2008-11-17 works):
http://people.apache.org/builds/lucene/solr/nightly/

See below for examples for both solutions...



((( 1 )))

> There may be one way to do this.
> 
> Add your property in the invariant section of solrconfig's
> DataImportHandler
> element. For example, add this section:
> 
> 
>   ${xmlDataDir}
> 
> 
> Then you can use it as ${dataimporter.request.xmlDataDir} in your
> data-config to access this.



// *** solr.xml ***













// *** solrconfig.xml ***
  

  ./data-config.xml


  
${xmlDataDir}
  




// *** data-config.xml ***

  

  


URL for full-import:
http://localhost:8983/solr/core1/dataimport?command=full-import&commit=true




((( 2 


> > nope . It is not possible as of now. the placeholders are not aware of
> > the core properties.
> > Is it possible to pass the values as request params? Request
> > parameters can be accessed .


// *** data-config.xml ***

  

  


URL for full-import:
http://localhost:8983/solr/core1/dataimport?command=full-import&commit=true&xmlDataDir=%2Fhome%2Fcore1


Thats all.
Gisto



> > On Mon, Nov 17, 2008 at 7:57 PM,  <[EMAIL PROTECTED]> wrote:
> > > Hello,
> > >
> > > is it possible to use properties from core configuration in
> > data-config.xml?
> > > I want to define the "baseDir" for DataImportHandler.
> > >
> > >
> > > I tried the following configuration:
> > >
> > >
> > > *** solr.xml ***
> > >
> > > 
> > >  
> > >
> > > 
> > >  
> > >
> > >...
> > >  
> > > 
> > >
> > >
> > >
> > >
> > > *** data-config.xml ***
> > >
> > > 
> > >  
> > >  
> > >> > processor="FileListEntityProcessor"
> > > baseDir="${xmlDataDir}"
> > > fileName="id-.*\.xml"
> > > rootEntity="false"
> > > dataSource="null">
> > >  > >  pk="id"
> > >  url="${xmlFile.fileAbsolutePath}"
> > >  processor="XPathEntityProcessor"
> > > ...
> > > 
> > >
> > >
> > >
> > > But this is the result:
> > >
> > > ...
> > > Nov 17, 2008 1:50:08 PM
> org.apache.solr.handler.dataimport.DataImporter
> > doFullImport
> > > INFO: Starting Full Import
> > > Nov 17, 2008 1:50:08 PM org.apache.solr.core.SolrCore execute
> > > INFO: [posts-politics] webapp=/solr path=/dataimport
> >
> params={optimize=true&commit=true&command=full-import&qt=/dataimport&wt=javabin&version=2.2}
> > status=0 QTime=66
> > > Nov 17, 2008 1:50:08 PM org.apache.solr.core.SolrCore execute
> > > INFO: [posts-politics] webapp=/solr path=/dataimport
> > params={qt=/dataimport&wt=javabin&version=2.2} status=0 QTime=0
> > > Nov 17, 2008 1:50:08 PM org.apache.solr.update.DirectUpdateHandler2
> > deleteAll
> > > INFO: [posts-politics] REMOVING ALL DOCUMENTS FROM INDEX
> > > Nov 17, 2008 1:50:08 PM
> org.apache.solr.handler.dataimport.DataImporter
> > doFullImport
> > > SEVERE: Full Import failed
> > > org.apache.solr.handler.dataimport.DataImportHandlerException:
> 'baseDir'
> > should point to a directory Processing Document # 1
> > >  at
> >
> org.apache.solr.handler.dataimport.FileListEntityProcessor.init(FileListEntityProcessor.java:81)
> > > ...
> > >
> > >
> > >
> > >
> > > I tried also to configure all dataimport settings in solrconfig.xml,
> but
> > I don't know how to do this exactly. Among other things, I tried this
> > format:
> > >
> > >
> > > *** solrconfig.xml ***
> > >
> > > ...
> > >  > class="org.apache.solr.handler.dataimport.DataImportHandler">
> > >  
> > >  
> > >   FileDataSource
> > >   
> > >
> > > xmlFile
> > > FileListEntityProcessor
> > > ${xmlDataDir}
> > > id-.*\.xml
> > > false
> > > null"
> > > 
> > >   data
> > >   id
> > >   ${xmlFile.fileAbsolutePath}
> > > ...
> > > 
> > > ...
> > >
> > >
> > >
> > > But all my tests (with different "dataimport" formats in
> solrconfig.xml)
> > failed:
> > >
> > >
> > > ...
> > > INFO: Reusing parent classloader
> > > Nov 17, 2008 2:18:14 PM org.apache.solr.common.SolrException log
> > > SEVERE: Error in solrconfig.xml:org.apache.solr.common.SolrException:
> No
> > system property or default value specified for xmlFile.fileAbsolutePath
> > >at
> > org.apache.solr.common.util.DOMUtil.substituteProperty(DOMUtil.java:311)
> > >at
> >
> org.apache.solr.common.util.DOMUtil.substituteProperties(DOMUtil.java:264)
> > > ...
> > >
> > >
> > >
> > > Thanks again for your excellent support!
> > >
> > > Gisto
> > >
> > > --
> > > Der GMX SmartSurfer hilft bis zu 70% Ihrer Onlinekosten zu sparen!
> > > Ideal für Modem und ISDN: http://www.gmx.net/de/go/smartsurfer
> > >
> >
> >
> >
> > --
> > --Noble Paul
> >
> 
> 
> 
> -- 
> Regards,
> Shalin Shekhar Mangar.

-- 
Der GMX SmartSurfer hilft bis zu 70% Ihrer Onlinekosten zu sparen! 
Ideal für Modem und ISDN: http://www.gmx.net/de/go/smartsurfer


Combine facet.date and facet.field

2009-03-02 Thread gistolero
Hello,

I'm using the facet.date function to get all matching docs per day:

q=foo&rows=0&facet=true&facet.date=date&facet.date.start=2009-01-31T00:00:00Z&facet.date.end=2009-03-01T23:59:59Z&facet.date.gap=+1DAY&fq=+size:big

2009-01-31T00:00:00Z -> 13 hits
2009-02-01T00:00:00Z -> 10 hits
...

As you see, the result is filtered by the "fq" parameter. The downside for this 
solution is that I have to run the query for every value from field "size" 
(big, small, ...). Is there an easier way to produce the following result with 
a single query only?

big -> 2009-01-31T00:00:00Z -> 13 hits
big -> 2009-02-01T00:00:00Z -> 10 hits
...
small -> 2009-01-31T00:00:00Z -> 1 hits
small -> 2009-02-01T00:00:00Z -> 7 hits
...

Thanks
Gisto
-- 
Pt! Schon vom neuen GMX MultiMessenger gehört? Der kann`s mit allen: 
http://www.gmx.net/de/go/multimessenger01


DisMax query parser syntax for the fq parameter

2009-07-09 Thread gistolero
Hello,

I am using the dismax query parser syntax for the fq param:

.../select?qt=dismax&rows=30&q.alt=*:*&qf=content&fq={!dismax 
qf=contentKeyword^1.0 mm=0%}Foo&fq=+date:[2009-03-11T00:00:00Z TO 
2009-07-09T16:41:50Z]&fl=id,date,content


Now, I want to add one more field to the qf parameter:

...&fq={!dismax qf=titleKeyword^1.8 contentKeyword^1.0 mm=0%}Foo&...

=> Solr should return a doc, if "titleKeyword" OR "contentKeyword" contains 
"Foo".


But this results in an error:

SEVERE: org.apache.solr.common.SolrException: 
org.apache.lucene.queryParser.ParseException: Expected identifier
at pos 43 str='{!dismax qf=titleKeyword^1.8 contentKeyword^1.0 mm=0%}Foo'

Is it possible to use the fq filter for multiple fields?

Thanks
Gisto
-- 
Neu: GMX Doppel-FLAT mit Internet-Flatrate + Telefon-Flatrate
für nur 19,99 Euro/mtl.!* http://portal.gmx.net/de/go/dsl02


Re: DisMax query parser syntax for the fq parameter

2009-07-10 Thread gistolero
Yes, it works :-) Thanks Erik!


> > I am using the dismax query parser syntax for the fq param:
> >
> > .../select?qt=dismax&rows=30&q.alt=*:*&qf=content&fq={!dismax  
> > qf=contentKeyword^1.0 mm=0%}Foo&fq=+date:[2009-03-11T00:00:00Z TO  
> > 2009-07-09T16:41:50Z]&fl=id,date,content
> >
> >
> > Now, I want to add one more field to the qf parameter:
> >
> > ...&fq={!dismax qf=titleKeyword^1.8 contentKeyword^1.0 mm=0%}Foo&...
> 
> The issue is that qf needs to have spaces in it, but the local params  
> can't deal with that.  So what you can do is fq={!dismax qf=$fqqf  
> mm=0%}Foo&fqqf=titleKeyword^1.8+contentKeyword^1.0 - hope that works  
> (just typing that without trying it).
> 
>   Erik

-- 
Neu: GMX Doppel-FLAT mit Internet-Flatrate + Telefon-Flatrate
für nur 19,99 Euro/mtl.!* http://portal.gmx.net/de/go/dsl02