serialization error - BinaryResponseWriter

2013-11-12 Thread giovanni.bricc...@banzai.it

Hi,

I'm getting some errors reading boolean filelds, can you give me any 
suggestions? in this example I only have four "false" fields: 
leasing=false, FiltroNovita=false, FiltroFreeShipping=false, Outlet=false.


this is the stack trace (solr 4.2.1)

java.lang.NumberFormatException: For input string: "false"
at 
java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
at java.lang.Integer.parseInt(Integer.java:492)
at java.lang.Integer.valueOf(Integer.java:582)
at org.apache.solr.schema.IntField.toObject(IntField.java:89)
at org.apache.solr.schema.IntField.toObject(IntField.java:43)
at 
org.apache.solr.response.BinaryResponseWriter$Resolver.getValue(BinaryResponseWriter.java:223)
at 
org.apache.solr.response.BinaryResponseWriter$Resolver.getDoc(BinaryResponseWriter.java:186)
at 
org.apache.solr.response.BinaryResponseWriter$Resolver.writeResultsBody(BinaryResponseWriter.java:147)
at 
org.apache.solr.response.BinaryResponseWriter$Resolver.writeResults(BinaryResponseWriter.java:173)
at 
org.apache.solr.response.BinaryResponseWriter$Resolver.resolve(BinaryResponseWriter.java:86)
at 
org.apache.solr.common.util.JavaBinCodec.writeVal(JavaBinCodec.java:154)
at 
org.apache.solr.common.util.JavaBinCodec.writeNamedList(JavaBinCodec.java:144)
at 
org.apache.solr.common.util.JavaBinCodec.writeKnownType(JavaBinCodec.java:234)
at 
org.apache.solr.common.util.JavaBinCodec.writeVal(JavaBinCodec.java:149)
at 
org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:92)
at 
org.apache.solr.response.BinaryResponseWriter.write(BinaryResponseWriter.java:50)
at 
org.apache.solr.servlet.SolrDispatchFilter.writeResponse(SolrDispatchFilter.java:620)
at 
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:358)
at 
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:141)
at 
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at 
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at 
org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
at 
org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
at 
org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
at 
org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)
at 
org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
at 
org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:298)
at 
org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:859)
at 
org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:588)
at 
org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:489)
at java.lang.Thread.run(Thread.java:722)



All these fields are defined as follows in schema.xml:

|
||  
||  |
|  
||  |



10:22:26 	WARNING 	BinaryResponseWriter 	Error reading a field from 
document :SolrDocument{leasing=false,​ namesearch=La porta del sole,​ 
Url_Slug_10=/store/libri/scienza-e-tecnica/geografia-e-astronomia,​ 
has_image=1,​ Prod_Id=9788895563220,​ tag=[CPM_UUMEDIAUU | 
CPM_8895563220 | CPM_HHBBNCCSTDDZHH | CPM_978-8895563220],​ DescPromo=,​ 
Url_Sku_20=/libri/scienza-e-tecnica/geografia-e-astronomia/item_E043I,​ 
idlevel_1=[/8637/8664],​ NCodven=9788895563220,​ N1Codven=9788895563220, 
​ idlevel_0=[/8637],​ body_all=La porta del sole PGRECO pagine : 
235;formato : da 20 a 28 cm;descrizioni ausiliarie : letteratura di 
viaggio classica Geografia e Astronomia CPM_UUMEDIAUU | CPM_8895563220 | 
CPM_HHBBNCCSTDDZHH | CPM_978-8895563220 
/geografia-e-astronomia-PGRECO/d-2724579 La porta del sole geografia e 
astronomia,​ idlevel_2=[/8637/8664/8856],​ IDCat2=16384,​ Weight=2,​ 
IDCat1=16299,​ namerank_slug=[],​ cercabile=true,​ IDCat3=16389,​ 
dispo_ordinamento=50,​ pidlevel_1=[/8637/8664],​ pidlevel_0=[/8637],​ 
pidlevel_2=[/8637/8664/8856],​ 
cslugleaf=/libri/scienza-e-tecnica/geografia-e-astronomia#geografia e 
astronomia,​ Provenienza=252,​ FiltroNovita=false,​ 
Url_Sku_10=/shop/PGRECO/id.1FA9A6,​ FiltroFreeShipping=false,​ 
ean=9788895563220,​ ProdottoEtaGG=1228,​ ranking=18,​ CodArt=100375062,​ 
csluglevel_2=[/libri/scienza-e-tecnica/geografia-e-astronomia#geografia 
e astronomia],​ 
tcsluglevel_2=[/libri/scienza-e-tecnica/geografia-e-astronomia#geografia 
e astronomia],​ Data_Ingresso=Fri Jul 02 09:49:38 CEST 2010,​ 
csluglevel_1=[/libri/scienza-e-tecnica#scienza e tecnica],​ 
tcsluglevel_1=[/libri/scienza-e-tecnica#scienza e tecnica],​ 
Url_Slug_30=/promo/libri/scienza-e-tecnica/geografia-e-astronomia,​ 
sGenerico2=La porta del sole,​ Outlet=false,​ product_metadata=La porta 
del

Re: serialization error - BinaryResponseWriter

2013-11-13 Thread giovanni.bricc...@banzai.it
Mhhh, I run a dih full reload every night, and the source field is a 
sqlserver smallint column...


By the way I'll try cleaning the data dir of the index and reindexing

Il 12/11/13 17:13, Shawn Heisey ha scritto:

On 11/12/2013 2:37 AM, giovanni.bricc...@banzai.it wrote:

I'm getting some errors reading boolean filelds, can you give me any
suggestions? in this example I only have four "false" fields:
leasing=false, FiltroNovita=false, FiltroFreeShipping=false, Outlet=false.

this is the stack trace (solr 4.2.1)

java.lang.NumberFormatException: For input string: "false"
 at
java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)

 at java.lang.Integer.parseInt(Integer.java:492)
 at java.lang.Integer.valueOf(Integer.java:582)
 at org.apache.solr.schema.IntField.toObject(IntField.java:89)
 at org.apache.solr.schema.IntField.toObject(IntField.java:43)
 at
org.apache.solr.response.BinaryResponseWriter$Resolver.getValue(BinaryResponseWriter.java:223)

Solr stores boolean values internally as a number - 0 or 1.  That gets
changed to true/false when displaying search results.

It sounds like what you have here is quite possibly an index which
originally had text fields with the literal string "true" or "false",
and you've changed your schema so these fields are now boolean.  When
you change your schema, you have to reindex.

http://wiki.apache.org/solr/HowToReindex

Thanks,
Shawn





SolrCloud softcommit problem

2013-07-16 Thread giovanni.bricc...@banzai.it

Hi

I'm using solr version 4.3.1. I have a core with only one shard and 
three replicas, say  server1, server2 and server3.

Suppose server1 is currently the leader

if I send an update to the leader everything works fine

wget -O -  --header='Content-type: text/xml' 
--post-data='16910update="set">yy' 
'server1:8080/solr/mycore/update?softCommit=true'


querying server 1 server2 and server3 I see the right answer, always 
"yy"


if instead I do send an update to a replica, say server2

wget -O -  --header='Content-type: text/xml' 
--post-data='16910update="set">z' 
'server2:8080/solr/mycore/update?softCommit=true'


I see on server1 (leader) and server3 the correct value 'z' but 
server2 continues to show the wrong value, y, untill I send a commit.


Am I using correctly the update api?

Thanks


Giovanni





Re: A Comma /aSpace in a Query argument

2013-05-06 Thread giovanni.bricc...@banzai.it

Try escaping it with a \


Giovanni

Il 06/05/13 15:34, Peter Sch�tt ha scritto:

Hallo,

I want to use a comma as part of a query argument.

E.G.

q=myfield:aa,bb

and "aa,bb" is the value of the field.

Do I have to mask it?

And what is about a space in an argument

q=myfield:aa bb

and "aa bb" is the value of the field.

Thanks for any hint.

Ciao
   Peter Schütt





solrcloud and spellcheck rebuild

2013-06-24 Thread giovanni.bricc...@banzai.it

*Hi solr users

I'm using solr 4.2.1 and I have some questions about soft commit and 
spellcheckers.
I see dictionary rebuild going on after softcommit, for my application 
it is acceptable to rebuild the index once a day, so I tried
to switch buildOnCommit parameter to false in SpellCheckComponent 
configuration. ***After pushing configuration to zookeeper
*I had to restart solr to stop seeing spellcheck rebuild after document 
updates. Is this correct? Should I always restart/reload cores after

configuration change?

***Now what should I do to rebuild the dictionary?

wget -O - "http://$HOST:8080/solr/$CORE/spell?spellcheck.build=true";

after sending this request I see this log line "webapp=/solr path=/spell 
params={spellcheck.build=true} hits=0 status=0 QTime=24316" only on one 
node.

Should I send the request to all solr servers individually?

Last but not least, is this the correct way to send a softcommit update?
**
***wget -O -  --header='Content-type: text/xml' 
--post-data='16910name="namesearch" update="set">test test test' 
'myhost:8080/solr/mycore/update?softCommit=true'*

*
because in logs I see the commit=true parameter appearing. How can I be 
sure that the change has been only soft-commited?


INFO: [mycore] webapp=/solr path=/update 
params={waitSearcher=true&commit=true&wt=javabin&expungeDeletes=false&commit_end_point=true&version=2&softCommit=true} 
{commit=} 0 69


This is my spellchecker configuration



  default
  solr.IndexBasedSpellChecker
  spell
  ./index/spellchecker1
  0.5
  true
  score



 
  
  default
  true
  true
  10
  true
  10
  
  
spellcheck
  
 

and this is my update configuration



  ${solr.data.dir:}

  



Thank you

*


Re: Need assistance in defining search urls

2013-06-24 Thread giovanni.bricc...@banzai.it

Il 24/06/13 13:26, Mysurf Mail ha scritto:

Now, each doc looks like this (i generated random user text in the freetext
columns in the DB)
 We have located the ship.  d1771fc0-d3c2-472d-aa33-4bf5d1b79992 
b2986a4f-9687-404c-8d45-57b073d900f7 

a99cf760-d78e-493f-a827-585d11a765f3 
ba349832-c655-4a02-a552-d5b76b45d58c 
35e86a61-eba8-49f4-95af-8915bd9561ac 
6d8eb7d9-b417-4bda-b544-16bc26ab1d85 
31453eff-be19-4193-950f-fffcea70ef9e 
08e27e4f-3d07-4ede-a01d-4fdea3f7ddb0 
79a19a3f-3f1b-486f-9a84-3fb40c41e9c7 
b34c6f78-75b1-42f1-8ec7-e03d874497df  
1.7437795 
My searches are :
(PackageName is deined as default search)

1. I try to search for any package that name has the word "have" or "had"
or "has"
2. I try to search for any package that consists
d1771fc0-d3c2-472d-aa33-4bf5d1b79992

Therefore I use this searches

1.
http://localhost:8983/solr/vault/select?q=*have*&fl=PackageName%2Cscore&defType=edismax&stopwords=true&lowercaseOperators=true

questions :
1.a. even if i display all results, I dont get any results with "has "
(inflections). Why?
What is the field type of PackageName? Have you defined a stemmer? try 
q=PackageName:have or define a query type that specifies

the fields you want to search on


1.b. what is the difference between
*have*
  and 
have.
the score is differnt.

2.
http://localhost:8983/solr/vault/select?q=*:d1771fc0-d3c2-472d-aa33-4bf5d1b79992&fl=PackageName,score&defType=edismax&stopwords=true&lowercaseOperators=true&start=0&rows=300


Try escaping the "-",

CatalogVendorPartNum:d1771fc0\-d3c2\-472d\-aa33\-4bf5d1b79992


Questions:
2.a. I get no result. even though i search it on all fields. (*) and it
appears in
2.b. If I want to search on more than one field i.e. packageName &
description, what is the best way to do it?
define all as default?
Thanks,


try changing solrconfig

|

 edismax
 explicit
 
||PackageName^2||CatalogVendorPartNum^1
 
  |||   
|
  |

*Giovanni*


tlog file questions

2013-02-18 Thread giovanni.bricc...@banzai.it

Hi

I have some questions about  tlog files and how are managed.

I'm using dih to do incremental data loading, once a day I do a full 
refresh.


these are the request parameters

/dataimport?command=full-import&commit=true
/dataimport?command=delta-import&commit=true&optimize=false

I was expecting to see removed all the old tlog file when completing a 
delta/full, but I see that these files remains. Actually

the older files gets removed.

Am I using the wrong parameters? is there a different parameter to 
trigger the hard commit?
Are there some configuration parameters to configure the number of tlog 
files to keep? Unfortunately I have very little space on my disks and I 
need to double check space consumption .


I'm using solr 4

Thank you


minimum match and not matched words / term frequency in query result

2012-04-18 Thread giovanni.bricc...@banzai.it

Hi

I have a dismax query with a mininimum match settings, this allows some 
terms to be missing in query results.


I would like give a feedback to the user, highlighting the not matched 
words. It would be interesting also to show the words with a very low 
frequence.


For instance searching for "purple pendrive" I would highlight that the 
results ignore the term "purple",  beacuse we don't have any.


Can you suggest how to approach the problem?

I was thinking about the debugQuery output, but since I will not get 
details about all the results I probably will  miss something.


I am trying to write a new SearchComponent but I don't know how to get 
term frequency data from a ResponseBuilder object... I am new to 
solr/lucene programming.


Thanks a lot






near realtime search and dih

2012-08-09 Thread giovanni.bricc...@banzai.it
I would like to understand if  near realtime search is applicable to my 
configuration, or if I should change the way I load data.


Currently my application uses data import handler to load new documents 
every 15 minutes. This is acceptable, but  it would be interesting to 
bring online some changes within a minute.


Is it possible to configure the DIH to run in soft commit mode? if not, 
is it possible to use the update handler to made changes with soft 
commit and still use the dih to load other changes and commit modifications?


I have a replica of this core, soft commited data gets replicated too or 
should I send soft updates to both the servers?


I have just updated to solr4 alpha.

thanks


Giovanni


Re: near realtime search and dih

2012-08-09 Thread giovanni.bricc...@banzai.it


Thank you,

this is very interesting, I will try with solr cloud + autosoftcommit.

Il 09/08/12 14:45, Tomás Fernández Löbbe ha scritto:

Master-Slave architectures don't get along very well with NRT. One minute
may be achieved if your index is small and you don't have many updates per
minute, but in other case, I would go with Solr Cloud and distributed
indexing (you can run DIH in one of the nodes and every document will be
indexed in both replicas at the same time).
I don't know if you can configure DIH to use soft commits, but you could
use autosoftcommit.
While I was testing DIH + Solr Cloud I had some memory issues, those were
solved withhttps://issues.apache.org/jira/browse/SOLR-3658  but the fix is
not in the ALPHA, you should get a more recent revision (or wait for the
BETA).

Tomás

On Thu, Aug 9, 2012 at 7:02 AM,giovanni.bricc...@banzai.it  <
giovanni.bricc...@banzai.it> wrote:


I would like to understand if  near realtime search is applicable to my
configuration, or if I should change the way I load data.

Currently my application uses data import handler to load new documents
every 15 minutes. This is acceptable, but  it would be interesting to bring
online some changes within a minute.

Is it possible to configure the DIH to run in soft commit mode? if not, is
it possible to use the update handler to made changes with soft commit and
still use the dih to load other changes and commit modifications?

I have a replica of this core, soft commited data gets replicated too or
should I send soft updates to both the servers?

I have just updated to solr4 alpha.

thanks


Giovanni





solrcloud and facet_pivot

2012-08-10 Thread giovanni.bricc...@banzai.it
I'm moving my first steps with solr4 alpha and solr cloud, and I'm 
having troubles with the facet.pivot parameter.


starting solr cloudless I am able to use this parameter in some queries:

...facet.pivot=pidlevel_0,pidlevel_1,pidlevel_2,pidlevel_3

obtaining something like this

..."facet_pivot":{
  "pidlevel_0,pidlevel_1,pidlevel_2,pidlevel_3":[{
  "field":"pidlevel_0",
  "value":"/5769",
  "count":21,
  "pivot":[{
  "field":"pidlevel_1",
  "value":"/5769/5822",
  "count":14,
  "pivot":[{
  "field":"pidlevel_2",
  "value":"/5769/5822/6725",
  "count":8,
  "pivot":[]},
{
  "field":"pidlevel_2",
  "value":"/5769/5822/6718",
  "count":5,
  "pivot":[{
  "field":"pidlevel_3",
  "value":"/5769/5822/6718/7965",
  "count":5}]},


when I restart solr using these options

-DzkRun -DnumShards=1 -Dbootstrap_conf=true

the facet_pivot options stops working: passing two/three levels ( 
facet.pivot=pidlevel_0,pidlevel_1 or 
facet.pivot=pidlevel_0,pidlevel_1,pidlevel_2) no "facet_pivot" section 
is returned in the response, using 4 levels the server never returns the 
response and cpu goes to 100%.


Does anybody have had problems like this?

I'm using numShards=1 because I would like to have many replicas of the 
full core,




Thanks



multivalued filed question (FieldCache error)

2012-10-01 Thread giovanni.bricc...@banzai.it

Hello,

I would like to put a multivalued field into a qt definition as output 
field. to do this I edit the current solrconfig.xml definition and add 
the field in the fl specification.


Unexpectedly when I do the query q=*:*&qt=mytype I get the error


can not use FieldCache on multivalued field: store_slug


But if I instead run the query

http://src-eprice-dev:8080/solr/0/select/?q=*:*&qt=mytype&fl=otherfield,mymultivaluedfiled

I don't get the error

Have you got any suggestions?

I'm using solr 4 beta

solr-spec 4.0.0.2012.08.06.22.50.47
lucene-impl 4.0.0-BETA 1370099


Giovanni


Re: multivalued filed question (FieldCache error)

2012-10-01 Thread giovanni.bricc...@banzai.it



I'm also using that field for a facet:

|

 dismax
 explicit
 1
 
   many field but not store_slug
 
 
   |many field but not store_slug|||
   
 

..., store_slug
 
 
 

 2
 2
 *:*
default
  true
  true
  10
  true  


true
1
0
count
...
store_slug
...
false


  spellcheck


  |


Il 01/10/12 18:34, Erik Hatcher ha scritto:

How is your request handler defined?  Using store_slug for anything but fl?

Erik

On Oct 1, 2012, at 10:51,"giovanni.bricc...@banzai.it"  
  wrote:


Hello,

I would like to put a multivalued field into a qt definition as output field. 
to do this I edit the current solrconfig.xml definition and add the field in 
the fl specification.

Unexpectedly when I do the query q=*:*&qt=mytype I get the error


can not use FieldCache on multivalued field: store_slug


But if I instead run the query

http://src-eprice-dev:8080/solr/0/select/?q=*:*&qt=mytype&fl=otherfield,mymultivaluedfiled

I don't get the error

Have you got any suggestions?

I'm using solr 4 beta

solr-spec 4.0.0.2012.08.06.22.50.47
lucene-impl 4.0.0-BETA 1370099


Giovanni



--


 Giovanni Bricconi

Banzai Consulting
cell. 348 7283865
ufficio 02 00643839
via Gian Battista Vico 42
20132 Milano (MI)





Re: multivalued filed question (FieldCache error)

2012-10-03 Thread giovanni.bricc...@banzai.it

Here is the stack trace



Oct 3, 2012 3:07:38 PM org.apache.solr.common.SolrException log
SEVERE: org.apache.solr.common.SolrException: can not use FieldCache on 
multivalued field: store_slug
at 
org.apache.solr.schema.SchemaField.checkFieldCacheSource(SchemaField.java:174)

at org.apache.solr.schema.StrField.getValueSource(StrField.java:44)
at 
org.apache.solr.search.FunctionQParser.parseValueSource(FunctionQParser.java:376)
at 
org.apache.solr.search.FunctionQParser.parse(FunctionQParser.java:70)

at org.apache.solr.search.QParser.getQuery(QParser.java:145)
at org.apache.solr.search.ReturnFields.add(ReturnFields.java:289)
at 
org.apache.solr.search.ReturnFields.parseFieldList(ReturnFields.java:115)

at org.apache.solr.search.ReturnFields.(ReturnFields.java:101)
at org.apache.solr.search.ReturnFields.(ReturnFields.java:77)
at 
org.apache.solr.handler.component.QueryComponent.prepare(QueryComponent.java:97)
at 
org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:185)
at 
org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:129)

at org.apache.solr.core.SolrCore.execute(SolrCore.java:1656)
at 
org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:454)
at 
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:275)
at 
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at 
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at 
org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
at 
org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
at 
org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
at 
org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)
at 
org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
at 
org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:298)
at 
org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:859)
at 
org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:588)
at 
org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:489)

at java.lang.Thread.run(Thread.java:662)

Il 02/10/12 19:40, Chris Hostetter ha scritto:

: I'm also using that field for a facet:

Hmmm... that still doesn't make sense.  faceting can use FieldCache, but
it will check if ht field is mutivalued to decide if/when/how to do this.

There's nothing else in your requestHandler config that would suggest why
you might get this error.

can you please provide more details about the error you are getting -- in
particular: the completley stack trace from the server logs.  that should
help us itendify the code path leading to the problem.


:
: |
: 
:  dismax
:  explicit
:  1
:  
:many field but not store_slug
:  
:  
:|many field but not store_slug|||
: 
: ..., store_slug
:  
:   
:  2
:  2
:  *:*
: default
:   true
:   true
:   10
:   true  
: true
: 1
: 0
: count
: ...
: store_slug
: ...
: false
: 
: 
:   spellcheck
: 
:
:   |
:
:
: Il 01/10/12 18:34, Erik Hatcher ha scritto:
: > How is your request handler defined?  Using store_slug for anything but fl?
: >
: > Erik
: >
: > On Oct 1, 2012, at 10:51,"giovanni.bricc...@banzai.it"
: >   wrote:
: >
: > > Hello,
: > >
: > > I would like to put a multivalued field into a qt definition as output
: > > field. to do this I edit the current solrconfig.xml definition and add the
: > > field in the fl specification.
: > >
: > > Unexpectedly when I do the query q=*:*&qt=mytype I get the error
: > >
: > > 
: > > can not use FieldCache on multivalued field: store_slug
: > > 
: > >
: > > But if I instead run the query
: > >
: > > 
http://src-eprice-dev:8080/solr/0/select/?q=*:*&qt=mytype&fl=otherfield,mymultivaluedfiled
: > >
: > > I don't get the error
: > >
: > > Have you got any suggestions?
: > >
: > > I'm using solr 4 beta
: > >
: > > solr-spec 4.0.0.2012.08.06.22.50.47
: > > lucene-impl 4.0.0-BETA 1370099
: > >
: > >
: > > Giovanni
:
:
: --
:
:
:  Giovanni Bricconi
:
: Banzai Consulting
: cell. 348 7283865
: ufficio 02 00643839
: via Gian Battista Vico 42
: 20132 Milano (MI)
:
:
:
:

-Hoss



--


 Giovanni Bricconi

Banzai Consulting
cell. 348 7283865
ufficio 02 00643839
via Gian Battista Vico 42
20132 Milano (MI)


Re: multivalued filed question (FieldCache error)

2012-10-04 Thread giovanni.bricc...@banzai.it
.solr.search.ReturnFields.(ReturnFields.java:101)
: at org.apache.solr.search.ReturnFields.(ReturnFields.java:77)
: at
: 
org.apache.solr.handler.component.QueryComponent.prepare(QueryComponent.java:97)
: at
: 
org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:185)
: at
: 
org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:129)
: at org.apache.solr.core.SolrCore.execute(SolrCore.java:1656)
: at
: 
org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:454)
: at
: 
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:275)
: at
: 
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
: at
: 
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
: at
: 
org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
: at
: 
org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
: at
: org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
: at
: org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)
: at
: 
org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
: at
: org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:298)
: at
: org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:859)
: at
: 
org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:588)
: at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:489)
: at java.lang.Thread.run(Thread.java:662)
:
: Il 02/10/12 19:40, Chris Hostetter ha scritto:
: > : I'm also using that field for a facet:
: >
: > Hmmm... that still doesn't make sense.  faceting can use FieldCache, but
: > it will check if ht field is mutivalued to decide if/when/how to do this.
: >
: > There's nothing else in your requestHandler config that would suggest why
: > you might get this error.
: >
: > can you please provide more details about the error you are getting -- in
: > particular: the completley stack trace from the server logs.  that should
: > help us itendify the code path leading to the problem.
: >
: >
: > :
: > : |
: > : 
: > :  dismax
: > :  explicit
: > :  1
: > :  
: > :many field but not store_slug
: > :  
: > :  
: > :|many field but not store_slug|||
: > : 
: > : ..., store_slug
: > :  
: > :   
: > :  2
: > :  2
: > :  *:*
: > : default
: > :   true
: > :   true
: > :   10
: > :   true  
: > : true
: > : 1
: > : 0
: > : count
: > : ...
: > : store_slug
: > : ...
: > : false
: > : 
: > : 
: > :   spellcheck
: > : 
: > :
: > :   |
: > :
: > :
: > : Il 01/10/12 18:34, Erik Hatcher ha scritto:
: > : > How is your request handler defined?  Using store_slug for anything but
: > fl?
: > : >
: > : > Erik
: > : >
: > : > On Oct 1, 2012, at 10:51,"giovanni.bricc...@banzai.it"
: > : >   wrote:
: > : >
: > : > > Hello,
: > : > >
: > : > > I would like to put a multivalued field into a qt definition as output
: > : > > field. to do this I edit the current solrconfig.xml definition and add
: > the
: > : > > field in the fl specification.
: > : > >
: > : > > Unexpectedly when I do the query q=*:*&qt=mytype I get the error
: > : > >
: > : > > 
: > : > > can not use FieldCache on multivalued field: store_slug
: > : > > 
: > : > >
: > : > > But if I instead run the query
: > : > >
: > : > >
: > 
http://src-eprice-dev:8080/solr/0/select/?q=*:*&qt=mytype&fl=otherfield,mymultivaluedfiled
: > : > >
: > : > > I don't get the error
: > : > >
: > : > > Have you got any suggestions?
: > : > >
: > : > > I'm using solr 4 beta
: > : > >
: > : > > solr-spec 4.0.0.2012.08.06.22.50.47
: > : > > lucene-impl 4.0.0-BETA 1370099
: > : > >
: > : > >
: > : > > Giovanni
: > :
: > :
: > : --
: > :
: > :
: > :  Giovanni Bricconi
: > :
: > : Banzai Consulting
: > : cell. 348 7283865
: > : ufficio 02 00643839
: > : via Gian Battista Vico 42
: > : 20132 Milano (MI)
: > :
: > :
: > :
: > :
: >
: > -Hoss
:
:
: --
:
:
:  Giovanni Bricconi
:
: Banzai Consulting
: cell. 348 7283865
: ufficio 02 00643839
: via Gian Battista Vico 42
: 20132 Milano (MI)
:

-Hoss



--


 Giovanni Bricconi

Banzai Consulting
cell. 348 7283865
ufficio 02 00643839
via Gian Battista Vico 42
20132 Milano (MI)


test.tar.bz2
Description: BZip2 compressed data


Re: multivalued filed question (FieldCache error)

2012-10-08 Thread giovanni.bricc...@banzai.it

Thank you very much!

I've singlelined, spaced removed every fl field in my solrconfig and now 
the app works fine


Giovanni

Il 05/10/12 20:49, Chris Hostetter ha scritto:

: So extracting the attachment you will be able to track down what appens
:
: this is the query that shows the error, and below you can see the latest stack
: trace and the qt definition

Awesome -- exactly what we needed.

I've reproduced your problem, and verified that it has something to do
with the extra newlines which are confusing the parsing into not
recognizing "store_slug" as a simple field name.

The workarround is to modify the fl in your config to look like this...

  sku,store_slug

...or even like this...

 sku,  store_slug   

...and then it should work fine.

having a newline immediately following the store_slug field name is
somehow confusing things, and making it not recognize "store_slug" as a
simple field name -- so then it tries to parse it as a function, and
since bare field names can also be used as functions that parsing works,
but then you get the error that the field can't be used as a function
since it's multivalued.

I'll try to get a fix for this into 4.0-FINAL...

https://issues.apache.org/jira/browse/SOLR-3916

-Hoss