Boosting a field with defType:dismax --> No results at all
Hi there, i want to boost a field, see below. If i add the defType:dismax i don't get results at all anymore. What i am doing wrong? Regards Uwe true text AND default true true 1 100 true true 1 dismax SignalImpl.baureihe^1011 text^0.1 spellcheck -- View this message in context: http://lucene.472066.n3.nabble.com/Boosting-a-field-with-defType-dismax-No-results-at-all-tp4095850.html Sent from the Solr - User mailing list archive at Nabble.com.
AW: Boosting a field with defType:dismax --> No results at all
Perfect!!! THANKS A LOT That was the mistake. Von: Jack Krupansky-2 [via Lucene] [mailto:ml-node+s472066n409590...@n3.nabble.com] Gesendet: Mittwoch, 16. Oktober 2013 14:55 An: uwe72 Betreff: Re: Boosting a field with defType:dismax --> No results at all Get rid of the newlines before and after the value of the qf parameter. -- Jack Krupansky -Original Message- From: uwe72 Sent: Wednesday, October 16, 2013 5:36 AM To: [hidden email] Subject: Boosting a field with defType:dismax --> No results at all Hi there, i want to boost a field, see below. If i add the defType:dismax i don't get results at all anymore. What i am doing wrong? Regards Uwe true text AND default true true 1 100 true true 1 dismax SignalImpl.baureihe^1011 text^0.1 spellcheck -- View this message in context: http://lucene.472066.n3.nabble.com/Boosting-a-field-with-defType-dismax-No -results-at-all-tp4095850.html Sent from the Solr - User mailing list archive at Nabble.com. _ If you reply to this email, your message will be added to the discussion below: http://lucene.472066.n3.nabble.com/Boosting-a-field-with-defType-dismax-No -results-at-all-tp4095850p4095901.html To unsubscribe from Boosting a field with defType:dismax --> No results at all, click here <http://lucene.472066.n3.nabble.com/template/NamlServlet.jtp?macro=unsubsc ribe_by_code&node=4095850&code=dXdlLmNsZW1lbnRAZXh4Y2VsbGVudC5kZXw0MDk1ODU wfC0yOTkxOTMwMjI=> . <http://lucene.472066.n3.nabble.com/template/NamlServlet.jtp?macro=macro_v iewer&id=instant_html%21nabble%3Aemail.naml&base=nabble.naml.namespaces.Ba sicNamespace-nabble.view.web.template.NabbleNamespace-nabble.view.web.temp late.NodeNamespace&breadcrumbs=notify_subscribers%21nabble%3Aemail.naml-in stant_emails%21nabble%3Aemail.naml-send_instant_email%21nabble%3Aemail.nam l> NAML -- View this message in context: http://lucene.472066.n3.nabble.com/Boosting-a-field-with-defType-dismax-No-results-at-all-tp4095850p4095906.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: AW: Boosting a field with defType:dismax --> No results at all
We have just one more Problem: When we search explicit, like *:* or partNumber:A32783627 we still don’t get any results. What we are doing here wrong? -- View this message in context: http://lucene.472066.n3.nabble.com/Boosting-a-field-with-defType-dismax-No-results-at-all-tp4095850p4095918.html Sent from the Solr - User mailing list archive at Nabble.com.
AW: Boosting a field with defType:dismax --> No results at all
We have just one more Problem: When we search explicit, like *:* or partNumber:A32783627 we still don't get any results. What we are doing here wrong? -- View this message in context: http://lucene.472066.n3.nabble.com/Boosting-a-field-with-defType-dismax-No-results-at-all-tp4095850p4095927.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: AW: Boosting a field with defType:dismax --> No results at all
Works like this? edismax SignalImpl.baureihe^1011 text^0.1 Another option: How about just but to the desired fields a high boosting factor while adding the field to the document, using solr?! Can this work? -- View this message in context: http://lucene.472066.n3.nabble.com/Boosting-a-field-with-defType-dismax-No-results-at-all-tp4095850p4095938.html Sent from the Solr - User mailing list archive at Nabble.com.
Prevent public access to Solr Admin Page
Hi there, how can i prevent that everybody who knows the URL of our solr admin page, has the right to access it? Thanks in advance! Uwe -- View this message in context: http://lucene.472066.n3.nabble.com/Prevent-public-access-to-Solr-Admin-Page-tp4092080.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: Prevent public access to Solr Admin Page
unfortunately i didn't understand at all. We are using a tomcat for the solr server. how exactly can i prevent that user access the solr admin page? -- View this message in context: http://lucene.472066.n3.nabble.com/Prevent-public-access-to-Solr-Admin-Page-tp4092080p4092236.html Sent from the Solr - User mailing list archive at Nabble.com.
SOLR-4641: Schema now throws exception on illegal field parameters.
Is there a way to tell solr, that it should not check these parameters? Because we added our own parameters, which we load on runtime for other proposes. Thans in advance! -- View this message in context: http://lucene.472066.n3.nabble.com/SOLR-4641-Schema-now-throws-exception-on-illegal-field-parameters-tp4069622.html Sent from the Solr - User mailing list archive at Nabble.com.
AW: SOLR-4641: Schema now throws exception on illegal field parameters.
Erick, i think he didn't at the validate=false to a field, but global to the schema.xml/solrconfig.xml (i don't remember where exactly define this globally) Von: Erick Erickson [via Lucene] [mailto:ml-node+s472066n4070067...@n3.nabble.com] Gesendet: Donnerstag, 13. Juni 2013 00:51 An: uwe72 Betreff: Re: SOLR-4641: Schema now throws exception on illegal field parameters. bbarani: Where did you see this? I haven't seen it before and I get an error on startup if I add validate="false" to a definition Thanks, Erick On Tue, Jun 11, 2013 at 12:33 PM, bbarani <[hidden email]> wrote: > I think if you use validate=false in schema.xml, field or dynamicField level, > Solr will not disable validation. > > I think this only works in solr 4.3 and above.. > > > > -- > View this message in context: http://lucene.472066.n3.nabble.com/SOLR-4641-Schema-now-throws-exception-o n-illegal-field-parameters-tp4069622p4069688.html > Sent from the Solr - User mailing list archive at Nabble.com. _ If you reply to this email, your message will be added to the discussion below: http://lucene.472066.n3.nabble.com/SOLR-4641-Schema-now-throws-exception-o n-illegal-field-parameters-tp4069622p4070067.html To unsubscribe from SOLR-4641: Schema now throws exception on illegal field parameters., click here <http://lucene.472066.n3.nabble.com/template/NamlServlet.jtp?macro=unsubsc ribe_by_code&node=4069622&code=dXdlLmNsZW1lbnRAZXh4Y2VsbGVudC5kZXw0MDY5NjI yfC0yOTkxOTMwMjI=> . <http://lucene.472066.n3.nabble.com/template/NamlServlet.jtp?macro=macro_v iewer&id=instant_html%21nabble%3Aemail.naml&base=nabble.naml.namespaces.Ba sicNamespace-nabble.view.web.template.NabbleNamespace-nabble.view.web.temp late.NodeNamespace&breadcrumbs=notify_subscribers%21nabble%3Aemail.naml-in stant_emails%21nabble%3Aemail.naml-send_instant_email%21nabble%3Aemail.nam l> NAML -- View this message in context: http://lucene.472066.n3.nabble.com/SOLR-4641-Schema-now-throws-exception-on-illegal-field-parameters-tp4069622p4070159.html Sent from the Solr - User mailing list archive at Nabble.com.
AW: SOLR-4641: Schema now throws exception on illegal field parameters.
How can i load this custom properties with solrJ? Von: Erick Erickson [via Lucene] [mailto:ml-node+s472066n4070068...@n3.nabble.com] Gesendet: Donnerstag, 13. Juni 2013 00:53 An: uwe72 Betreff: Re: SOLR-4641: Schema now throws exception on illegal field parameters. But see Steve Rowe's comments at https://issues.apache.org/jira/browse/SOLR-4641 and use custom child properties as: VALUE ... Best Erick On Wed, Jun 12, 2013 at 6:49 PM, Erick Erickson <[hidden email]> wrote: > bbarani: > > Where did you see this? I haven't seen it before and I get an error on > startup if I add validate="false" to a definition > > Thanks, > Erick > > On Tue, Jun 11, 2013 at 12:33 PM, bbarani <[hidden email]> wrote: >> I think if you use validate=false in schema.xml, field or dynamicField level, >> Solr will not disable validation. >> >> I think this only works in solr 4.3 and above.. >> >> >> >> -- >> View this message in context: http://lucene.472066.n3.nabble.com/SOLR-4641-Schema-now-throws-exception-o n-illegal-field-parameters-tp4069622p4069688.html >> Sent from the Solr - User mailing list archive at Nabble.com. _ If you reply to this email, your message will be added to the discussion below: http://lucene.472066.n3.nabble.com/SOLR-4641-Schema-now-throws-exception-o n-illegal-field-parameters-tp4069622p4070068.html To unsubscribe from SOLR-4641: Schema now throws exception on illegal field parameters., click here <http://lucene.472066.n3.nabble.com/template/NamlServlet.jtp?macro=unsubsc ribe_by_code&node=4069622&code=dXdlLmNsZW1lbnRAZXh4Y2VsbGVudC5kZXw0MDY5NjI yfC0yOTkxOTMwMjI=> . <http://lucene.472066.n3.nabble.com/template/NamlServlet.jtp?macro=macro_v iewer&id=instant_html%21nabble%3Aemail.naml&base=nabble.naml.namespaces.Ba sicNamespace-nabble.view.web.template.NabbleNamespace-nabble.view.web.temp late.NodeNamespace&breadcrumbs=notify_subscribers%21nabble%3Aemail.naml-in stant_emails%21nabble%3Aemail.naml-send_instant_email%21nabble%3Aemail.nam l> NAML -- View this message in context: http://lucene.472066.n3.nabble.com/SOLR-4641-Schema-now-throws-exception-on-illegal-field-parameters-tp4069622p4070160.html Sent from the Solr - User mailing list archive at Nabble.com.
Pls help: Very long query - what to do?
my query is like this, see below. I use already POST request. i got a solr exception: org.apache.solr.client.solrj.SolrServerException: Server at http://server:7056/solr returned non ok status:400, message:Bad Request is there a way in order to prevent this? id:("ModuleImpl@20117" OR "ModuleImpl@37886" OR "ModuleImpl@9379" OR "ModuleImpl@37906" OR "ModuleImpl@19969" OR "ModuleImpl@37936" OR "ModuleImpl@115568" OR "ModuleImpl@19901" OR "ModuleImpl@115472" OR "ModuleImpl@20044" OR "ModuleImpl@25168" OR "ModuleImpl@38026" OR "ModuleImpl@115647" OR "ModuleImpl@115648" OR "ModuleImpl@115649" OR "ModuleImpl@20045" OR "ModuleImpl@25169" OR "ModuleImpl@38031" OR "ModuleImpl@115650" OR "ModuleImpl@21090" OR "ModuleImpl@38037" OR "ModuleImpl@117097" OR "ModuleImpl@21091" OR "ModuleImpl@38038" OR "ModuleImpl@117098" OR "ModuleImpl@117099" OR "ModuleImpl@19973" OR "ModuleImpl@38040" OR "ModuleImpl@115571" OR "ModuleImpl@115572" OR "ModuleImpl@115573" OR "ModuleImpl@21092" OR "ModuleImpl@38135" OR "ModuleImpl@117100" OR "ModuleImpl@21093" OR "ModuleImpl@38136" OR "ModuleImpl@117101" OR "ModuleImpl@117102" OR "ModuleImpl@19979" OR "ModuleImpl@38140" OR "ModuleImpl@115581" OR "ModuleImpl@19980" OR "ModuleImpl@38143" OR "ModuleImpl@115582" OR "ModuleImpl@115583" OR "ModuleImpl@21094" OR "ModuleImpl@38223" OR "ModuleImpl@117104" OR "ModuleImpl@117105" OR "ModuleImpl@117106" OR "ModuleImpl@117107" OR "ModuleImpl@117108" OR "ModuleImpl@21095" OR "ModuleImpl@38224" OR "ModuleImpl@117109" OR "ModuleImpl@19920" OR "ModuleImpl@25157" OR "ModuleImpl@38240" OR "ModuleImpl@115493" OR "ModuleImpl@20139" OR "ModuleImpl@38286" OR "ModuleImpl@115752" OR "ModuleImpl@21096" OR "ModuleImpl@38327" OR "ModuleImpl@117111" OR "ModuleImpl@117112" OR "ModuleImpl@117113" OR "ModuleImpl@21097" OR "ModuleImpl@38328" OR "ModuleImpl@117114" OR "ModuleImpl@19989" OR "ModuleImpl@25166" OR "ModuleImpl@38332" OR "ModuleImpl@115585" OR "ModuleImpl@115586" OR "ModuleImpl@19990" OR "ModuleImpl@38339" OR "ModuleImpl@115587" OR "ModuleImpl@115588" OR "ModuleImpl@115589" OR "ModuleImpl@115590" OR "ModuleImpl@115591" OR "ModuleImpl@115592" OR "ModuleImpl@115593" OR "ModuleImpl@115594" OR "ModuleImpl@115595" OR "ModuleImpl@19807" OR "ModuleImpl@38365" OR "ModuleImpl@115365" OR "ModuleImpl@115366" OR "ModuleImpl@19808" OR "ModuleImpl@38373" OR "ModuleImpl@115367" OR "ModuleImpl@115368" OR "ModuleImpl@115369" OR "ModuleImpl@115370" OR "ModuleImpl@115371" OR "ModuleImpl@21121" OR "ModuleImpl@38418" OR "ModuleImpl@117132" OR "ModuleImpl@117133" OR "ModuleImpl@117134" OR "ModuleImpl@732" OR "ModuleImpl@38438" OR "ModuleImpl@117115" OR "ModuleImpl@21099" OR "ModuleImpl@38440" OR "ModuleImpl@117116" OR "ModuleImpl@19929" OR "ModuleImpl@38450" OR "ModuleImpl@115501" OR "ModuleImpl@115502" OR "ModuleImpl@19810" OR "ModuleImpl@38471" OR "ModuleImpl@115372" OR "ModuleImpl@115373" OR "ModuleImpl@21124" OR "ModuleImpl@38529" OR "ModuleImpl@117135" OR "ModuleImpl@117136" OR "ModuleImpl@117137" OR "ModuleImpl@117138" OR "ModuleImpl@19931" OR "ModuleImpl@115505" OR "ModuleImpl@21074" OR "ModuleImpl@38546" OR "ModuleImpl@117077" OR "ModuleImpl@19934" OR "ModuleImpl@38548" OR "ModuleImpl@115507" OR "ModuleImpl@115508" OR "ModuleImpl@115509" OR "ModuleImpl@115510" OR "ModuleImpl@20550" OR "ModuleImpl@38607" OR "ModuleImpl@115885" OR "ModuleImpl@21127" OR "ModuleImpl@38638" OR "ModuleImpl@117139" OR "ModuleImpl@21077" OR "ModuleImpl@25182" OR "ModuleImpl@38657" OR "ModuleImpl@117078" OR "ModuleImpl@117079" OR "ModuleImpl@117080" OR "ModuleImpl@19938" OR "ModuleImpl@38658" OR "ModuleImpl@115516" OR "ModuleImpl@115517" OR "ModuleImpl@115518" OR "ModuleImpl@115519" OR "ModuleImpl@19864" OR "ModuleImpl@115432" OR "ModuleImpl@19769" OR "ModuleImpl@38695" OR "ModuleImpl@115320" OR "ModuleImpl@20556" OR "ModuleImpl@38720" OR "ModuleImpl@20494" OR "ModuleImpl@38736" OR "ModuleImpl@19871" OR "ModuleImpl@115438" OR "ModuleImpl@21056" OR "ModuleImpl@38771" OR "ModuleImpl@19775" OR "ModuleImpl@19776" OR "ModuleImpl@38802" OR "ModuleImpl@115330" OR "ModuleImpl@115331" OR "ModuleImpl@115332" OR "ModuleImpl@20566" OR "ModuleImpl@38835" OR "ModuleImpl@115889" OR "ModuleImpl@115890" OR "ModuleImpl@20501" OR "ModuleImpl@38846" OR "ModuleImpl@115869" OR "ModuleImpl@115870" OR "ModuleImpl@21107" OR "ModuleImpl@38859" OR "ModuleImpl@117118" OR "ModuleImpl@19879" OR "ModuleImpl@38871" OR "ModuleImpl@115444" OR "ModuleImpl@115445" OR "ModuleImpl@21058" OR "ModuleImpl@38873" OR "ModuleImpl@19823" OR "ModuleImpl@25153" OR "ModuleImpl@38896" OR "ModuleImpl@115396" OR "ModuleImpl@115397" OR "ModuleImpl@19779" OR "ModuleImpl@38904" OR "ModuleImpl@115334" OR "ModuleImpl@115335" OR "ModuleImpl@115336" OR "ModuleImpl@20574" OR "ModuleImpl@38932" OR "ModuleImpl@115892" OR "ModuleImpl@115893" OR "ModuleImpl@20504" OR "ModuleImpl@38941" OR "ModuleImpl@115871" OR "ModuleImpl@115872" OR "ModuleImpl@21083" OR "ModuleImpl@38962" OR "ModuleImpl@117081" OR "ModuleImpl@117082" OR "ModuleImpl@19884" OR "ModuleImpl@3896
Re: Pls help: Very long query - what to do?
i have already: -- View this message in context: http://lucene.472066.n3.nabble.com/Pls-help-Very-long-query-what-to-do-tp4021606p4021619.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: Pls help: Very long query - what to do?
Yes it works when i increase the maxBooleanClauses But any case i have to think how i redesign the document structure. i have big problems do the relations between documents. also a document can be changed, then i have to update many documents which has a relation to the modified one. -- View this message in context: http://lucene.472066.n3.nabble.com/Pls-help-Very-long-query-what-to-do-tp4021606p4021673.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: Pls help: Very long query - what to do?
My design is like this at the moment: Documents in general has a relation to each other. So, a document has a id, some attributes and a multivalue-field "navigateTo". E.g. Document1: id1, some attributes, naviagteToAllDocumentsWhenColor:red, navigateTo: id2, id3 Document2: id2, some attributes, color:red, navigateTo: id1 (backlink) Document3: id3, some attributes, color:red, navigateTo: id1 (backlink) Document4: id5, some attributes, color:black, navigateTo: My first problem is, that when I re-import document3 I have to load all documents in cache which has a relation to my documents, because of my color is red. Especially when my color is not red anymore, I have to update document1 und delete the relation to document3. Always do the queries in order to find out which documents I have to update, the to load and update it, costs a lot of performance. That’s why I changed the design. I don’t do all relations anymore at importime. I have some serialized hashmaps and store and update them outide of solr. In this maps I have the informations which documents I related to a document. I have all ids. But then I have this problem now, that this can be up to 20.000 ids. So I think this is impossible to load the with OR...OR...OR. It is a bit complicated to explain...i am using solr 3.6.1. I think with solr 4 they have this LINK feature, where can join other queries. Not sure if this would fix my problem. REGARDS, Uwe -- View this message in context: http://lucene.472066.n3.nabble.com/Pls-help-Very-long-query-what-to-do-tp4021606p4021684.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: OutOfMemoryError | While Faceting Query
You mean this: stats: entries_count : 24 entry#0 : 'NIOFSIndexInput(path="/home/connect/ConnectPORTAL/preview/solr-home/data/index/_2f3.frq")'=>'WiringDiagramSheetImpl.pageNumber',class org.apache.lucene.search.FieldCache$StringIndex,null=>org.apache.lucene.search.FieldCache$StringIndex#32159051 entry#1 : 'NIOFSIndexInput(path="/home/connect/ConnectPORTAL/preview/solr-home/data/index/_2f3.frq")'=>'BandagierungImpl.sachnummer',class org.apache.lucene.search.FieldCache$StringIndex,null=>org.apache.lucene.search.FieldCache$StringIndex#2383166 entry#2 : 'NIOFSIndexInput(path="/home/connect/ConnectPORTAL/preview/solr-home/data/index/_2f3.frq")'=>'ModuleImpl.partNumber',class org.apache.lucene.search.FieldCache$StringIndex,null=>org.apache.lucene.search.FieldCache$StringIndex#23443846 entry#3 : 'NIOFSIndexInput(path="/home/connect/ConnectPORTAL/preview/solr-home/data/index/_2f3.frq")'=>'SingleCoreWireImpl.wireNumber',class org.apache.lucene.search.FieldCache$StringIndex,null=>org.apache.lucene.search.FieldCache$StringIndex#24586189 entry#4 : 'NIOFSIndexInput(path="/home/connect/ConnectPORTAL/preview/solr-home/data/index/_2f3.frq")'=>'EinzelleitungImpl.bezeichnung',class org.apache.lucene.search.FieldCache$StringIndex,null=>org.apache.lucene.search.FieldCache$StringIndex#30808319 entry#5 : 'NIOFSIndexInput(path="/home/connect/ConnectPORTAL/preview/solr-home/data/index/_2f3.frq")'=>'WiringDiagramSheetImpl.partNumber',class org.apache.lucene.search.FieldCache$StringIndex,null=>org.apache.lucene.search.FieldCache$StringIndex#16987461 entry#6 : 'NIOFSIndexInput(path="/home/connect/ConnectPORTAL/preview/solr-home/data/index/_3kj.frq")'=>'WiringDiagramSheetImpl.pageNumber',class org.apache.lucene.search.FieldCache$StringIndex,null=>org.apache.lucene.search.FieldCache$StringIndex#27154168 entry#7 : 'NIOFSIndexInput(path="/home/connect/ConnectPORTAL/preview/solr-home/data/index/_3kj.frq")'=>'BandagierungImpl.sachnummer',class org.apache.lucene.search.FieldCache$StringIndex,null=>org.apache.lucene.search.FieldCache$StringIndex#6277146 entry#8 : 'NIOFSIndexInput(path="/home/connect/ConnectPORTAL/preview/solr-home/data/index/_3kj.frq")'=>'ModuleImpl.partNumber',class org.apache.lucene.search.FieldCache$StringIndex,null=>org.apache.lucene.search.FieldCache$StringIndex#4860238 entry#9 : 'NIOFSIndexInput(path="/home/connect/ConnectPORTAL/preview/solr-home/data/index/_3kj.frq")'=>'SingleCoreWireImpl.wireNumber',class org.apache.lucene.search.FieldCache$StringIndex,null=>org.apache.lucene.search.FieldCache$StringIndex#14545746 entry#10 : 'NIOFSIndexInput(path="/home/connect/ConnectPORTAL/preview/solr-home/data/index/_3kj.frq")'=>'EinzelleitungImpl.bezeichnung',class org.apache.lucene.search.FieldCache$StringIndex,null=>org.apache.lucene.search.FieldCache$StringIndex#26324419 entry#11 : 'NIOFSIndexInput(path="/home/connect/ConnectPORTAL/preview/solr-home/data/index/_3kj.frq")'=>'WiringDiagramSheetImpl.partNumber',class org.apache.lucene.search.FieldCache$StringIndex,null=>org.apache.lucene.search.FieldCache$StringIndex#19329933 entry#12 : 'NIOFSIndexInput(path="/home/connect/ConnectPORTAL/preview/solr-home/data/index/_192.frq")'=>'WiringDiagramSheetImpl.pageNumber',class org.apache.lucene.search.FieldCache$StringIndex,null=>org.apache.lucene.search.FieldCache$StringIndex#4187113 entry#13 : 'NIOFSIndexInput(path="/home/connect/ConnectPORTAL/preview/solr-home/data/index/_192.frq")'=>'BandagierungImpl.sachnummer',class org.apache.lucene.search.FieldCache$StringIndex,null=>org.apache.lucene.search.FieldCache$StringIndex#9180601 entry#14 : 'NIOFSIndexInput(path="/home/connect/ConnectPORTAL/preview/solr-home/data/index/_192.frq")'=>'ModuleImpl.partNumber',class org.apache.lucene.search.FieldCache$StringIndex,null=>org.apache.lucene.search.FieldCache$StringIndex#15091934 entry#15 : 'NIOFSIndexInput(path="/home/connect/ConnectPORTAL/preview/solr-home/data/index/_192.frq")'=>'SingleCoreWireImpl.wireNumber',class org.apache.lucene.search.FieldCache$StringIndex,null=>org.apache.lucene.search.FieldCache$StringIndex#12186256 entry#16 : 'NIOFSIndexInput(path="/home/connect/ConnectPORTAL/preview/solr-home/data/index/_192.frq")'=>'EinzelleitungImpl.bezeichnung',class org.apache.lucene.search.FieldCache$StringIndex,null=>org.apache.lucene.search.FieldCache$StringIndex#31719847 entry#17 : 'NIOFSIndexInput(path="/home/connect/ConnectPORTAL/preview/solr-home/data/index/_192.frq")'=>'WiringDiagramSheetImpl.partNumber',class org.apache.lucene.search.FieldCache$StringIndex,null=>org.apache.lucene.search.FieldCache$StringIndex#2653949 entry#18 : 'NIOFSIndexInput(path="/home/connect/ConnectPORTAL/preview/solr-home/data/index/_2f3.frq")'=>'WiringDiagramSheetImpl.versionAsDate',long,org.apache.lucene.search.FieldCache.NUMERIC_UTILS_LONG_PARSER=>[J#11082048 entry#19 : 'NIOFSIndexInput(path="/home/connect/ConnectPORTAL/preview/solr-home/data/index/_2f3.frq")'=>'ModuleImpl.versionAsDate',long,org.apache.lucene.search.FieldCache.NUMERIC_UTILS_LONG_PARSER=>[J
Bad performance while query pdf solr documents
hi i am indexing pdf documents to solr by tika. when i do the query in the client with solrj the performance is very bad (40 seconds) to load 100 documents? Probably because to load all the content. The content i don't need. How can i tell the query to don't load the content? Or other reasons why the performance is so bad? Regards Uwe -- View this message in context: http://lucene.472066.n3.nabble.com/Bad-performance-while-query-pdf-solr-documents-tp4028766.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: Bad performance while query pdf solr documents
>>>your query-time fl parameter. means "don't return" this field? because we have many many fields, so probably now i use the default and all fields will be loaded. so i just want to tell the query to don't load the "text" field. I do this with the fl parameter? -- View this message in context: http://lucene.472066.n3.nabble.com/Bad-performance-while-query-pdf-solr-documents-tp4028766p4028813.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: Bad performance while query pdf solr documents
we have more than hundreds fields...i don't want to put them all to the fl parameters is there a other way, like to say return all fields, except the fields...? anyhow i will change the field from stored to stored=false in the schema. -- View this message in context: http://lucene.472066.n3.nabble.com/Bad-performance-while-query-pdf-solr-documents-tp4028766p4028816.html Sent from the Solr - User mailing list archive at Nabble.com.
SolrJ | Add a date field to ContentStreamUpdateRequest
Hi there, how can i add a date field to a pdf document? ContentStreamUpdateRequest up = new ContentStreamUpdateRequest("/update/extract"); up.addFile(pdfFile, "application/octet-stream"); up.setParam("literal." + SolrConstants.ID, solrPDFId); Regards Uwe -- View this message in context: http://lucene.472066.n3.nabble.com/SolrJ-Add-a-date-field-to-ContentStreamUpdateRequest-tp4029704.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: SolrJ | IOException while Indexing a PDF document with additional fields
wasn't it the stacetrace in my posting before? It is the same behavior when i use the HttpSolrServer.java here is the console output of the solr server: 03.01.2013 11:32:31 org.apache.solr.core.SolrDeletionPolicy updateCommits INFO: newest commit = 1 03.01.2013 11:32:31 org.apache.solr.update.processor.LogUpdateProcessor finish INFO: [core-main] webapp=/solr path=/update params={wt=javabin&version=2} {add=[WiringDiagramSheetImpl@17171]} 0 296 03.01.2013 11:32:31 org.apache.solr.update.DirectUpdateHandler2 commit INFO: start commit{flags=0,_version_=0,optimize=false,openSearcher=true,waitSearcher=true,expungeDeletes=false,softCommit=false} 03.01.2013 11:32:32 org.apache.solr.core.SolrDeletionPolicy onCommit INFO: SolrDeletionPolicy.onCommit: commits:num=2 commit{dir=C:\Projects\Project_ConnectPORTAL\connect-portal\tools\solr\solr-home-4.0\core-main\data\index,segFN=segments_1,generation=1,filenames=[segments_1] commit{dir=C:\Projects\Project_ConnectPORTAL\connect-portal\tools\solr\solr-home-4.0\core-main\data\index,segFN=segments_2,generation=2,filenames=[_0_Lucene40_0.tim, _0.fnm, _0.tvd, _0.tvf, _nrm.cfs, _0_Lucene40_0.prx, _0_Lucene40_0.tip, _0_Lucene40_0.frq, _0.tvx, _0_nrm.cfe, segments_2, _0.fdx, _0.si, _0.fdt] 03.01.2013 11:32:32 org.apache.solr.core.SolrDeletionPolicy updateCommits INFO: newest commit = 2 03.01.2013 11:32:32 org.apache.solr.search.SolrIndexSearcher INFO: Opening Searcher@7f2ea1dd main 03.01.2013 11:32:32 org.apache.solr.update.DirectUpdateHandler2 commit INFO: end_commit_flush 03.01.2013 11:32:32 org.apache.solr.core.SolrCore registerSearcher INFO: [core-main] Registered new searcher Searcher@7f2ea1dd main{StandardDirectoryReader(segments_2:3 _0(4.0.0.2):C1)} 03.01.2013 11:32:32 org.apache.solr.update.processor.LogUpdateProcessor finish INFO: [core-main] webapp=/solr path=/update params={waitSearcher=true&wt=javabin&commit=true&softCommit=false&version=2} {commit=} 0 375 -- View this message in context: http://lucene.472066.n3.nabble.com/SolrJ-IOException-while-Indexing-a-PDF-document-with-additional-fields-tp4029971p4030235.html Sent from the Solr - User mailing list archive at Nabble.com.
SolrJ and Solr 4.0 | doc.getFieldValue() returns String instead of Date
A Lucene 4.0 document returns for a Date field now a string value, instead of a Date object. "2009-10-29T00:00:009Z" Solr3.6 --> Date instance Can this be set somewhere in the config? I prefer to receive a date instance -- View this message in context: http://lucene.472066.n3.nabble.com/SolrJ-and-Solr-4-0-doc-getFieldValue-returns-String-instead-of-Date-tp4031588.html Sent from the Solr - User mailing list archive at Nabble.com.
SolrJ |ContentStreamUpdateRequest | Accessing parsed items without committing to solr
i have a bit strange usecase. when i index a pdf to solr i use ContentStreamUpdateRequest. The lucene document then contains in the "text" field all containing items (the parsed items of the physical pdf). i also need to add these parsed items to another lucene document. is there a way, to receive/parse these items just in memory, without comitting them to lucene? -- View this message in context: http://lucene.472066.n3.nabble.com/SolrJ-ContentStreamUpdateRequest-Accessing-parsed-items-without-committing-to-solr-tp4032636.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: SolrJ |ContentStreamUpdateRequest | Accessing parsed items without committing to solr
Yes, i don't really want to index/store the pdf document in lucene. i just need the parsed tokens for other things. So you mean i can use ExtractingRequestHandler.java to retrieve the items. has anybody a piece of code, doing that? actually i give the pdf as input and want the parsed items (the same what would be in the "text" field in the stored lucene doc). -- View this message in context: http://lucene.472066.n3.nabble.com/SolrJ-ContentStreamUpdateRequest-Accessing-parsed-items-without-committing-to-solr-tp4032636p4032646.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: SolrJ |ContentStreamUpdateRequest | Accessing parsed items without committing to solr
ok, seems this works: Tika tika = new Tika(); String tokens = tika.parseToString(file); -- View this message in context: http://lucene.472066.n3.nabble.com/SolrJ-ContentStreamUpdateRequest-Accessing-parsed-items-without-committing-to-solr-tp4032636p4032649.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: SolrJ |ContentStreamUpdateRequest | Accessing parsed items without committing to solr
Erik, what do u mean with this parameter, i don't find it.. -- View this message in context: http://lucene.472066.n3.nabble.com/SolrJ-ContentStreamUpdateRequest-Accessing-parsed-items-without-committing-to-solr-tp4032636p4032656.html Sent from the Solr - User mailing list archive at Nabble.com.
SolrJ | Atomic Updates | How works exactly?
i have very big documents in the index. i want to update a multivalue field of a document, without loading the whole document. how can i do this? is there somewhere a good documentation? regards -- View this message in context: http://lucene.472066.n3.nabble.com/SolrJ-Atomic-Updates-How-works-exactly-tp4032976.html Sent from the Solr - User mailing list archive at Nabble.com.
java.io.IOException: Map failed :: OutOfMemory
While adding lucene document we got this problem: What can we do here? Nov 12, 2012 3:25:09 PM org.apache.solr.update.DirectUpdateHandler2 commit INFO: start commit(optimize=false,waitFlush=true,waitSearcher=true,expungeDeletes=false) Exception in thread "Lucene Merge Thread #0" org.apache.lucene.index.MergePolicy$MergeException: java.io.IOException: Map failed at org.apache.lucene.index.ConcurrentMergeScheduler.handleMergeException(ConcurrentMergeScheduler.java:509) at org.apache.lucene.index.ConcurrentMergeScheduler$MergeThread.run(ConcurrentMergeScheduler.java:482) Caused by: java.io.IOException: Map failed at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:748) at org.apache.lucene.store.MMapDirectory$MMapIndexInput.(MMapDirectory.java:270) at org.apache.lucene.store.MMapDirectory.openInput(MMapDirectory.java:220) at org.apache.lucene.index.TermVectorsReader.(TermVectorsReader.java:87) at org.apache.lucene.index.SegmentCoreReaders.openDocStores(SegmentCoreReaders.java:243) at org.apache.lucene.index.SegmentReader.get(SegmentReader.java:118) at org.apache.lucene.index.IndexWriter$ReaderPool.get(IndexWriter.java:696) at org.apache.lucene.index.IndexWriter.mergeMiddle(IndexWriter.java:4238) at org.apache.lucene.index.IndexWriter.merge(IndexWriter.java:3908) at org.apache.lucene.index.ConcurrentMergeScheduler.doMerge(ConcurrentMergeScheduler.java:388) at org.apache.lucene.index.ConcurrentMergeScheduler$MergeThread.run(ConcurrentMergeScheduler.java:456) Caused by: java.lang.OutOfMemoryError: Map failed at sun.nio.ch.FileChannelImpl.map0(Native Method) at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:745) ... 10 more Nov 12, 2012 3:25:11 PM org.apache.solr.core.SolrDeletionPolicy onCommit INFO: SolrDeletionPolicy.onCommit: commits:num=2 Nov 12, 2012 5:16:41 PM org.apache.solr.update.SolrIndexWriter finalize SEVERE: SolrIndexWriter was not closed prior to finalize(), indicates a bug -- POSSIBLE RESOURCE LEAK!!! Nov 12, 2012 5:16:41 PM org.apache.solr.update.SolrIndexWriter finalize SEVERE: SolrIndexWriter was not closed prior to finalize(), indicates a bug -- POSSIBLE RESOURCE LEAK!!! -- View this message in context: http://lucene.472066.n3.nabble.com/java-io-IOException-Map-failed-OutOfMemory-tp4019802.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: java.io.IOException: Map failed :: OutOfMemory
Thanks Eric. We are using: export JAVA_OPTS="-XX:MaxPermSize=400m -Xmx2000m -Xms200M -Dsolr.solr.home=/home/connect/ConnectPORTAL/preview/solr-home" We have arround 5 Millions documents. The index size is arround 50GB. Before we add a document we delete the same id in the cache, doesn't matter if the doc exists or not. We use here the functionality in solrj to delete a list of ids. Always in this deletion the error occurs. -- View this message in context: http://lucene.472066.n3.nabble.com/java-io-IOException-Map-failed-OutOfMemory-tp4019802p4020027.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: java.io.IOException: Map failed :: OutOfMemory
Kernel: 2.6.32.29-0.3-default #1 SMP 2011-02-25 13:36:59 +0100 x86_64 x86_64 x86_64 GNU/Linux SUSE Linux Enterprise Server 11 SP1 (x86_64) physical Memory: 4 GB portadm@smtcax0033:/srv/connect/tomcat/instances/SYSTEST_Portal_01/bin> java -version java version "1.6.0_33" Java(TM) SE Runtime Environment (build 1.6.0_33-b03) Java HotSpot(TM) 64-Bit Server VM (build 20.8-b03, mixed mode) -- View this message in context: http://lucene.472066.n3.nabble.com/java-io-IOException-Map-failed-OutOfMemory-tp4019802p4020078.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: java.io.IOException: Map failed :: OutOfMemory
today the same exception: INFO: [] webapp=/solr path=/update params={waitSearcher=true&commit=true&wt=javabin&waitFlush=true&version=2} status=0 QTime=1009 Nov 13, 2012 2:02:27 PM org.apache.solr.core.SolrDeletionPolicy onInit INFO: SolrDeletionPolicy.onInit: commits:num=1 commit{dir=/net/smtcax0033/connect/Portal/solr-home/data/index,segFN=segments_3gm,version=1352803609067,generation=4486,filenames=[_21c.fdt, _4mv.tis, _4mh.fnm, _1si.fdt, _4n0.fdx, _4mx.nrm, _1si.fdx, _2n0.nrm, _2n0.prx, _4mv.tii, _3ii.fnm, _4mz.tvd, _4mv.nrm, _2ie.frq, _1l9.fnm, _4my.fnm, _21c.fdx, _308.tvd, _4mz.tvf, _308.tvf, _sc.tis, _4mw.tii, _4n1.fnm, _4mv.fdt, _1o2.nrm, _1si.nrm, _4mw.fdt, _it.tvf, _4mv.fdx, _sc.tii, _4mw.tis, _4mw.fdx, _37y.tvx, _4mz.tvx, _4mh.nrm, _1si.prx, _1o2.prx, _it.tvx, _3ii.tis, _3yn.nrm, _43w.tii, _37y.tvd, _3yn.prx, _308.prx, _cv.nrm, _37y.tvf, _1b9.nrm, _3xp.frq, _43w.tis, _4mf.tvf, _4mf.tvd, _1b9.fdt, _4ag.fdt, _1b9.fdx, _4mz.frq, _4ag.fdx, _418.tvx, _4mf.tvx, _418.frq, _473.tis, _3ii.nrm, _4mx.fnm, _cv.frq, _3yn.tvd, _418.tvd, _3yn.tvf, _418.tvf, _2ie.tvf, _2ie.tvd, _sc.frq, _1b9.frq, _4ag.nrm, _37y.tii, _cv.prx, _4mx.tis, _4ag.prx, _2ie.tvx, _2n0.fdx, _4mx.tii, _4mh.prx, _4my.prx, _4mz.nrm, _4lc.prx, _2ie.nrm, _3yn.tis, _4n0.tii, _4mw.prx, _3yn.tvx, _it.fnm, _2n0.fdt, _4ag.frq, _21c.tvf, _21c.tvd, _21c.nrm, _43w.prx, _308.fdt, _4my.frq, _1si.tvx, _4n3.prx, _3yn.tii, _37y.tis, _4dj.fdt, _473.frq, _1l9.prx, _2ie.fnm, _4dj.fdx, _308.fdx, _473.tvx, _cv.fdx, _4mz.tii, _473.tii, _cv.fdt, _3xp.tii, _4lc.nrm, _2em.fnm, _it.tis, _418.fdx, _4n3.fdx, _3xp.tis, _418.fdt, _1ih.fdx, _it.tii, _4n3.fdt, _4ix.tis, _1ih.fdt, _4lc.fdt, _4ix.tii, _4mz.tis, _1b9.prx, _4n0.tis, _4lc.fdx, _473.tvd, _1ih.nrm, _2n0.frq, _473.tvf, _4mz.fdx, _sc.fdx, _it.nrm, _4mz.fdt, _4my.tvx, _4mx.tvf, _3ii.tii, _1b9.tvf, _4mx.tvd, _1b9.tvd, _418.prx, _3ii.tvx, _3xp.fnm, _4mv.tvx, _sc.fdt, _sc.prx, segments_3gm, _418.fnm, _2n0.tii, _4mf.tis, _sc.nrm, _4mf.tii, _4dj.nrm, _3ii.tvd, _1ih.frq, _3ii.tvf, _4n1.prx, _1o2.tii, _37y.frq, _2em.prx, _4n3.frq, _4ix.fdt, _473.fdt, _21c.prx, _1o2.tvx, _3xp.nrm, _473.fdx, _sc.fnm, _2n0.tis, _43w.fdt, _4mf.fnm, _4ix.fdx, _43w.fdx, _4dj.tis, _473.nrm, _4my.tvf, _4mx.tvx, _4mv.tvd, _1o2.tvd, _4my.tvd, _1o2.tvf, _4dj.tii, _4mv.frq, _1si.tvf, _4mv.tvf, _1si.tvd, _473.fnm, _4ix.frq, _cv.tvx, _4dj.tvd, _21c.tii, _473.prx, _4n1.tvx, _1ih.tvx, _1si.tis, _cv.tvf, _4ag.fnm, _1b9.tvx, _1ih.tvf, _1l9.fdx, _4lc.tii, _1ih.tvd, _4n1.fdx, _4lc.tis, _1l9.fdt, _21c.tis, _4dj.tvf, _1si.tii, _4n1.fdt, _4n0.fnm, _cv.tvd, _it.frq, _4mv.prx, _4mh.tis, _3xp.tvf, _4n0.tvf, _3xp.tvd, _4n0.tvd, _4mx.fdx, _4my.nrm, _4dj.frq, _4mx.fdt, _43w.frq, _1o2.frq, _4n0.tvx, _it.tvd, _1si.fnm, _4n3.tvx, _3xp.tvx, _4mz.prx, _4my.tis, _21c.tvx, _37y.prx, _1ih.tii, _4ix.prx, _4mh.fdt, _2n0.fnm, _4n3.tvf, _21c.fnm, _4mh.fdx, _2em.tvx, _1b9.tii, _308.frq, _4mx.prx, _37y.fdx, _3yn.fnm, _4n3.tvd, _4mh.tii, _4ag.tis, _4my.tii, _1b9.tis, _2ie.prx, _1ih.prx, _4ag.tii, _4n1.tvd, _1ih.fnm, _3ii.prx, _4ix.nrm, _4n1.tvf, _4n1.nrm, _2em.tvd, _4mv.fnm, _4mw.fnm, _37y.nrm, _it.fdx, _4mf.frq, _4n0.nrm, _3ii.frq, _it.fdt, _1o2.tis, _37y.fdt, _4dj.tvx, _4n3.fnm, _4lc.fnm, _4my.fdt, _4lc.frq, _2em.tvf, _4my.fdx, _37y.fnm, _4n0.prx, _1l9.tvd, _418.nrm, _2em.tis, _4mw.nrm, _3xp.prx, _2ie.tis, _3xp.fdx, _1l9.frq, _1l9.tvf, _4mf.nrm, _2em.tii, _4ix.fnm, _3xp.fdt, _4mh.tvd, _4mh.tvf, _2ie.tii, _1o2.fdt, _4mh.tvx, _4mf.fdt, _4n0.frq, _308.tii, _4mw.tvx, _4ag.tvx, _308.tis, _4n1.frq, _4mf.fdx, _sc.tvd, _sc.tvf, _3yn.fdt, _4mw.tvf, _4ag.tvf, _4mw.tvd, _3yn.fdx, _1o2.fdx, _43w.fnm, _1o2.fnm, _4ag.tvd, _1si.frq, _sc.tvx, _cv.tis, _4dj.fnm, _4mh.frq, _1ih.tis, _4lc.tvf, _2em.fdt, _4lc.tvd, _2em.frq, _4ix.tvd, _21c.frq, _3ii.fdt, _2em.fdx, _4ix.tvf, _4n1.tis, _cv.tii, _4mz.fnm, _308.tvx, _4dj.prx, _4lc.tvx, _43w.tvf, _308.fnm, _3yn.frq, _43w.tvd, _43w.nrm, _it.prx, _4mx.frq, _cv.fnm, _2n0.tvx, _1l9.tii, _4n0.fdt, _418.tis, _418.tii, _1l9.tis, _4n3.nrm, _1l9.nrm, _4mw.frq, _4mf.prx, _4ix.tvx, _1l9.tvx, _2ie.fdx, _1b9.fnm, _43w.tvx, _2n0.tvd, _4n3.tii, _2n0.tvf, _3ii.fdx, _4n1.tii, _2em.nrm, _4n3.tis, _308.nrm, _2ie.fdt] Nov 13, 2012 2:02:27 PM org.apache.solr.core.SolrDeletionPolicy updateCommits INFO: newest commit = 1352803609067 Nov 13, 2012 2:02:27 PM org.apache.solr.update.processor.LogUpdateProcessor finish INFO: {add=[SingleCoreWireImpl@3005994, SingleCoreWireImpl@3005997, SingleCoreWireImpl@3005996, SingleCoreWireImpl@3005999, SingleCoreWireImpl@3005998, SingleCoreWireImpl@3005985, SingleCoreWireImpl@3005984, SingleCoreWireImpl@3005987, ... (500 adds)]} 0 85 Nov 13, 2012 2:02:27 PM org.apache.solr.core.SolrCore execute INFO: [] webapp=/solr path=/update params={wt=javabin&version=2} status=0 QTime=85 Nov 13, 2012 2:02:27 PM org.apache.solr.update.DirectUpdateHandler2 commit INFO: start commit(optimize=false,waitFlush=true,waitSearcher=true,expungeDeletes=false) Exception in thread "Lucene Merge Thread #0" org.apache.lucene.index.MergePolicy$MergeException: java.io.
Re: AW: java.io.IOException: Map failed :: OutOfMemory
Thanks Andrew! Parallel i also found this thread: http://grokbase.com/t/lucene/solr-user/117m8e9n8t/solr-3-3-exception-in-thread-lucene-merge-thread-1 they are talking about the same We just started the importer again, with the unlimited-flag (/ulimit -v unlimited /), then we will see. -- View this message in context: http://lucene.472066.n3.nabble.com/java-io-IOException-Map-failed-OutOfMemory-tp4019802p4020134.html Sent from the Solr - User mailing list archive at Nabble.com.
Inserting many documents and update relations
Hi there, i have a principal question. We have arround 5 million lucene documents. At the beginning we have arround 4000 XML-files which we transform to SolrInputDocuemnts by using solrj and adding them to the index. A document is also related to other documents, so while adding a document we have to do some queries (at least one) to identiy if there are related documents already in the cache in order to do the association to the related document. The related document also has a "backlink", so we have to update also the related document (means load, update, delete and re-add). We are using solr 3.6.1. The performance is quite slow because of this queries and modfifications of already existing documents in the cache. Are there some configuration issues what we can do, or anything else? Thanks a lot in advance. -- View this message in context: http://lucene.472066.n3.nabble.com/Inserting-many-documents-and-update-relations-tp4021151.html Sent from the Solr - User mailing list archive at Nabble.com.