Urgent:Partial Search not Working
All, I am using apache-solr-4.0.0-ALPHA and trying to configure the Partial search on two fields. Keywords using to search are The value inside the search ProdSymbl is M1.6X0.35 9P and I illl have to get the results if I search for M1.6 or X0.35 (Partial of the search value). I have tried using both NGramTokenizerFactory and solr.EdgeNGramFilterFactory in the schema.xml Fields I have configured as Copy field as Please let me know IF I and missing anything, this is kind of Urgent requirement needs to be addressed at the earliest, Please help. Thanks in advance, Jay
Re: Urgent:Partial Search not Working
Could anyone please reply the solution to this On Wed, Jul 4, 2012 at 7:18 PM, jayakeerthi s wrote: > All, > > I am using apache-solr-4.0.0-ALPHA and trying to configure the Partial > search on two fields. > > Keywords using to search are > The value inside the search ProdSymbl is M1.6X0.35 9P > > and I willl have to get the results if I search for M1.6 or X0.35 (Partial > of the search value). > > > I have tried using both NGramTokenizerFactory and solr.EdgeNGramFilterFactory > in the schema.xml > > > > > omitNorms="false"> > > > > > maxGramSize="15" side="front"/> > > > > > > Fields I have configured as > >multiValued="true"/> > multiValued="true"/> > > Copy field as > > > > > > > Please let me know IF I and missing anything, this is kind of Urgent > requirement needs to be addressed at the earliest, Please help. > > > Thanks in advance, > > Jay >
Re: Urgent:Partial Search not Working
Hi Jack, Many thanks for your reply... yes i have tried both ngram and Edgegram filterfactory, still no result. Please le t me know any alternatives On Thu, Jul 5, 2012 at 12:42 AM, Jack Krupansky wrote: > You need to apply the edge n-gram filter only at index time, not at query > time. So, you need to specify two analyzers for these field types, an > "index" and a "query" analyzer. They should be roughly the same, but the > "query" analyzer would not have the edge n-gram filter since you are > accepting the single n-gram given by the user and then matching it against > the full list of n-grams that are in the index. > > It is unfortunate that the wiki example is misleading. Just as bad, we > don't have an example in the example schema. > > Basically, take a "text" field type that you like from the Solr example > schema and then add the edge n-gram filter to its "index" analyzer, > probably as the last token filter. I would note that the edge n-gram filter > will interact with the stemming filter, but there is not much you can do > other than try different stemmers and experiment with whether stemming > should be before or after the edge n-gram filter. I suspect that having > stemming after edge n-gram may be better. > > -- Jack Krupansky > > -Original Message- From: jayakeerthi s > Sent: Wednesday, July 04, 2012 1:41 PM > To: solr-user@lucene.apache.org ; > solr-user-help@lucene.apache.**org > Subject: Re: Urgent:Partial Search not Working > > > Could anyone please reply the solution to this > > On Wed, Jul 4, 2012 at 7:18 PM, jayakeerthi s >wrote: > > All, >> >> I am using apache-solr-4.0.0-ALPHA and trying to configure the Partial >> search on two fields. >> >> Keywords using to search are >> The value inside the search ProdSymbl is M1.6X0.35 9P >> >> and I willl have to get the results if I search for M1.6 or X0.35 (Partial >> of the search value). >> >> >> I have tried using both NGramTokenizerFactory and >> solr.EdgeNGramFilterFactory >> in the schema.xml >> >> >> >> >> > omitNorms="false"> >> >> >> >> >> > maxGramSize="15" side="front"/> >> >> >> >> >> >> Fields I have configured as >> >> > multiValued="true"/> >>> multiValued="true"/> >> >> Copy field as >> >> >> >> >> >> >> Please let me know IF I and missing anything, this is kind of Urgent >> requirement needs to be addressed at the earliest, Please help. >> >> >> Thanks in advance, >> >> Jay >> >> >
Urgent: Facetable but not Searchable Field
All, We have a requirement, where we need to implement 2 fields as Facetable, but the values of the fields should not be Searchable. Please let me know is this feature Supported in Solr If yes what would be the Configuration to be done in Schema.xml and Solrconfig.xml to achieve the same. This is kind of urgent as we need to reply on the functionality. Thanks in advance, Jay
DIH Error in latest Nightly Builds
Hi All, I tried Indexing data and got the following error., Used Solr nightly Oct5th and nightly 8th, The same Configuration/query is working in Older version(May nightly Build) The db-data-config.xml has the simple Select query SEVERE: Full Import failed org.apache.solr.handler.dataimport.DataImportHandlerException: Unable to execute query: select CATALOG_ID, CATALOG_NUMBER, CATALOG_NAME, SEGMENTATION_ TYPE, BEGIN_OFFER_DATE, END_OFFER_DATE, FUTURE_BEGIN_DATE, FUTURE_END_DATE, ATONCE_BEGIN_DATE, ATONCE_END_DATE, REFERENCE_BEGIN_DATE, REFERENCE_END_DA TE, BEGIN_SEASON, LANGUAGE, COUNTRY, SIZE_TYPE, CURRENCY, DIVISION, LIFECYCLE, PRODUCT_CD, STYLE_CD, GLOBAL_STYLE_NAME, REGION_STYLE_NAME, NEW_STYLE, SIZE_RUN, COLOR_NBR, GLOBAL_COLOR_DESC, REGION_COLOR_DESC, WIDTH, CATEGORY, SUB_CATEGORY, CATEGORY_SUMMARY, CATEGORY_CORE_FOCUS, SPORT_ACTIVITY, SPORT _ACTIVITY_SUMMARY, GENDER_AGE, GENDER_AGE_SUMMARY, SILO, SILHOUETTE, SILHOUETTE_SUMMARY, SEGMENTATION_TIER, PRIMARY_COLOR, NEW_PRODUCT, CARRYOVER_PROD UCT, WHOLESALE_AMOUNT, RETAIL_AMOUNT, CATALOG_LAST_MOD_DATE, PRODUCT_LAST_MOD_DATE, STYLE_LAST_MOD_DATE, CATALOG_ID || '-' || PRODUCT_CD as UNIQ from prodsearch_atlasatgcombine Processing Document # 1 at org.apache.solr.handler.dataimport.DataImportHandlerException.wrapAndThrow(DataImportHandlerException.java:72) at org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.(JdbcDataSource.java:253) at org.apache.solr.handler.dataimport.JdbcDataSource.getData(JdbcDataSource.java:210) at org.apache.solr.handler.dataimport.JdbcDataSource.getData(JdbcDataSource.java:39) at org.apache.solr.handler.dataimport.SqlEntityProcessor.initQuery(SqlEntityProcessor.java:58) at org.apache.solr.handler.dataimport.SqlEntityProcessor.nextRow(SqlEntityProcessor.java:71) at org.apache.solr.handler.dataimport.EntityProcessorWrapper.nextRow(EntityProcessorWrapper.java:237) at org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:356) at org.apache.solr.handler.dataimport.DocBuilder.doFullDump(DocBuilder.java:242) at org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:180) at org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:331) at org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:389) at org.apache.solr.handler.dataimport.DataImporter$1.run(DataImporter.java:370) Caused by: java.sql.SQLException: Unsupported feature at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:134) at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:179) at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:269) at oracle.jdbc.dbaccess.DBError.throwUnsupportedFeatureSqlException(DBError.java:689) at oracle.jdbc.driver.OracleConnection.setHoldability(OracleConnection.java:3065) at org.apache.solr.handler.dataimport.JdbcDataSource$1.call(JdbcDataSource.java:191) at org.apache.solr.handler.dataimport.JdbcDataSource$1.call(JdbcDataSource.java:128) at org.apache.solr.handler.dataimport.JdbcDataSource.getConnection(JdbcDataSource.java:363) at org.apache.solr.handler.dataimport.JdbcDataSource.access$300(JdbcDataSource.java:39) at org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.(JdbcDataSource.java:240) ... 11 more Oct 8, 2009 6:30:23 PM org.apache.solr.update.DirectUpdateHandler2 rollback INFO: start rollback Oct 8, 2009 6:30:23 PM org.apache.solr.update.DirectUpdateHandler2 rollback INFO: end_rollback 2009-10-08 18:31:12.149::INFO: Shutdown hook executing 2009-10-08 18:31:12.149::INFO: Shutdown hook complete Thanks and regards, JK
query regarding Indexing xml files -db-data-config.xml
Hi All, I am trying to index the fileds from the xml files, here is the configuration that I am using. db-data-config.xml Schema.xml has the field "manu" The input xml file used to import the field is F8V7067-APL-KIT Belkin Mobile Power Cord for iPod w/ Dock Belkin electronics connector car power adapter, white 4 19.95 1 false doing the full-import this is the response I am getting - 0 0 0 2009-05-15 11:58:00 Indexing completed. Added/Updated: 0 documents. Deleted 0 documents. 2009-05-15 11:58:00 2009-05-15 11:58:00 0:0:0.172 This response format is experimental. It is likely to change in the future. Do I missing anything here or is there any format on the input xml,?? please help resolving this. Thanks and regards, Jay
Re: query regarding Indexing xml files -db-data-config.xml
Many thanks for the reply The complete input xml file is below I missed to include this earlier. F8V7067-APL-KIT Belkin Mobile Power Cord for iPod w/ Dock Belkin electronics connector car power adapter, white 4 19.95 1 false IW-02 iPod & iPod Mini USB 2.0 Cable Belkin electronics connector car power adapter for iPod, white 2 11.50 1 false regards, Jay On Fri, May 15, 2009 at 1:14 PM, Jay Hill wrote: > If that is your complete input file then it looks like you are missing the > wrapping element: > > > > F8V7067-APL-KIT > > > field> > > Belkin Mobile Power Cord for iPod w/ Dock > > Belkin > > electronics > > connector > > car power adapter, white > > 4 > > 19.95 > > 1 > > false > > > > > > Is it possible you just forgot to include the ? > > -Jay > > > On Fri, May 15, 2009 at 12:53 PM, jayakeerthi s >wrote: > > > Hi All, > > > > I am trying to index the fileds from the xml files, here is the > > configuration that I am using. > > > > > > db-data-config.xml > > > > > > > > > > > fileName="c:\test\ipod_other.xml" recursive="true" rootEntity="false" > > dataSource="null" baseDir="${dataimporter.request.xmlDataDir}"> > > > > > name="manu"/> > > > > > > > > > > > > > > Schema.xml has the field "manu" > > > > The input xml file used to import the field is > > > > > > F8V7067-APL-KIT > > Belkin Mobile Power Cord for iPod w/ Dock > > Belkin > > electronics > > connector > > car power adapter, white > > 4 > > 19.95 > > 1 > > false > > > > > > > > doing the full-import this is the response I am getting > > > > - > > 0 > > 0 > > 0 > > 2009-05-15 11:58:00 > > Indexing completed. Added/Updated: 0 documents. Deleted 0 > > documents. > > 2009-05-15 11:58:00 > > 2009-05-15 11:58:00 > > 0:0:0.172 > > > > This response format is experimental. It is likely > to > > change in the future. > > > > > > > > Do I missing anything here or is there any format on the input xml,?? > > please > > help resolving this. > > > > Thanks and regards, > > Jay > > >
Re: Indexing issue in DIH - not all records are Indexed
Hi Noble, Many thanks for the reply Yes there is a UniqueKey in the Schema which is the ProductID. I also tried PROD_ID. But no luck same only one document seen after querying *:* I have attached the Schema.xml used for your reference,please advise. Thanks and regards, Jay 2009/5/16 Noble Paul നോബിള് नोब्ळ् > check out if you have a uniqueKey in your schema. I there are > duplicates they are overwritten > > On Sat, May 16, 2009 at 1:38 AM, jayakeerthi s > wrote: > > I am using Solr for our application with JBoss Integration. > > > > I have managed to configure the indexing from Oralce db for 22 > fields.Here > > is the db-data-config.xml > > > > > >driver="oracle.jdbc.driver.OracleDriver" > > url="jdbc:oracle:thin:@camatld6.***.com:1521:atlasint" > > user="service_product_lgd" password=""/> > > > > > > > > > > > > > > > > > > And I have attached the Schema.xml used.done a full-import > > http://localhost:8983/solr/dataimport?command=full-import > > > > > > > > - > > > > 0 > > 0 > > > > - > > > > - > > > > - > > > > > C:\apache-solr-nightly\example\example-DIH\solr\db\conf\db-data-config.xml > > > > > > > > full-import > > idle > > > > - > > > > 1 > > 15 > > 0 > > 2009-05-11 11:27:02 > > - > > > > Indexing completed. Added/Updated: 15 documents. Deleted 0 documents. > > > > 2009-05-11 11:27:05 > > 2009-05-11 11:27:05 > > 0:0:2.625 > > > > - > > > > This response format is experimental. It is likely to change in the > future. > > > > > > > > The issue I am facing is:though the response is "Indexing completed. > > Added/Updated: 15 documents. Deleted 0 documents" > > I am able to seee only one document when I query *:* so all the other 14 > > documents are missing. > > Similarly I tried indexing 1 million records and found only 2500 docs by > > using *:* query > > > > So could anyone please help resolving this. > > > > > > Regards, > > Jay > > > > -- > - > Noble Paul | Principal Engineer| AOL | http://aol.com > StandardTokenizer, StandardFilter, LowerCaseFilter, StopFilter, GermanStemFilter PROD_ID. text
Re: query regarding Indexing xml files -db-data-config.xml
Hi Noble, Thanks for the reply, As advised I have changed the db-data-config.xml as below. But still the Indexing completed. Added/Updated: 0 documents. Deleted 0 documents. Got error as below when baseDir is removed INFO: last commit = 1242683454570 May 18, 2009 2:55:15 PM org.apache.solr.handler.dataimport.DataImporter doFullImport SEVERE: Full Import failed org.apache.solr.handler.dataimport.DataImportHandlerException: 'baseDir' is a required attribute Pro cessing Document # 1 at org.apache.solr.handler.dataimport.FileListEntityProcessor.init(FileListEntityProcessor.j ava:76) at org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:299) at org.apache.solr.handler.dataimport.DocBuilder.doFullDump(DocBuilder.java:225) at org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:167) at org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:324) at org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:382) at org.apache.solr.handler.dataimport.DataImporter$1.run(DataImporter.java:363) May 18, 2009 2:55:15 PM org.apache.solr.update.DirectUpdateHandler2 rollback INFO: start rollback Please advise. Thanks and regards, Jay 2009/5/17 Noble Paul നോബിള് नोब्ळ् > hi , > u may not need that enclosing entity , if you only wish to index one file. > > baseDir is not required if you give absolute path in the fileName. > > no need to mention forEach or fields if you set useSolrAddSchema="true" > > On Sat, May 16, 2009 at 1:23 AM, jayakeerthi s > wrote: > > Hi All, > > > > I am trying to index the fileds from the xml files, here is the > > configuration that I am using. > > > > > > db-data-config.xml > > > > > > > > > > > fileName="c:\test\ipod_other.xml" recursive="true" rootEntity="false" > > dataSource="null" baseDir="${dataimporter.request.xmlDataDir}"> > > > > > name="manu"/> > > > > > > > > > > > > > > Schema.xml has the field "manu" > > > > The input xml file used to import the field is > > > > > > F8V7067-APL-KIT > > Belkin Mobile Power Cord for iPod w/ Dock > > Belkin > > electronics > > connector > > car power adapter, white > > 4 > > 19.95 > > 1 > > false > > > > > > > > doing the full-import this is the response I am getting > > > > - > > 0 > > 0 > > 0 > > 2009-05-15 11:58:00 > > Indexing completed. Added/Updated: 0 documents. Deleted 0 > > documents. > > 2009-05-15 11:58:00 > > 2009-05-15 11:58:00 > > 0:0:0.172 > > > > This response format is experimental. It is likely > to > > change in the future. > > > > > > > > Do I missing anything here or is there any format on the input xml,?? > please > > help resolving this. > > > > Thanks and regards, > > Jay > > > > > > -- > - > Noble Paul | Principal Engineer| AOL | http://aol.com >
Re: Indexing issue in DIH - not all records are Indexed
I changed the UniqueKey and it worked fine.thank you very much Nobel 2009/5/18 Noble Paul നോബിള് नोब्ळ् > the problem is that your uniquekey may not be unique > > just remove the entry altogether > > On Mon, May 18, 2009 at 10:53 PM, jayakeerthi s > wrote: > > Hi Noble, > > Many thanks for the reply > > > > Yes there is a UniqueKey in the Schema which is the ProductID. > > > > I also tried PROD_ID. But no luck > > same only one document seen after querying *:* > > > > I have attached the Schema.xml used for your reference,please advise. > > > > Thanks and regards, > > Jay > > > > 2009/5/16 Noble Paul നോബിള് नोब्ळ् > >> > >> check out if you have a uniqueKey in your schema. I there are > >> duplicates they are overwritten > >> > >> On Sat, May 16, 2009 at 1:38 AM, jayakeerthi s > >> wrote: > >> > I am using Solr for our application with JBoss Integration. > >> > > >> > I have managed to configure the indexing from Oralce db for 22 > >> > fields.Here > >> > is the db-data-config.xml > >> > > >> > > >> >>> > driver="oracle.jdbc.driver.OracleDriver" > >> > url="jdbc:oracle:thin:@camatld6.***.com:1521:atlasint" > >> > user="service_product_lgd" password=""/> > >> > > >> > > >> > > >> > > >> > > >> > > >> > > >> > > >> > And I have attached the Schema.xml used.done a full-import > >> > http://localhost:8983/solr/dataimport?command=full-import > >> > > >> > > >> > > >> > - > >> > > >> > 0 > >> > 0 > >> > > >> > - > >> > > >> > - > >> > > >> > - > >> > > >> > > >> > > C:\apache-solr-nightly\example\example-DIH\solr\db\conf\db-data-config.xml > >> > > >> > > >> > > >> > full-import > >> > idle > >> > > >> > - > >> > > >> > 1 > >> > 15 > >> > 0 > >> > 2009-05-11 11:27:02 > >> > - > >> > > >> > Indexing completed. Added/Updated: 15 documents. Deleted 0 documents. > >> > > >> > 2009-05-11 11:27:05 > >> > 2009-05-11 11:27:05 > >> > 0:0:2.625 > >> > > >> > - > >> > > >> > This response format is experimental. It is likely to change in the > >> > future. > >> > > >> > > >> > > >> > The issue I am facing is:though the response is "Indexing completed. > >> > Added/Updated: 15 documents. Deleted 0 documents" > >> > I am able to seee only one document when I query *:* so all the other > 14 > >> > documents are missing. > >> > Similarly I tried indexing 1 million records and found only 2500 docs > by > >> > using *:* query > >> > > >> > So could anyone please help resolving this. > >> > > >> > > >> > Regards, > >> > Jay > >> > >> > >> > >> -- > >> - > >> Noble Paul | Principal Engineer| AOL | http://aol.com > > > > > > > > -- > - > Noble Paul | Principal Engineer| AOL | http://aol.com >
Regarding Delta-Import Query in DIH
Hi All, I understand from the details provided under http://wiki.apache.org/solr/DataImportHandler regarding Delta-import that there should be an additional column *last_modified* of timestamp type in the table. Is there any other way/method the same can be achieved without creating the additional column *last_modified* in the tables?? please advise. Thanks in advance
Problem using db-data-config.xml
Hi All, I am facing an issue while fetching the records from database by providing the value" '${prod.prod_cd}' " in this type at db-data-config.xml. It is working fine If I provide the exact value of the product code ie '302437-413' Here is the db-data-config.xm I am using AND p.prod_cd = '302437-413'"> * * The issue is IF I replace the *AND prod_cd ='${prod.prod_cd}' AND reg_id = '${prod_reg.reg_id'">* with the exact value '302437-413' I am getting the result If not it is not executing the prod_reg and prod_reg_cmrc_styl entity. Please advise anything I am missing in the above db-data-config.xml. Thanks in advance. Regards, Jayakeerthi
Re: Problem using db-data-config.xml
Many Thanks Noble the issue was with case of the field names. After fixing that I am getting the response for the full-data import cmd as *-* <http://localhost:8983/solr/dataimport?command=abort#> *-* <http://localhost:8983/solr/dataimport?command=abort#> * * * C:\apache-solr-nightly\example\example-DIH\solr\db\conf\db-data-config.xml* * * * * * * *abort* * * *busy* * * *-* <http://localhost:8983/solr/dataimport?command=abort#> * * *0:3:55.861* * * *3739* * * *4135* * * *1402* * * *0* * * *2009-06-10 13:54:22* * * * * *This response format is experimental. It is likely to change in the future.* * * As displayed above *3739* * * *4135* * * *1402* are differing The request to the datasource is increasing ..and the documents processed is less than the rows fetchedPlease advise If I am missing something here. I have attached the db-data-config.xml after modifying. Thanks in advance, jayakeerthi 2009/6/9 Noble Paul നോബിള് नोब्ळ् > are you sure prod_cd and reg_id\ are emitted by respective entities in > the same name if not you may need to alias those fields (using as) > > keep in mind ,the field namkes are case sensitive. Just to know what > are the values emitted use debug mode or use logTransformer > > On Wed, Jun 10, 2009 at 4:55 AM, jayakeerthi s > wrote: > > Hi All, > > > > I am facing an issue while fetching the records from database by > providing > > the value" '${prod.prod_cd}' " in this type at db-data-config.xml. > > It is working fine If I provide the exact value of the product code ie > > '302437-413' > > > > Here is the db-data-config.xm I am using > > > > > > driver="oracle.jdbc.driver.OracleDriver" > > url="jdbc:oracle:thin:@*:1521:" user="lslsls" > > password="***"/> > > > > > > > > AND p.prod_cd = '302437-413'"> > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > * > > > > > > > > > > > > > name="frst_prod_offr_dt"/> > > > > > > > > > > > > > > * > > > > > name="reg_cmrc_styl_nm"/> > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > The issue is IF I replace the *AND prod_cd ='${prod.prod_cd}' AND > reg_id = > > '${prod_reg.reg_id'">* with the exact value '302437-413' I am getting the > > result If not it is not > > executing the prod_reg and prod_reg_cmrc_styl entity. > > > > Please advise anything I am missing in the above db-data-config.xml. > > > > Thanks in advance. > > > > Regards, > > Jayakeerthi > > > > > > -- > - > Noble Paul | Principal Engineer| AOL | http://aol.com >
Empty results after merging index via IndexMergeTool
Hi All, I am trying to merge two index using mergeindextool. The two index created using solr1.4 and fine showing results . Used the below cmd as per the http://wiki.apache.org/solr/MergingSolrIndexes#head-feb9246bab59b54c0ba361d84981973976566c2a to merge the two index java -cp C:\jbdevstudio\jboss-eap\jboss-as\bin\Core\lib\lucene-core-2.9-dev.jar C:\jbdevstudio\jboss-eap\jboss-as\bin\Core\lib\lucene-misc-2.4.1.jar org.apache.lucene.misc.IndexMergeTool C:\jbdevstudio\jboss-eap\jboss-as\bin\Core\core\data C:\jbdevstudio\jboss-eap\jboss-as\bin\Core\core1\data\index C:\jbdevstudio\jboss-eap\jboss-as\bin\Core\core2\data\index After exeuting the above cmd got the result as Merging... Optimizing... Done. The core data folder contains the files " _0.cfs , segments.gen,segments_2 " Once I chk the results from the merged data respose got as zero results no documents found. I am using lucene-core-2.9-dev.jar and lucene-misc-2.4.1.jar files Please help resolve the issue. Thanks in advance Jay