So just to throw the idea out there, what would happen if I shutdown and created a new solrServer on reindex? We only reindex daily. Will that force the reread of all lucene files?
John On Tue, Jun 15, 2010 at 4:47 PM, John Ament <my.repr...@gmail.com> wrote: > Hi all > > I wrote a small app using solrj and solr. The app has a small wrapper that > handles the reindexing., which was written using groovy. The groovy script > generates the solr docs, and then the java code deletes and recreates the > data > > In a singleton ejb, we do this in the post construct phase: > > 39 CoreContainer.Initializer initializer = new > CoreContainer.Initializer(); 40 coreContainer = initializer.initialize(); > 41 solrServer = new EmbeddedSolrServer(coreContainer, ""); > A method that does this can be invoked over HTTP service to force the > reindexing: > > 52 gse.run("search_indexer.groovy", b); 53 logger.info("Solr docs size: " > + solrDocs.size()); 54 solrServer.deleteByQuery("*:*"); 55 > solrServer.add(solrDocs); > 56 solrServer.commit(); > > we've noticed that after executing this, we see appropriate log messages > indicating that it ran, however the search indexes do not repopulate. We're > deployed on glassfish v3. Any thoughts? > > Any ideas? > > Thanks, > > John >