Hmm, I wonder whether I *am* using an SSD or spinning disk, in Apache. :) I 
guess I can try to find out.

I forgot to mention, this is with Solr 5.2.1 — is that likely to make much 
difference?

-- 
David Moles
UC Curation Center
California Digital Library










On 4/7/16, 4:19 PM, "Chris Hostetter" <hossman_luc...@fucit.org> wrote:

>
>hat's a strainge error to get.
>
>I can't explain why LinuxFileSystem can't load LinuxNativeDispatcher, but 
>you can probably bypass hte entire situation by explicitly configuring 
>ConcurrentMergeScheduler with defaults so that it doesn't try determine 
>wether you are using an SSD or "spinning" disk...
>
>http://lucene.apache.org/core/5_5_0/core/org/apache/lucene/index/ConcurrentMergeScheduler.html
>https://cwiki.apache.org/confluence/display/solr/IndexConfig+in+SolrConfig#IndexConfiginSolrConfig-MergingIndexSegments
>
>Something like this in your indexConfig settings...
>
>    <mergeScheduler class="org.apache.lucene.index.ConcurrentMergeScheduler">
>      <int name="maxMergeCount">42</int>
>      <int name="maxThreadCount">7</int>
>    </mergeScheduler>
>
>...will force those specific settings, instead of trying to guess 
>defaults.
>
>I haven't tested this, but in theory you can also use something like to 
>indicate definitively that you are using a spinning disk (or not) but let 
>it pick the appropriate default values for the merge count & 
>threads accordingly ...
>
>    <mergeScheduler class="org.apache.lucene.index.ConcurrentMergeScheduler">
>      <bool name="defaultMaxMergesAndThreads">true</bool>
>    </mergeScheduler>
>
>
>
>: Date: Thu, 7 Apr 2016 22:56:54 +0000
>: From: David Moles <david.mo...@ucop.edu>
>: Reply-To: solr-user@lucene.apache.org
>: To: "solr-user@lucene.apache.org" <solr-user@lucene.apache.org>
>: Subject: Solr update fails with “Could not initialize class
>:     sun.nio.fs.LinuxNativeDispatcher”
>: 
>: Hi folks,
>: 
>: New Solr user here, attempting to apply the following Solr update command 
>via curl
>: 
>: curl 'my-solr-server:8983/solr/my-core/update?commit=true' \
>:   -H 'Content-type:application/json' -d \
>:   
>'[{"my_id_field":"some-id-value","my_other_field":{"set":"new-field-value"}}]'
>: 
>: I'm getting an error response with a stack trace that reduces to:
>: 
>: Caused by: java.lang.NoClassDefFoundError: Could not initialize class 
>sun.nio.fs.LinuxNativeDispatcher
>:     at sun.nio.fs.LinuxFileSystem.getMountEntries(LinuxFileSystem.java:81)
>:     at sun.nio.fs.LinuxFileStore.findMountEntry(LinuxFileStore.java:86)
>:     at sun.nio.fs.UnixFileStore.<init>(UnixFileStore.java:65)
>:     at sun.nio.fs.LinuxFileStore.<init>(LinuxFileStore.java:44)
>:     at 
>sun.nio.fs.LinuxFileSystemProvider.getFileStore(LinuxFileSystemProvider.java:51)
>:     at 
>sun.nio.fs.LinuxFileSystemProvider.getFileStore(LinuxFileSystemProvider.java:39)
>:     at 
>sun.nio.fs.UnixFileSystemProvider.getFileStore(UnixFileSystemProvider.java:368)
>:     at java.nio.file.Files.getFileStore(Files.java:1461)
>:     at org.apache.lucene.util.IOUtils.getFileStore(IOUtils.java:528)
>:     at org.apache.lucene.util.IOUtils.spinsLinux(IOUtils.java:483)
>:     at org.apache.lucene.util.IOUtils.spins(IOUtils.java:472)
>:     at org.apache.lucene.util.IOUtils.spins(IOUtils.java:447)
>:     at 
>org.apache.lucene.index.ConcurrentMergeScheduler.initDynamicDefaults(ConcurrentMergeScheduler.java:371)
>:     at 
>org.apache.lucene.index.ConcurrentMergeScheduler.merge(ConcurrentMergeScheduler.java:457)
>:     at org.apache.lucene.index.IndexWriter.maybeMerge(IndexWriter.java:1817)
>:     at 
>org.apache.lucene.index.IndexWriter.prepareCommitInternal(IndexWriter.java:2761)
>:     at 
>org.apache.lucene.index.IndexWriter.commitInternal(IndexWriter.java:2866)
>:     at org.apache.lucene.index.IndexWriter.commit(IndexWriter.java:2833)
>:     at 
>org.apache.solr.update.DirectUpdateHandler2.commit(DirectUpdateHandler2.java:586)
>:     at 
>org.apache.solr.update.processor.RunUpdateProcessor.processCommit(RunUpdateProcessorFactory.java:95)
>:     at 
>org.apache.solr.update.processor.UpdateRequestProcessor.processCommit(UpdateRequestProcessor.java:64)
>:     at 
>org.apache.solr.update.processor.DistributedUpdateProcessor.doLocalCommit(DistributedUpdateProcessor.java:1635)
>:     at 
>org.apache.solr.update.processor.DistributedUpdateProcessor.processCommit(DistributedUpdateProcessor.java:1612)
>:     at 
>org.apache.solr.update.processor.LogUpdateProcessor.processCommit(LogUpdateProcessorFactory.java:161)
>:     at 
>org.apache.solr.handler.RequestHandlerUtils.handleCommit(RequestHandlerUtils.java:69)
>:     at 
>org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(ContentStreamHandlerBase.java:78)
>:     at 
>org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:143)
>:     at org.apache.solr.core.SolrCore.execute(SolrCore.java:2064)
>:     at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:654)
>:     at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:450)
>:     ... 22 more
>: 
>: It looks like sun.nio.fs can't find its own classes, which seems odd. Solr 
>is running with OpenJDK 1.8.0_77 on Amazon Linux AMI release 2016.03.
>: 
>: Does anyone know what might be going on here? Is it an OpenJDK / Amazon 
>Linux problem?
>: 
>: --
>: David Moles
>: UC Curation Center
>: California Digital Library
>: 
>: 
>: 
>
>-Hoss
>http://www.lucidworks.com/

Reply via email to