Hmm…I think we shipped 4.4.0 on a pre hadoop 2.2.0 version? I’d assume that is 
the problem - for instance, the goog protobuf lib prob has to be updated to 
match what’s expected by 2.2.0 at the least if I remember right.

- Mark


On Dec 13, 2013, at 4:41 AM, javozzo <danilo.domen...@gmail.com> wrote:

> Hi,
> I'm new in Solr.
> I tried to store the solr data in A distributed filesystem (Hadoop in this
> case).
> I'm following this tutorial 
>    https://cwiki.apache.org/confluence/display/solr/Running+Solr+on+HDFS
> but i have a problem when i set this field
>  -Dsolr.data.dir=hdfs://host:port/path
> Which is the configuration?
> I put:
>    host -> the master hadoop host
>    port -> 9000
>    path -> solr
> 
> i created the folder solr on hdfs and I setting the permission to 777 but i
> have this error in my log:
> 
> ERROR - 2013-12-13 09:49:13.065; org.apache.solr.core.CoreContainer; Unable
> to create core: collection1
> org.apache.solr.common.SolrException: Problem creating directory:
> hdfs://master:9000/solr
>       at org.apache.solr.core.SolrCore.<init>(SolrCore.java:835)
>       at org.apache.solr.core.SolrCore.<init>(SolrCore.java:629)
>       at
> org.apache.solr.core.CoreContainer.createFromLocal(CoreContainer.java:622)
>       at org.apache.solr.core.CoreContainer.create(CoreContainer.java:657)
>       at org.apache.solr.core.CoreContainer$1.call(CoreContainer.java:364)
>       at org.apache.solr.core.CoreContainer$1.call(CoreContainer.java:356)
>       at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
>       at java.util.concurrent.FutureTask.run(FutureTask.java:166)
>       at 
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>       at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
>       at java.util.concurrent.FutureTask.run(FutureTask.java:166)
>       at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>       at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>       at java.lang.Thread.run(Thread.java:724)
> Caused by: java.lang.RuntimeException: Problem creating directory:
> hdfs://master:9000/solr
>       at 
> org.apache.solr.store.hdfs.HdfsDirectory.<init>(HdfsDirectory.java:66)
>       at
> org.apache.solr.core.HdfsDirectoryFactory.create(HdfsDirectoryFactory.java:154)
>       at
> org.apache.solr.core.CachingDirectoryFactory.get(CachingDirectoryFactory.java:350)
>       at org.apache.solr.core.SolrCore.getNewIndexDir(SolrCore.java:256)
>       at org.apache.solr.core.SolrCore.initIndex(SolrCore.java:469)
>       at org.apache.solr.core.SolrCore.<init>(SolrCore.java:759)
>       ... 13 more
> Caused by: java.io.IOException: Failed on local exception:
> com.google.protobuf.InvalidProtocolBufferException: Message missing required
> fields: callId, status; Host Details : local host is:
> "master-VirtualBox/127.0.1.1"; destination host is: "master":9000; 
>       at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:761)
>       at org.apache.hadoop.ipc.Client.call(Client.java:1239)
>       at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202)
>       at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>       at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:606)
>       at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)
>       at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)
>       at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)
>       at
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:630)
>       at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1559)
>       at
> org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:811)
>       at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1345)
>       at 
> org.apache.solr.store.hdfs.HdfsDirectory.<init>(HdfsDirectory.java:61)
>       ... 18 more
> Caused by: com.google.protobuf.InvalidProtocolBufferException: Message
> missing required fields: callId, status
>       at
> com.google.protobuf.UninitializedMessageException.asInvalidProtocolBufferException(UninitializedMessageException.java:81)
>       at
> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto$Builder.buildParsed(RpcPayloadHeaderProtos.java:1094)
>       at
> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto$Builder.access$1300(RpcPayloadHeaderProtos.java:1028)
>       at
> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcPayloadHeaderProtos.java:986)
>       at 
> org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:946)
>       at org.apache.hadoop.ipc.Client$Connection.run(Client.java:844)
> ERROR - 2013-12-13 09:49:13.114; org.apache.solr.common.SolrException;
> null:org.apache.solr.common.SolrException: Unable to create core:
> collection1
>       at
> org.apache.solr.core.CoreContainer.recordAndThrow(CoreContainer.java:1150)
>       at org.apache.solr.core.CoreContainer.create(CoreContainer.java:666)
>       at org.apache.solr.core.CoreContainer$1.call(CoreContainer.java:364)
>       at org.apache.solr.core.CoreContainer$1.call(CoreContainer.java:356)
>       at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
>       at java.util.concurrent.FutureTask.run(FutureTask.java:166)
>       at 
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>       at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
>       at java.util.concurrent.FutureTask.run(FutureTask.java:166)
>       at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>       at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>       at java.lang.Thread.run(Thread.java:724)
> Caused by: org.apache.solr.common.SolrException: Problem creating directory:
> hdfs://master:9000/solr
>       at org.apache.solr.core.SolrCore.<init>(SolrCore.java:835)
>       at org.apache.solr.core.SolrCore.<init>(SolrCore.java:629)
>       at
> org.apache.solr.core.CoreContainer.createFromLocal(CoreContainer.java:622)
>       at org.apache.solr.core.CoreContainer.create(CoreContainer.java:657)
>       ... 10 more
> Caused by: java.lang.RuntimeException: Problem creating directory:
> hdfs://master:9000/solr
>       at 
> org.apache.solr.store.hdfs.HdfsDirectory.<init>(HdfsDirectory.java:66)
>       at
> org.apache.solr.core.HdfsDirectoryFactory.create(HdfsDirectoryFactory.java:154)
>       at
> org.apache.solr.core.CachingDirectoryFactory.get(CachingDirectoryFactory.java:350)
>       at org.apache.solr.core.SolrCore.getNewIndexDir(SolrCore.java:256)
>       at org.apache.solr.core.SolrCore.initIndex(SolrCore.java:469)
>       at org.apache.solr.core.SolrCore.<init>(SolrCore.java:759)
>       ... 13 more
> Caused by: java.io.IOException: Failed on local exception:
> com.google.protobuf.InvalidProtocolBufferException: Message missing required
> fields: callId, status; Host Details : local host is:
> "master-VirtualBox/127.0.1.1"; destination host is: "master":9000; 
>       at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:761)
>       at org.apache.hadoop.ipc.Client.call(Client.java:1239)
>       at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202)
>       at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>       at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:606)
>       at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)
>       at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)
>       at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)
>       at
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:630)
>       at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1559)
>       at
> org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:811)
>       at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1345)
>       at 
> org.apache.solr.store.hdfs.HdfsDirectory.<init>(HdfsDirectory.java:61)
>       ... 18 more
> Caused by: com.google.protobuf.InvalidProtocolBufferException: Message
> missing required fields: callId, status
>       at
> com.google.protobuf.UninitializedMessageException.asInvalidProtocolBufferException(UninitializedMessageException.java:81)
>       at
> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto$Builder.buildParsed(RpcPayloadHeaderProtos.java:1094)
>       at
> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto$Builder.access$1300(RpcPayloadHeaderProtos.java:1028)
>       at
> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcPayloadHeaderProtos.java:986)
>       at 
> org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:946)
>       at org.apache.hadoop.ipc.Client$Connection.run(Client.java:844)
> 
> Any ideas?
> Thanks
> Danilo
> 
> 
> 
> 
> --
> View this message in context: 
> http://lucene.472066.n3.nabble.com/Solr-4-4-0-on-hadoop-2-2-0-tp4106551.html
> Sent from the Solr - User mailing list archive at Nabble.com.

Reply via email to