[
https://issues.apache.org/jira/browse/HADOOP-11975?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14550953#comment-14550953
]
Alan Burlison commented on HADOOP-11975:
----------------------------------------
It's not the architecture that is really the problem, the detection code above
is Linux-specific and does not work on Solaris. The other issue is the
assumption that the 32/64 bitness of the compiler output is governed by the
bitness of the OS which isn't true on Solaris. 64-bit Solaris will quite
happily run 32-bit executables and indeed the output mode of gcc on Solaris is
32-bit, even when the OS is 64-bit. I'll have to do something similar to the
current vile hack involving setting CMAKE_SYSTEM_PROCESSOR; if my reading of
this comment in FindJNI is correct, it's seems it's the only way to get the
correct JVM libraries to be detected.
# Sometimes ${CMAKE_SYSTEM_PROCESSOR} is added to the list to prefer
# current value to a hardcoded list. Remove possible duplicates.
> Native code needs to be built to match the 32/64 bitness of the JVM
> -------------------------------------------------------------------
>
> Key: HADOOP-11975
> URL: https://issues.apache.org/jira/browse/HADOOP-11975
> Project: Hadoop Common
> Issue Type: Sub-task
> Components: build
> Affects Versions: 2.7.0
> Environment: Solaris
> Reporter: Alan Burlison
> Assignee: Alan Burlison
>
> When building with a 64-bit JVM on Solaris the following error occurs at the
> link stage of building the native code:
> [exec] ld: fatal: file
> /usr/jdk/instances/jdk1.8.0/jre/lib/amd64/server/libjvm.so: wrong ELF class:
> ELFCLASS64
> [exec] collect2: error: ld returned 1 exit status
> [exec] make[2]: *** [target/usr/local/lib/libhadoop.so.1.0.0] Error 1
> [exec] make[1]: *** [CMakeFiles/hadoop.dir/all] Error 2
> The compilation flags in the makefiles need to explicitly state if 32 or 64
> bit code is to be generated, to match the JVM.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)