[
https://issues.apache.org/jira/browse/HADOOP-11064?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14126207#comment-14126207
]
Todd Lipcon commented on HADOOP-11064:
--------------------------------------
bq. If nothing else, we should really properly version them instead of making
up wackadoodle numbers. (libhadoop.so has been 1.0.0 for how many releases now?
Probably since it was introduced!)
Agreed, but unfortunately afaik Java doesn't provide a way to specify a
particular so version dependency. I asked this on Quora a few years ago:
http://www.quora.com/Is-there-a-way-to-force-Java-to-load-a-particular-soversion-of-a-JNI-dependency
and unfortunately got no real answers.
So, does this mean we need to always keep binary compatibility of the
internal-facing libhadoop.so? Could we change hbase to pick up the Hadoop
libraries from HADOOP_HOME instead of bundling them? It seems like it should
either (a) bundle everything, including the native code, or (b) bundle nothing,
and load everything from HADOOP_HOME. What's causing a problem is that it's
using bundled jars with system-located native code.
> UnsatisifedLinkError with hadoop 2.4 JARs on hadoop-2.6 due NativeCRC32
> method changes
> --------------------------------------------------------------------------------------
>
> Key: HADOOP-11064
> URL: https://issues.apache.org/jira/browse/HADOOP-11064
> Project: Hadoop Common
> Issue Type: Bug
> Components: native
> Affects Versions: 2.6.0
> Environment: Hadoop 2.6 cluster, trying to run code containing hadoop
> 2.4 JARs
> Reporter: Steve Loughran
> Priority: Blocker
>
> The private native method names and signatures in {{NativeCrc32}} were
> changed in HDFS-6561 ... as a result hadoop-common-2.4 JARs get unsatisifed
> link errors when they try to perform checksums.
> This essentially stops Hadoop 2.4 applications running on Hadoop 2.6 unless
> rebuilt and repackaged with the hadoop- 2.6 JARs
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)