Hi, I'm trying to write compressed data into HDFS using Java API. The application runs on a local machine which there are no Hadoop modules running and tries to connect to a remote HDFS cluster. My code successfully compressed in the mode of GZip and BZip, but using LZ4 I got the error message: native lz4 library not available
Some information: The output of the command *hadoop checknative -a *in namenode and datanode: INFO bzip2.Bzip2Factory: Successfully loaded & initialized native-bzip2 library system-native INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library Native library checking: hadoop: true /opt/dfs/lib/native/libhadoop.so.1.0.0 zlib: true /lib64/libz.so.1 snappy: true /lib64/libsnappy.so.1 lz4: true revision:10301 bzip2: true /lib64/libbz2.so.1 I also set the following Variables on all nodes: HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=$HADOOP_HOME/lib/native" JAVA_LIBRARY_PATH=$JAVA_LIBRARY_PATH:/opt/dfs/lib/native LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/opt/dfs/lib/native Is it needed to have Hadoop native libraries on the machine which runs the code? Hadoop 2.8.5 OS: CentOS 7
