Hi
refer to below pom.xml
you should modify hadoop version that you want.
build is simple: mvn clean package
It will make a jar of 34mb size.
usage is simple:
java -jar ${build_jar}.jar -mkdir /user/home
java -jar ${build_jar}.jar -ls /user/home
pom.xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.naver.c3</groupId>
<artifactId>hdfs-connector</artifactId>
<version>1.0-SNAPSHOT</version>
<dependencies>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.7.1</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>2.7.1</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.3</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<minimizeJar>false</minimizeJar>
<transformers>
<transformer
implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>org.apache.hadoop.fs.FsShell</mainClass>
</transformer>
<transformer
implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
-----Original Message-----
From: "F21"<[email protected]>
To: <[email protected]>;
Cc:
Sent: 2016-08-29 (월) 14:25:09
Subject: Installing just the HDFS client
Hi all,
I am currently building a HBase docker image. As part of the bootstrap
process, I need to run some `hdfs dfs` commands to create directories on
HDFS.
The whole hadoop distribution is pretty heavy contains things to run
namenodes, etc. I just need a copy of the dfs client for my docker
image. I have done some poking around and see that I need to include the
files in bin/, libexec/, lib/ and share/hadoop/common and share/hadoop/hdfs.
However, including the above still takes up quite a bit of space. Is
there a single JAR I can add to my image to perform operations against HDFS?
Cheers,
Francis
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]