Hi Richard, I am not able to decode the issue properly here, It would have been better if you shared the PR or the failure trace as well. QQ: Why are you having hadoop-common as an explicit dependency? Those hadoop-common stuff should be there in hadoop-client-api I quickly checked once on the 3.4.0 release and I think it does have them.
``` ayushsaxena@ayushsaxena client % jar tf hadoop-client-api-3.4.0.jar | grep org/apache/hadoop/fs/FileSystem.class org/apache/hadoop/fs/FileSystem.class `` You didn't mention which shaded classes are being reported as missing... I think spark uses these client jars, you can use that as an example, can grab pointers from here: [1] & [2] -Ayush [1] https://github.com/apache/spark/blob/master/pom.xml#L1361 [2] https://issues.apache.org/jira/browse/SPARK-33212 On Thu, 11 Apr 2024 at 17:09, Richard Zowalla <[email protected]> wrote: > > Hi all, > > we are using "hadoop-minicluster" in Apache Storm to test our hdfs > integration. > > Recently, we were cleaning up our dependencies and I noticed, that if I > am adding > > <dependency> > <groupId>org.apache.hadoop</groupId> > <artifactId>hadoop-client-api</artifactId> > <version>${hadoop.version}</version> > </dependency> > <dependency> > <groupId>org.apache.hadoop</groupId> > <artifactId>hadoop-client-runtime</artifactId> > <version>${hadoop.version}</version> > </dependency> > > and have > <dependency> > <groupId>org.apache.hadoop</groupId> > <artifactId>hadoop-minicluster</artifactId> > <version>${hadoop.version}</version> > <scope>test</scope> > </dependency> > > as a test dependency to setup a mini-cluster to test our storm-hdfs > integration. > > This fails weirdly because of missing (shaded) classes as well as a > class ambiquity with HttpServer2. > > It is present as a class inside of the "hadoop-client-api" and within > "hadoop-common". > > Is this setup wrong or should we try something different here? > > Gruß > Richard --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
