[
https://issues.apache.org/jira/browse/PIG-3880?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13969808#comment-13969808
]
Josh Elser commented on PIG-3880:
---------------------------------
What version of Hadoop are you using, [~medined]? I recall situations on other
projects where the dependency management expected certain artifacts to be
provided by hadoop when the user's version didn't actually provide that jar. I
believe commons-io was one of these artifacts that I was bit by too.
This seems to be a plausible explanation to what you're seeing. The
jarwithhadoop would contain the dependencies and thus you wouldn't have the
issues if your local hadoop install was missing necessary jars.
> After compiling trunk, I am seeing ClassLoaderObjectInputStream
> ClassNotFoundException.
> ---------------------------------------------------------------------------------------
>
> Key: PIG-3880
> URL: https://issues.apache.org/jira/browse/PIG-3880
> Project: Pig
> Issue Type: Bug
> Components: grunt
> Affects Versions: 0.13.0
> Reporter: David Medinets
>
> I pulled trunk from subversion using the following commands:
> mkdir pig
> cd pig
> svn co http://svn.apache.org/repos/asf/pig/trunk
> cd trunk
> ant
> export PATH=$PATH:$HOME/pig/trunk/bin
> export ACCUMULO_HOME=/opt/accumulo
> export HADOOP_HOME=/opt/hadoop
> export PIG_HOME=$HOME/pig/trunk
> export PIG_CLASSPATH="$HOME/pig/trunk/build/ivy/lib/Pig/*"
> export PIG_CLASSPATH="$ACCUMULO_HOME/lib/*:$PIG_CLASSPATH"
> cd ~
> pig
> Then I ran into this error:
> java.lang.NoClassDefFoundError:
> org/apache/commons/io/input/ClassLoaderObjectInputStream
> at org.apache.pig.Main.run(Main.java:399)
> When I change PIG_JAR to use the fat jar, I was able to run the pig command
> without getting the exception.
--
This message was sent by Atlassian JIRA
(v6.2#6252)