Thanks for the reply. Also is there any other application master example other than DistributedShell.
On Fri, Dec 16, 2011 at 4:23 AM, Hitesh Shah <[email protected]> wrote: > The shell script is invoked within the context of a container launched by > the NodeManager. If you are creating a directory using a relative path, it > will be created relative of the container's working directory and cleaned > up when the container completes. > > If you really want to see some output, one option could be to have your > script create some data on hdfs or echo output to stdout which will be > captured in the container logs. The stdout/stderr logs generated by your > script should be available wherever you have configured the node-manager's > log dirs to point to. > > -- Hitesh > > On Dec 14, 2011, at 10:52 PM, raghavendhra rahul wrote: > > > When we create a directory using distributed shell,any idea where it is > created > > > > On Thu, Dec 15, 2011 at 11:57 AM, raghavendhra rahul < > [email protected]> wrote: > > How to run any script using this.When i tried it shows final status as > failed. > > > > > > On Thu, Dec 15, 2011 at 11:48 AM, raghavendhra rahul < > [email protected]> wrote: > > Thanks for the help i made a mistake of creating symlinks within > modules.Now everythng is fine. > > > > > > > > On Thu, Dec 15, 2011 at 11:18 AM, raghavendhra rahul < > [email protected]> wrote: > > should i link the hadoop-yarn-applications-distributedshell-0.23.0.jar > also. > > Without linking this jar it throws the same error. > > If linked it shows > > at org.apache.hadoop.util.RunJar.main(RunJar.java:130) > > Caused by: java.util.zip.ZipException: error in opening zip file > > at java.util.zip.ZipFile.open(Native Method) > > at java.util.zip.ZipFile.<init>(ZipFile.java:131) > > at java.util.jar.JarFile.<init>(JarFile.java:150) > > at java.util.jar.JarFile.<init>(JarFile.java:87) > > at org.apache.hadoop.util.RunJar.main(RunJar.java:128) > > > > > > > > > > > >
