Hi!

I was trying to implement a Hadoop/Spark audit tool, but l met a problem
that I can’t get  the input file location and file name. I can get
username, IP address, time, user command, all of these info  from
hdfs-audit.log. But When I submit a MapReduce job, I can’t see input file
location  neither in Hadoop logs or Hadoop ResourceManager. Does hadoop
have API or log that contains these info through some configuration ?If it
have ,What should I configure?

Thanks.

Reply via email to