[ 
https://issues.apache.org/jira/browse/HADOOP-7596?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13096546#comment-13096546
 ] 

Bruno Mahé commented on HADOOP-7596:
------------------------------------

{quote}
+       if start-stop-daemon --start --quiet --oknodo --pidfile 
${HADOOP_PID_DIR}/hadoop-${IDENT_USER}-datanode.pid -c ${DN_USER} -x 
${HADOOP_PREFIX}/sbin/hadoop-daemon.sh -- --config ${HADOOP_CONF_DIR} start 
datanode; then
{quote}

Out of curiosity, what is the rationale of having a ${IDENT_USER} and 
${DN_USER} ?



{quote}
+    daemon --user hdfs ${HADOOP_PREFIX}/sbin/hadoop-daemon.sh --config 
"${HADOOP_CONF_DIR}" start datanode
{quote}

hdfs user is still hardcoded in several locations



{quote}
${HADOOP_PREFIX}/sbin/hadoop-daemon.sh
{quote}

is used in many locations. It may be useful to use a variable instead.



{quote}
-chown $HADOOP_IDENT_STRING $HADOOP_LOG_DIR 
+touch $HADOOP_LOG_DIR/.hadoop_test > /dev/null 2>&1
+TEST_LOG_DIR=$?
+if [ "${TEST_LOG_DIR}" = "0" ]; then
+  rm -f $HADOOP_LOG_DIR/.hadoop_test
+else
+  chown $HADOOP_IDENT_STRING $HADOOP_LOG_DIR 
+fi
{quote}

Isn't it a dangerous behaviour if a user customize this directory? (and by 
mistake or knowingly set it for instance to /var/log)




And also in some places I see the LSB log* functions being used and in some 
other places I see "echo -n". Although this is out of scope of this ticket.



> Enable jsvc to work with Hadoop RPM package
> -------------------------------------------
>
>                 Key: HADOOP-7596
>                 URL: https://issues.apache.org/jira/browse/HADOOP-7596
>             Project: Hadoop Common
>          Issue Type: Bug
>          Components: build
>    Affects Versions: 0.20.204.0
>         Environment: Java 6, RedHat EL 5.6
>            Reporter: Eric Yang
>            Assignee: Eric Yang
>             Fix For: 0.20.205.0
>
>         Attachments: HADOOP-7596-2.patch, HADOOP-7596-3.patch, 
> HADOOP-7596.patch
>
>
> For secure Hadoop 0.20.2xx cluster, datanode can only run with 32 bit jvm 
> because Hadoop only packages 32 bit jsvc.  The build process should download 
> proper jsvc versions base on the build architecture.  In addition, the shell 
> script should be enhanced to locate hadoop jar files in the proper location.

--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira


Reply via email to