[ 
https://issues.apache.org/jira/browse/HADOOP-7596?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13096549#comment-13096549
 ] 

Eric Yang commented on HADOOP-7596:
-----------------------------------

bq. Out of curiosity, what is the rationale of having a ${IDENT_USER} and 
${DN_USER} ?

For secure datanode, the script should be started with root user then drop 
privileges to HADOOP_SECURE_DN_USER.  In the secure deployment context, 
IDENT_USER=hdfs, and DN_USER=root.

bq. hdfs user is still hardcoded in several locations

For non-secure cluster, datanode process is hard coded to hdfs user.

bq. Isn't it a dangerous behaviour if a user customize this directory? (and by 
mistake or knowingly set it for instance to /var/log)

chmod code was legacy code.  The test log dir is to make it less destructive 
while maintain backward compatibility.

> Enable jsvc to work with Hadoop RPM package
> -------------------------------------------
>
>                 Key: HADOOP-7596
>                 URL: https://issues.apache.org/jira/browse/HADOOP-7596
>             Project: Hadoop Common
>          Issue Type: Bug
>          Components: build
>    Affects Versions: 0.20.204.0
>         Environment: Java 6, RedHat EL 5.6
>            Reporter: Eric Yang
>            Assignee: Eric Yang
>             Fix For: 0.20.205.0
>
>         Attachments: HADOOP-7596-2.patch, HADOOP-7596-3.patch, 
> HADOOP-7596.patch
>
>
> For secure Hadoop 0.20.2xx cluster, datanode can only run with 32 bit jvm 
> because Hadoop only packages 32 bit jsvc.  The build process should download 
> proper jsvc versions base on the build architecture.  In addition, the shell 
> script should be enhanced to locate hadoop jar files in the proper location.

--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

Reply via email to