[
https://issues.apache.org/jira/browse/HADOOP-7596?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13096557#comment-13096557
]
Eric Yang commented on HADOOP-7596:
-----------------------------------
bq. Does this mean I'll have to adduser hdfs to run a non-secure cluster? If so
I'd much rather not.
Ravi, hdfs is added as part of the rpm/deb package installation. This is the
common behavior for rpm installed by packages. The headless user is
preconfigured as part of package installation process.
bq. I would be expecting an identical behavior in both init scripts and use
DN_USER/IDENT_USER for both cases
Bruno, Debian system uses .pid file to check for process aliveness. The
filename generated by hadoop-daemon.sh can have discrepancy between the
starting user and effective running user. This is why this fix only applies to
Debian family. For redhat, it is using /var/lock/subsys to track process
aliveness. Hence, in Redhat, there is no discrepancy for the pid file name
verse the user that started the process because the lock is a fixed name.
> Enable jsvc to work with Hadoop RPM package
> -------------------------------------------
>
> Key: HADOOP-7596
> URL: https://issues.apache.org/jira/browse/HADOOP-7596
> Project: Hadoop Common
> Issue Type: Bug
> Components: build
> Affects Versions: 0.20.204.0
> Environment: Java 6, RedHat EL 5.6
> Reporter: Eric Yang
> Assignee: Eric Yang
> Fix For: 0.20.205.0
>
> Attachments: HADOOP-7596-2.patch, HADOOP-7596-3.patch,
> HADOOP-7596.patch
>
>
> For secure Hadoop 0.20.2xx cluster, datanode can only run with 32 bit jvm
> because Hadoop only packages 32 bit jsvc. The build process should download
> proper jsvc versions base on the build architecture. In addition, the shell
> script should be enhanced to locate hadoop jar files in the proper location.
--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira