[ 
https://issues.apache.org/jira/browse/HADOOP-12718?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15602149#comment-15602149
 ] 

John Zhuge commented on HADOOP-12718:
-------------------------------------

[[email protected]], Thanks a lot for catching the mistake.

Filed HADOOP-13751 Contract test for FS shell commands.

What do you think of placing the permission check into {{FileUtil#list}}?
{code}
  public static String[] list(File dir) throws IOException {
    if (!canRead(dir)) {
      throw new AccessControlException("Permission denied for dir: " +
          dir.toString());
    }
    String[] fileNames = dir.list();
    if(fileNames == null) {
      throw new IOException("Invalid directory or I/O error occurred for dir: "
                + dir.toString());
    }
    return fileNames;
  }
{code}
Currently {{FileUtil#list}} is only called by:
{code}
        hadoop-common  (1 usage found)
            org.apache.hadoop.fs  (1 usage found)
                RawLocalFileSystem  (1 usage found)
                    listStatus(Path)  (1 usage found)
                        474String[] names = FileUtil.list(localf);
        hadoop-hdfs  (3 usages found)
            org.apache.hadoop.hdfs.server.datanode  (2 usages found)
                BlockPoolSliceStorage  (1 usage found)
                    cleanupDetachDir(File)  (1 usage found)
                        518if (FileUtil.list(detachDir).length != 0) {
                DataStorage  (1 usage found)
                    cleanupDetachDir(File)  (1 usage found)
                        910if (FileUtil.list(detachDir).length != 0 ) {
            org.apache.hadoop.hdfs.server.datanode.fsdataset.impl  (1 usage 
found)
                FsVolumeImpl  (1 usage found)
                    isBPDirEmpty(String)  (1 usage found)
                        1035if (rbwDir.exists() && FileUtil.list(rbwDir).length 
!= 0) {
{code}
They seem ok with the change but I am a little reluctant to widen the scope to 
modify a static utility function.

> Incorrect error message by fs -put local dir without permission
> ---------------------------------------------------------------
>
>                 Key: HADOOP-12718
>                 URL: https://issues.apache.org/jira/browse/HADOOP-12718
>             Project: Hadoop Common
>          Issue Type: Bug
>            Reporter: John Zhuge
>            Assignee: John Zhuge
>              Labels: supportability
>         Attachments: HADOOP-12718.001.patch, HADOOP-12718.002.patch, 
> HADOOP-12718.003.patch, HADOOP-12718.004.patch, HADOOP-12718.005.patch, 
> TestFsShellCopyPermission-output.001.txt, 
> TestFsShellCopyPermission-output.002.txt, TestFsShellCopyPermission.001.patch
>
>
> When the user doesn't have access permission to the local directory, the 
> "hadoop fs -put" command prints a confusing error message "No such file or 
> directory".
> {noformat}
> $ whoami
> systest
> $ cd /home/systest
> $ ls -ld .
> drwx------. 4 systest systest 4096 Jan 13 14:21 .
> $ mkdir d1
> $ sudo -u hdfs hadoop fs -put d1 /tmp
> put: `d1': No such file or directory
> {noformat}
> It will be more informative if the message is:
> {noformat}
> put: d1 (Permission denied)
> {noformat}
> If the source is a local file, the error message is ok:
> {noformat}
> put: f1 (Permission denied)
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to