[ 
https://issues.apache.org/jira/browse/HADOOP-8615?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

thomastechs updated HADOOP-8615:
--------------------------------

    Release Note:   (was: This patch contains the fix as follows.As per the bug 
says, the EOFExceptions thrown in the DecompressorStream does not provide any 
information about the file at which the decompression fails. I have added 
overloaded constructor and necessary methods, which will have the file name 
also added as parameter. When the user uses this method and pass the filename, 
it would be printed in the EOF exception thrown, if any.So I believe the test 
cases may not be necessary. I was able to test it locally by forcefully 
creating an EOF Exception and verifying the new message as 
"java.io.EOFException: Unexpected end of input stream in the file = filename"
)
    
> EOFException in DecompressorStream.java needs to be more verbose
> ----------------------------------------------------------------
>
>                 Key: HADOOP-8615
>                 URL: https://issues.apache.org/jira/browse/HADOOP-8615
>             Project: Hadoop Common
>          Issue Type: Bug
>          Components: io
>    Affects Versions: 0.20.2
>            Reporter: Jeff Lord
>              Labels: patch
>         Attachments: HADOOP-8615-release-0.20.2.patch
>
>
> In ./src/core/org/apache/hadoop/io/compress/DecompressorStream.java
> The following exception should at least pass back the file that it encounters 
> this error in relation to:
>   protected void getCompressedData() throws IOException {
>     checkStream();
>     int n = in.read(buffer, 0, buffer.length);
>     if (n == -1) {
>       throw new EOFException("Unexpected end of input stream");
>     }
> This would help greatly to debug bad/corrupt files.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

Reply via email to