[
https://issues.apache.org/jira/browse/HADOOP-9114?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
sathish updated HADOOP-9114:
----------------------------
Attachment: HADOOP-9114-001.patch
> After defined the dfs.checksum.type as the NULL, write file and hflush will
> through java.lang.ArrayIndexOutOfBoundsException
> ----------------------------------------------------------------------------------------------------------------------------
>
> Key: HADOOP-9114
> URL: https://issues.apache.org/jira/browse/HADOOP-9114
> Project: Hadoop Common
> Issue Type: Bug
> Affects Versions: 2.0.1-alpha
> Reporter: liuyang
> Priority: Minor
> Attachments: FSOutputSummer.java.patch, HADOOP-9114-001.patch
>
>
> when I test the characteristic parameter about dfs.checksum.type. The value
> can be defined as NULL,CRC32C,CRC32. It's ok when the value is CRC32C or
> CRC32, but the client will through java.lang.ArrayIndexOutOfBoundsException
> when the value is configured NULL.
--
This message was sent by Atlassian JIRA
(v6.1#6144)