[ 
https://issues.apache.org/jira/browse/HADOOP-18950?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17943597#comment-17943597
 ] 

ASF GitHub Bot commented on HADOOP-18950:
-----------------------------------------

steveloughran commented on PR #4854:
URL: https://github.com/apache/hadoop/pull/4854#issuecomment-2797124945

   > If we need to support users who want to do their own Avro serialization of 
Hadoop classes, then I think we should abandon this PR. I think it would be far 
easier to just upgrade the actual Avro jars that Hadoop uses and give up on 
shading it.
   
   We now have a shaded avro jar, which we use internally -right?
   
   we still have that problem in branch-3.4 of the 1.9.2 dependency in 
serialization though.
   
   I'm thinking: we pull all public classes related to nonshaded avro into a 
hadoop-serialization module which is not bundled into hadoop distros; there if 
you want it, but not by default
   




> upgrade avro to 1.11.3 due to CVE
> ---------------------------------
>
>                 Key: HADOOP-18950
>                 URL: https://issues.apache.org/jira/browse/HADOOP-18950
>             Project: Hadoop Common
>          Issue Type: Bug
>          Components: common
>            Reporter: Xuze Yang
>            Priority: Major
>              Labels: pull-request-available
>
> [https://nvd.nist.gov/vuln/detail/CVE-2023-39410]
> When deserializing untrusted or corrupted data, it is possible for a reader 
> to consume memory beyond the allowed constraints and thus lead to out of 
> memory on the system. This issue affects Java applications using Apache Avro 
> Java SDK up to and including 1.11.2. Users should update to apache-avro 
> version 1.11.3 which addresses this issue.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to