singhpk234 commented on code in PR #13191:
URL: https://github.com/apache/iceberg/pull/13191#discussion_r2151100252


##########
core/src/main/java/org/apache/iceberg/rest/HTTPClient.java:
##########
@@ -94,6 +96,7 @@ public class HTTPClient extends BaseHTTPClient {
   private final ObjectMapper mapper;
   private final AuthSession authSession;
   private final boolean isRootClient;
+  private final ConcurrentMap<Class<?>, ObjectReader> objectReaderCache = 
Maps.newConcurrentMap();
 

Review Comment:
   I agree, i was on the same boat as i expected Jackson to cache the reader by 
Type, turns out it doesn't, what jackson does cache is : 
   - Deserializers, serializers, type resolvers, etc. — these are cached inside 
ObjectMapper for performance.
   This means the actual deserialization logic is reused and shared across 
threads and calls, making repeated deserialization of the same type efficient.
   So i proactively went and added a cache for the overall reader so that i can 
just work with its copy with new injectable, i expect the Cache to be not put 
memory pressure as we have very limited parsers. 
   Happy to remove if you feel strongly about it !



##########
core/src/main/java/org/apache/iceberg/rest/HTTPClient.java:
##########
@@ -94,6 +96,7 @@ public class HTTPClient extends BaseHTTPClient {
   private final ObjectMapper mapper;
   private final AuthSession authSession;
   private final boolean isRootClient;
+  private final ConcurrentMap<Class<?>, ObjectReader> objectReaderCache = 
Maps.newConcurrentMap();
 

Review Comment:
   I agree, i was on the same boat as i expected Jackson to cache the reader by 
Type, turns out it doesn't, what jackson does cache is : 
   - Deserializers, serializers, type resolvers, etc. — these are cached inside 
ObjectMapper for performance.
   This means the actual deserialization logic is reused and shared across 
threads and calls, making repeated deserialization of the same type efficient.
   
   
   So i proactively went and added a cache for the overall reader so that i can 
just work with its copy with new injectable, i expect the Cache to be not put 
memory pressure as we have very limited parsers. 
   Happy to remove if you feel strongly about it !



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org
For additional commands, e-mail: issues-h...@iceberg.apache.org

Reply via email to