Edarke commented on PR #13581:
URL: https://github.com/apache/lucene/pull/13581#issuecomment-2266355062

   > > @msokolov Sorry to interject, but the hash function Objects.hash(level, 
node) % NUM_LOCKS; seems like it is prone to lock contention if workers have 
power of 2 offsets. Would it be worthwhile to avoid the Integer[] allocation 
and smear the hash instead?
   > 
   > Thanks for interjection! I'm sorry, but I don't really understand. Which 
Integer[] allocation did you mean?
   
   Ah sorry, I was referring to the varargs invocation of 
`Objects.hash(Object...)` implicitly boxing the primitive ints. Not sure if 
that's a concern, but `Objects.hash` produces low entropy in the lower bits, 
which is ordinarily fine because collections like `HashMap` will smear 
hashCodes before computing array indexes.
   
   ```java
   int lockid = Objects.hash(level, node) % NUM_LOCKS;
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@lucene.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@lucene.apache.org
For additional commands, e-mail: issues-h...@lucene.apache.org

Reply via email to