> However, when setting  `OMP_NUM_THREADS=1`  the model inference time is same, 
> seems it’s a problem with multiple threads.

Will it be possible that there's any thread realated limitation in your pytorch 
script?





---
[Visit 
Topic](https://discuss.tvm.ai/t/performance-of-same-op-and-workload-in-different-model-varies-differently/7766/2)
 to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click 
here](https://discuss.tvm.ai/email/unsubscribe/a03bfaf5a782eb6a17e831fad6edb36e238d60dc02b60046f8e7c3fe8062578b).

Reply via email to