In vta, is it possible to run two inference tasks concurrently using Python's 
multithreading? I tried it and found that the two tasks are executed serially.





---
[Visit 
Topic](https://discuss.tvm.apache.org/t/is-it-possible-to-run-two-inference-models-concurrently-in-vta/12910/1)
 to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click 
here](https://discuss.tvm.apache.org/email/unsubscribe/639c2ae3272a43bc62897d68e0b491258b91dc3ce830f07f88855dbe243cdbaa).

Reply via email to