I agree, I had a lot of fun using it for longformer, and I would love to see it 
used for more NLP application. 

> (if it just worked slightly better)

I chatted with the TVM folks, @jknight, @jwfromm, @antinucleon, a few weeks ago 
and they mentioned they will consider providing better support for our use 
case. Things like a pip installable runtime and better tutorials can make TVM 
much easier to use. Better support for fp16 and getting GEMM performance closer 
to PyTorch's are also important to make it practical.





---
[Visit Topic](https://discuss.tvm.ai/t/competitive-gemm-matmul-example/5478/4) 
to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click 
here](https://discuss.tvm.ai/email/unsubscribe/f9140da5f9f0b30efd8077cc4e93f04ad22ddb01c3cb6b284799f174cc653986).

Reply via email to