Re: [apache/tvm] RFC: initial stab at TorchScript fallback (#7401)

2022-09-28 Thread Thomas Viehmann
The fundamental problem is that (pre-compiled) PyTorch python modules use the pre C++-11 string ABI to better blend into the Python ecosystem or so. TVM does not, so it needs to link to LibTorch with the "new" C++ string ABI. But these two versions clash. One option is to use self-compiled PyTor

Re: [apache/tvm] RFC: initial stab at TorchScript fallback (#7401)

2022-03-09 Thread Thomas Viehmann
Thank you @masahi ! -- Reply to this email directly or view it on GitHub: https://github.com/apache/tvm/pull/7401#issuecomment-1063277207 You are receiving this because you are subscribed to this thread. Message ID:

Re: [apache/tvm] WIP/RFC: initial stab at TorchScript fallback (#7401)

2022-03-08 Thread Thomas Viehmann
Are you on the tvm discord or so to quickly discuss? -- Reply to this email directly or view it on GitHub: https://github.com/apache/tvm/pull/7401#issuecomment-1061504189 You are receiving this because you are subscribed to this thread. Message ID:

Re: [apache/tvm] WIP/RFC: initial stab at TorchScript fallback (#7401)

2022-03-07 Thread Thomas Viehmann
Hi, so I rebased this finally and it all compiles and runs one test against a current PyTorch master, so I think I'm back in business with this PR (unless it has been obsoleted, but from what I understand, the bridge is in the other direction). -- Reply to this email directly or view it on Gi

Re: [apache/tvm] WIP/RFC: initial stab at TorchScript fallback (#7401)

2022-01-21 Thread Thomas Viehmann
M. Ruberry of the PyTorch team re-landed the update of the dlpack.h in PyTorch. If this still holds next week, it'll be exciting to bring this up to date. :) -- Reply to this email directly or view it on GitHub: https://github.com/apache/tvm/pull/7401#issuecomment-1018688381 You are receiving th

Re: [apache/tvm] WIP/RFC: initial stab at TorchScript fallback (#7401)

2022-01-10 Thread Thomas Viehmann
So I thought, I could wait it out, but I'll look into working around the version discrepancy in the next few weeks. -- Reply to this email directly or view it on GitHub: https://github.com/apache/tvm/pull/7401#issuecomment-1008961724 You are receiving this because you are subscribed to this thre

Re: [apache/tvm] WIP/RFC: initial stab at TorchScript fallback (#7401)

2022-01-09 Thread Thomas Viehmann
@masahi So I had hoped to get the dlpack header version in PyTorch bumped (see the linked bug) but Facebook has internal uses that let it insist on the old one. I wonder if we could work around it by providing a "dlpack-compat" header that defines the names for the types / fields? Or I could try

Re: [apache/tvm] WIP/RFC: initial stab at TorchScript fallback (#7401)

2021-11-18 Thread Thomas Viehmann
Just a quick note that when I tried to revive this back in the summer it got a bit stalled around pytorch/pytorch#65047 . -- You are receiving this because you are subscribed to this thread. Reply to this email directly or view it on GitHub: https://github.com/apache/tvm/pull/7401#issuecomment-9

Re: [apache/tvm] WIP/RFC: initial stab at TorchScript fallback (#7401)

2021-09-15 Thread Thomas Viehmann
So I have been mulling over the best granularity / approach. Currently I'm taking TorchScript functions / graphs as the unit I'm working with. An alternative could be to move to the PyTorch operator level (so one aten::...-call) - which would seem to be more natural in Relay - but then one woul

Re: [apache/tvm-rfcs] [RFC]PyTorchTVM (#25)

2021-08-27 Thread Thomas Viehmann
I wonder whether this would make the torch fallback op (https://github.com/apache/tvm/pull/7401) more or less useful (it would depend on what you (plan to) do with unsupported ops). I am still pondering whether to close it or dust it off. I should note that as far as I know NVidia has a TensorR

Re: [apache/tvm] WIP/RFC: initial stab at TorchScript fallback (#7401)

2021-05-31 Thread Thomas Viehmann
> I would really appreciate getting at least your fix to solve this issue > merged into upstream. Maybe in a separate PR at this is not really related to > the TorchScript use case. I'm all for it, but I wouldn't know how to add tests in lieu of something using it. If you or @masahi has any opi

Re: [apache/tvm] WIP/RFC: initial stab at TorchScript fallback (#7401)

2021-02-05 Thread Thomas Viehmann
Yeah, the general idea is to use this as the fallback. I can add the fallback generation here in the PR if that is better. Also I added a bit of a pro-con discussion regarding single op vs. program on the forum thread, if you have opinions, I'd be very grateful if you could chime in. -- You ar

Re: [apache/tvm] WIP/RFC: initial stab at TorchScript fallback (#7401)

2021-02-04 Thread Thomas Viehmann
> I'm curious how it integrates with PyTorch frontend. Do we convert every op > not supported to relay.torchop, run BYOC flow to get TorchScript subgraphs, > and send them to libtorch? Sounds interesting! This is how I'd like it to work out. I've been thinking what the best "level" is and while

[apache/tvm] WIP/RFC: initial stab at TorchScript fallback (#7401)

2021-02-03 Thread Thomas Viehmann
This patch adds a support for calling TorchScript. This can be used fallback for when torch operators are not yet implemented or if one wants to incorporate bespoke PyTorch custom ops into TVM with ease. It adds - a new relay `torchop` that takes a variable number of inputs and executes a provi

Re: [apache/incubator-tvm] ROCm changed name of library and removed the old one in ROCm 3.7 release. (#6345)

2020-08-26 Thread Thomas Viehmann
Seems good to me. If we are giving up on pre-3.3 compat, I should also remove the code object v3 workaround I introduced in the spring in favour of 3.5+. (I'll send a PR.) -- You are receiving this because you are subscribed to this thread. Reply to this email directly or view it on GitHub: ht