Merged #5421 into master.
--
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-tvm/pull/5421#event-3267080792
Thanks @u99127 @tom-gall
--
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-tvm/pull/5421#issuecomment-618695859
[quote="gmagogsfm, post:1, topic:6473"]
dynamic shape
[/quote]
Nice proposal, one fruit for thoughts would be whether can we have a hybrid TF
/ HLO combo, which is made possible under the unified IR infra.
The idea is that we could run a cut during conversion, making MLIR function as
a possib
Thanks for the nice RFC.
And happy to see folks other than us also pay attention to the MLIR-as-a-bridge
design to integrate TVM as a backend for TensorFlow(or maybe more than
TensorFlow^-^).
Inside Alibaba, we are also working on the related things.
To be more specific, for static shape JIT
>
>
> > It would be great if we can avoid the hack into the `with_same_user`. One
> > alternative would be still pass in the `PYTEST_ADDOPTS` env variable from
> > the docker env(for development purposes) but source the setup-pytest-env
> > within each of the script.
> > This also makes the in
> It would be great if we can avoid the hack into the `with_same_user`. One
> alternative would be still pass in the `PYTEST_ADDOPTS` env variable from the
> docker env(for development purposes) but source the setup-pytest-env within
> each of the script.
>
> This also makes the intention of th
>
>
Thanks for the quick review.
> It would be great if we can avoid the hack into the `with_same_user`. One
> alternative would be still pass in the `PYTEST_ADDOPTS` env variable from the
> docker env(for development purposes) but source the setup-pytest-env within
> each of the script.
>
# RFC for Relay MLIR Frontend
**Authors: Yanan Cao, Yong Wu**
**Contributors: Yida Wang, Haichen Shen, Yao Wang**
## Summary
We propose a solution that can give TVM/Relay top-notch model/op coverage for
TensorFlow with affordable effort.
## Motivation
TensorFlow, as the most dominant machin
It would be great if we can avoid the hack into the `with_same_user`. One
alternative would be still pass in the `PYTEST_ADDOPTS` env variable from the
docker env(for development purposes) but source the setup-pytest-env within
each of the script.
This also makes the intention of the CI scripts
I don't like my current hack of overloading "with_same_user" for sourcing this
global environment but it seemed like the simplest hack and worked in my
environment. Obviously I don't have cuda testing in my CI or my regular test
environment, so this isn't fully clear.
--
You are receiving this
In many places having a global pytest flag is useful . For me with the
build and test of tvm , I would like to be able to globally pass in
pytest options as part of development flow or CI flows where one would
like to measure other things regularly that need measurements including
pytest coverage d
11 matches
Mail list logo