Due to the restriction of the OpenGL shader language — you cannot use shared
memory and have to write to a single predefined axis. we cannot get as
performant kernels that unleashes power as cuda or OpenCL.
In this case, probably the optimized web assembly will outperform the
webGL(although
I am told some times about the web acceleration in Mobile (The scenario is
often the H5 App). For the compatibility of most mobile phones, they even could
only use WebGL1 (not WebGL2). How do we consider this situation?
---
[Visit
Topic](https://discuss.tvm.ai/t/deprecate-opengl-webgl-in-
[quote="mbrookhart, post:24, topic:5833"]
V2 will include graph partitioning and verification to better support bring
your own code gen, starting in on those passes now.
[/quote]
I'm very interested in this. Using composite, I have to generate many slightly
different patterns (`with_bias`, `wit
I'm interested in this topic. Having worked on frontend stuff for some time,
now I'm looking to revive my backend-fu :) Also I wanted to learn about
graphics API, this seems a good opportunity.
I also hope that
`WebGPU + WASM = win`
Although GPU support in WASM doesn't seem official yet.
[q
I think the PR is ready for review as V1 of the pattern language, it contains
some documentation, testing, and the language itself, the matcher, and
pattern-based expression rewriting.
V2 will include graph partitioning and verification to better support bring
your own code gen, starting in o
TVM stack has a primilinary OpenGL backend that translates some of the compute
code into opengl shaders. However, due to the limitation of GLSL in terms of
its compute shader capability, we do not have the flexibility of the other
programming models such as OpenCL/CUDA when targeting OpenGL. T
Hi,
I find your post very interesting, I would expect that the frontend conversion
into Relay is quite mature and stable.
Therefore I would want to gather more info on your POV.
Would you mind elaborating on these points?
1. What models where you trying to convert?
2. From what framework were