[Apache TVM Discuss] [Development] Pass sparse tensor to tvm.build

2021-02-10 Thread Teng HUANG via Apache TVM Discuss
yes, thanks for the hint from this post. --- [Visit Topic](https://discuss.tvm.apache.org/t/pass-sparse-tensor-to-tvm-build/7739/7) to respond. You are receiving this because you enabled mailing list mode. To unsubscribe from these emails, [click here](https://discuss.tvm.apache.org/ema

[Apache TVM Discuss] [Development] Pass sparse tensor to tvm.build

2021-02-10 Thread Wheest via Apache TVM Discuss
Looks like it's solved, but ping me if you have other issues with sparse stuff. I'm not as well versed as some other developers, but I have been working on it on-and-off the past couple of months. --- [Visit Topic](https://discuss.tvm.apache.org/t/pass-sparse-tensor-to-tvm-build/7739/6)

[Apache TVM Discuss] [Development] Pass sparse tensor to tvm.build

2021-02-10 Thread Teng HUANG via Apache TVM Discuss
Hi I am struggling with sparse cmm usage and could you please help to take a look at this post ? https://discuss.tvm.apache.org/t/error-ndarray-object-has-no-attribute-a/9107 thanks --- [Visit Topic](https://discuss.tvm.apache.org/t/pass-sparse-tensor-to-tvm-build/7739/5) to respond. You

[Apache TVM Discuss] [Development] Pass sparse tensor to tvm.build

2020-09-07 Thread Wheest via Apache TVM Discuss
Useful resource, thanks. I ended up fixing the approach in my 2nd example (using three `ndarray`s. Basically one can only pass sparse tensors that have the same sparsity pattern, not just the same level of sparsity as the placeholders you pass. Thus, when constructing the placeholder tenso

[Apache TVM Discuss] [Development] Pass sparse tensor to tvm.build

2020-09-07 Thread jfm via Apache TVM Discuss
Another work around is to get rid off the sparse placeholder completely. Instead use three standard Tensors for each element in the sparse tensor (i.e. data, indices and index pointers). Then feed those (plus the X matrix) to `topi.nn.sparse_dense` and things seem to work. There's a working i

[TVM Discuss] [Development] Pass sparse tensor to tvm.build

2020-08-28 Thread Wheest via TVM Discuss
I have also tried bypassing this issue by passing the three tensor objects inside a sparse array. ```python from tvm.contrib import sparse # create placeholder tensors ... n = out_c k = kdim_h * kdim_w * in_c sparse_weights = sparse.placeholder((n,k), nonzeros=(1-sparsity)*n*k, name='W') #

[TVM Discuss] [Development] Pass sparse tensor to tvm.build

2020-08-27 Thread Wheest via TVM Discuss
>From the [discussion about running sparse >CNNs](https://discuss.tvm.ai/t/running-a-cnn-using-sparsity-convertor/7267/11), > I have implemented prototypes of a dense NCHW GEMM convolution, and what I >think is a working CSR NCHW GEMM convolution. I will share the code once it's a bit more mat