[TVM Discuss] [Development/RFC] [DISCUSS] Module based Model Runtime Interface

2020-05-12 Thread ANSHUMAN TRIPATHY via TVM Discuss
@tqchen: Any thoughts on above point? --- [Visit Topic](https://discuss.tvm.ai/t/discuss-module-based-model-runtime-interface/5025/70) to respond. You are receiving this because you enabled mailing list mode. To unsubscribe from these emails, [click here](https://discuss.tvm.ai/email/un

[TVM Discuss] [Development/RFC] [DISCUSS] Module based Model Runtime Interface

2020-05-11 Thread ANSHUMAN TRIPATHY via TVM Discuss
Thank you, will start a new thread about that. About the original post in the thread, i do have one small concern. Whenever we provide Export_library a path, it always has to be accompanied with correct extension name. Like below: `mod.export_library("xx.so")` As we are coming up with a Pack

[TVM Discuss] [Development/RFC] [DISCUSS] Module based Model Runtime Interface

2020-05-11 Thread tqchen via TVM Discuss
Please start another discuss thread for new questions(of weight serialization). The current proposal does have a `package_params` option that packages the weight. --- [Visit Topic](https://discuss.tvm.ai/t/discuss-module-based-model-runtime-interface/5025/68) to respond. You are receivi

[TVM Discuss] [Development/RFC] [DISCUSS] Module based Model Runtime Interface

2020-05-11 Thread ANSHUMAN TRIPATHY via TVM Discuss
Thanks! Agree we can utilize rodata for that case. May be that is for another thread of discussion. Would you please help me about the basic question i raised? What i am trying to figure out here is from the user perspective - the standard way to save and reuse weights. As in the current thre

[TVM Discuss] [Development/RFC] [DISCUSS] Module based Model Runtime Interface

2020-05-11 Thread tqchen via TVM Discuss
Note that the parameter has to be loaded into DRAM, so there is no place where we could do partial weight load. For memory limited scenarios like embedded devices, we would certainly need to go for a different solution, for example directly store weights in the rodata section to remove the ne

[TVM Discuss] [Development/RFC] [DISCUSS] Module based Model Runtime Interface

2020-05-11 Thread ANSHUMAN TRIPATHY via TVM Discuss
@tqchen: Thank you very much for your enlightening response! I agree it will introduce an additional layer, but it may have an additional performance benefit as well even when the store is for simple objects with Flatbuffer or more precisely Flexbuffers used. I was thinking of a scenario wh

[TVM Discuss] [Development/RFC] [DISCUSS] Module based Model Runtime Interface

2020-05-11 Thread tqchen via TVM Discuss
It would be helpful to ask why and why not when introducing new dependencies. See some of the examples in the design decision above. Flatbuffer coould be useful when we need to serialize a complicated set of objects, but also introduces an additional layer of abstraction. Given that we are on

[TVM Discuss] [Development/RFC] [DISCUSS] Module based Model Runtime Interface

2020-05-11 Thread ANSHUMAN TRIPATHY via TVM Discuss
HI All, I was wondering, whether we can use [flatbuffer](https://google.github.io/flatbuffers/) for serializing params. In that way we can customize the framework according to our suit as its opensource and it will be target agnostic. I am working on a prototype currently. However i wanted

[TVM Discuss] [Development/RFC] [DISCUSS] Module based Model Runtime Interface

2020-04-15 Thread Zhao Wu via TVM Discuss
ok make sense. if all agree, we could improve our fallback way to make tvm blob in the rodata section --- [Visit Topic](https://discuss.tvm.ai/t/discuss-module-based-model-runtime-interface/5025/62) to respond. You are receiving this because you enabled mailing list mode. To unsubscribe

[TVM Discuss] [Development/RFC] [DISCUSS] Module based Model Runtime Interface

2020-04-15 Thread Ramana Radhakrishnan via TVM Discuss
I wasn't proposing that as a solution, that is one of the options. I'm merely stating that this is still a problem that will hit others most notably anyone using the C backend . Ramana --- [Visit Topic](https://discuss.tvm.ai/t/discuss-module-based-model-runtime-interface/5025/61) to r

[TVM Discuss] [Development/RFC] [DISCUSS] Module based Model Runtime Interface

2020-04-15 Thread Zhao Wu via TVM Discuss
I think I should clarify your question. Do you mean we should generate .rodata section of `unsighed char __tvm_data_blob[]`? --- [Visit Topic](https://discuss.tvm.ai/t/discuss-module-based-model-runtime-interface/5025/60) to respond. You are receiving this because you enabled mailing lis

[TVM Discuss] [Development/RFC] [DISCUSS] Module based Model Runtime Interface

2020-04-15 Thread Ramana Radhakrishnan via TVM Discuss
So, the problem hasn't been fixed : there is a "solution" depending on the presence of an llvm target. Ramana --- [Visit Topic](https://discuss.tvm.ai/t/discuss-module-based-model-runtime-interface/5025/59) to respond. You are receiving this because you enabled mailing list mode. To un

[TVM Discuss] [Development/RFC] [DISCUSS] Module based Model Runtime Interface

2020-04-15 Thread Zhao Wu via TVM Discuss
When we don’t have LLVM, we will fallback to our original way (call compiler to generate) --- [Visit Topic](https://discuss.tvm.ai/t/discuss-module-based-model-runtime-interface/5025/58) to respond. You are receiving this because you enabled mailing list mode. To unsubscribe from these

[TVM Discuss] [Development/RFC] [DISCUSS] Module based Model Runtime Interface

2020-04-15 Thread Ramana Radhakrishnan via TVM Discuss
This won't work by default for the C backend where we don't necessarily rely on the presence of llvm or are we saying that there needs to be an llvm solution for the backend just to produce this constant data object always, so we do need a general solution Ramana --- [Visit Topic]

[TVM Discuss] [Development/RFC] [DISCUSS] Module based Model Runtime Interface

2020-04-12 Thread Windclarion via TVM Discuss
I got it. Thanks FrozenGene. --- [Visit Topic](https://discuss.tvm.ai/t/discuss-module-based-model-runtime-interface/5025/56) to respond. You are receiving this because you enabled mailing list mode. To unsubscribe from these emails, [click here](https://discuss.tvm.ai/email/unsubscrib

[TVM Discuss] [Development/RFC] [DISCUSS] Module based Model Runtime Interface

2020-04-12 Thread Zhao Wu via TVM Discuss
CUDA also could use this. Because cuda's target host is LLVM. As the example I show, it is in fact cuda target. So you could see `NVIDIA NNVM Compiler` in the constant string. --- [Visit Topic](https://discuss.tvm.ai/t/discuss-module-based-model-runtime-interface/5025/55) to respond. Yo

[TVM Discuss] [Development/RFC] [DISCUSS] Module based Model Runtime Interface

2020-04-12 Thread Windclarion via TVM Discuss
Good solution! Thanks FrozenGene! but if we use LLVM, llvm series target can take advantage of this solution, I'm not sure if other targets such as cuda can use this solution. --- [Visit Topic](https://discuss.tvm.ai/t/discuss-module-based-model-runtime-interface/5025/54) to respond

[TVM Discuss] [Development/RFC] [DISCUSS] Module based Model Runtime Interface

2020-04-12 Thread Zhao Wu via TVM Discuss
Thanks for respond.Finally, we don't use this special hack. We will generate this directly using LLVM IR. And LLVM will put this into `rodata` section correctly. Like this test: ![image|690x165](upload://nkm1SoLvI1b36CZyZHWAIRUd7bi.png) ![image|690x397](upload://pXekhJ0Qe1ilLipMCtLW5JCP3DP.

[TVM Discuss] [Development/RFC] [DISCUSS] Module based Model Runtime Interface

2020-04-12 Thread Windclarion via TVM Discuss
const unsigned char __tvm_dev_mblob[46788038] = {"TVM_BLOB_SIG"}; maybe not enough. because 46788038 is too big for many embedded system, so I have to place __tvm_dev_mblob to special section, for example, a rodata section. so I mean I need declare __tvm_dev_mblob as const unsigned char