I guess the module.so can be seen as a hash map with func name as key and function implement as value. The following code can be found in graph_runtime_codegen.cc. ``` LoweredOutput Codegen(relay::Function func) { auto pf = GetPackedFunc("relay.backend.GraphPlanMemory"); storage_device_map_ = (*pf)(func); // First we convert all the parameters into input nodes. for (auto param : func->params) { auto node_ptr = GraphInputNode::make_node_ptr(param->name_hint(), GraphAttrs()); var_map_[param.get()] = AddNode(node_ptr, param); } heads_ = VisitExpr(func->body); std::ostringstream os; dmlc::JSONWriter writer(&os); GetJSON(&writer); LoweredOutput ret; ret.graph_json = os.str(); ret.params = params_;
for (auto& kv : lowered_funcs_) { if (ret.lowered_funcs.count(kv.first) == 0) { ret.lowered_funcs.Set(kv.first, IRModule::Empty()); } auto& mod = ret.lowered_funcs[kv.first]; mod->Update(kv.second); ret.lowered_funcs.Set(kv.first, mod); } ret.external_mods = compile_engine_->LowerExternalFunctions(); return ret; } ``` ``` struct LoweredOutput { std::string graph_json; Map<std::string, IRModule> lowered_funcs; Array<tvm::runtime::Module> external_mods; std::unordered_map<std::string, tvm::runtime::NDArray> params; }; ``` And the IRModule shown above have something to do with parallelism. --- [Visit Topic](https://discuss.tvm.ai/t/execution-order-of-operators-at-runtime-in-tvm/6572/9) to respond. You are receiving this because you enabled mailing list mode. To unsubscribe from these emails, [click here](https://discuss.tvm.ai/email/unsubscribe/e5bf2f9180523482392e0171bfca3c5f8ca64e43db28261007a6f4ef9ae0f9d0).