```c++
Module GraphExecutorFactory::ExecutorCreate(const 
std::vector<Device>& devs) {
  auto exec = make_object<GraphExecutor>();
  exec->Init(this->graph_json_, this->imports_[0], devs, PackedFunc());
  // set params
  SetParams(exec.get(), this->params_);
  return Module(exec);
}
```
In the code block above, `params_` is defined as 
`std::unordered_map<std::string, tvm::runtime::NDArray>` and holds a copy 
of parameters deserialized from DSO model. After creating the graph executor 
this copy of parameters still resides in memory. It will waste lots of memory 
usage if the model is big.
Take `resnet101` as an example. The size of the parameters is about 170.49 MB. 
The picture below is the reported memory map of the process which inference 
with `resnet101`. The RSS in the anonymous mapping space is about 350 MB.
![image](https://user-images.githubusercontent.com/18597737/191883103-02ab8e30-966f-4ee7-95fe-f3c3bd3b4a9d.png)

We can release `this->params_` by `this->params_.clear()`. After release, 
the RSS decreases to 180 MB.
![image](https://user-images.githubusercontent.com/18597737/191882947-59b2a1d7-f381-4c52-819f-3437bbe16041.png)

You can view, comment on, or merge this pull request online at:

  https://github.com/apache/tvm/pull/12881

-- Commit Summary --

  * [Runtime] Release temp param buffer after creating graph executor.

-- File Changes --

    M src/runtime/graph_executor/graph_executor_factory.cc (2)

-- Patch Links --

https://github.com/apache/tvm/pull/12881.patch
https://github.com/apache/tvm/pull/12881.diff

-- 
Reply to this email directly or view it on GitHub:
https://github.com/apache/tvm/pull/12881
You are receiving this because you are subscribed to this thread.

Message ID: <apache/tvm/pull/12...@github.com>

Reply via email to