dutZ1855 opened a new issue, #18608:
URL: https://github.com/apache/tvm/issues/18608

   ### Expected behavior
   
   TVM should be able to import and compile an ONNX `Resize` model with 
**non-4D** input tensors (e.g. 3D), matching ONNX Runtime behavior (and also 
OpenVINO, which can run this model).
   
   Per the ONNX `Resize` operator spec, `Resize` supports resizing **N-D** 
tensors (not restricted to 4D):
   
   - Spec: https://onnx.ai/onnx/operators/onnx__Resize.html
   
   ### Actual behavior
   
   For the following model,
   
   <img width="145" height="284" alt="Image" 
src="https://github.com/user-attachments/assets/49af8147-d1f0-429d-b2f2-3e5f65fa71af";
 />
   
   <img width="493" height="681" alt="Image" 
src="https://github.com/user-attachments/assets/00e596f3-c025-45e7-877c-1d2acbbe6177";
 />
   
   When importing the attached model with TVM Relax 
(`tvm.relax.frontend.onnx.from_onnx`), TVM fails during ONNX conversion with:
   
   > `AssertionError: Only resize2d is currently supported.`
   
   The failure is from TVM's ONNX `Resize` converter which asserts `ndims == 4`.
   
   ```
   Error converting operator Resize, with inputs: [x, None, None, sizes]
   Traceback (most recent call last):
     File "DLCompilers/bug/tvm/resize_only_resize2d/run_repro.py", line 66, in 
<module>
       test(onnx_model)  
       ^^^^^^^^^^^^^^^^
     File "DLCompilers/bug/tvm/resize_only_resize2d/run_repro.py", line 51, in 
test
       tvm_model = from_onnx(model, opset=18, keep_params_in_input=True)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
     File "DLCompilers/tvm/python/tvm/relax/frontend/onnx/onnx_frontend.py", 
line 4260, in from_onnx
       return g.from_onnx(graph, opset)
              ^^^^^^^^^^^^^^^^^^^^^^^^^
     File "DLCompilers/tvm/python/tvm/relax/frontend/onnx/onnx_frontend.py", 
line 3890, in from_onnx
       self._construct_nodes(graph)
     File "DLCompilers/tvm/python/tvm/relax/frontend/onnx/onnx_frontend.py", 
line 4071, in _construct_nodes
       raise err
     File "DLCompilers/tvm/python/tvm/relax/frontend/onnx/onnx_frontend.py", 
line 4066, in _construct_nodes
       op = self._convert_operator(op_name, inputs, attr, self.opset)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
     File "DLCompilers/tvm/python/tvm/relax/frontend/onnx/onnx_frontend.py", 
line 4166, in _convert_operator
       sym = op_function(self.bb, inputs, attrs, [self._nodes, self._params])
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
     File "DLCompilers/tvm/python/tvm/relax/frontend/onnx/onnx_frontend.py", 
line 2221, in _impl_v18
       assert ndims == 4, "Only resize2d is currently supported."
              ^^^^^^^^^^
   AssertionError: Only resize2d is currently supported.
   ```
   
   ONNX Runtime can execute the same model successfully.
   
   ```
   ORT run finished
   [[[0.1257]]]
   ```
   
   ### Environment
   
   Operating System:Ubuntu 22.04.4 LTS
   TVM version:0.23.0dev
   pytorch version:2.9.1
   ort version:1.23.2
   onnx version: 1.20.0
   python:3.11.14
   
   ### Steps to reproduce
   
   Download the model and run the following code to obtain the results.
   
   [model.zip](https://github.com/user-attachments/files/24328158/model.zip)
   
   ```
   from typing import Dict, List, Literal, Optional
   import sys
   import os
   from pathlib import Path
   
   import numpy as np
   import onnx
   import onnxruntime
   from onnx import ModelProto, TensorProto, helper
   
   _REPO_ROOT = Path(__file__).resolve().parents[3]
   _TVM_PYTHON = _REPO_ROOT / "tvm" / "python"
   _TVM_BUILD = _REPO_ROOT / "tvm" / "build"
   if _TVM_PYTHON.exists():
       sys.path.insert(0, _TVM_PYTHON.as_posix())
   os.environ.setdefault("TVM_LIBRARY_PATH", _TVM_BUILD.as_posix())
   
   import tvm
   import tvm.testing
   from tvm import relax
   from tvm.relax.frontend.onnx import from_onnx
   
   import argparse
   import pickle
   
   def test(model: ModelProto,) -> None:
       model.ir_version = 8
       model.opset_import[0].version = 18
    
       with open("oracle.pkl", 'rb') as fp:
           inputs = pickle.load(fp)
       try:
           ort_session = onnxruntime.InferenceSession(
               model.SerializeToString(), providers=["CPUExecutionProvider"]
           )
           feed = inputs.get("input", inputs) if isinstance(inputs, dict) else 
inputs
           ort_output = ort_session.run([], feed)
           print("ORT run finished")
           for idx, tensor in enumerate(ort_output):
               print(tensor)
               print("-" * 40)
       except Exception as e:
           print("This model cannot be executed by onnxruntime!")
           sys.exit(1)
   
       tvm_model = from_onnx(model, opset=18, keep_params_in_input=True)
       tvm_model = relax.transform.DecomposeOpsForInference()(tvm_model)
       tvm_model = relax.transform.LegalizeOps()(tvm_model)
   
       tvm_model, params = relax.frontend.detach_params(tvm_model)
       with tvm.transform.PassContext(opt_level=3):
           ex = tvm.compile(tvm_model, target="llvm")
      
   if __name__ == "__main__":
       
       onnx_model = onnx.load("model.onnx")
       test(onnx_model)  
   
   ```
   
   ### Triage
   
   * needs-triage
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to