gemini-code-assist[bot] commented on code in PR #258:
URL: https://github.com/apache/tvm-ffi/pull/258#discussion_r2514673572


##########
addons/torch_c_dlpack_ext/build_backend.py:
##########
@@ -75,10 +75,15 @@ def build_wheel(
             )
         else:
             extra_args = []
-            if torch.version.cuda is not None:
-                extra_args.append("--build-with-cuda")
-            elif torch.version.hip is not None:
-                extra_args.append("--build-with-rocm")
+            # First use "torch.cuda.is_available()" to check whether GPU 
environment
+            # is available. Then determine the GPU type.
+            if torch.cuda.is_available():
+                if torch.version.cuda is not None:
+                    extra_args.append("--build-with-cuda")
+                elif torch.version.hip is not None:
+                    extra_args.append("--build-with-rocm")
+                else:
+                    raise ValueError("Cannot determine whether to build with 
CUDA or ROCm.")

Review Comment:
   ![medium](https://www.gstatic.com/codereviewagent/medium-priority.svg)
   
   This logic to detect the available GPU backend (CUDA/ROCm) is duplicated in 
a few places:
   * `addons/torch_c_dlpack_ext/build_backend.py` (lines 80-86)
   * `python/tvm_ffi/_optional_torch_c_dlpack.py` (lines 72-78)
   * `tests/python/test_optional_torch_c_dlpack.py` (lines 49-55)
   
   To improve maintainability and avoid future inconsistencies, it would be 
best to extract this logic into a shared helper function. This function could 
determine and return the GPU type ('cuda', 'rocm', or `None`), and then be 
called from each of these locations. This would centralize the logic and make 
the code easier to read and maintain.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to