TensorRTExecutionProvider error during session initialization #22199
Labels
ep:TensorRT
issues related to TensorRT execution provider
stale
issues that have not been addressed in a while; categorized by a bot
Describe the issue
I'm exporting an nnUNetV2 model from torch to onnx fromat using torch.dynamo_export. I can launch the model with onnxruntime on the CPUExecutionProvider as well as the CUDAExecutionProvider. However, when using the TensorRTExecutionProvider I get the following error when initialiazing my session:
onnxruntime.capi.onnxruntime_pybind11_state.RuntimeException: [ONNXRuntimeError] : 6 : RUNTIME_EXCEPTION : Exception during initialization: /onnxruntime_src/onnxruntime/core/providers/tensorrt/tensorrt_execution_provider.cc:2185 SubGraphCollection_t onnxruntime::TensorrtExecutionProvider::GetSupportedList(SubGraphCollection_t, int, int, const onnxruntime::GraphViewer&, bool*) const graph_build.Resolve().IsOK() was false.
I've found some issues describing this error and people were mentioning the possibility that it could come from the fact that TensorRT does not deal with dynamic shapes. Therefore I've exported my model with fixed shapes but it dit not solve the issue.
Moreover when using the polygraphy library it seems like there is no problem
polygraphy run checkpoint_final_torch2.4.1+cu121_onnx1.16.1.onnx --trt
I get the following
Many thanks for your help !
To reproduce
Ubuntu 22.04
Cuda 12.2
Cudnn 9.4.0
TensorRT 10.2.0
onnxruntime-gpu 1.19.2
onxx 1.16.1 (for model export)
torch 2.4.1 (for model export)
Urgency
No response
Platform
Linux
OS Version
Ubuntu 22.04
ONNX Runtime Installation
Released Package
ONNX Runtime Version or Commit ID
1.19.0
ONNX Runtime API
Python
Architecture
X64
Execution Provider
TensorRT
Execution Provider Library Version
TensorRT 10.2
The text was updated successfully, but these errors were encountered: