Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TensorRTExecutionProvider error during session initialization #22199

Open
cabinader opened this issue Sep 24, 2024 · 3 comments
Open

TensorRTExecutionProvider error during session initialization #22199

cabinader opened this issue Sep 24, 2024 · 3 comments
Labels
ep:TensorRT issues related to TensorRT execution provider stale issues that have not been addressed in a while; categorized by a bot

Comments

@cabinader
Copy link

Describe the issue

I'm exporting an nnUNetV2 model from torch to onnx fromat using torch.dynamo_export. I can launch the model with onnxruntime on the CPUExecutionProvider as well as the CUDAExecutionProvider. However, when using the TensorRTExecutionProvider I get the following error when initialiazing my session:

onnxruntime.capi.onnxruntime_pybind11_state.RuntimeException: [ONNXRuntimeError] : 6 : RUNTIME_EXCEPTION : Exception during initialization: /onnxruntime_src/onnxruntime/core/providers/tensorrt/tensorrt_execution_provider.cc:2185 SubGraphCollection_t onnxruntime::TensorrtExecutionProvider::GetSupportedList(SubGraphCollection_t, int, int, const onnxruntime::GraphViewer&, bool*) const graph_build.Resolve().IsOK() was false.

I've found some issues describing this error and people were mentioning the possibility that it could come from the fact that TensorRT does not deal with dynamic shapes. Therefore I've exported my model with fixed shapes but it dit not solve the issue.

Moreover when using the polygraphy library it seems like there is no problem

polygraphy run checkpoint_final_torch2.4.1+cu121_onnx1.16.1.onnx --trt

I get the following

image

Many thanks for your help !

To reproduce

Ubuntu 22.04
Cuda 12.2
Cudnn 9.4.0
TensorRT 10.2.0
onnxruntime-gpu 1.19.2
onxx 1.16.1 (for model export)
torch 2.4.1 (for model export)

Urgency

No response

Platform

Linux

OS Version

Ubuntu 22.04

ONNX Runtime Installation

Released Package

ONNX Runtime Version or Commit ID

1.19.0

ONNX Runtime API

Python

Architecture

X64

Execution Provider

TensorRT

Execution Provider Library Version

TensorRT 10.2

@snnn snnn added the ep:TensorRT issues related to TensorRT execution provider label Sep 24, 2024
@chilo-ms
Copy link
Contributor

Hi, Could you share the model so that we can repro from our side?

@cabinader
Copy link
Author

cabinader commented Oct 4, 2024

Sorry for the late reply. I can't share the original model for confidentiality reasons. However, I could reproduce the issue on the exact same architecture with a publicly available model. Here's a link to download it: https://we.tl/t-1xrZOLot4p

Copy link
Contributor

github-actions bot commented Nov 3, 2024

This issue has been automatically marked as stale due to inactivity and will be closed in 30 days if no further activity occurs. If further support is needed, please provide an update and/or more details.

@github-actions github-actions bot added the stale issues that have not been addressed in a while; categorized by a bot label Nov 3, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ep:TensorRT issues related to TensorRT execution provider stale issues that have not been addressed in a while; categorized by a bot
Projects
None yet
Development

No branches or pull requests

3 participants