Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Failed to get (TensorRT) engine for RT-DETR #560

Open
prashant-dn opened this issue Aug 14, 2024 · 1 comment
Open

Failed to get (TensorRT) engine for RT-DETR #560

prashant-dn opened this issue Aug 14, 2024 · 1 comment

Comments

@prashant-dn
Copy link

Hi,

I have an RT-DETR (l) model trained using ultralytics. Naturally, I used the ultralytics-rtdetr script to get an ONNX file. As I wanted a dynamic batch size, I use

python3 rtdetr_export.py -w best.pt -s 384 640 --simplify --dynamic 

This completes normally. When I export the engine using this onnx,

/usr/src/tensorrt/bin/trtexec --onnx=best.onnx \
--minShapes=input:1x3x384x640 \
--optShapes=input:2x3x384x640 \
--maxShapes=input:40x3x384x640 \
--fp16 \
--saveEngine=engines/best.engine

I get this error

[08/14/2024-01:28:47] [W] [TRT] onnx2trt_utils.cpp:364: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32.
[08/14/2024-01:28:47] [W] [TRT] onnx2trt_utils.cpp:390: One or more weights outside the range of INT32 was clamped
[08/14/2024-01:28:47] [E] [TRT] ModelImporter.cpp:720: While parsing node number 2 [Pad -> "/0/model.0/Pad_output_0"]:
[08/14/2024-01:28:47] [E] [TRT] ModelImporter.cpp:721: --- Begin node ---
[08/14/2024-01:28:47] [E] [TRT] ModelImporter.cpp:722: input: "/0/model.0/stem1/act/Relu_output_0"
input: "/0/model.0/Reshape_1_output_0"
input: ""
output: "/0/model.0/Pad_output_0"
name: "/0/model.0/Pad"
op_type: "Pad"
attribute {
  name: "mode"
  s: "constant"
  type: STRING
}

[08/14/2024-01:28:47] [E] [TRT] ModelImporter.cpp:723: --- End node ---
[08/14/2024-01:28:47] [E] [TRT] ModelImporter.cpp:726: ERROR: builtin_op_importers.cpp:2990 In function importPad:
[8] Assertion failed: inputs.at(2).is_weights() && "The input constant_value is required to be an initializer."
[08/14/2024-01:28:47] [E] Failed to parse onnx file
[08/14/2024-01:28:47] [I] Finish parsing network model
[08/14/2024-01:28:47] [E] Parsing model failed
[08/14/2024-01:28:47] [E] Engine creation failed
[08/14/2024-01:28:47] [E] Engine set up failed
&&&& FAILED TensorRT.trtexec [TensorRT v8001] # /usr/src/tensorrt/bin/trtexec --onnx=best.onnx --minShapes=input:1x3x384x640 --optShapes=input:2x3x384x640 --maxShapes=input:40x3x384x640 --fp16 --saveEngine=engines/best.engine
@marcoslucianops
Copy link
Owner

Does it work with the official weights?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants