-
-
Notifications
You must be signed in to change notification settings - Fork 357
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error when running Deepstream and Tensorrt 10.x #563
Comments
As far as I know, the DeepStream doesn't support other version of TensorRT. Only the versions it's compiled. |
Can we support Tensorrt version 10.x, or as you said above: Only the versions it's compiled. Another question: ERROR: [TRT]: 1: [stdArchiveReader.cpp::StdArchiveReaderInitCommon::46] Error Code 1: Serialization (Serialization assertion stdVersionRead == serializationVersion failed.Version tag does not match. Note: Current Version: 236, Serialized Engine Version: 239) Could not parse the ONNX model I see current version it 236, how can we update this version to 237, 239 or something with other tensorrt version Thanks @marcoslucianops I see version 7.0 |
The TRT 10.3 is the default for DeepStream 7.1. I will add the support this week. The TRT version for DeepStream depends on the NVIDIA release, so we should use exactly same CUDA/TRT they use for each DeepStream version. |
I want to update to tensorrt 10.x to make it support not auto case INT64 -> INT32
But when upgrade to tensorrt 10.x, ONNX can not auto convert to engine
I try export engine from onnx by trtexec in same env I build for running deepstream:
/usr/src/tensorrt/bin/trtexec --onnx=yolov8_warehouse_7_class.onnx --saveEngine=yolov8_warehouse_7_class.engine
But it error occurred
ERROR: [TRT]: 4: [runtime.cpp::deserializeCudaEngineEx::113] Error Code 4: Internal Error (Cannot deserialize engine with lean runtime since IRuntime::getEngineHostCodeAllowed() is false.)
ERROR: ../nvdsinfer/nvdsinfer_model_builder.cpp:1533 Deserialize engine failed from file: /ds_app/models/infer/engine/yolov8_warehouse_7_class.engine
ERROR: [TRT]: ModelImporter.cpp:949: While parsing node number 0 [Conv -> "/0/model.0/conv/Conv_output_0"]:
ERROR: [TRT]: ModelImporter.cpp:950: --- Begin node ---
input: "input"
input: "0.model.0.conv.weight"
input: "0.model.0.conv.bias"
output: "/0/model.0/conv/Conv_output_0"
name: "/0/model.0/conv/Conv"
op_type: "Conv"
attribute {
name: "dilations"
ints: 1
ints: 1
type: INTS
}
attribute {
name: "group"
i: 1
type: INT
}
attribute {
name: "kernel_shape"
ints: 3
ints: 3
type: INTS
}
attribute {
name: "pads"
ints: 1
ints: 1
ints: 1
ints: 1
type: INTS
}
attribute {
name: "strides"
ints: 2
ints: 2
type: INTS
}
ERROR: [TRT]: ModelImporter.cpp:951: --- End node ---
ERROR: [TRT]: ModelImporter.cpp:954: ERROR: onnxOpImporters.cpp:775 In function importConv:
[8] Assertion failed: (nbSpatialDims == kernelWeights.shape.nbDims - 2): The number of spatial dimensions and the kernel shape doesn't match up for the Conv operator. Number of spatial dimensions = 5, number of kernel dimensions = 4.
Could not parse the ONNX model
The text was updated successfully, but these errors were encountered: