You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When I load the model using the tensorRT model file generated by TF-TRT, the console displays the following information:
2022-07-13 16:42:54.914735: E tensorflow/compiler/tf2tensorrt/utils/trt_logger.cc:42] DefaultLogger coreReadArchive.cpp (38) - Serialization Error in verifyHeader: 0 (Version tag does not match)
2022-07-13 16:42:54.915035: E tensorflow/compiler/tf2tensorrt/utils/trt_logger.cc:42] DefaultLogger INVALID_STATE: std::exception
2022-07-13 16:42:54.915065: E tensorflow/compiler/tf2tensorrt/utils/trt_logger.cc:42] DefaultLogger INVALID_CONFIG: Deserialize the cuda engine failed.
My system environment is as follows:
CentOS =7.9.2009
Tython = 3.7.6
Tensorflow = 2.4
Nvidia-Driver=495.29.05
CUDA=11.0
cuDnn=8.0
TensorRT=7.2.1
About the tensorRT model file, I generated it with tf.experimental.tensorrt.Converter(), and I checked the assets folder under the save file path, there are many similar trt-serialized-engine.TRTEngineOp files generated, which were not generated using tf.saved_model.save() before
I am not sure if my tensorRT model file is generated correctly. Maybe, you can provide a method to verify this.
Below is the code of my conversion process:
The text was updated successfully, but these errors were encountered:
When I load the model using the tensorRT model file generated by TF-TRT, the console displays the following information:
2022-07-13 16:42:54.914735: E tensorflow/compiler/tf2tensorrt/utils/trt_logger.cc:42] DefaultLogger coreReadArchive.cpp (38) - Serialization Error in verifyHeader: 0 (Version tag does not match)
2022-07-13 16:42:54.915035: E tensorflow/compiler/tf2tensorrt/utils/trt_logger.cc:42] DefaultLogger INVALID_STATE: std::exception
2022-07-13 16:42:54.915065: E tensorflow/compiler/tf2tensorrt/utils/trt_logger.cc:42] DefaultLogger INVALID_CONFIG: Deserialize the cuda engine failed.
My system environment is as follows:
CentOS =7.9.2009
Tython = 3.7.6
Tensorflow = 2.4
Nvidia-Driver=495.29.05
CUDA=11.0
cuDnn=8.0
TensorRT=7.2.1
About the tensorRT model file, I generated it with tf.experimental.tensorrt.Converter(), and I checked the assets folder under the save file path, there are many similar trt-serialized-engine.TRTEngineOp files generated, which were not generated using tf.saved_model.save() before
I am not sure if my tensorRT model file is generated correctly. Maybe, you can provide a method to verify this.
Below is the code of my conversion process:
The text was updated successfully, but these errors were encountered: