-
Notifications
You must be signed in to change notification settings - Fork 84
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to use TensorRT in trained model #45
Comments
For
Then converting it to the following format would be suitable:
|
You're right. You need to change the model's input and other code using dictionaries to ensure the model uses pure tensors to forward. |
Thanks for your reply. I will try to deal with it. |
In my computer, the average inference time is approximately 10ms per scenario. |
How operator operations in torch_geometric are converted to onn?? |
PyG and ONNX don't work very well together, especially with functions like torch |
Hi @SunHaoOne have you produced the ONNX, Could you plz share some idea on the TensorRT process? |
I encountered some issues while rewriting this code: |
@xiaowuge1201 hi bro, have you find a solution? i got the same problem for dense computing. The inference speed is so damn slow with onnx-cpu version, and my gpu memory is not enough to run onnx-gpu version |
|
I am going to use TensorRT to accelerate my inference step.
For many issues, like the input data is a dict, it cannot be converted to ONNX.
The text was updated successfully, but these errors were encountered: