Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

onnx export #59

Open
GallonDeng opened this issue May 18, 2024 · 4 comments
Open

onnx export #59

GallonDeng opened this issue May 18, 2024 · 4 comments

Comments

@GallonDeng
Copy link

great work! since radio could be used to get sota feature representation, it may be very useful for some tasks such as image retrieval or video retrieval. Also, is there example for onnx model export of radio, which could be easily integrated in downstream task just mentioned before.

@mranzinger
Copy link
Collaborator

We have an overly complicated example of doing an onnx export across the suite of models we analyzed: https://github.com/NVlabs/RADIO/blob/main/examples/count_params.py (Just uncomment the code around the onnx export). Have you run into problems using it? I think with opset=17, it should hopefully be pretty straightforward.

@GallonDeng
Copy link
Author

thanks a lot @mranzinger ; I tried the 'count_paramas.py' and successfully exported onnx model for e-radio2. Here are some questions: 1. at least which version should be used for converting the onnx model to trt engine? it fails for trt 8.4; 2. the exported default e-radio2 onnx model still too large to show with netron tool, how to use a smaller version of e-radio2 since in the codabase there are many versions of registered e-radio2 for different configuration.

@mranzinger
Copy link
Collaborator

Good questions. @gheinrich is much more of an expert on e-radio + tensorrt things. Greg, could you shed some light on this?

@gheinrich
Copy link
Collaborator

Hello, I used TensorRT 9.0 to export E-RADIO2. If you're looking for a smaller version of E-RADIO, you can use eradio_xxxtiny though it might not be much easier to visualize. It is not shallower than the full-size version, only slimmer.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants