Replies: 1 comment
-
No currently. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
We are using the Java wrapper for the onnxruntime and running in an environment where we would like to have some guard rails as to how much native memory the onnxruntime uses. Is there any way of calculating based on the onnx model how much native memory it will need to allocate or maybe some type of expansion factor guideline based on the input models size?
Thanks.
Beta Was this translation helpful? Give feedback.
All reactions