Skip to content
This repository has been archived by the owner on Aug 30, 2018. It is now read-only.

Reshape Input for Dynamic Batch Size #70

Open
akurniawan opened this issue Nov 17, 2017 · 2 comments
Open

Reshape Input for Dynamic Batch Size #70

akurniawan opened this issue Nov 17, 2017 · 2 comments

Comments

@akurniawan
Copy link

Based on this tutorial http://pytorch.org/tutorials/advanced/super_resolution_with_caffe2.html, we need to specify the batch_size while exporting the model from pytorch to onnx. In some cases we need a dynamic batch_size for inference, do you have any advice of how we can do this?

@ezyang
Copy link
Collaborator

ezyang commented Nov 17, 2017

This is a PyTorch export limitation. One idea we had for relaxing this restriction was this: you export the model twice with two different batch sizes. Then, looking at the dimensions which changed, you generalize your ONNX model accordingly. This code doesn't exist yet but it seems reasonably likely to work. One potential difficulty is that the symbolic batch size, while supported in ONNX spec, is not very well tested.

@akurniawan
Copy link
Author

Sorry, I still don't really get it by "export the model twice with two different batch sizes". I need to create two different protos with different batch sizes? But that would result still in static batch size right? Do you mind to elaborate more?

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants