Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to handle dynamic output tensors with tflite.runForMultipleInputsOutputs #962

Open
hello-fri-end opened this issue Jan 24, 2024 · 0 comments

Comments

@hello-fri-end
Copy link

I'm working with a text to speech model so I cannot predict the size of the output tensors beforehand. According to the documentation, "some models have dynamic outputs, where the shape of output tensors can vary depending on the input. There's no straightforward way of handling this with the existing Java inference API, but planned extensions will make this possible." What's the non-straightforward way of handling it? For tflite.run, it says here that you can pass null as output but the same doesn't work with runMultipleInputsOutputs.

P.S: I cannot use a large enough buffer than can handle all possible inputs that can be passed to the model because I actually I have divided my model into 3 tflite files and I need to pass the output of one model to another model and therefore it has to be of the exact size that the model expects..

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant