You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm working with a text to speech model so I cannot predict the size of the output tensors beforehand. According to the documentation, "some models have dynamic outputs, where the shape of output tensors can vary depending on the input. There's no straightforward way of handling this with the existing Java inference API, but planned extensions will make this possible." What's the non-straightforward way of handling it? For tflite.run, it says here that you can pass null as output but the same doesn't work with runMultipleInputsOutputs.
P.S: I cannot use a large enough buffer than can handle all possible inputs that can be passed to the model because I actually I have divided my model into 3 tflite files and I need to pass the output of one model to another model and therefore it has to be of the exact size that the model expects..
The text was updated successfully, but these errors were encountered:
I'm working with a text to speech model so I cannot predict the size of the output tensors beforehand. According to the documentation, "some models have dynamic outputs, where the shape of output tensors can vary depending on the input. There's no straightforward way of handling this with the existing Java inference API, but planned extensions will make this possible." What's the non-straightforward way of handling it? For
tflite.run
, it says here that you can passnull
as output but the same doesn't work withrunMultipleInputsOutputs
.P.S: I cannot use a large enough buffer than can handle all possible inputs that can be passed to the model because I actually I have divided my model into 3 tflite files and I need to pass the output of one model to another model and therefore it has to be of the exact size that the model expects..
The text was updated successfully, but these errors were encountered: