-
Notifications
You must be signed in to change notification settings - Fork 2.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ragged Tensor as an output from Tensorflow serving #2222
Comments
@bajaj6, Apologies for late reply. Can you try serving the same model on TF Serving and let us know if you face any issues. Please share the whole error stack trace to debug the issue on our end. Thank you! |
Response:
Searching for this error:
|
@bajaj6, Let us keep this issue as a feature request for supporting Ragged Tensor in Tensorflow serving. Thank you for reporting this issue. |
@singhniraj08 Sure, thanks. |
@bajaj6, Currently ragged tensor is not supported in Tensorflow Serving. One way to fix this is to remove the ragged tensor from the model. I will keep this issue open as a feature request for ragged tensor support. Thank you! |
Bug Report
System information
Describe the problem
We use tensorflow serving to serve models in production. We have a use case where the output of the model is a ragged tensor.
To see if the tensorflow serving supports ragged tensor as output, we created this toy example.
We save the model to a local disk and then load the model via tensorflow serving. I used [saved_model_cli][1] to inspect model signatures.
The model output has datatype
DT_INVALID
, I guess the tensorflow serving will fail to load this model.Exact Steps to Reproduce
1- Run the above the python code to save the model to local disk
2- Run the saved_model_cli command to print model signatures
saved_model_cli show --dir /Users/user/ragged_tensor/my_model/1 --all
I did post this in Stackoverflow here but haven't received any response.
Thanks
The text was updated successfully, but these errors were encountered: