Replies: 3 comments 4 replies
-
Hey, it is definitely possible. That is also true for Pytorch. Having said that, I am currently working on a tensorflow refactor, which will showcase the usage of TNNs with tensorflow in a very similar way as it is for Pytorch now. Stay tuned! |
Beta Was this translation helpful? Give feedback.
-
Thanks for your quick reply Wilhem, I'm using the new tf2 notebook and the optimization has been a lot easier, thanks. My problem with saving the model persists. I added the following decorator above the TNNModel class "@tf.keras.saving.register_keras_serializable" since it was mandatory, but then when I load the model I get the following error: "ValueError: You cannot build your model by calling build if your layers do not support float type inputs. Instead, in order to instantiate and build your model, call your model on real tensor data (of the correct dtype). The actual error from call is: Exception encountered when calling layer 'rnn_2' (type RNN). TNNCell.get_initial_state() got multiple values for argument 'inputs' Call arguments received by layer 'rnn_2' (type RNN): Any comments? Thanks again. |
Beta Was this translation helpful? Give feedback.
-
#5 adds examples for model saving |
Beta Was this translation helpful? Give feedback.
-
Hi and congratulations for this great work. My question is if it's possible to save to a file the Tensorflow model in order to load it and use it for real-time inference?
Thanks
Beta Was this translation helpful? Give feedback.
All reactions