-
I ran into this error when I was running the training, with the backend set to torch I have installed: Here is a small test code that I ran and reproduced this result. If I had misused the keras API, or I missed any configuration for keras 3+, please do let me know. Thanks!
Output with backend set to torch:
Output with backend set to tensorflow:
|
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
"distribution strategy" is a TensorFlow concept, so this attribute only exist with the TensorFlow backend. To do distributed training with the torch backend, see this guide: https://keras.io/guides/distributed_training_with_torch/ |
Beta Was this translation helpful? Give feedback.
"distribution strategy" is a TensorFlow concept, so this attribute only exist with the TensorFlow backend. To do distributed training with the torch backend, see this guide: https://keras.io/guides/distributed_training_with_torch/