You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I think some preprocessing layers in Keras with 2.9 was refactored, like KLP layers in Keras-cv, on the new base class acquiring a randomization policy within the batch + vectorized_map (randomize the transformation for every single element in the batch).
Other then for missing converters I think that the problem often is in that native ops that doesn't support "a batch" of different parameters: tensorflow/tensorflow#55639
As already exposed with #291 on nightly we have the new signature that will expose the root cause of the fallback:
If you want to really benchmark the performance gap here an in Keras refactored preprocessing layers I suppose what you really want to test is the batch augmentation (with fixed parameter) vs within batch augmentation.
P.s. just to make the history more consistent, for who is landing on this ticket directly, we started the discussion with @qlzh727 3 months ago at #146.
tensorflow/tensorflow#56242
The text was updated successfully, but these errors were encountered: