Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to resize (downsample) 5D samples in keras? #16260

Closed
innat opened this issue Mar 17, 2022 · 19 comments
Closed

How to resize (downsample) 5D samples in keras? #16260

innat opened this issue Mar 17, 2022 · 19 comments
Assignees
Labels
keras-team-review-pending Pending review by a Keras team member. type:bug/performance

Comments

@innat
Copy link

innat commented Mar 17, 2022

(Reposting from here)


Currently, for 5D data (batch_size, h, w, depth, channel), the tf.keras.backend.resize_volumes or UpSampling3D can be used to upsampling purpose. For example, I can do

a  = tf.ones(shape=(1, 100, 100, 64, 1))

tf.keras.backend.resize_volumes(
       a, depth_factor=2, 
       height_factor=2, 
       width_factor=2, 
       data_format="channels_last").shape
TensorShape([1, 200, 200, 128, 1])

These factor values (above) should be an integer. https://github.com/keras-team/keras/blob/master/keras/backend.py#L3441-L3444 - in that case, how can I downsample the input sample, ie.

a  = tf.ones(shape=(1, 100, 100, 64, 1))

tf.keras.backend.resize_volumes(
       a, depth_factor=0.5, 
       height_factor=0.5, 
       width_factor=0.5 
       data_format="channels_last").shape

TypeError: 'float' object cannot be interpreted as an integer

# EXPECTED
TensorShape([1, 50, 50, 32, 1])

And HERE https://stackoverflow.com/q/57341504/9215780, another scenario where the factor needed to be fractional.


( PS. downsample or upsample can be done properly in the same manner with scipy.ndimage.zoom )

@bhack
Copy link
Contributor

bhack commented Mar 17, 2022

The big question here is about Keras vs Tensorflow ownership.

Could this be achieved composing TF ops calls available in TF? If not it is a high level API request for Keras but it requires another ticket/PR in TF.

Also more in general I don't know what is the most update plan/scope for the tf.keras.backed namespace from an user perspective (probably @qlzh727 @fchollet could help on this point).

@innat
Copy link
Author

innat commented Mar 17, 2022

It's might be tricky or specialized cases. Normally, in volumetric CT samples, the spatial information mostly appears in the middle of the depth range.

In most cases, the beginning and ending slices are completely black - that makes them useless to use. In that case, in order to downsample the volume depth size, it's needed to resample from the middle range of slices to get the most informative slices. But I'm not sure if this is a general scenario for all volumetric data.

Now, in tf, if we want to downsample volumetric data and set a specific middle-range for the depth axis, we can do as follow - though not sure if it's optimal.

# (input data: 1, 50, 50, 20, 4)
# (desired output: 1, 25, 25, 10, 4)
a = tf.ones(shape=(1, 50, 50, 20, 4))
a.shape # TensorShape([1, 50, 50, 20, 4])

a2 = tf.reshape(a, [-1, 50, 50, 20*4])
a2.shape # TensorShape([1, 50, 50, 80])

a3 = tf.image.resize(a2, [25, 25])
a3.shape # TensorShape([1, 25, 25, 80])

a4 = tf.reshape(a3, [-1, 25, 25, 20, 4])
a4.shape # TensorShape([1, 25, 25, 20, 4])

# HERE, Picking some middle slices -
# - Assuming that by this we may get relevant slices. 
# How convenient is this? # May not general! 
a5 = a4[...,  5:15, :]
a5.shape
TensorShape([1, 25, 25, 10, 4])

@bhack
Copy link
Contributor

bhack commented Mar 17, 2022

a5 = a4[..., 5:15, :]

Is this pseudocode? As I think it will not work with Tensor.

@qlzh727
Copy link
Member

qlzh727 commented Mar 17, 2022

Agreed with @bhack. I think this kind of implementation should be pushed to ops level (rather than implement on keras side with py) to achieve the best performance. We aren't going to add more API to keras.backend since it was historically a layering between keras and different backends, and TF is the only backend now.

@innat
Copy link
Author

innat commented Mar 17, 2022

@bhack It worked on eager mode. Maybe will fail on graph mode, not sure I didn't try in actual training.

tf.config.run_functions_eagerly(False)
tf.compat.v1.disable_eager_execution()

a5 = a4[...,  5:15, :] # OK

@innat
Copy link
Author

innat commented Mar 17, 2022

@qlzh727 With the current tf ops, is it possible effectively and safely to do this operation? Or, should I open a ticket in tf.

@bhack
Copy link
Contributor

bhack commented Mar 17, 2022

@bhack It worked on eager mode. Maybe will fail on graph mode, not sure I didn't try in actual training.

tf.config.run_functions_eagerly(False)
tf.compat.v1.disable_eager_execution()

a5 = a4[...,  5:15, :] # OK

Oh sorry I supposed that you need to do a slice assigment later.

@bhack
Copy link
Contributor

bhack commented Mar 17, 2022

It really depend on what you want to do. If the interpolation included in tf.image.resize needs to just work with the current dimensions that It already support It could be ok.
Do we have cases where we need to interpolate across multiple dimensions?

@innat
Copy link
Author

innat commented Mar 18, 2022

Do we have cases where we need to interpolate across multiple dimensions?

Not sure if there are many causes for this. For medical cases, I've seen that people often adopt a resampling approach over all relevant dimensions (h, w, depth).

@bhack
Copy link
Contributor

bhack commented Mar 18, 2022

Probably an interface like:

https://www.mathworks.com/help/images/ref/imresize3.html

@sachinprasadhs sachinprasadhs added the keras-team-review-pending Pending review by a Keras team member. label Mar 24, 2022
@LukeWood
Copy link
Contributor

So if I understand correctly we will NOT be supporting this on the keras side? If so can close this issue. @qlzh727 is that correct?

@innat
Copy link
Author

innat commented Mar 24, 2022

@qlzh727 With the current tf ops, is it possible effectively and safely to do this operation? Or, should I open a ticket in tf.

@qlzh727 Should I close here and reopen it in tensorflow?

@bhack
Copy link
Contributor

bhack commented Mar 24, 2022

@innat It was not scheduled for 3 years at tensorflow/tensorflow#33951.
I suppose that a community contributed PR is required also if we move this in TF.

but also with a community PR the real problem is that namespace is currently orphan (and it is not a so isolated case tensorflow/community#412)

@LukeWood
Copy link
Contributor

Yeah - @innat lets re-open this as an issue against TensorFlow or in tf community

@google-ml-butler
Copy link

Are you satisfied with the resolution of your issue?
Yes
No

@innat
Copy link
Author

innat commented Apr 24, 2022

@qlzh727 @LukeWood Can it be moved to the tensorflow repo?

@LukeWood
Copy link
Contributor

Hey @innat ! I unfortunately do not work much on tensorflow API design, so I am unsure.

@innat
Copy link
Author

innat commented Apr 25, 2022

Actually, I just want you guys to move this issue to tensorflow, HERE. I think it can be done, LIKE.

But I can't see any option to move my issue to another repository from my side. It may be visible to tf/keras teams.

@bhack
Copy link
Contributor

bhack commented Apr 25, 2022

Actually, I just want you guys to move this issue to tensorflow, HERE. I think it can be done, LIKE.\n\nBut I can't see any option to move my issue to another repository from my side. It may be visible to tf/keras teams.

It could not be done between Keras and TF as at some point there was a decision to have two independent github Orgs (keras-team and tensorflow).

https://docs.github.com/en/issues/tracking-your-work-with-issues/transferring-an-issue-to-another-repository

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
keras-team-review-pending Pending review by a Keras team member. type:bug/performance
Projects
None yet
Development

No branches or pull requests

6 participants