-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[WIP]Ports generative networks #7196
[WIP]Ports generative networks #7196
Conversation
I'm encountering torchscript errors, which I will document as I implement each network:
|
Hi @marksgraham, yes it's fine to skip the torchscript test. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
|
||
def __init__(self, spatial_dims: int, in_channels: int) -> None: | ||
super().__init__() | ||
self.pad = (0, 1) * spatial_dims |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this pad value being hardcoded for a specific reason? To make this class more general, perhaps we could add a parameter or pad_kwargs
? What do you think?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is the padding value that allows for downsampling by a factor of 2 each level; so that the output shape of the network matches the input shape (as long as the input shape is even). I can't think of a use case for changing the padding here, specifically. I suppose we might want to find a way to support a user supplying non-even inputs but I'm not even sure if MONAI's base AutoEncoder network supports that right now.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We have a padding argument for the AutoEncoder which can be the values compatible with the Pytorch Conv classes. This will by default be set to be a value to ensure 1-stride convolutions produce outputs with the same shape as the inputs but it can be set to other things.
The padding you have here is specifically for convolutions having an odd kernel size (eg. 3x3 or 3x3x3)? Instead of having a padding value just don't set padding
in Convolution
and let that class handle how to do the padding for downsampling.
I'm not sure you need this class anyway since the downsample you want can be done with Convolution
with stride=2, kernel_size=3, conv_only=True
.
Thanks for reviewing - good point on the blocks, I'll see how many of them can be combined. I've still got a few more networks to add, too, just adding bits to the PR as I go along. |
Would it be OK to defer the merging of some of the duplicate blocks (e.g. Attention, Resblock) until porting the rest of MONAI Generative is done? Until it is fully ported, we will be supporting the MONAI Gen repo too, and keeping the code here tightly coupled to that codebase will make it much easier to push any bugfixes etc from there to here. Once the port is done, MONAI Gen will be archived and users will be redirected here, and then it will be easier to devote our efforts to more tightly integrating the ported code and MONAI Core through better used of shared blocks, etc. |
Yes, that should be fine. We can also create another ticket for refactoring. |
Duplicate code/functionality isn't good and doesn't look good either. I get your reasoning for pushing off a refactor so I would suggest that we mark redundant definitions as being private (prefix _ to their names) and write in their docstrings that these shouldn't be used and will be factored our later. What we don't want is users relying on these types directly then they disappear or we replace them with shim types we have to maintain. |
Ticket for the refactoring is up here |
for more information, see https://pre-commit.ci
Hi @marksgraham, thanks for your hard work on this PR. |
Yes that seems reasonable to me! I'll split into a PR for each network over the coming week :) |
Partially fixes #6676 .
Description
A few sentences describing the changes proposed in this pull request.
Types of changes
./runtests.sh -f -u --net --coverage
../runtests.sh --quick --unittests --disttests
.make html
command in thedocs/
folder.