Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Padding in Tensorflow vs Pytorch #17

Open
zanussbaum opened this issue Oct 10, 2020 · 2 comments
Open

Padding in Tensorflow vs Pytorch #17

zanussbaum opened this issue Oct 10, 2020 · 2 comments

Comments

@zanussbaum
Copy link

Hey I had a quick question regarding padding. I don't have a ton of experience with Pytorch, but as far as I can tell from this, it seems that the layer adds the parameter padding of zeros to each axis of the input.

However, in your implementation, you use padding='same'. In the original implementation in pytorch, the author uses padding=1. I'm under the impression that to get the padding the same, we would need to specify the padding manually in tensorflow.

Am I missing something here where Tensorflow is taking care of something under the hood that I'm not understanding?

@asmith26
Copy link
Owner

Hi @zanussbaum,

Thanks for the info. It's been a while since I wrote this, and I'm not familiar with PyTorch syntax so I'm not sure if there is a discrepancy. I remember when I originally wrote this I followed the original Lua code, https://github.com/szagoruyko/wide-residual-networks/blob/master/models/wide-resnet.lua.

If you think there is a discrepancy and can improve on the model results currently obtained, I happily accept pull requests.

Thanks again! :)

@zanussbaum
Copy link
Author

Thanks I'll try and see what's going on. If there's not a discrepancy, at least I'll hopefully understand everything a little more :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants