Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Codifying logs, and code to recreate them #2

Open
ardila opened this issue Oct 9, 2013 · 2 comments
Open

Codifying logs, and code to recreate them #2

ardila opened this issue Oct 9, 2013 · 2 comments

Comments

@ardila
Copy link
Member

ardila commented Oct 9, 2013

I'm having some trouble getting the logs into the repo right now because of some git issues. I'll commit them as soon as possible.

mnist logs are in here (inside figure 2 or 3)

http://cs.nyu.edu/~wanli/dropc/dropnn-exp.tar.gz

cifar logs are in here. (something something run##_log.txt)

http://cs.nyu.edu/~wanli/dropc/cifar10-9_32.tar.gz

@ardila
Copy link
Member Author

ardila commented Oct 9, 2013

@dicarlolab/daseibert it looks like the mnist training procedure was done only on a few fully connected layers
_on features extracted by a model trained on cifar!!
Besides this being very interesting,
I'm pretty sure that means that
a) we should not test the mnist dataset, or its procedures
b) this resulting extra computation time freed up by not doing this should be used by testing different parameters/procedures which have been used for cifar.
We should codify alex's procedure:
https://code.google.com/p/cuda-convnet/wiki/Methodology

And any other ones that we can find. If we could figure out any approximation of the methodology used for imagenet, then we should use that.

@ardila
Copy link
Member Author

ardila commented Mar 12, 2014

@yamins81 Do you have any code for automatically adjusting the learning rates?

@ardila ardila closed this as completed Mar 12, 2014
@ardila ardila reopened this Mar 12, 2014
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant