Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Get Started with Neural Networks #12

Open
TarunTomar122 opened this issue Oct 1, 2020 · 10 comments
Open

Get Started with Neural Networks #12

TarunTomar122 opened this issue Oct 1, 2020 · 10 comments
Labels

Comments

@TarunTomar122
Copy link
Member

This issue for later stages of development. Once we have implemented all the basic algorithms it is necessary for us to make our own neural network from our module without any other library then numpy and matplotlib.

@rohansingh9001 rohansingh9001 added the nobody-assigned Nobody has been assigned to work on this issue. label Oct 2, 2020
@kwanit1142
Copy link
Collaborator

How much deep the neural network should be, like any precise information related to number of layers, can be provided on this?

@TarunTomar122
Copy link
Member Author

@kwanit1142 i already mentioned this issue is for later stages of development. We haven't decided much for this right now!

@q-viper
Copy link
Contributor

q-viper commented Oct 4, 2020

I have written a NN library using NumPy only few months ago. Will something like this works?

@rohansingh9001
Copy link
Collaborator

@q-viper I found your repository quite interesting and fun. We do want something like that in our project.

I was about to start work on auto differentiator for our package. But it seems you have implemented some sort of version of it. In your repository.

I am checking your repository from my side, but please let me know if we could somehow merge our two libraries together in the sense that we have already implemented optimizers like RMSProp, SGD, Adam etc.

Can we use our implementations of these optimizing algorithms (and the rest of the library) with the rest of your code with a feasible amount of adjustments?

Your cooperation will help save a lot of time.
Thanks for your time and interest!

@q-viper
Copy link
Contributor

q-viper commented Oct 4, 2020

Sure why not? As far as i am getting credits and i am looking for improvements.

@rohansingh9001
Copy link
Collaborator

@q-viper That sounds great! I went through your repo and I understand your code now.

Since we want to include portions of your implementation of Neural Networks into our codebase, we could go two ways. Either you make PRs to this repo and add your code after understanding this repository and utilising what this repository has already implemented.

Or I could have the permission to use your code and then add it to this repository through a sequence of PRs. Of course, due credit will be given to you for the code that I use. There is a need for some code cleanup and integrating the two codebases will be my job.

For credit, could you please provide the information that you would like to include, like would you like to be referred with your GitHub username, your real name, do we need to mention the repository etc?

As soon as I merge the first PR I will add a CONTRIBUTORS.md file which contains details of all contributors. I will mention you in the PR so that you can see it.

If you want to ask anything else feel free to join the newly made gitter channel.
You can ask anything else there.

@q-viper
Copy link
Contributor

q-viper commented Oct 6, 2020

@roshansingh9001 can i make PR using my codes to this repo?

@rohansingh9001 rohansingh9001 added WOC hard and removed nobody-assigned Nobody has been assigned to work on this issue. labels Dec 2, 2020
@parva-jain
Copy link
Contributor

Can you explain this issue if there's any update?

@rohansingh9001
Copy link
Collaborator

@parva-jain If you want to, you can work on this issue. We want a separate neural network code implementation but it should use our activations, gradients and loss functions. Currently, the implementation of Neural Networks is placed in MLlib directory in the quark subdirectory. You might have a look into it.

However, currently, since we do not have a working AutoGrad or gradient methods for activations, it is very hard to implement neural networks. issue number #62 and #64 are recommended to be finished before we continue with this issue.

@kwanit1142
Copy link
Collaborator

As #62 and #64 are solved, so @thisis-nkul, you can start from here on. Also, if anyone wants to work on this as well, then they are welcome too.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

6 participants