-
Notifications
You must be signed in to change notification settings - Fork 69
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Get Started with Neural Networks #12
Comments
How much deep the neural network should be, like any precise information related to number of layers, can be provided on this? |
@kwanit1142 i already mentioned this issue is for later stages of development. We haven't decided much for this right now! |
I have written a NN library using NumPy only few months ago. Will something like this works? |
@q-viper I found your repository quite interesting and fun. We do want something like that in our project. I was about to start work on auto differentiator for our package. But it seems you have implemented some sort of version of it. In your repository. I am checking your repository from my side, but please let me know if we could somehow merge our two libraries together in the sense that we have already implemented optimizers like RMSProp, SGD, Adam etc. Can we use our implementations of these optimizing algorithms (and the rest of the library) with the rest of your code with a feasible amount of adjustments? Your cooperation will help save a lot of time. |
Sure why not? As far as i am getting credits and i am looking for improvements. |
@q-viper That sounds great! I went through your repo and I understand your code now. Since we want to include portions of your implementation of Neural Networks into our codebase, we could go two ways. Either you make PRs to this repo and add your code after understanding this repository and utilising what this repository has already implemented. Or I could have the permission to use your code and then add it to this repository through a sequence of PRs. Of course, due credit will be given to you for the code that I use. There is a need for some code cleanup and integrating the two codebases will be my job. For credit, could you please provide the information that you would like to include, like would you like to be referred with your GitHub username, your real name, do we need to mention the repository etc? As soon as I merge the first PR I will add a CONTRIBUTORS.md file which contains details of all contributors. I will mention you in the PR so that you can see it. If you want to ask anything else feel free to join the newly made gitter channel. |
@roshansingh9001 can i make PR using my codes to this repo? |
Can you explain this issue if there's any update? |
@parva-jain If you want to, you can work on this issue. We want a separate neural network code implementation but it should use our activations, gradients and loss functions. Currently, the implementation of Neural Networks is placed in MLlib directory in the quark subdirectory. You might have a look into it. However, currently, since we do not have a working AutoGrad or gradient methods for activations, it is very hard to implement neural networks. issue number #62 and #64 are recommended to be finished before we continue with this issue. |
This issue for later stages of development. Once we have implemented all the basic algorithms it is necessary for us to make our own neural network from our module without any other library then numpy and matplotlib.
The text was updated successfully, but these errors were encountered: