Skip to content

Latest commit

 

History

History
81 lines (45 loc) · 6.63 KB

README.md

File metadata and controls

81 lines (45 loc) · 6.63 KB

Why the Hype around GANs?

In summer of 2020, a reading project on Generative Adversarial Networks was undertaken as a part of WnCC's Seasons of Code initiative. Since the mentors of this project are WnCC members themselves, they decided to open source the project timeline for the benefit of anyone out there who's just starting out with Deep Learning.

The only prerequisite for this course is basic Python proficiency. With this being said, let's begin!

Week 1 | Getting Started

  • To get a basic understanding what a Neural Network is, watch this excellent playlist by 3Blue1Brown - Neural Networks.

  • Now, to build your own Neural Network, try completing this short course by Andrew NG - Neural Networks and Deep Learning. You can opt for Financial aid, if you need to.

  • It is sometimes overwhelming to visualise how a neural network improves its performance over time. This website will allow you to do just the same - Neural Network Playground.
    P.S. - You might come across new terms here. Instead of just overlooking them, try finding out what they mean. You could google them or just visit our Wiki page on Deep Learning.

  • Exhausted by all the math? Here's an article to get you motivativated - Applications of GANs.

Week 2-3 | Learning Pytorch

  • Libraries like PyTorch and Tensorflow make implementing neural nets a bliss. PyTorch's 60 Minute Blitz will help you get started. It's recommended that you type your own code as well.

  • Hopefully you would have got a clear understanding of what a neural network is. It is now time to tinker around with them to decrease training time, and improve accuracy. Do this course on Hyperparameter Tuning to know more. You can skip the TensorFlow part if you wish to, since you already got an idea of PyTorch.

  • You can now do further PyTorch tutorials. The above course would help you understand these examples better. Make your own Google Colab notebooks and tinker around. It's important to try out various values of hyperparameters for better practical learning.

Week 4 | Attempting a Kaggle Challenge

  • MNIST dataset is a large database of handwritten digits. Pytorch has a tutorial to train your NN on the MNIST dataset. You can leave the CNN part for now.

  • Kaggle is a community of data scientists where you can find a vast variety of datasets and competitions to hone your skills. Try attempting this Kaggle Challenge to get started - Digit Recognizer.

Week 5 | CNNs

  • Convolutional Neural Networks have been considered revolutionary in processing images. Read either of these articles to get an understanding of how they work -

  • CIFAR-10 is an established computer-vision dataset. Attempt a related challenge on Kaggle - Object Recognition.

  • Try implementing CNN models for classification problems on your own. This article will guide you as to how you can Create your own dataset.

Week 6 | GANs

  • At last, we will now start with GANs. In case you have never read a research paper before, here is a guide to get you started - How to Read a Research Paper.

  • It might be overwhelming to read this paper but it is strongly recommended that you do - GANs.

  • It is okay even if you do not understand all of it. These articles might come handy -

Week 7 | Implementing GANs

  • Now that you have understood how a GAN works, you can try implementing GANs for simple datasets. You can refer to the code given here - PyTorch GANs. You can leave DCGANs for now.

  • Also read this article for for some Tips to make GANs work.

Week 8 | Tinkering Around

Conclusion

We hope this plan helps you in getting a better understanding of "the most interesting idea in the last 10 years in Machine Learning" as described by Yann LeCun. If on your learning path you discover some more efficient resources, we would be more than happy to incorporate them here. Just create a pull request on this repository.


Created with ❤️ by WnCC