Skip to content
/ CS50AI Public

A repository for all my work done for Harvard's CS50AI course

Notifications You must be signed in to change notification settings

koleada/CS50AI

Repository files navigation

CS50AI

This is a repository dedicated to all the projects I completed during CS50AI. I learned an incredible amount through this course and know that not only my programming skills but also my general problem-solving skills have drastically increased through this course. This was by far the most challenging course I have taken and the most complex topics I have ever dove into.

Through this course, I learned some search algorithms such as the minimax algorithm and using stack/queue frontiers along with nodes to assist with searching.

I learned about the Python logic library and how to use it to create logic in my code. I also learned about other ways of storing "knowledge" within certain objects and just how beneficial and powerful it can be to do so.

I then learned about optimization and how to further improve upon basic searching algorithms like the minimax. This time I implemented a backtracking search that was optimized by both heuristics and maintaining arc consistency. This specific algorithm I implemented is called the Maintaining Arc-Consistency algorithm and uses a helper algorithm called AC3 which ensures arc consistency between all the arcs in our data set.

Probability and dealing with uncertainty was another hugely important concept in this course. I learned about computing joint probabilities and from the joint probabilities computing conditional probabilities. More generally I learned that always being exact is often not the goal, sometimes being able to predict things or determine their approximate probabilities is even more valuable.

Next, I learned about machine learning and various ways I can train a model on a dataset. I learned about "rewarding" and "punishing" based on states that result from doing an action. Specifically, I used Q-learning to "teach" the computer what decisions are more valuable. Instead of always making greedy decisions I used the Epsilon greedy algorithm to introduce some level of exploration to the learning through probability.

Neural networks are likely the biggest buzzword in AI, so it was very cool to learn more about how they work. To do this I created my model and learned the basics of TensorFlow. I found it very interesting to experiment with different convolutional layers and various filters and kernel sizes to see how it affected the image classification accuracy.

The courses ended off on Natural Language. Here I was able to play around with the BERT language model and test its capability in predicting masked words. I also learned about the NLTK library and built a very simple parser based on nonterminal rules I created.

About

A repository for all my work done for Harvard's CS50AI course

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published