Skip to content

Vishwaaaah/Chatbot-using-BERT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Chatbot-using-BERT

🤖BERT(Bidirectional Encoder Representations from Transformers) is a state-of-the-art natural language processing (NLP) model developed by Google. It revolutionized the field by introducing a bidirectional approach to language understanding. Unlike traditional models that read text in a left-to-right or right-to-left manner, BERT considers the entire context of a word by looking at both its left and right context in all layers of the model. BERT incorporates two key training objectives: Masked Language Modeling (MLM) and Next Sentence Prediction (NSP). model-name: BERT base model (uncased) by huggingface.

🤖Libraries used: Data preprocessing: pandas, JSON EDA+ Visualising: matplotlib , seaborn, wordcloud Machine Learning: sklearn, Tensorflow, Keras, torch, nltk, Transformers(neural networks that learn context and understand through sequential data analysis)

🤖Also used Weights & Biases which is a machine learning experiment tracking and visualization platform. It helped me to keep track of machine learning experiments, log relevant information, and visualize results in a centralized and user-friendly dashboard.

🤖During the project I got the opportunity to dig deep in the field of artificial intelligence. It fascinates me how it works and motivates me to further work and contribute to the field.🚀 Regards.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published