The goal of this project is to build a semantic search engine, using S-BERT, after first implementing and evaluating it on the STS benchmark.
This is done as part of the second lab exercise in the course ID2223 Scalable Machine Learning and Deep Learning at KTH.
So far we have only implemented the regression objective of the S-BERT paper, and evaluated it on the STS benchmark, using the bert-base-uncased
pre-trained BERT model from Hugging Face.