Skip to content

thlurte/SiameseNet-tensorflow

Repository files navigation

Siamese Neural Networks

tensorflow implementation of (Koch et al. 2015)

Siamese Neural Network Architecture

Instructions

  • Set Up Virtual Environment (Optional but Recommended)
python -m venv venv
source venv/bin/activate 
  • Clone the repository
git clone https://github.com/thlurte/SiameseNet-tensorflow.git
  • Change into the directory of the repository
cd SiameseNet-tensorflow
  • Install Dependancies
pip install -r requirements.txt
  • To Train the model
python main.py --data_dir <path-to-dataset>

Architecture

The model consists of a sequence of convolutional layers, each of which uses a single channel with filters of varying size and a fixed stride of 1.

Activation

The network applies a ReLU activation function to the output feature maps, optionally followed by maxpooling with a fliter size and stride of 2.

Loss Function

Binary cross entropy is used as loss function to calculate the error.

Optimizer

Adam is used to find the global minima in this architecture.

Weight Initialization

All network weights are initialized in the convolutional layers from a normal distribution with zero-mean and a standard deviation of $10^{−2}$. Biases were also initialized from a normal distribution, but with mean 0.5 and standard deviation $10^{−2}$.

References

About

tensorflow implementation of Koch et al. (2015)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages