Skip to content

Commit

Permalink
Avoid positiveSigmoid becoming negative
Browse files Browse the repository at this point in the history
  • Loading branch information
brs96 committed Dec 8, 2023
1 parent cb8fbe2 commit 7c9b38b
Showing 1 changed file with 2 additions and 2 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -270,10 +270,10 @@ private void trainSample(long center, long context, boolean positive) {

//When |affinity| > 40, positiveSigmoid = 1. Double precision is not enough.
//Make sure negativeSigmoid can never be 0 to avoid infinity loss.
double positiveSigmoid = Sigmoid.sigmoid(affinity) - EPSILON;
double positiveSigmoid = Sigmoid.sigmoid(affinity);
double negativeSigmoid = 1 - positiveSigmoid;

lossSum -= positive ? Math.log(positiveSigmoid) : Math.log(negativeSigmoid);
lossSum -= positive ? Math.log(positiveSigmoid+EPSILON) : Math.log(negativeSigmoid+EPSILON);

float gradient = positive ? (float) -negativeSigmoid : (float) positiveSigmoid;
// we are doing gradient descent, so we go in the negative direction of the gradient here
Expand Down

0 comments on commit 7c9b38b

Please sign in to comment.