In OCGNN model, why updates radius only in warm-up training epochs #90
-
I understand that OCGNN model aims to jointly learn the net parameters "w" and minimize the volume of the data description hypersphere which is characterized by a radius "r" and a center "c" as explained in "One-Class Graph Neural Networks for Anomaly Detection in Attributed Networks" paper. That means OCGNN learn "w", "r", and "c" not only "w" I wonder why PyGod scripts updates radius only in warm-up training epochs, In that setting the radius would be learned from only first few epochs (default is 2). I also found that they calculate the hypersphere center before the first epoch, as the average of all data points.In contrast PyGod start the hypersphere center from (zeros) and calculate it after getting the (distance, loss, score) of the first epoch. Thanks in advance for your clarification. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 2 replies
-
In the official implementation, |
Beta Was this translation helpful? Give feedback.
In the official implementation,
r
andc
are not learnable and are determined by this line and this line. According to the author, it has better stability. In our implementation, we alternatively use small warmup epochs to update (not learn by gradient descent)r
andc
. For the initialization ofc
, we use the mean ofemb
in the warmup epochs (this line). Feel free to modify the model (e.g., increase the number of the warmup epochs) if it helps.