Skip to content

Commit

Permalink
Merge branch 'devel' into devel-atomic_weight
Browse files Browse the repository at this point in the history
  • Loading branch information
ChiahsinChu authored Dec 18, 2024
2 parents 9d0d7d6 + 104fc36 commit a8e204f
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion deepmd/pt/train/training.py
Original file line number Diff line number Diff line change
Expand Up @@ -579,7 +579,7 @@ def warm_up_linear(step, warmup_steps):
# author: iProzd
if self.opt_type == "Adam":
self.optimizer = torch.optim.Adam(
self.wrapper.parameters(), lr=self.lr_exp.start_lr
self.wrapper.parameters(), lr=self.lr_exp.start_lr, fused=True
)
if optimizer_state_dict is not None and self.restart_training:
self.optimizer.load_state_dict(optimizer_state_dict)
Expand Down

0 comments on commit a8e204f

Please sign in to comment.