Skip to content

Commit

Permalink
Make test_keyed standalone and remove HPC dependency (pytorch#1540)
Browse files Browse the repository at this point in the history
Summary:
Pull Request resolved: pytorch#1540

We remove the `hpc.optimizers` dependency on `optimizer_modules` by enhancing the `DummyOptimizerModule` under `test_keyed.py`. This ensures that the tests are standalone, and can be properly OSS'ed.

Thanks to joshuadeng for catching the error!

Reviewed By: henrylhtsang

Differential Revision: D51599854

fbshipit-source-id: cf77ffe994b946acda9abed5fbedb8e0a8a4dd42
  • Loading branch information
Michael Shi authored and facebook-github-bot committed Nov 28, 2023
1 parent d355caf commit ab2ee34
Showing 1 changed file with 7 additions and 3 deletions.
10 changes: 7 additions & 3 deletions torchrec/optim/tests/test_keyed.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,6 @@

import torch
import torch.distributed as dist
from hpc.optimizers.optimizer_modules import OptimizerModule
from torch.autograd import Variable
from torch.distributed._shard import sharded_tensor, sharding_spec
from torchrec.optim.keyed import (
Expand All @@ -24,14 +23,19 @@
from torchrec.test_utils import get_free_port


class DummyOptimizerModule(OptimizerModule):
class DummyOptimizerModule:
def __init__(
self,
tensor: torch.Tensor,
) -> None:
super(DummyOptimizerModule, self).__init__()
self.tensor = tensor

def state_dict(self) -> Dict[str, Any]:
return {"tensor": self.tensor}

def load_state_dict(self, state_dict: Dict[str, Any]) -> None:
self.tensor.detach().copy_(state_dict["tensor"])


class TestKeyedOptimizer(unittest.TestCase):
def _assert_state_dict_equals(
Expand Down

0 comments on commit ab2ee34

Please sign in to comment.