You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi,in the file "rlkit/torch/networks.py", you use "import rlkit.torch.transformer as transformer" . But it seem like the repo doesn't have rlkit.torch.transformer
The text was updated successfully, but these errors were encountered:
The transformer module was reserved for the follow-up paper: https://arxiv.org/pdf/2102.10774.pdf, which improves the robustness of FOCAL against sparse reward and distribution shift. For reproducing FOCAL, you can ignore that part of code.
If you are interested in applying attention/contrastive learning on FOCAL, check out the aforementioned paper (https://arxiv.org/pdf/2102.10774.pdf) and a latest follow-up paper (https://arxiv.org/pdf/2206.10442.pdf). We will have some updates on the paper and code soon. You are welcome to ask questions, too. @hkx888
Hi,in the file "rlkit/torch/networks.py", you use "import rlkit.torch.transformer as transformer" . But it seem like the repo doesn't have rlkit.torch.transformer
The text was updated successfully, but these errors were encountered: