Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SAC log_alpha different from paper #171

Open
cvigoe opened this issue Jan 12, 2023 · 1 comment
Open

SAC log_alpha different from paper #171

cvigoe opened this issue Jan 12, 2023 · 1 comment

Comments

@cvigoe
Copy link

cvigoe commented Jan 12, 2023

Thanks for the great work on this project! The issue is illustrated in the following line in the SAC trainer:

https://github.com/rail-berkeley/rlkit/blob/master/rlkit/torch/sac/sac.py#L161

In the SAC trainer, it seems as though log_alpha is being trained as though it is alpha, and then an exponentiation is being performed when computing the soft policy & critic losses. Initially I thought that log_alpha was being used for numerical precision reasons, but the loss function for alpha is not modified to work in log space, so it looks like what is happening is log_alpha is really alpha, and then in the soft policy & critic objectives exp(alpha) is being used in place of alpha.

I have not seen any discussion of this choice in the literature. If my understanding is correct, I think this technically breaks the soft-policy iteration and dual-descent theory in the original SAC papers (specifically the follow-up paper that introduced the auto-tuning entropy version of SAC)

It would be great if there could be some clarification as to why this choice was made or, at the very least, a comment/note saying that this deviates from the original (auto-entropy tuning) version of SAC.

@Shapeno
Copy link

Shapeno commented Feb 29, 2024

This problem has been bothering me for a long time! It doesn't affect the program's operation, but the auto-alpha function doesn't work and degenerates into an alpha=1 effect.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants