You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In your posterior, you use the stochastic state of the prior. But in RSSM they only use the deterministic state, and observation embedding. Since the prior's stochastic state is just a function of the deterministic state, it won't have extra information to condition upon. And using the stochastic state sample might hurt computing the posterior because of the sampling noise.
I am checking in case there is some other deeper reason to use it.
Hi @sai-prasanna,
you're right that there is a subtle difference with the original RSSM.
However, I would not expect any major differences as the information to condition upon is contained in the deterministic state, as you pointed out.
The stochastic state might either be helpful (it is a more noisy estimate of the state) or be ignored by the network, if it doesn't contain any useful information (e.g. if you just concatenate random noise to the inputs of a network, the network quickly learns to ignore it)
In your posterior, you use the stochastic state of the prior. But in RSSM they only use the deterministic state, and observation embedding. Since the prior's stochastic state is just a function of the deterministic state, it won't have extra information to condition upon. And using the stochastic state sample might hurt computing the posterior because of the sampling noise.
I am checking in case there is some other deeper reason to use it.
contrastive-aif/world_model.py
Line 129 in 980e386
The text was updated successfully, but these errors were encountered: