Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Results: reconstruction vs. compactness -- KL vs. dim plot #68

Open
3 tasks
donovanr opened this issue Jul 7, 2020 · 0 comments
Open
3 tasks

Results: reconstruction vs. compactness -- KL vs. dim plot #68

donovanr opened this issue Jul 7, 2020 · 0 comments
Assignees

Comments

@donovanr
Copy link
Contributor

donovanr commented Jul 7, 2020

Issue summary

as beta changes, plot KL vs sorted(dims) and show what happens (maybe more peaked?)

Details

Another feature of the reconstruction vs. compactness trade-off explored by tuning beta should, I think, be reflected in the amount of variance encapsulated in each of the (ordered) latent space dimensions. At some point I saw a plot of variance vs. dimension that showed a pretty strong fall-off over the first few dimensions, flattening down to some kind of noise baseline at about 40. I think that as beta is increased, the falloffs in these curves should get steeper and steeper, as the model is shoving more information into a more sparse latent space. I think it would be good to include these curves in the figure, or at the very least in the supplement, and comment on the utility of tuning beta for the purposes of dimensionality reduction.

TODO

  • code
  • manuscript text
  • figure
@donovanr donovanr self-assigned this Jul 7, 2020
@donovanr donovanr changed the title reconstruction vs. compactness -- KL vs. dim plot Results: reconstruction vs. compactness -- KL vs. dim plot Jul 8, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant