Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Parallel training of VAE and AAE? #170

Open
microbiomix opened this issue May 3, 2023 · 2 comments
Open

Parallel training of VAE and AAE? #170

microbiomix opened this issue May 3, 2023 · 2 comments

Comments

@microbiomix
Copy link

Hello,

Would it make sense to allow VAE and AAE to be trained in parallel on request? They are now running one after the other but our server with two GPUs could handle them in parallel. Any thoughts?

Best,
Mani

@simonrasmu
Copy link
Collaborator

Hi Mani, very good point - I second that they could in principle run in parallel?

Best
Simon

@microbiomix
Copy link
Author

The other option is to run vae model and aae models separately but in parallel, e.g. via Snakemake, and derep downstream. If the behavior using aae-vae is not expected to differ from running them separate (except for non-reproducibility due to random seed etc), perhaps that keeps it simple for avamb code itself? If yes, then we can close this issue or reduce priority to -Infinity.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants