Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Parallel Training #12

Open
elrodrigues opened this issue Sep 18, 2023 · 1 comment
Open

Parallel Training #12

elrodrigues opened this issue Sep 18, 2023 · 1 comment
Assignees
Labels
enhancement New feature or request

Comments

@elrodrigues
Copy link
Collaborator

Extend trainer/runner/environment from #11 for parallel training.

@elrodrigues elrodrigues added the enhancement New feature or request label Sep 18, 2023
@elrodrigues elrodrigues self-assigned this Sep 18, 2023
@elrodrigues
Copy link
Collaborator Author

elrodrigues commented Sep 18, 2023

On second thought, parallel training may be achieved without #11 by wrapping our current environment in a 'pool' wrapper.

This pool would have a manager or a cron job based on time or episodes that will periodically soft-sync models to a 'master' model to rapidly accumulate experience, assuming all jobs are normalized identically. This master model would then be distributed to env-threads in this pool for further (distributed) training.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

When branches are created from issues, their pull requests are automatically linked.

1 participant