Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement multi-GPU LB #5007

Merged
merged 2 commits into from
Nov 5, 2024
Merged

Implement multi-GPU LB #5007

merged 2 commits into from
Nov 5, 2024

Conversation

jngrad
Copy link
Member

@jngrad jngrad commented Oct 30, 2024

Description of changes:

  • implement GPU PackInfo classes
    • weak scaling is 75% at 16 GPUs on the Ant cluster
  • replace manual patches by automatic code patching

@jngrad jngrad added Core Improvement waLBerla Issues regarding waLBerla integration labels Oct 30, 2024
@jngrad jngrad added this to the ESPResSo 4.3.0 milestone Oct 30, 2024
@jngrad jngrad marked this pull request as ready for review October 30, 2024 23:40
@jngrad jngrad requested a review from RudolfWeeber October 30, 2024 23:41
@jngrad jngrad added the automerge Merge with kodiak label Nov 5, 2024
@kodiakhq kodiakhq bot merged commit 8ca8f10 into espressomd:python Nov 5, 2024
10 checks passed
@jngrad jngrad deleted the multigpu branch November 5, 2024 17:07
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
automerge Merge with kodiak Core Improvement waLBerla Issues regarding waLBerla integration
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants