Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Porting to PArrays v0.3 #114

Merged
merged 56 commits into from
Aug 15, 2023
Merged

Porting to PArrays v0.3 #114

merged 56 commits into from
Aug 15, 2023

Conversation

amartinhuertas
Copy link
Member

@amartinhuertas amartinhuertas commented Jun 30, 2023

Work in progress ...

  • Documentation.
  • Bump GridapDistributed version to v0.3.
  • Fix PeriodicBoundaryConditions test. Issue Issue with Periodic BCs after porting to PArrays 0.3 #119
  • Run all tests with MPI.
  • Mock test for RedistributeGlue. We can perhaps manufacture a silly test from the packages that currently leverage RedistributeGlue, and then copy & paste the data fields of the structures in a hard-coded way.
  • Add RedistributeTools.jl from GridapSolvers (actually from GridapP4est) and create mock tests
  • Had to add functions fetch_ghost_values_cache and fetch_ghost_values! to GridapDistributed because exchange! in PArrays no longer provides the previous functionality. These functions can be waived using PVectors all the way through. TO-THINK which is more appropriate on a case-by-case basis. UPDATE: We decided to not use PVector all the way through but to keep these caches.
  • What else?

Left for another future PR:

  • Add FESpaces.jl (and associated tools) from GridapP4est and create mock tests

@amartinhuertas amartinhuertas marked this pull request as draft June 30, 2023 05:18
@amartinhuertas
Copy link
Member Author

for the records ... the error with the Periodic BCs is here ... https://github.com/gridap/GridapDistributed.jl/actions/runs/5419584873/jobs/9852831016#step:7:16

@codecov-commenter
Copy link

codecov-commenter commented Jul 2, 2023

Codecov Report

Merging #114 (e693b85) into master (2ed5353) will not change coverage.
The diff coverage is 0.00%.

❗ Your organization is not using the GitHub App Integration. As a result you may experience degraded service beginning May 15th. Please install the Github App Integration for your organization. Read more.

@@           Coverage Diff           @@
##           master    #114    +/-   ##
=======================================
  Coverage    0.00%   0.00%            
=======================================
  Files          10      11     +1     
  Lines        1694    1982   +288     
=======================================
- Misses       1694    1982   +288     
Files Changed Coverage Δ
src/Adaptivity.jl 0.00% <0.00%> (ø)
src/Algebra.jl 0.00% <0.00%> (ø)
src/CellData.jl 0.00% <0.00%> (ø)
src/DivConformingFESpaces.jl 0.00% <0.00%> (ø)
src/FESpaces.jl 0.00% <0.00%> (ø)
src/Geometry.jl 0.00% <0.00%> (ø)
src/MultiField.jl 0.00% <0.00%> (ø)
src/TransientFESpaces.jl 0.00% <0.00%> (ø)
src/Visualization.jl 0.00% <0.00%> (ø)

📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more

@amartinhuertas
Copy link
Member Author

Run all tests with MPI.

For the records, I run all MPI tests on my local machine, and as expected, only the periodic BCs test fails.

JordiManyer and others added 27 commits August 7, 2023 09:37
This allows access to fine and coarse comms
from within the redistribution routines.
Got rid of the GridapDistributed void
structures which were created for that same
purpose.
Bugfix: periodic bcs not working
Removed Manifests
@amartinhuertas amartinhuertas marked this pull request as ready for review August 15, 2023 04:45
@amartinhuertas amartinhuertas merged commit 100f959 into master Aug 15, 2023
6 checks passed
@JordiManyer JordiManyer deleted the partitioned_arrays_v0.3 branch September 7, 2023 02:00
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants