Skip to content

Commit

Permalink
fix #19, update ex10
Browse files Browse the repository at this point in the history
  • Loading branch information
jmorice91 committed Nov 22, 2024
1 parent 7334107 commit 1875b7b
Show file tree
Hide file tree
Showing 4 changed files with 85 additions and 56 deletions.
49 changes: 31 additions & 18 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -379,42 +379,55 @@ of processes used. You can easily check if the files are the same by running:

### Ex10. Post-processing the data in python

\attention
You need to have a version %PDI with the \ref pycall_plugin "Pycall plugin" to do this exercise.

In this exercise, you will once again modify the YAML file only and use python
to post-process the data in situ before writing it to HDF5.
Here, you will write the square root of the raw data to HDF5 instead of the
data itself.

* Examine the YAML file and compile the code.

* Load the \ref pycall_plugin "Pycall plugin" and enable this plugin when the `loop` event is triggered.

Some variables of the python script inside `ex10.yml` are not defined.
The `with` directive of this plugin allows to specify input variables (parameters) to pass to Python as a set of "$-expressions".
These parameters can be given as multiple blocks.

* Add a `with` block with the missing parameter to let the Python code process
the data exposed in `main_field` for event `loop`.

* Use the keyword `exec` of \ref pycall_plugin "Pycall plugin" and decomment the python script.

Notice that the Decl'HDF5 configuration was simplified, no memory selection is
applied, the when condition disappeared.
applied, the `when` condition disappeared because it is done in the python script:
```python
if 0 < iter_id < 4:
transformed_field = np.sqrt(source_field[1:-1,1:-1])
pdi.expose('transformed_field', transformed_field, pdi.OUT)
```
The last line of the python script allows to expose the transformed field to %PDI. Moreover, this data is known to %PDI in this call.

* Modify the Decl'HDF5 configuration to write the new data `transformed_field` exposed from Python.

\attention
The dataset name is however explicitly specified now because it does not match
the %PDI variable name anymore, you will instead write a new variable exposed
from python.

The `pycall` section has been added to load the
\ref pycall_plugin "Pycall plugin".
It executes the provided code when the "loop" event is triggered.
The `with` section specifies the variables (parameters) to pass to Python as a
set of "$-expressions".
The provided code again exposes its result to %PDI and multiple blocks can be
chained this way.

* Add the missing parameter to the `with` block to let the Python code process
the data exposed in `main_field`.

* Modify the Decl'HDF5 configuration to write the new data exposed from Python.

* Match the output from `ex10.h5dump`. You can easily check if the files are the
same by running:
You should be able to match the expected output described in `ex10.h5dump`. You can easily check if the files are the same by running:
```bash
diff ex10.h5dump <(h5dump ex10*.h5)
```
To see your `h5` file in readable file format, you can check the section [Comparison with the `h5dump` command](#h5comparison).

\warning
If you relaunch the executable, remember to delete your old `ex10.h5` file before, otherwise the data will not be changed.

\attention
In a more realistic setup, one would typically not write much code in the YAML
file directly, but would instead call functions specified in a `.py` file on
the side.
file directly, but would instead call functions specified in a `.py` file on the side.

## What next ?

Expand Down
52 changes: 31 additions & 21 deletions ex10.c
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,6 @@
#include <time.h>

#include <paraconf.h>
// load the PDI header
#include <pdi.h>

/// size of the local data as [HEIGHT, WIDTH] including ghosts & boundary constants
Expand All @@ -44,13 +43,32 @@ int pcoord[2];
/// the alpha coefficient used in the computation
double alpha;

double L=1.0;
double source1[4]={0.4, 0.4, 0.2, 100};
double source2[4]={0.7, 0.8, 0.1, 200};

/** Initialize the data all to 0 except for the left border (XX==0) initialized to 1 million
* \param[out] dat the local data to initialize
*/
void init(double dat[dsize[0]][dsize[1]])
{
for (int yy=0; yy<dsize[0]; ++yy) for (int xx=0; xx<dsize[1]; ++xx) dat[yy][xx] = 0;
if ( pcoord[1] == 0 ) for (int yy=0; yy<dsize[0]; ++yy) dat[yy][0] = 1000000;
double dy = L / ((dsize[0]-2) *psize[0]) ;
double dx = L / ((dsize[1]-2) *psize[1]) ;

double cpos_x,cpos_y;
for(int yy=0; yy<dsize[0];++yy) {
cpos_y=(yy+pcoord[0]*(dsize[0]-2))*dy-0.5*dy;
for(int xx=0; xx<dsize[1];++xx) {
cpos_x=(xx+pcoord[1]*(dsize[1]-2))*dx-0.5*dx;
if((cpos_y-source1[0])*(cpos_y-source1[0]) + (cpos_x-source1[1])*(cpos_x-source1[1]) <= source1[2]*source1[2]) {
dat[yy][xx] = source1[3];
}
if((cpos_y-source2[0])*(cpos_y-source2[0]) + (cpos_x-source2[1])*(cpos_x-source2[1]) <= source2[2]*source2[2]) {
dat[yy][xx] = source2[3];
}
}
}
}

/** Compute the values at the next time-step based on the values at the current time-step
Expand All @@ -60,21 +78,15 @@ void init(double dat[dsize[0]][dsize[1]])
void iter(double cur[dsize[0]][dsize[1]], double next[dsize[0]][dsize[1]])
{
int xx, yy;
for (xx=0; xx<dsize[1]; ++xx) next[0][xx] = cur[0][xx];
for (yy=1; yy<dsize[0]-1; ++yy) {
next[yy][0] = cur[yy][0];
for (xx=1; xx<dsize[1]-1; ++xx) {
next[yy][xx] =
(1.-4.*alpha) * cur[yy][xx]
+ alpha * ( cur[yy][xx-1]
+ cur[yy][xx+1]
+ cur[yy-1][xx]
+ cur[yy+1][xx]
);
next[yy][xx] = (1.-4.*alpha) * cur[yy][xx]
+alpha * ( cur[yy][xx-1]
+ cur[yy][xx+1]
+ cur[yy-1][xx]
+ cur[yy+1][xx]);
}
next[yy][dsize[1]-1] = cur[yy][dsize[1]-1];
}
for (xx=0; xx<dsize[1]; ++xx) next[dsize[0]-1][xx] = cur[dsize[0]-1][xx];
}

/** Exchanges ghost values with neighbours
Expand All @@ -87,7 +99,7 @@ void exchange(MPI_Comm cart_comm, double cur[dsize[0]][dsize[1]])
int rank_source, rank_dest;
static MPI_Datatype column, row;
static int initialized = 0;

if ( !initialized ) {
MPI_Type_vector(dsize[0]-2, 1, dsize[1], MPI_DOUBLE, &column);
MPI_Type_commit(&column);
Expand All @@ -104,8 +116,8 @@ void exchange(MPI_Comm cart_comm, double cur[dsize[0]][dsize[1]])

// send up
MPI_Cart_shift(cart_comm, 0, -1, &rank_source, &rank_dest);
MPI_Sendrecv(&cur[1][1], 1, row, rank_dest, 100, // send column after ghost
&cur[dsize[0]-1][1], 1, row, rank_source, 100, // receive last column (ghost)
MPI_Sendrecv(&cur[1][1], 1, row, rank_dest, 100, // send row after ghost
&cur[dsize[0]-1][1], 1, row, rank_source, 100, // receive last row (ghost)
cart_comm, &status);

// send to the right
Expand Down Expand Up @@ -162,7 +174,7 @@ int main( int argc, char* argv[] )
dsize[1] = global_size[1]/psize[1] + 2;

// create a 2D Cartesian MPI communicator & get our coordinate (rank) in it
int cart_period[2] = { 0, 0 };
int cart_period[2] = { 1, 1 };
MPI_Comm cart_comm; MPI_Cart_create(main_comm, 2, psize, cart_period, 1, &cart_comm);
MPI_Cart_coords(cart_comm, pcoord_1d, 2, pcoord);

Expand All @@ -178,11 +190,9 @@ int main( int argc, char* argv[] )
int ii=0;

// share useful configuration bits with PDI
PDI_expose("ii", &ii, PDI_OUT);
PDI_expose("pcoord", pcoord, PDI_OUT);
PDI_expose("dsize", dsize, PDI_OUT);
PDI_expose("psize", psize, PDI_OUT);
PDI_expose("main_field", cur, PDI_OUT);

// the main loop
for (; ii<10; ++ii) {
Expand All @@ -197,14 +207,14 @@ int main( int argc, char* argv[] )

// exchange data with the neighbours
exchange(cart_comm, next);

// swap the current and next values
double (*tmp)[dsize[1]] = cur; cur = next; next = tmp;
}
// finally share the main field as well as the loop counter after the loop
PDI_multi_expose("finalization",
"main_field", cur, PDI_OUT,
"ii", &ii, PDI_OUT,
"main_field", cur, PDI_OUT,
NULL);

// finalize PDI
Expand Down
31 changes: 17 additions & 14 deletions ex10.yml
Original file line number Diff line number Diff line change
Expand Up @@ -21,17 +21,20 @@ pdi:
datasets:
main_field: { type: array, subtype: double, size: [ 3, '$psize[0]*($dsize[0]-2)', '$psize[1]*($dsize[1]-2)' ] }
write:
TODO: # the name of the PDI data to write
dataset: main_field
dataset_selection:
size: [1, '$dsize[0]-2', '$dsize[1]-2']
start: ['$ii-1', '$pcoord[0]*($dsize[0]-2)', '$pcoord[1]*($dsize[1]-2)']
pycall:
on_event:
loop:
with: { iter_id: $ii, ... }
exec: |
import numpy as np
if 0 < iter_id < 4:
transformed_field = np.sqrt(source_field[1:-1,1:-1])
pdi.expose('transformed_field', transformed_field, pdi.OUT)
#*** the name of the PDI variable to write.
#...
dataset: main_field # name of the dataset in "ex10.h5"
dataset_selection:
size: [1, '$dsize[0]-2', '$dsize[1]-2']
start: ['$ii-1', '$pcoord[0]*($dsize[0]-2)', '$pcoord[1]*($dsize[1]-2)']
#*** load the pycall plugin and enable this plugin for event loop.
#...
#*** Specifies the input parameters (variables) to pass to Python as a set of "$-expressions" (expressions must be defined in .yml script)
#...
#*** add exec keyword of pycall plugin. Decomment and the following python script
#...
# import numpy as np
# if 0 < iter_id < 4: # iter_id: time iteration
# transformed_field = np.sqrt(source_field[1:-1,1:-1])
# pdi.expose('transformed_field', transformed_field, pdi.OUT)
## Comment: The last line allows to expose transformed field to PDI. Hence, the data is known by PDI in this call.
9 changes: 6 additions & 3 deletions solutions/ex10.yml
Original file line number Diff line number Diff line change
Expand Up @@ -18,14 +18,16 @@ pdi:
decl_hdf5:
file: ex10.h5
communicator: $MPI_COMM_WORLD
datasets:
datasets: # List the name of the dataset in the file ex10.h5
main_field: { type: array, subtype: double, size: [ 3, '$psize[0]*($dsize[0]-2)', '$psize[1]*($dsize[1]-2)' ] }
write:
#*** the name of the PDI data to write
transformed_field:
dataset: main_field
dataset: main_field # name of the dataset in the h5 file (defined in the directive datasets) that we want to fill
dataset_selection:
size: [1, '$dsize[0]-2', '$dsize[1]-2']
start: ['$ii-1', '$pcoord[0]*($dsize[0]-2)', '$pcoord[1]*($dsize[1]-2)']
#*** load the pycall plugin
pycall:
on_event:
loop:
Expand All @@ -34,4 +36,5 @@ pdi:
import numpy as np
if 0 < iter_id < 4:
transformed_field = np.sqrt(source_field[1:-1,1:-1])
pdi.expose('transformed_field', transformed_field, pdi.OUT)
pdi.expose('transformed_field', transformed_field, pdi.OUT)
# The last line allows to expose transformed field to PDI. Hence, the data is known by PDI in this call.

0 comments on commit 1875b7b

Please sign in to comment.