-
Notifications
You must be signed in to change notification settings - Fork 78
HOWTO define a testcase
COOLFluiD comes with a whole bunch of testcases which can be found inside plugins folders, typically inside physical models (e.g. LinearAdv, Burgers, NavierStokes, MHD). Each testcase is defined by a few input files:
-
CFcase file: conventionally have extension "
.CFcase
", provides all user-defined settings and parameters for configuring a simulation. -
Interactive file: conventionally with extension "
.inter
", contains a few parameters (in the same format as in CFcase files) which can be modified on-the-fly by the end-users while the simulation is running. The interactive file is optional. -
CFmesh file: conventionally with extension "
.CFmesh
", provide mesh with or w/o solution in a native COOLFluiD format which supports parallel I/O. Those files can be used to start a simulation from scratch or to restart from a previous solution.
When meshes are available in formats different from CFmesh (e.g. as written by mesh generators), COOLFluiD relies on converter plugins to translate them into the desired format. For the moment, the following formats are supported:
-
Gmsh (
.msh
,.SP
): mesh only; -
ANSYS GAMBIT neutral (
.neu
): mesh only; - a few legacy formats (e.g. THOR, DPL, FAST) are available;
-
TECPLOT (
.plt
for full domain,.surf.plt
for boundaries): mesh with or w/o solution. NOTE: An additional file.allsurf.plt
is needed if boundary node precision in.plt
doesn't match.surf.plt
.
Implementing new converters while looking at existing examples is rather straightforward and, therefore, other formats can be easily supported.
The COOLFluiD team is involved and open to new collaborations in Horizon2020, ESA and other research projects dealing with complex multi-disciplinary problems and computational challenges.
Training sessions and consulting services can also be provided on demand.
For any request or suggestions please contact [email protected]
Parallel mesh decomposition
High-performance computing (strong scaling on NASA Pleiades for 1/2 billion-cells 3D grid)
Chemically reacting flows and plasma
Complex all-speed flow simulations