-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Singularity compatibility update #2079
base: dev
Are you sure you want to change the base?
Conversation
Leaving this as a draft for now, as I want the change to be verified ok for singularity 3.8/3.10 before merging to dev (and a bit more time for me to verify it at NYU, too, though should be fine as I developed it there). |
This change would not work for NU. since their bind dir is '/projects' and pwd can be any working dir |
I don't understand how binding the sif path fixes unavailability of assets in the container, could you please explain? Also, I thought the working directory for all idmtools-platform-slurm simulations is the simulation directory, which is $pwd in the case of my proposed change. How does a user override this behavior? |
I am not sure if you are using emodpy task. my assumption is baed on using emodpy task. |
@@ -23,7 +23,7 @@ do | |||
{% if simulation.task.command.cmd.startswith('singularity') %} | |||
{{simulation.task.command.cmd}} | |||
{% else %} | |||
singularity exec {{simulation.task.sif_path}} {{simulation.task.command.cmd}} | |||
singularity exec -B $(pwd) -B $(pwd)/../Assets --pwd $(pwd) {{simulation.task.sif_path}} {{simulation.task.command.cmd}} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@ckirkman-IDM , here are some comments for your reference:
-
-B $(pwd) is not necessary as it was done by default in Singularity
-
--pwd $(pwd) is not necessary as script ran under $pwd already and Singularity takes current working directory already by default
-
for -B $(pwd)/../Assets
SlurmPlatform support all kind of tasks (not just EMODTask), in certain case there may not have Assets folder (remark: we may have with current code due to code defect). You may consider the following instead
-B $(pwd)/../ <your sif path>
- Workaround
This PR only solves NYU's one special case, like Sharon pointed out the NU case, it doesn't solve the general binding cases. Instead of fixing a special case with PR, you may use the same work around like Sharon's case:
task.sif_path = '-B $(pwd)/../Assets <your sif path>'
or simply
task.sif_path = '-B $(pwd)/../ <your sif path>'
- My proposal of fixing general biding issue as below:
Default: idmtool has experiment folder binded in _run.sh file
singularity exec -B {{exp_folder}} {{simulation.task.sif_path}}
Then we separate binding from sif_path in workaround above.
Approach 1:
Since we alrady did
task.sif_path = '<your sif path>'
we may add one more variable for binding, like
task.sig_bind = '<Singularity bindings>'
Notes
singularity exec may have many other optional parameter in addition to --bind. If needed, other options can be built/constructed into
task.sig_bind = '<user-path-binding>
For example,
(a) with default experiment folder binded, it solves Clark's case automatically
(b) for Sharon't case, we can do
task.sif_path = '/projects/b1139/dtk_run_rocky_py39.sif'
task.sig_bind = '--bind /projects'
Approach 2:
Instead of doing (adding variables to task)
task.sif_path = '<user-sif-path>'
we can instroduce two input parameters in suite.run(...) and experiment.run(...),
- SIF_PATH
- SIG_BIND
Let's take suite as an example, same for experiment:
suite.run(..., SIF_PATH='<user-sif-path>')
if user needs more extra path binding, we can do
suite.run(..., SIF_PATH='<user-sif-path>', SIG_BIND='<user-path-binding>')
Notes
singularity exec may have many other optional parameter in addition to --bind. If needed, other options can be built/constructed into
SIG_BIND='<user-path-binding>
For example,
(a) with default experiment folder binded, it solves Clark's case automatically
(b) for Sharon't case, we can do
suite.run(..., SIF_PATH='/projects/b1139/dtk_run_rocky_py39.sif', SIG_BIND='--bind /projects')
Remark: with appraoch #2, we need to consider code back compatibility.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the reply, working on digesting it all :)
This update to how we use 'singularity exec' works at NYU for singularity versions 3.1, 3.7, 3.9 . I suspect it will work for 3.8 and 3.10 (our dev cluster and NU), but this would need to be verified.