-
Notifications
You must be signed in to change notification settings - Fork 46
Description
This is typically what we see when a job failed,
ERROR: [susie_twas_1]: [0]:
Failed to execute Rscript /home/aw3600/.sos/97de343d7da3f0ce/susie_twas_1_0_ff982a12.R
exitcode=1, workdir=/home/mambauser/data
---------------------------------------------------------------------------
[susie_twas]: Exits with 1 pending step (susie_twas_2)
The line Rscript /home/aw3600/.sos/97de343d7da3f0ce/susie_twas_1_0_ff982a12.R is how we track and try to reproduce the error, to debug. However, long story short is that we are working with cloud computing where /home/aw3600/.sos/ is a path in the VM that gets destroyed after a command ends. Although it is possible to copy the entire .sos folder to permanent AWS S3 bucket before the VM dies, it is non-trivial to sync the entire folder ... all we care is this file susie_twas_1_0_ff982a12.R.
I think this conversation was once brought up but I don't remember we have an option to do it yet -- can we specify something on the sos run interface to make these temporary scripts saved to a given folder? I like the behavior of:
R: ... , stdout =, stderr =
which writes the stderr and stdout to where I want them. I wonder if we can add something like:
R: ..., stdout=, stderr=, debug_script="/path/to/debug/folder"
and only keep the scripts to /path/to/debug/folder when there is an issue -- and change the prompt Failed to execute to pointing to this script?