Skip to content

Change the destination of where failed script is written to #1530

@gaow

Description

@gaow

This is typically what we see when a job failed,

ERROR: [susie_twas_1]: [0]:
Failed to execute Rscript /home/aw3600/.sos/97de343d7da3f0ce/susie_twas_1_0_ff982a12.R
exitcode=1, workdir=/home/mambauser/data
---------------------------------------------------------------------------
[susie_twas]: Exits with 1 pending step (susie_twas_2)

The line Rscript /home/aw3600/.sos/97de343d7da3f0ce/susie_twas_1_0_ff982a12.R is how we track and try to reproduce the error, to debug. However, long story short is that we are working with cloud computing where /home/aw3600/.sos/ is a path in the VM that gets destroyed after a command ends. Although it is possible to copy the entire .sos folder to permanent AWS S3 bucket before the VM dies, it is non-trivial to sync the entire folder ... all we care is this file susie_twas_1_0_ff982a12.R.

I think this conversation was once brought up but I don't remember we have an option to do it yet -- can we specify something on the sos run interface to make these temporary scripts saved to a given folder? I like the behavior of:

R: ... , stdout =, stderr = 

which writes the stderr and stdout to where I want them. I wonder if we can add something like:

R: ..., stdout=, stderr=, debug_script="/path/to/debug/folder"

and only keep the scripts to /path/to/debug/folder when there is an issue -- and change the prompt Failed to execute to pointing to this script?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions