A common pattern in my team is to start every repo notebook with %pip install -r ../../../requirements.txt. A minor annoyance I have using the VSCode extension Run As Workflow On Databricks is that this doesn't work out of the box. I suspect it's because a new notebook gets materialized to run the workflow and it's not necessarily in the same path as where I'm developing.
Could we add a step to the preamble that does os.chdir("/abolute/path/to/synced_notebook_dir")? That way when I do Run As Workflow on Databricks not only am I running the code, but I am also running it as if I was working in the notebook's remote directory?