Conversation
naglepuff
left a comment
There was a problem hiding this comment.
This looks good to me. Only kind of clunky thing is that if a user wants to go to the main job via the job list, it could be difficult depending on how fast the child jobs are spawning. If they spawn to fast, it could be hard to click into the main job. This is alleviated by the fact that starting the import shows a notification that the user can click to go right to the main job.
This was something David and I were talking about the other day, the same thing happens in Slicer CLI batch jobs. You can also use the type filter to quickly filter for type "DIVE Batch Process Import" and it will only show those jobs. This would only be used by an admin of the system though, so if it is a little cumbersome I think that's okay. |
This updates the importing of S3 Assets to create a master job that spawns other jobs for the conversion.
Before this if you had 1000 videos it would immediately spawn 1000 jobs. This spawns and processes jobs one at a time to make the job management a bit easier. If you cancel the main job it will stop adding more jobs.
dive_batch_postprocesstask that will search through a folder and continually spawn subsequent jobs for all data that is 'MarkedForPostProcess'.event.pyfor asset store importing so it will kick off this main job instead of thousands of smaller jobs.DIVE-20250918-S3BatchImport_small.mp4