Expand the regression test annotations.#2501
Expand the regression test annotations.#2501rwest merged 11 commits intoReactionMechanismGenerator:mainfrom
Conversation
Codecov Report
@@ Coverage Diff @@
## main #2501 +/- ##
=======================================
Coverage 48.54% 48.54%
=======================================
Files 110 110
Lines 30812 30812
Branches 8054 8054
=======================================
Hits 14959 14959
Misses 14330 14330
Partials 1523 1523 📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more |
31ffcfc to
8d04acb
Compare
|
I think this annotation idea is not going to work for pull requests from forked repositories. Only pull requests made from branches of the official repository. Check the "Maximum access for pull requests from public forked repositories" column here Options seem to be...
|
|
Here is a workaround https://github.com/nyurik/auto_pr_comments_from_forks. The This summary also helpful https://github.blog/2020-08-03-github-actions-improvements-for-fork-and-pull-request-workflows/ |
|
I will start a separate PR to implement this workaround to keep this one in scope. |
|
I have an idea I'm going to try |
The regression-comparison step now captures its STDOUT (and STDERR) outputs and they are used to create the annotation on the pull request. I tried the mathiasvr/[email protected] workflow, but it doesn't let us use a bash login shell, and maybe has other vulnerabilities. So I reverted to bash and trying to just redirect outputs to files.
To help debugging. The thing that should become the PR annotation is now duplicated into the summary of the job run. Might be a good idea anyway.
Looks bad, with new system of putting the full summary there.
Blank lines needed to make subsequent blocks render as markdown.
Just trying to get the first line of a damn file.
We put the pull request number at the start of the summary file, so the other workflow knows which PR to put the comment on.
|
Sorry for pushing a couple of commits to master. If we undo (reset the head to before I did that and force push) then those commits will become part of this pull request, which can then be merged. Anyway, I got it working (with a bunch of testing done in relative privacy of my own fork). |
(When I was debugging, I had a special workflow to test it more quickly)
JacksonBurns
left a comment
There was a problem hiding this comment.
Nice and succinct, well done. LGTM
|
Dammit. Is |
|
The "Continuous Integration" workflow runs on a push to master (eg merging a PR) and on a cron job, both of which will trigger the Annotation workflow, but neither of which will have uploaded a regression test summary file, so the Annotation workflow will fail. |
Motivation or Problem
The results of the regression tests were largely buried in the logs of the continuous integration action workflow, rather than being nicely summarized on the annotation to the pull request.
Description of Changes
The summaries and comparisons that are generated by the CI running the regression tests are captured, and used to make the annotation.
The summary is stored as a text file, with the pull request number stored on the first line. This is then uploaded as an artifact. A new workflow runs whenever the CI workflow completes, then downloads the artifact, extracts the summary, and posts a comment. This complicated runaround is needed because workflows run by pull requests from forks do not have permission to write comments directly.
Testing
For the markdown formatting, I have run it a bunch, and checked things render correctly. (See some of the run logs, where things are duplicated into the workflow summary).
For the artifact and annotation scripts, I debugged and tested it on my own fork. It'll only function properly once the annotation.yml workflow ends up on the main branch - but I tested it there :)