Our new conformance.py script is awesome, but one thing I miss from the old workflow is that there's no longer any indication when diagnostics in the same location have their messages or error codes changed. That can be a useful signal of improvement or regression -- sometimes it looks like an assertion is "passing", but on closer inspection it turns out that we're actually emitting a diagnostic on that line for entirely the wrong reasons. I think it would be great to list these in the conformance.py comment as well as an additional drop-down table after the listed "false positives added"/"true positives added" drop-down tables.
Cc. @WillDuke if you fancy taking a look 😃 (but no pressure, of course!)
Our new
conformance.pyscript is awesome, but one thing I miss from the old workflow is that there's no longer any indication when diagnostics in the same location have their messages or error codes changed. That can be a useful signal of improvement or regression -- sometimes it looks like an assertion is "passing", but on closer inspection it turns out that we're actually emitting a diagnostic on that line for entirely the wrong reasons. I think it would be great to list these in the conformance.py comment as well as an additional drop-down table after the listed "false positives added"/"true positives added" drop-down tables.Cc. @WillDuke if you fancy taking a look 😃 (but no pressure, of course!)