-
Notifications
You must be signed in to change notification settings - Fork 766
Whitespace token modernization - a* lexers - regarding #1905 #1914
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Further playing with the automation lexer wasn't very fruitful... I tried to get detect the labels correctly for the following snippet. #p::
Run, https://www.pygments.org/
return
^t::
Run, calc.exe
return
:*:pyg::pyg
::pygmentize::pygmentizeThis snippet results in differing tokens as well. Anyhow, I am unable to find a quick solution. Rethinking this lexer would require some time... I'd drop the respective commit stashing it for later. Maybe I have a better understanding on pygments' code then - or someone else will address this... |
|
Sounds like a plan. I'd rather have this noted as an open issue than introduce this kind of change during a cleanup -- thanks for investigating this! |
|
Merged, thanks a lot! |
This PR is chunk of the effort (#1905) to insert the Whitespace token where ever it applies.
The automation lexer also contains a minor fix for multiline comments (making the multiline-comment-content regex greedy).