tests: added benchmark for periph_gpio driver#7478
tests: added benchmark for periph_gpio driver#7478haukepetersen wants to merge 1 commit intoRIOT-OS:masterfrom
Conversation
|
Nice idea, both for this test/bench and in general, i.e. I'm also in favour of having a dedicated However, IMHO the output should be adapted to be (machine) readable, e.g. CSV with columns like:
[edit/add]: or some JSON style data format to allow for meta infos to, like number of runs, test name etc. |
|
IMHO it does not really need to be directly machine readable (as in JSON or CSV), just a common format used for different applications would suffices. I would expect the app to be fired from some kind of pexpect script or similar in anycase, and the conversion from simly applying regexes to JSON/CSV/whatever can be done in python then... |
|
I agree that JSON might be a bit overkill, but a simple CSV format will make it parsing/conversion using python easier, too. At least the current output is a bit chatty for my taste (ie., having the complete runtime and per call, does not provide any additional info). Further, every unnecessary char in output strings consumes precious memory on very constraint platforms such as Arduino (i.e., uno, duemilanove). |
I think a bit chatty output is actually helpful, as I see this app not only for automated test runs, but also for manual runs while optimizing code. And I think the parsing part is not an issue here: the regex to parse the proposed output in python is dead simple... Memory wise I also think the difference is not that much and the applications itself are quite simple and therefore dont need much memory in the first place, so that the advantage of nicely readable lines prevails here. |
+1 Maybe we write a central function, like:
... and centrally write in whatever format works best? |
IMO "test" can be used for "benchmark test", so for now, IMO we can stick to tests/. Prefixing with "bench_" goes a long way. |
|
I just thought about rowing back with the dedicated benchmark application and rather extend the gpio test application with a |
I had some ideas this morning on how to optimize some of our GPIO driver implementations. So for testing my ideas, I created this simple benchmark application that would allow me to see the impact of my changes. This seems to be something useful so I share this one stand-alone.
I would think this kind of application would make sense for many parts of RIOT, probably not only periph drivers but also for other modules. So maybe we can develop some kind of unified output format, which would allow us to read the output from many different benchmark application into some kind of generic tool for e.g. regression testing and performance tracking.
I am however not sure, if this kind of application should reside in the
tests/folder. I tend to think that a dedicatedbench/folder would make sense here. Any opinions on this?