Skip to content

tests: added benchmark for periph_gpio driver#7478

Closed
haukepetersen wants to merge 1 commit intoRIOT-OS:masterfrom
haukepetersen:add_bench_periphgpio
Closed

tests: added benchmark for periph_gpio driver#7478
haukepetersen wants to merge 1 commit intoRIOT-OS:masterfrom
haukepetersen:add_bench_periphgpio

Conversation

@haukepetersen
Copy link
Copy Markdown
Contributor

I had some ideas this morning on how to optimize some of our GPIO driver implementations. So for testing my ideas, I created this simple benchmark application that would allow me to see the impact of my changes. This seems to be something useful so I share this one stand-alone.

I would think this kind of application would make sense for many parts of RIOT, probably not only periph drivers but also for other modules. So maybe we can develop some kind of unified output format, which would allow us to read the output from many different benchmark application into some kind of generic tool for e.g. regression testing and performance tracking.

I am however not sure, if this kind of application should reside in the tests/ folder. I tend to think that a dedicated bench/ folder would make sense here. Any opinions on this?

@haukepetersen haukepetersen added Area: drivers Area: Device drivers Discussion: RFC The issue/PR is used as a discussion starting point about the item of the issue/PR Area: tests Area: tests and testing framework labels Aug 16, 2017
@haukepetersen haukepetersen added this to the Release 2017.10 milestone Aug 16, 2017
@smlng
Copy link
Copy Markdown
Member

smlng commented Aug 16, 2017

Nice idea, both for this test/bench and in general, i.e. I'm also in favour of having a dedicated bench(mark) directory apart from tests.

However, IMHO the output should be adapted to be (machine) readable, e.g. CSV with columns like:

func-name metric value
gpio_set runtime_us 1375004
gpio_read runtime_us 2062504

[edit/add]: or some JSON style data format to allow for meta infos to, like number of runs, test name etc.

@haukepetersen
Copy link
Copy Markdown
Contributor Author

IMHO it does not really need to be directly machine readable (as in JSON or CSV), just a common format used for different applications would suffices. I would expect the app to be fired from some kind of pexpect script or similar in anycase, and the conversion from simly applying regexes to JSON/CSV/whatever can be done in python then...

@smlng
Copy link
Copy Markdown
Member

smlng commented Aug 16, 2017

I agree that JSON might be a bit overkill, but a simple CSV format will make it parsing/conversion using python easier, too.

At least the current output is a bit chatty for my taste (ie., having the complete runtime and per call, does not provide any additional info). Further, every unnecessary char in output strings consumes precious memory on very constraint platforms such as Arduino (i.e., uno, duemilanove).

@haukepetersen
Copy link
Copy Markdown
Contributor Author

At least the current output is a bit chatty for my taste (ie., having the complete runtime and per call, does not provide any additional info). Further, every unnecessary char in output strings consumes precious memory on very constraint platforms such as Arduino (i.e., uno, duemilanove).

I think a bit chatty output is actually helpful, as I see this app not only for automated test runs, but also for manual runs while optimizing code. And I think the parsing part is not an issue here: the regex to parse the proposed output in python is dead simple...

Memory wise I also think the difference is not that much and the applications itself are quite simple and therefore dont need much memory in the first place, so that the advantage of nicely readable lines prevails here.

@kaspar030
Copy link
Copy Markdown
Contributor

I agree that JSON might be a bit overkill, but a simple CSV format will make it parsing/conversion using python easier, too.

+1

Maybe we write a central function, like:

void bench_put_result(const char *benchmark, const char *metric, const char *value)

... and centrally write in whatever format works best?

@kaspar030
Copy link
Copy Markdown
Contributor

I am however not sure, if this kind of application should reside in the tests/ folder. I tend to think that a dedicated bench/ folder would make sense here. Any opinions on this?

IMO "test" can be used for "benchmark test", so for now, IMO we can stick to tests/. Prefixing with "bench_" goes a long way.

@haukepetersen
Copy link
Copy Markdown
Contributor Author

I just thought about rowing back with the dedicated benchmark application and rather extend the gpio test application with a bench shell command (as done some time ago also for the periph_spi test application). Automatic execution of the benchmark is still possible using a Pexpect script. And it would mean that the benchmark is run on a variable device definition (-> GPIO_PIN(x,y) is set at runtime), which further prevents the compiler to do certain optimizations which it could do on a constant pin definition...

@haukepetersen
Copy link
Copy Markdown
Contributor Author

Opened a new PR in #8040 implementing the same code as shell command in the existing test. I like it much better, so if no one objects, I'd like to close this PR in favor of #8040.

@PeterKietzmann
Copy link
Copy Markdown
Member

I'd also prefer to go with #8040 and close this one. @smlng d'accord?

@PeterKietzmann
Copy link
Copy Markdown
Member

As I see it: three maintainers for #8040 over #7478 -> reason enough to close this one

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Area: drivers Area: Device drivers Area: tests Area: tests and testing framework Discussion: RFC The issue/PR is used as a discussion starting point about the item of the issue/PR

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants