We have a wonderful mechanism (--with-pystats) for quantifying the success of many of our optimizations. However, there is currently quite a bit of friction involved in collecting that data. I think that the situation can be improved without too much difficulty.
My current wishlist regarding pyperformance runs built --with-pystats:
I also know that @markshannon has also expressed a desire to have pyperformance (or maybe pyperf?) turn stats gathering on and off using 3.12's new sys utilities before and after running each benchmark, so that we're gathering just stats on the benchmarks themselves, and ignoring the external benchmarking machinery.
CC @mdboom
Individual tasks to get there:
We have a wonderful mechanism (
--with-pystats) for quantifying the success of many of our optimizations. However, there is currently quite a bit of friction involved in collecting that data. I think that the situation can be improved without too much difficulty.My current wishlist regarding pyperformance runs built
--with-pystats:--fastoption.I also know that @markshannon has also expressed a desire to have pyperformance (or maybe pyperf?) turn stats gathering on and off using 3.12's new
sysutilities before and after running each benchmark, so that we're gathering just stats on the benchmarks themselves, and ignoring the external benchmarking machinery.CC @mdboom
Individual tasks to get there: