aboutsummaryrefslogtreecommitdiff
path: root/benchtests/scripts
AgeCommit message (Collapse)Author
2014-06-11Validate bench.out against a JSON schemaSiddhesh Poyarekar
This patch adds a JSON schema for the benchmark output file and also adds a script that validates the generated output against the schema.
2014-05-26benchtests: Add new directive for benchmark initialization hookSiddhesh Poyarekar
Add a new 'init' directive that specifies the name of the function to call to do function-specific initialization. This is useful for benchmarks that need to do a one-time initialization before the functions are executed.
2014-03-29Detailed benchmark outputs for functionsSiddhesh Poyarekar
This patch adds an option to get detailed benchmark output for functions. Invoking the benchmark with 'make DETAILED=1 bench' causes each benchmark program to store a mean execution time for each input it works on. This is useful to give a more comprehensive picture of performance of functions compared to just the single mean figure.
2014-03-29Make bench.out in json formatSiddhesh Poyarekar
This patch changes the output format of the main benchmark output file (bench.out) to an extensible format. I chose JSON over XML because in addition to being extensible, it is also not too verbose. Additionally it has good support in python. The significant change I have made in terms of functionality is to put timing information as an attribute in JSON instead of a string and to do that, there is a separate program that prints out a JSON snippet mentioning the type of timing (hp_timing or clock_gettime). The mean timing has now changed from iterations per unit to actual timing per iteration.
2014-03-24benchtests: Move bench.py to benchtests/scripts/Siddhesh Poyarekar
It makes much more sense to have all benchmarking-related scripts in a single place away from everything else.