Skip to main content

Execute and document benchmarks reproducibly.

Project description

ReBench - Execute and Document Benchmarks Reproducibly

ReBench is a tool to run and document benchmarks. Currently, its focus lies on benchmarking virtual machines, but nonetheless it can be used to benchmark all kind of other applications/programs, too.

To facilitate the documentation of benchmarks, ReBench uses a text-based configuration format. The configuration files contain all aspects of the benchmark. They describe which binary was used, which parameters where given to the benchmarks, and the number of iterations to be used to obtain statistically reliable results.

Thus, the documentation contains all benchmark-specific informations to reproduce a benchmark run. However, it does not capture the whole systems information, and also does not include build settings for the binary that is benchmarked. These informations can be included as comments, but are not captured automatically.

The data of all benchmark runs is recorded in a data file and allows to continue aborted benchmark runs at a later time.

The data can be exported for instance as CSV or visualized with the help of box plots.

Current Build Status

BuildStatus

Credits

Even though, we do not share code with JavaStats, it was a strong inspiration for the creation of ReBench.

Furthermore, our thanks go to Travis CI for their services.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ReBench-0.7.3.tar.gz (30.1 kB view hashes)

Uploaded Source

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page