Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

[T.1.1] Add Support for Custom Data Sources and Provenance Tracking #175

Open
1 of 5 tasks
smarr opened this issue Dec 4, 2021 · 0 comments
Open
1 of 5 tasks

[T.1.1] Add Support for Custom Data Sources and Provenance Tracking #175

smarr opened this issue Dec 4, 2021 · 0 comments
Labels
help-wanted We'd like help with this issue, and are happy to provide guidance Proposal Work proposed as part of a (rejected) funding application

Comments

@smarr
Copy link
Owner

smarr commented Dec 4, 2021

It would be useful to be able to track custom metrics, for instance:

  • hardware counters
  • memory usage
  • specific parts of a program's execution
  • any kind of custom metrics
  • single value metrics for builds, run jobs (overall time, binary size, ...)

ReBench can already capture such data from a single data source, it does not yet support reading data from different input sources, and it doesn't yet have a mechanism to get single values, from the "build", which are not necessarily associated with a specific run.

Having different data sources/files is pretty common, especially when using tools to measure metrics.

The challenge is to keep simple things simple, and make complex customization require less effort than writing a custom tool. This is key to make ReBench useful for a wider community.

This needs to be applied to the tracking of custom provenance information, too. For instance, tracking of GPU drivers versions, processor temperature, and CPU microcode revisions.

@smarr smarr added help-wanted We'd like help with this issue, and are happy to provide guidance Proposal Work proposed as part of a (rejected) funding application labels Dec 4, 2021
@smarr smarr moved this to WP1. Improving Benchmarking Accuracy and Result Contextualization in ReBench for a wider community Mar 19, 2023
@smarr smarr moved this to 📋 Backlog in ReBench Summer 2024 Jul 4, 2023
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
help-wanted We'd like help with this issue, and are happy to provide guidance Proposal Work proposed as part of a (rejected) funding application
Projects
Status: 📋 Backlog
Status: WP1. Improving Benchmarking Accuracy and Result Contextualization
Development

No branches or pull requests

1 participant