Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Add perf workflow #727

Merged

Conversation

lukaszstolarczuk
Copy link
Contributor

@lukaszstolarczuk lukaszstolarczuk commented Sep 12, 2024

Add initial workflow for performance measurements.

// Preview run for main branch: https://github.com/oneapi-src/unified-memory-framework/actions/runs/10996891676/job/30531119551
// manual workflow can be tested only if this workflow is merged.

@lukaszstolarczuk lukaszstolarczuk force-pushed the add-perf-workflow branch 2 times, most recently from a4bac5c to 1be0923 Compare September 12, 2024 16:31
@lukaszstolarczuk lukaszstolarczuk marked this pull request as ready for review September 12, 2024 16:31
@lukaszstolarczuk lukaszstolarczuk requested a review from a team as a code owner September 12, 2024 16:31
@kswiecicki
Copy link
Contributor

How are the measurements meant to be viewed when benchmarks are ran on main and the comment is skipped?
The Run benchmarks section from your preview looks like this:

Run numactl -N 1 ctest -V --test-dir benchmark -C Release
Internal ctest changing into directory: /home/test-user/shared-actions-runner/_work/unified-memory-framework/unified-memory-framework/benchmark
UpdateCTestConfiguration  from :/home/test-user/shared-actions-runner/_work/unified-memory-framework/unified-memory-framework/benchmark/DartConfiguration.tcl
UpdateCTestConfiguration  from :/home/test-user/shared-actions-runner/_work/unified-memory-framework/unified-memory-framework/benchmark/DartConfiguration.tcl
No tests were found!!!
Test project /home/test-user/shared-actions-runner/_work/unified-memory-framework/unified-memory-framework/benchmark
Constructing a list of tests
Updating test list for fixtures
Added 0 tests to meet fixture requirements
Checking test dependency graph...
Checking test dependency graph end

@lukaszstolarczuk lukaszstolarczuk force-pushed the add-perf-workflow branch 2 times, most recently from 09b0059 to ed398ec Compare September 23, 2024 15:02
@lukaszstolarczuk
Copy link
Contributor Author

thx @kswiecicki - fixed the bug with no tests (I was in the wrong directory).

As for results from benchmarks run on main - as we can see in UR, this scenario is rarely run - we rather want results for specific PR; and of course, there are logs in the CI.

When we have a final version of benchmarks we can tweak this workflow to print some more results; for now we only run basic benchmark just to check if the workflow is running properly.

.github/workflows/performance.yml Outdated Show resolved Hide resolved
.github/workflows/performance.yml Show resolved Hide resolved
.github/workflows/performance.yml Show resolved Hide resolved
.github/workflows/performance.yml Show resolved Hide resolved
.github/workflows/performance.yml Outdated Show resolved Hide resolved
@lukaszstolarczuk lukaszstolarczuk merged commit 36beed3 into oneapi-src:main Sep 30, 2024
72 checks passed
@lukaszstolarczuk lukaszstolarczuk deleted the add-perf-workflow branch September 30, 2024 18:15
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants