-
Notifications
You must be signed in to change notification settings - Fork 13
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
ci: use get_charms_build_with_cache.yaml workflow and use charm artefacts for individual tests #645
base: main
Are you sure you want to change the base?
Conversation
* ci: use get_charms_build_with_cache.yaml workflow This workflow enables the usage of canonical/charmed-kubeflow-workflows/.github/workflows/get_charms_build_with_cache.yaml for this CI, meaning that the integration tests and publish jobs are now able to download charm artefacts and use them consistently throught the various operations of this CI. Fixes #641
* ci, tests: pass charm artefacts to deploy and test charms This commit enables the "--charm-path" option to pass .charm artefacts to be deployed and tested at an individual level. This change enables the option to pass pre-built charms to the tests to avoid building them on the same test. It also ensures compatibility with the build_charm.py reusable workflow (from canonical/data-platform-workflows). Fixes: #639
get-paths-and-build: | ||
name: Get charm paths and build with cache | ||
uses: canonical/charmed-kubeflow-workflows/.github/workflows/get_charms_build_with_cache.yaml@main |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm concerned about how we will version this workflow, and whether it will be supported by data platform team as they are the ones that maintain the workflow reused in https://github.com/canonical/charmed-kubeflow-workflows/blob/main/.github/workflows/get_charms_build_with_cache.yaml, pinging @carlcsaposs-canonical
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would strongly recommend against wrapping the data-platform-workflows inside of other reusable workflows, unless there's a compelling use case for it across multiple repositories—and if there is such a compelling use case, please contact us first so we can look at upstreaming it to data-platform-workflows instead
few reasons:
- pinning a version (of the workflow) for several repos in one place can cause a lot of headaches
- there are several cases where you may need to have different versions of the workflow (in a short to medium timespan) across repos
- for example, external changes to CI (new version of build tool released, change to runner image, etc.) often require changes to data-platform-workflows, some of which are breaking. If you need to urgently release a new version of a charm, it may require you to update that repo to the new data-platform-workflows version to get the fix for the external change. If instead of just needing to update that repo, you need to do a coordinated update of all your repos (since the version is pinned in one place), it could delay the urgent release (potentially by days), since the breaking change on dpw could break another repo that is unrelated to the urgent release—and further updates to dpw might be required for that unrelated repo
- we've seen several examples of this already across the data platform repos & we would've been much worse off if all data platform repos could only be on version of the workflows—lmk if it'd be helpful to dig up some of these examples
- this is also why the workflows do not pin a specific version of charmcraft, lxd, etc. but allow it to be specified per-repository as an input—if you're going to wrap the workflow, you should probably also wrap all of these inputs (in the past, we've run into overlapping issues that required different lxd versions across repos at one time or ci would've broken)
- also pinning a version (of the workflow) for multiple commits/branches of one repo in one place can cause a lot of headaches
- if you ever have more than one branch or need to backport a fix to a stable release (e.g. make a patch to stable without including everything in edge), it's likely that you'll need to use an old version of data-platform-workflows when working on an old branch/commit—if that old commit pins to
main
, you won't be able to go back and re-build an old charm—so you significantly increase risk of patches since you can't minimize the changes from the old build
- if you ever have more than one branch or need to backport a fix to a stable release (e.g. make a patch to stable without including everything in edge), it's likely that you'll need to use an old version of data-platform-workflows when working on an old branch/commit—if that old commit pins to
- data-platform-workflows provides both reusable workflows and python dependencies (namely pytest plugins). you're not using pytest plugins right now, but if you plan to in the future (e.g. for allure dashboards to track ci stability/speed), the pytest plugins and workflows should be versioned & updated together. (data-platform-workflows only supports using one version for all components in a repo, so if the workflow is v24.0.2, the python plugin needs to be v24.0.2 or there may be compatibility issues)
- if your python dependencies are versioned per-repo and your workflows are versioned in all repos, it's likely that these will get out of sync. for that reason, we recommend versioning them together—for example, using renovate config that ensures the workflows & python deps get updated in the same PR https://github.com/canonical/data-platform-workflows?tab=readme-ov-file#version
- this deviates significantly from our approach on data platform repos, where we call the reusable workflow directly and version it per repository
- if you have a good reason to deviate from this, I don't want to discourage you. however, it seems like this extra abstraction might cost more in the difficulty it adds than the benefit it provides
- some benefits of standardization that would be lost:
- using the same approach for versioning/updates allows knowledge sharing/transfer, especially in troubleshooting common issues
- we are able to share more tooling (e.g. parts of renovate configs) and developing future tooling is easier
- making changes to data-platform-worfklows becomes more difficult—for example, for some of the breaking changes to data-platform-workflows (e.g. the charmcraft 3 migration), I make an effort to open PRs across all our repos to keep everyone up-to-date—if the reusable workflows are called directly, I'd be happy to extend that to the kubeflow repos as well, but if there's an extra layer of abstraction it would make that much more difficult so I wouldn't be able to do that
what'd I recommend instead:
move the contents of get_charms_build_with_cache.yaml into this workflow so that build_charm.yaml is called directly & versioned in this repo
to keep data-platform-workflows up-to-date, this renovate config is recommended (which groups workflows & python updates from dpw into the same PR): https://github.com/canonical/data-platform-workflows?tab=readme-ov-file#version (also available in a preset here: https://github.com/canonical/data-platform/tree/main/renovate_presets)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm concerned about how we will version this workflow
Hey @NohaIhab, this workflow only wraps an action for getting charm paths and the build_charm.yaml reusable workflow, so I don't think a versioning system other than pointing to main
to be that necessary. Were you thinking of something else? Is there a concern with it knowing this information?
Thanks @carlcsaposs-canonical for your reply and insight, a couple comments:
I would strongly recommend against wrapping the data-platform-workflows inside of other reusable workflows, unless there's a compelling use case for it across multiple repositories
The charmed-kubeflow reusable workflow contains an action that gets charm paths, and then passes them to the build_charm.yaml workflow, this way we avoid some boilerplate in multiple repositories.
pinning a version (of the workflow) for several repos in one place can cause a lot of headaches
also pinning a version (of the workflow) for multiple commits/branches of one repo in one place can cause a lot of headaches
I think this can be resolved by exposing some workflow options, for starters both the tools versions and the version of the workflow itself.
data-platform-workflows provides both reusable workflows and python dependencies (namely pytest plugins). you're not using pytest plugins right now, but if you plan to in the future (e.g. for allure dashboards to track ci stability/speed), the pytest plugins and workflows should be versioned & updated together. (data-platform-workflows only supports using one version for all components in a repo, so if the workflow is v24.0.2, the python plugin needs to be v24.0.2 or there may be compatibility issues)
That's a good point. I think it can still be solved by exposing the workflow version in the wrapper, no? That way if a repository uses the pytest plugin, we'll always have to pin the version and update it in the same PR according to the update suggestions.
some benefits of standardization that would be lost:
using the same approach for versioning/updates allows knowledge sharing/transfer, especially in troubleshooting common issues
Do you think this will be lost even if we expose the version of the wf?
making changes to data-platform-worfklows becomes more difficult—for example, for some of the breaking changes to data-platform-workflows (e.g. the charmcraft 3 migration), I make an effort to open PRs across all our repos to keep everyone up-to-date—if the reusable workflows are called directly, I'd be happy to extend that to the kubeflow repos as well, but if there's an extra layer of abstraction it would make that much more difficult so I wouldn't be able to do that
For this one I'm not really able to see what the blocker would be. I have in mind #643, which changes the charms metadata directly, and does very minimal changes to the workflows. In fact, this is not even touching the build_charm.yaml workflow, so I wonder what sort of changes could there be that will make this task any different.
In the end, the wrapper is just avoiding us the get charm paths boilerplate, if we think the benefits of using it are rather small compared to the way the data platform team is doing it, we can change it, but I cannot seem to find something that's very troublesome. Some of the benefits of keeping the wrapper:
- Keep the version centralised, so one update for the 30+ repositories that we have
- Avoid the boilerplate of getting charm paths
- Any fix we'll have to make will happen in one repo, rather than in 30+, unless we need to pin a specific version (e.g. charmcraft, lxd), BUT we've rarely seen this, usually if we have to pin a version in one repo, it has to be replicated in ALL repos.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thank you @DnPlas for the detailed evaluation, I'm pro to exposing the dp workflow version as input to our re-usable workflow. I'd definitely like to avoid boilerplate code, but it's important that we do not introduce difficulties for the dpw maintainers to be able to include us in their beta testing/updates. Ideally, we would not need a wrapper at all, if dpw can be optimized for multi-charm repos, then we can use it out of the box without needing a layer with our own workflow, but I'm not sure if dpw can accomodate this. What do you think @carlcsaposs-canonical?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
- Keep the version centralised, so one update for the 30+ repositories that we have
I don't think you want this—this has the issues mentioned above around multiple repos, around multiple branches/commits, and incompatibility with dpw python dependencies
discussing the idea of
exposing some workflow options, for starters both the tools versions and the version of the workflow itself
what happens when a breaking change is made to a dpw workflow option? (e.g. the artifact-name to artifact-prefix switch) is your wrapper workflow going to support both versions or just one? if both, how are you going to handle inputs? if one, are you okay with breaking builds that use the wrapper on older branches & older commits (i.e. making it not possible to release a patch with minimal changes)?
I think it can still be solved by exposing the workflow version in the wrapper, no?
this solves some issues but not all. see above ^. also renovate would no longer be able to detect the version as a data-platform-workflows version, so you need to maintain custom config for that
For this one I'm not really able to see what the blocker would be. I have in mind #643
if it would be helpful, I can show you the tooling involved behind https://chat.canonical.com/canonical/pl/n3sok33oo3ratpystd61yeqg5c (part automated, part manual). it only makes sense because there's quite a bit of consistency between repos—if y'all have a wrapper workflow around dpw workflows, updating the automated/semi-manual parts of the process to support that would be a lot of effort & would end up not being worth the effort
how often are you adding new charms to your repos? if it's not more than once a week, imo it's not worth adding an abstraction instead of hardcoding the charm paths
from my experience with ci/cd and github workflows, the desire to avoid boilerplate here seems very penny wise, pound foolish—especially considering things like minimal changes for emergency patches/hotfixes, which I imagine y'all will need in the future (we're expecting to need significant tooling, processes, and knowledge for this on the data platform repos)
if you're dead set on not hardcoding the paths, I'd encourage you to put the path stuff into a reusable component, and call that component and build_charm.yaml directly in your charm repos—but even this is not worth it imo, unless you're adding new charms more than once a week
…ation (#648) * use python plugin and add licenses to charms * use charmcraft channel 3.x/edge * use branch for quality checks workflow with charmcraft 3.x/edge * update charmcraft channel to latest/candidate
…ows (#651) * ci: remove wrapper around get charm paths and build with cache workflows This commit removes the extra abstraction of the get-charm-paths action and build_charm.yaml reusable workflow to avoid potential incompatibilities and sync issues with the data platforms workflows. * update workflows for dpw v29 * fix: use correct artifact name * nit: address comments for consistency with data repos --------- Co-authored-by: NohaIhab <noha.ihab@canonical.com>
Use single (cached) build for integration tests & release Fix hardcoded path for `platforms` syntax remove workflow dispatch from release.yaml not needed add description for outputs rename get charm paths job and channel output pin quality checks back to main due to merging canonical/charmed-kubeflow-workflows#95 Use stage instead of prime in charmcraft files part (#658) Co-authored-by: Daniela Plascencia <daniela.plascencia@canonical.com>
uses: canonical/data-platform-workflows/.github/workflows/release_charm.yaml@v29.0.0 | ||
with: | ||
charmcraft-snap-channel: latest/candidate # TODO: remove after charmcraft 3.3 stable release | ||
channel: ${{ inputs.destination_channel || needs.ci-tests.outputs.channel }} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
channel: ${{ inputs.destination_channel || needs.ci-tests.outputs.channel }} | |
channel: ${{ needs.ci-tests.outputs.channel }} |
follow-up to #657 (comment)
This PR merges
KF-6684-refactor-ci-with-build
branch intomain
. These are the changes:This workflow enables the usage of canonical/charmed-kubeflow-workflows/.github/workflows/get_charms_build_with_cache.yaml for this CI, meaning that the integration tests and publish jobs are now able to download charm artefacts and use them consistently throught the various operations of this CI.
Fixes #641
This commit enables the "--charm-path" option to pass .charm artefacts
to be deployed and tested at an individual level.
This change enables the option to pass pre-built charms to the tests
to avoid building them on the same test. It also ensures compatibility
with the build_charm.py reusable workflow (from canonical/data-platform-workflows).
Fixes: #639