You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I attempted to use the Artifacts Plugin for my CI/CD pipeline but encountered several issues that suggest it might not be suitable for my use case. Below is an overview of my workflow and the challenges faced:
Workflow:
Setup Phase:
Create a setup.tgz archive with bundle dependencies.
Upload setup.tgz to the artifact storage.
Build Phase:
Download setup.tgz in a subsequent build job.
Produce a build.tgz archive (containing only files with specific extensions like *.xctestrun, *.app, *.xctest).
Upload build.tgz.
Test Phase:
Download both setup.tgz and build.tgz in a test job.
Produce a test.tgz archive and upload it.
Challenges:
Shared compressed Option:
The compressed option seems to be shared between the upload and download steps, preventing the use of distinct archive names.
Using the same compressed value (e.g., compressed: artifacts.tgz) causes an error during the test job, as the agent cannot distinguish between different artifacts that already exist in the storage.
Manual Compression Issues:
Currently, I have to perform compression and decompression of the artifacts manually. When performing compression manually, the upload option does not inherit the value of environment variables defined within the step. This requires the variable to be set in the global environment or hardcoded.
env:
BUNDLE_ARTIFACTS_COMPRESSED: setup.tgzBUILD_FOR_TESTING_ARTIFACTS_COMPRESSED: build.tgzunit_test: &unit_testlabel: "Test (unit)"command: ".buildkite/scripts/test.sh --type unit"env:
TEST_ARTIFACTS_COMPRESSED: test.tgzplugins:
- artifacts#v1.9.3:upload: TEST_ARTIFACTS_COMPRESSED # This value will not be received by the plugindownload:
- $BUILD_FOR_TESTING_ARTIFACTS_COMPRESSED # These values will be received by the plugin
- $BUNDLE_ARTIFACTS_COMPRESSED
Performance Concerns:
Compressing large files (e.g., .app and .xctest) takes approximately 5-7 minutes, which seems inefficient for an input directory of ~17GiB. I'm currently using tar -czf <archive> [files...] to do the compression. Curious to know if there is a more optimal way to perform the compression.
Request for Assistance:
Is there a way to distinguish between upload and download artifacts when using the compressed option?
Can the artifact upload plugin inherit environment variables defined within the step instead of requiring global scope?
Any tips on optimizing the compression of large directories (~17GiB) to improve efficiency?
The text was updated successfully, but these errors were encountered:
I attempted to use the Artifacts Plugin for my CI/CD pipeline but encountered several issues that suggest it might not be suitable for my use case. Below is an overview of my workflow and the challenges faced:
Workflow:
Setup Phase:
setup.tgz
archive with bundle dependencies.Build Phase:
setup.tgz
in a subsequent build job.build.tgz
archive (containing only files with specific extensions like*.xctestrun
,*.app
,*.xctest
).build.tgz
.Test Phase:
setup.tgz
andbuild.tgz
in a test job.test.tgz
archive and upload it.Challenges:
Shared compressed Option:
compressed
option seems to be shared between theupload
anddownload
steps, preventing the use of distinct archive names.compressed
value (e.g.,compressed: artifacts.tgz
) causes an error during the test job, as the agent cannot distinguish between different artifacts that already exist in the storage.Manual Compression Issues:
Currently, I have to perform compression and decompression of the artifacts manually. When performing compression manually, the upload option does not inherit the value of environment variables defined within the step. This requires the variable to be set in the global environment or hardcoded.
Performance Concerns:
Compressing large files (e.g., .app and .xctest) takes approximately 5-7 minutes, which seems inefficient for an input directory of ~17GiB. I'm currently using
tar -czf <archive> [files...]
to do the compression. Curious to know if there is a more optimal way to perform the compression.Request for Assistance:
The text was updated successfully, but these errors were encountered: