Skip to content

Commit 48291f1

Browse files
committed
include extraction in the retrying for submodule download
The code to download larger submodules previously used retries around the `curl` invocation to handle network failures, but we saw in recent build failures that failures can also happen during extraction, for example if a response got terminated early. This commit moves the retry outwards, wrapping the whole download+extraction function in the retrying code. This means, if the extraction fails the tarball will be re-downloaded.
1 parent 0442fba commit 48291f1

File tree

1 file changed

+5
-3
lines changed

1 file changed

+5
-3
lines changed

src/ci/scripts/checkout-submodules.sh

+5-3
Original file line numberDiff line numberDiff line change
@@ -23,15 +23,17 @@ fi
2323
function fetch_github_commit_archive {
2424
local module=$1
2525
local cached="download-${module//\//-}.tar.gz"
26-
retry sh -c "rm -f $cached && \
27-
curl -f -sSL -o $cached $2"
26+
rm -f "${cached}"
27+
rm -rf "${module}"
28+
curl -f -sSL -o "${cached}" "$2"
2829
mkdir $module
2930
touch "$module/.git"
3031
# On Windows, the default behavior is to emulate symlinks by copying
3132
# files. However, that ends up being order-dependent while extracting,
3233
# which can cause a failure if the symlink comes first. This env var
3334
# causes tar to use real symlinks instead, which are allowed to dangle.
3435
export MSYS=winsymlinks:nativestrict
36+
mkdir -p "${module}"
3537
tar -C $module --strip-components=1 -xf $cached
3638
rm $cached
3739
}
@@ -50,7 +52,7 @@ for i in ${!modules[@]}; do
5052
git rm $module
5153
url=${urls[$i]}
5254
url=${url/\.git/}
53-
fetch_github_commit_archive $module "$url/archive/$commit.tar.gz" &
55+
retry fetch_github_commit_archive $module "$url/archive/$commit.tar.gz" &
5456
bg_pids[${i}]=$!
5557
continue
5658
else

0 commit comments

Comments
 (0)