Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

chore(CI): improve datadog reporting #58267

Merged
merged 2 commits into from
Nov 23, 2023
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
19 changes: 11 additions & 8 deletions .github/workflows/build_and_deploy.yml
Original file line number Diff line number Diff line change
Expand Up @@ -301,12 +301,12 @@ jobs:

- name: 'check build cache status'
id: check-did-build
run: if [[ ! -z $(ls packages/next-swc/native) ]]; then echo "DID_BUILD=yup" >> $GITHUB_OUTPUT; fi
run: if [[ ! -z $(ls packages/next-swc/native) ]]; then echo "DID_BUILD=true" >> $GITHUB_OUTPUT; fi

# Trying to upload metrics for the Turbopack to datadog's CI pipeline execution
# Try to upload metrics for Turbopack to datadog's CI pipeline execution
- name: 'Collect turbopack build metrics'
id: check-turbopack-bytesize
if: ${{ steps.check-did-build.outputs.DID_BUILD == 'yup' }}
if: ${{ steps.check-did-build.outputs.DID_BUILD == 'true' }}
continue-on-error: true
run: |
mkdir -p ./turbopack-bin-size
Expand All @@ -322,7 +322,7 @@ jobs:
done

- name: Upload turbopack bytesize artifact
if: ${{ steps.check-did-build.outputs.DID_BUILD == 'yup' }}
if: ${{ steps.check-did-build.outputs.DID_BUILD == 'true' }}
uses: actions/upload-artifact@v3
with:
name: turbopack-bytesize
Expand Down Expand Up @@ -492,23 +492,26 @@ jobs:
NEXT_SKIP_NATIVE_POSTINSTALL: 1

upload_turbopack_bytesize:
name: Upload Turbopack Bytesize trace to Datadog
name: Upload Turbopack Bytesize metrics to Datadog
runs-on: ubuntu-latest
needs: [build-native]
env:
DATADOG_API_KEY: ${{ secrets.DATA_DOG_API_KEY }}
steps:
- name: Collect bytesize traces
- name: Collect bytesize metrics
uses: actions/download-artifact@v3
with:
name: turbopack-bytesize
path: turbopack-bin-size

- name: Upload to Datadog
run: |
ls -al turbopack-bin-size
npm install -g @datadog/datadog-ci@2.14.0 @aws-sdk/property-provider@3

for filename in turbopack-bin-size/*; do
export BYTESIZE+=" --metrics $(cat $filename)"
done

echo "Reporting $BYTESIZE"
datadog-ci metric --no-fail --level pipeline $BYTESIZE

npx @datadog/datadog-ci@2.23.1 metric --no-fail --level pipeline $BYTESIZE
29 changes: 23 additions & 6 deletions .github/workflows/build_and_test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ env:
# canary next-swc binaries in the monorepo
NEXT_SKIP_NATIVE_POSTINSTALL: 1
DATADOG_API_KEY: ${{ secrets.DATA_DOG_API_KEY }}
DATADOG_TRACE_NEXTJS_TEST: 'true'
NEXT_JUNIT_TEST_REPORT: 'true'
DD_ENV: 'ci'
TEST_TIMINGS_TOKEN: ${{ secrets.TEST_TIMINGS_TOKEN }}
NEXT_TEST_JOB: 1
Expand Down Expand Up @@ -240,11 +240,28 @@ jobs:
'test-turbopack-dev',
'test-turbopack-integration',
]
uses: ./.github/workflows/build_reusable.yml
with:
skipForDocsOnly: 'yes'
uploadTestTrace: 'yes'
secrets: inherit
if: always()
runs-on: ubuntu-latest
name: report test results to datadog
steps:
- name: Download test report artifacts
id: download-test-reports
uses: actions/download-artifact@v3
with:
name: test-reports
path: test

- name: Upload test report to datadog
run: |
if [ -d ./test/test-junit-report ]; then
# Add a `test.type` tag to distinguish between turbopack and next.js runs
DD_ENV=ci npx @datadog/datadog-ci@2.23.1 junit upload --tags test.type:nextjs --service nextjs ./test/test-junit-report
fi

if [ -d ./test/turbopack-test-junit-report ]; then
# Add a `test.type` tag to distinguish between turbopack and next.js runs
DD_ENV=ci npx @datadog/datadog-ci@2.23.1 junit upload --tags test.type:turbopack --service nextjs ./test/turbopack-test-junit-report
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is there a reason to merge the reports before uploading?

fi

tests-pass:
needs:
Expand Down
34 changes: 5 additions & 29 deletions .github/workflows/build_reusable.yml
Original file line number Diff line number Diff line change
Expand Up @@ -49,10 +49,6 @@ on:
required: false
description: 'if swc artifact needs uploading'
type: string
uploadTestTrace:
required: false
description: 'if test trace needs uploading'
type: string
rustCacheKey:
required: false
description: 'rustCacheKey to cache shared target assets'
Expand All @@ -73,7 +69,7 @@ env:
# canary next-swc binaries in the monorepo
NEXT_SKIP_NATIVE_POSTINSTALL: 1
DATADOG_API_KEY: ${{ secrets.DATA_DOG_API_KEY }}
DATADOG_TRACE_NEXTJS_TEST: 'true'
NEXT_JUNIT_TEST_REPORT: 'true'
DD_ENV: 'ci'
TEST_TIMINGS_TOKEN: ${{ secrets.TEST_TIMINGS_TOKEN }}
NEXT_TEST_JOB: 1
Expand Down Expand Up @@ -187,6 +183,7 @@ jobs:
with:
name: turbo run summary
path: .turbo/runs
if-no-files-found: ignore

- name: Upload bundle analyzer artifacts
uses: actions/upload-artifact@v3
Expand All @@ -195,33 +192,12 @@ jobs:
name: webpack bundle analysis stats
path: packages/next/dist/compiled/next-server/report.*.html

- name: Upload test reports artifact
- name: Upload test report artifacts
uses: actions/upload-artifact@v3
if: ${{ inputs.afterBuild }}
if: ${{ inputs.afterBuild && always() }}
with:
name: Test trace reports
name: test-reports
path: |
test/test-junit-report
test/turbopack-test-junit-report
if-no-files-found: ignore

- name: Download test reports artifact
id: download-test-reports
uses: actions/download-artifact@v3
continue-on-error: true
if: ${{ inputs.uploadTestTrace == 'yes' && (inputs.skipForDocsOnly != 'yes' || steps.docs-change.outputs.DOCS_CHANGE == 'nope') }}
with:
name: Test trace reports
path: test

- name: Upload test trace to datadog
if: ${{ inputs.uploadTestTrace == 'yes' && (inputs.skipForDocsOnly != 'yes' || steps.docs-change.outputs.DOCS_CHANGE == 'nope') }}
continue-on-error: true
run: |
ls -al ./test
npm install -g junit-report-merger@6.0.2 @datadog/datadog-ci@2.14.0 @aws-sdk/property-provider@3
jrm ./nextjs-test-result-junit.xml "test/test-junit-report/**/*.xml"
jrm ./turbopack-test-result-junit.xml "test/turbopack-test-junit-report/**/*.xml"
# Put a separate tag for the tests with turbopack to distinguish between same test names
DD_ENV=ci datadog-ci junit upload --tags test.type:nextjs --service nextjs ./nextjs-test-result-junit.xml
DD_ENV=ci datadog-ci junit upload --tags test.type:turbopack --service nextjs ./turbopack-test-result-junit.xml
41 changes: 14 additions & 27 deletions .github/workflows/nextjs-integration-test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ env:
NEXT_TELEMETRY_DISABLED: 1
TEST_CONCURRENCY: 6
DATADOG_API_KEY: ${{ secrets.DATA_DOG_API_KEY }}
DATADOG_TRACE_NEXTJS_TEST: 'true'
NEXT_JUNIT_TEST_REPORT: 'true'
DD_ENV: 'ci'
# Turbopack specific customization for the test runner
TURBOPACK: 1
Expand Down Expand Up @@ -96,10 +96,10 @@ jobs:
# marker to parse log output, do not delete / change.
NEXT_INTEGRATION_TEST: true

- name: Upload test reports artifact
- name: Upload test report artifacts
uses: actions/upload-artifact@v3
with:
name: Test trace reports
name: test-reports
path: |
test/turbopack-test-junit-report

Expand Down Expand Up @@ -140,10 +140,10 @@ jobs:
env:
NEXT_INTEGRATION_TEST: true

- name: Upload test reports artifact
- name: Upload test report artifacts
uses: actions/upload-artifact@v3
with:
name: Test trace reports
name: test-reports
path: |
test/turbopack-test-junit-report

Expand Down Expand Up @@ -183,9 +183,9 @@ jobs:
passed-test-path-list.json
slack-payload.json

upload_test_trace:
upload_test_report:
needs: [test-dev, test-integration]
name: Upload test trace to datadog
name: Upload test report to datadog
runs-on:
- 'self-hosted'
- 'linux'
Expand All @@ -194,27 +194,14 @@ jobs:

if: always()
steps:
- name: Checkout
uses: actions/checkout@v3
- name: Download test reports artifact
- name: Download test report artifacts
id: download-test-reports
uses: actions/download-artifact@v3
with:
name: Test trace reports
path: test
- name: Upload test trace to datadog
name: test-reports
path: test/reports

- name: Upload to datadog
run: |
npm install -g @datadog/datadog-ci@2.18.1
ls -al ./test/*.xml
# We'll tag this to the Turbopack service, not the next.js
DD_ENV=ci datadog-ci junit upload --tags test.type:turbopack.daily --service Turbopack test

# For the debugging purpose, upload the test trace to github artifact.
# So we can manually analyze test reports.
- name: Upload test reports artifact
uses: actions/upload-artifact@v3
with:
name: Merged test trace reports
path: |
nextjs-test-result-junit.xml
./**/*.xml
# We'll tag this to the "Turbopack" datadog service, not "nextjs"
DD_ENV=ci npx @datadog/datadog-ci@2.23.1 junit upload --tags test.type:turbopack.daily --service Turbopack ./test/report
17 changes: 8 additions & 9 deletions .github/workflows/test_e2e_deploy.yml
Original file line number Diff line number Diff line change
Expand Up @@ -50,23 +50,22 @@ jobs:
- run: RESET_VC_PROJECT=true node scripts/reset-vercel-project.mjs
name: Reset test project

- run: docker run --rm -v $(pwd):/work mcr.microsoft.com/playwright:v1.35.1-jammy /bin/bash -c "cd /work && NODE_VERSION=${{ env.NODE_LTS_VERSION }} ./scripts/setup-node.sh && corepack enable > /dev/null && DATADOG_TRACE_NEXTJS_TEST=TRUE DATADOG_API_KEY=${DATADOG_API_KEY} DD_ENV=ci VERCEL_TEST_TOKEN=${{ secrets.VERCEL_TEST_TOKEN }} VERCEL_TEST_TEAM=vtest314-next-e2e-tests NEXT_TEST_JOB=1 NEXT_TEST_MODE=deploy TEST_TIMINGS_TOKEN=${{ secrets.TEST_TIMINGS_TOKEN }} NEXT_TEST_CONTINUE_ON_ERROR=1 xvfb-run node run-tests.js --type e2e --timings -g ${{ matrix.group }}/2 -c 1 >> /proc/1/fd/1"
- run: docker run --rm -v $(pwd):/work mcr.microsoft.com/playwright:v1.35.1-jammy /bin/bash -c "cd /work && NODE_VERSION=${{ env.NODE_LTS_VERSION }} ./scripts/setup-node.sh && corepack enable > /dev/null && NEXT_JUNIT_TEST_REPORT=true DATADOG_API_KEY=${DATADOG_API_KEY} DD_ENV=ci VERCEL_TEST_TOKEN=${{ secrets.VERCEL_TEST_TOKEN }} VERCEL_TEST_TEAM=vtest314-next-e2e-tests NEXT_TEST_JOB=1 NEXT_TEST_MODE=deploy TEST_TIMINGS_TOKEN=${{ secrets.TEST_TIMINGS_TOKEN }} NEXT_TEST_CONTINUE_ON_ERROR=1 xvfb-run node run-tests.js --type e2e --timings -g ${{ matrix.group }}/2 -c 1 >> /proc/1/fd/1"
name: Run test/e2e (deploy)

- name: Upload test trace
- name: Upload test report
if: always()
uses: actions/upload-artifact@v3
with:
name: test-trace
name: test-reports
if-no-files-found: ignore
retention-days: 2
path: |
test/traces
test/test-junit-report

- name: Upload test trace to datadog
- name: Upload test report to datadog
continue-on-error: true
run: |
ls -al ./test
npm install -g junit-report-merger@6.0.2 @datadog/datadog-ci@2.14.0 @aws-sdk/property-provider@3
jrm ./nextjs-test-result-junit.xml "test/test-junit-report/**/*.xml"
DD_ENV=ci datadog-ci junit upload --tags test.type:nextjs_deploy_e2e --service nextjs ./nextjs-test-result-junit.xml
ls -al ./test/*junit

DD_ENV=ci npx @datadog/datadog-ci@2.23.1 junit upload --tags test.type:nextjs_deploy_e2e --service nextjs ./test/test-junit-report
13 changes: 7 additions & 6 deletions jest.config.js
Original file line number Diff line number Diff line change
Expand Up @@ -22,14 +22,14 @@ const customJestConfig = {
},
}

// Check if the environment variable is set to enable test trace,
// Check if the environment variable is set to enable test report,
// Insert a reporter to generate a junit report to upload.
// This won't count for the retry to avoid duplicated test being reported twice
// - which means our test trace will report test results for the flaky test as failed without retry.
const shouldEnableTestTrace =
process.env.DATADOG_API_KEY && process.env.DATADOG_TRACE_NEXTJS_TEST
//
// This won't count retries to avoid tests being reported twice.
// Our test report will report test results for flaky tests as failed without retry.
const enableTestReport = !!process.env.NEXT_JUNIT_TEST_REPORT

if (shouldEnableTestTrace) {
if (enableTestReport) {
if (!customJestConfig.reporters) {
customJestConfig.reporters = ['default']
}
Expand All @@ -45,6 +45,7 @@ if (shouldEnableTestTrace) {
reportTestSuiteErrors: 'true',
uniqueOutputName: 'true',
outputName: 'nextjs-test-junit',
addFileAttribute: 'true',
},
])
}
Expand Down