Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Bump org.apache.parquet:parquet-avro from 1.13.1 to 1.14.3 #1211

Open
wants to merge 2 commits into
base: master
Choose a base branch
from

Merge branch 'master' into dependabot/maven/org.apache.parquet-parque…

6baa6e6
Select commit
Loading
Failed to load commit list.
Open

Bump org.apache.parquet:parquet-avro from 1.13.1 to 1.14.3 #1211

Merge branch 'master' into dependabot/maven/org.apache.parquet-parque…
6baa6e6
Select commit
Loading
Failed to load commit list.
Google Cloud Build / fhir-data-pipes-pr (cloud-build-fhir) failed Nov 15, 2024 in 23m 1s

Summary

Build Information

Trigger fhir-data-pipes-pr
Build 5f0605a7-ec38-4103-8151-75f62225b668
Start 2024-11-15T11:42:03-08:00
Duration 22m12.099s
Status FAILURE

Steps

Step Status Duration
Launch HAPI Source Server SUCCESS 30.813s
Launch Sink Server Search SUCCESS 28.773s
Launch Sink Server JDBC SUCCESS 28.832s
Wait for the initial Servers Start SUCCESS 1m4.594s
Compile Bunsen and Pipeline SUCCESS 6m26.752s
Build Uploader Image SUCCESS 29.105s
Run Uploader Unit Tests SUCCESS 2.285s
Build E2E Image SUCCESS 2m21.718s
Upload to HAPI SUCCESS 1m31.178s
Build Pipeline Images SUCCESS 21.537s
Run Batch Pipeline in FHIR-search mode with HAPI source SUCCESS 57.64s
Run E2E Test for FHIR-search mode with HAPI source SUCCESS 10.012s
Run Batch Pipeline for JDBC mode with HAPI source SUCCESS 55.545s
Run E2E Test for JDBC mode with HAPI source SUCCESS 8.544s
Run Batch Pipeline for BULK_EXPORT mode with HAPI source SUCCESS 4m15.149s
Run E2E Test for BULK_EXPORT mode with HAPI source SUCCESS 7.611s
Turn down FHIR Sink Server Search SUCCESS 4.33s
Turn down FHIR Sink Server JDBC SUCCESS 6.118s
Create views database SUCCESS 732ms
Launch HAPI FHIR Sink Server Controller SUCCESS 4.138s
Bring up controller and Spark containers SUCCESS 12m55.044s
Run E2E Test for Dockerized Controller and Spark Thriftserver FAILURE 53.599s
Bring down controller and Spark containers QUEUED 33.052s
Turn down HAPI Source Server QUEUED 3m11.792s
Turn down FHIR Sink Server Controller for e2e tests QUEUED 51.261s
Launch OpenMRS Server and HAPI FHIR Sink Server for OpenMRS SUCCESS 7.098s
Wait for Servers Start SUCCESS 3m55.883s
Launch Streaming Pipeline SUCCESS 1m0.644s
Run E2E Test for STREAMING, using OpenMRS Source SUCCESS 0s
Upload to OpenMRS SUCCESS 0s
Run Batch Pipeline FHIR-search mode with OpenMRS source CANCELLED 0s
Run E2E Test for FHIR-search mode with OpenMRS source QUEUED 0s
Run Batch Pipeline for JDBC mode with OpenMRS source QUEUED 0s
Run E2E Test for JDBC mode with OpenMRS source QUEUED 0s
Test Indicators QUEUED 0s
Turn down Webserver and HAPI Server QUEUED 0s

Details

starting build "5f0605a7-ec38-4103-8151-75f62225b668"

FETCHSOURCE
hint: Using 'master' as the name for the initial branch. This default branch name
hint: is subject to change. To configure the initial branch name to use in all
hint: of your new repositories, which will suppress this warning, call:
hint:
hint: 	git config --global init.defaultBranch <name>
hint:
hint: Names commonly chosen instead of 'master' are 'main', 'trunk' and
hint: 'development'. The just-created branch can be renamed via this command:
hint:
hint: 	git branch -m <name>
Initialized empty Git repository in /workspace/.git/
From https://github.com/google/fhir-data-pipes
 * branch            6baa6e6c2c5c4145216b407bd754a23d753efb4d -> FETCH_HEAD
Updating files:  48% (478/978)
Updating files:  49% (480/978)
Updating files:  50% (489/978)
Updating files:  51% (499/978)
Updating files:  52% (509/978)
Updating files:  53% (519/978)
Updating files:  54% (529/978)
Updating files:  54% (534/978)
Updating files:  55% (538/978)
Updating files:  56% (548/978)
Updating files:  57% (558/978)
Updating files:  58% (568/978)
Updating files:  59% (578/978)
Updating files:  60% (587/978)
Updating files:  61% (597/978)
Updating files:  62% (607/978)
Updating files:  63% (617/978)
Updating files:  64% (626/978)
Updating files:  65% (636/978)
Updating files:  66% (646/978)
Updating files:  67% (656/978)
Updating files:  68% (666/978)
Updating files:  69% (675/978)
Updating files:  70% (685/978)
Updating files:  71% (695/978)
Updating files:  72% (705/978)
Updating files:  73% (714/978)
Updating files:  74% (724/978)
Updating files:  75% (734/978)
Updating files:  76% (744/978)
Updating files:  77% (754/978)
Updating files:  78% (763/978)
Updating files:  79% (773/978)
Updating files:  80% (783/978)
Updating files:  81% (793/978)
Updating files:  82% (802/978)
Updating files:  83% (812/978)
Updating files:  84% (822/978)
Updating files:  85% (832/978)
Updating files:  86% (842/978)
Updating files:  87% (851/978)
Updating files:  88% (861/978)
Updating files:  89% (871/978)
Updating files:  90% (881/978)
Updating files:  91% (890/978)
Updating files:  92% (900/978)
Updating files:  93% (910/978)
Updating files:  93% (914/978)
Updating files:  94% (920/978)
Updating files:  95% (930/978)
Updating files:  96% (939/978)
Updating files:  97% (949/978)
Updating files:  98% (959/978)
Updating files:  99% (969/978)
Updating files: 100% (978/978)
Updating files: 100% (978/978), done.
HEAD is now at 6baa6e6 Merge branch 'master' into dependabot/maven/org.apache.parquet-parquet-avro-1.14.3
BUILD
Starting Step #0 - "Launch HAPI Source Server"
Starting Step #7 - "Build E2E Image"
Starting Step #1 - "Launch Sink Server Search"
Starting Step #2 - "Launch Sink Server JDBC"
Starting Step #4 - "Compile Bunsen and Pipeline"
Starting Step #5 - "Build Uploader Image"
Step #0 - "Launch HAPI Source Server": Pulling image: docker/compose
Step #1 - "Launch Sink Server Search": Pulling image: docker/compose
Step #4 - "Compile Bunsen and Pipeline": Pulling image: maven:3.8.5-openjdk-17
Step #2 - "Launch Sink Server JDBC": Pulling image: docker/compose
Step #5 - "Build Uploader Image": Already have image (with digest): gcr.io/cloud-builders/docker
Step #7 - "Build E2E Image": Already have image (with digest): gcr.io/cloud-builders/docker
Step #1 - "Launch Sink Server Search": Using default tag: latest
Step #0 - "Launch HAPI Source Server": Using default tag: latest
Step #2 - "Launch Sink Server JDBC": Using default tag: latest
Step #5 - "Build Uploader Image": Sending build context to Docker daemon  1.466MB

Step #5 - "Build Uploader Image": Step 1/10 : FROM python:3.7-slim
Step #7 - "Build E2E Image": Sending build context to Docker daemon  66.43MB

Step #7 - "Build E2E Image": Step 1/14 : FROM maven:3.8.7-eclipse-temurin-17-focal
Step #0 - "Launch HAPI Source Server": latest: Pulling from docker/compose
Step #0 - "Launch HAPI Source Server": aad63a933944: Pulling fs layer
Step #0 - "Launch HAPI Source Server": b396cd7cbac4: Pulling fs layer
Step #0 - "Launch HAPI Source Server": 0426ec0ed60a: Pulling fs layer
Step #0 - "Launch HAPI Source Server": 9ac2a98ece5b: Pulling fs layer
Step #0 - "Launch HAPI Source Server": 9ac2a98ece5b: Waiting
Step #1 - "Launch Sink Server Search": latest: Pulling from docker/compose
Step #1 - "Launch Sink Server Search": aad63a933944: Pulling fs layer
Step #1 - "Launch Sink Server Search": b396cd7cbac4: Pulling fs layer
Step #1 - "Launch Sink Server Search": 0426ec0ed60a: Pulling fs layer
Step #1 - "Launch Sink Server Search": 9ac2a98ece5b: Pulling fs layer
Step #1 - "Launch Sink Server Search": 9ac2a98ece5b: Waiting
Step #1 - "Launch Sink Server Search": b396cd7cbac4: Download complete
Step #0 - "Launch HAPI Source Server": b396cd7cbac4: Download complete
Step #4 - "Compile Bunsen and Pipeline": 3.8.5-openjdk-17: Pulling from library/maven
Step #7 - "Build E2E Image": 3.8.7-eclipse-temurin-17-focal: Pulling from library/maven
Step #0 - "Launch HAPI Source Server": aad63a933944: Verifying Checksum
Step #0 - "Launch HAPI Source Server": aad63a933944: Download complete
Step #1 - "Launch Sink Server Search": aad63a933944: Verifying Checksum
Step #1 - "Launch Sink Server Search": aad63a933944: Download complete
Step #2 - "Launch Sink Server JDBC": latest: Pulling from docker/compose
Step #2 - "Launch Sink Server JDBC": aad63a933944: Pulling fs layer
Step #2 - "Launch Sink Server JDBC": b396cd7cbac4: Pulling fs layer
Step #2 - "Launch Sink Server JDBC": 0426ec0ed60a: Pulling fs layer
Step #2 - "Launch Sink Server JDBC": 9ac2a98ece5b: Pulling fs layer
Step #2 - "Launch Sink Server JDBC": 9ac2a98ece5b: Waiting
Step #2 - "Launch Sink Server JDBC": b396cd7cbac4: Download complete
Step #5 - "Build Uploader Image": 3.7-slim: Pulling from library/python
Step #0 - "Launch HAPI Source Server": aad63a933944: Pull complete
Step #2 - "Launch Sink Server JDBC": aad63a933944: Pull complete
Step #1 - "Launch Sink Server Search": aad63a933944: Pull complete
Step #1 - "Launch Sink Server Search": b396cd7cbac4: Pull complete
Step #2 - "Launch Sink Server JDBC": b396cd7cbac4: Pull complete
Step #0 - "Launch HAPI Source Server": b396cd7cbac4: Pull complete
Step #7 - "Build E2E Image": 7608715873ec: Pulling fs layer
Step #7 - "Build E2E Image": 64a0b7566174: Pulling fs layer
Step #7 - "Build E2E Image": 414e25888ba9: Pulling fs layer
Step #7 - "Build E2E Image": fa1796814410: Pulling fs layer
Step #7 - "Build E2E Image": dc3ab4515b24: Pulling fs layer
Step #7 - "Build E2E Image": 495d1ae42cb9: Pulling fs layer
Step #7 - "Build E2E Image": 64a0b7566174: Waiting
Step #7 - "Build E2E Image": 66b6d86e5b33: Pulling fs layer
Step #7 - "Build E2E Image": 90062ecd5dec: Pulling fs layer
Step #7 - "Build E2E Image": 414e25888ba9: Waiting
Step #7 - "Build E2E Image": fa1796814410: Waiting
Step #7 - "Build E2E Image": dc3ab4515b24: Waiting
Step #7 - "Build E2E Image": 495d1ae42cb9: Waiting
Step #7 - "Build E2E Image": 90062ecd5dec: Waiting
Step #7 - "Build E2E Image": 66b6d86e5b33: Waiting
Step #4 - "Compile Bunsen and Pipeline": 38a980f2cc8a: Pulling fs layer
Step #4 - "Compile Bunsen and Pipeline": de849f1cfbe6: Pulling fs layer
Step #4 - "Compile Bunsen and Pipeline": a7203ca35e75: Pulling fs layer
Step #4 - "Compile Bunsen and Pipeline": 3337662e6dc9: Pulling fs layer
Step #4 - "Compile Bunsen and Pipeline": 81485058ab89: Pulling fs layer
Step #4 - "Compile Bunsen and Pipeline": b548970362bb: Pulling fs layer
Step #4 - "Compile Bunsen and Pipeline": dbd02ad382f5: Pulling fs layer
Step #4 - "Compile Bunsen and Pipeline": 3337662e6dc9: Waiting
Step #4 - "Compile Bunsen and Pipeline": 81485058ab89: Waiting
Step #4 - "Compile Bunsen and Pipeline": b548970362bb: Waiting
Step #4 - "Compile Bunsen and Pipeline": dbd02ad382f5: Waiting
Step #4 - "Compile Bunsen and Pipeline": 38a980f2cc8a: Waiting
Step #4 - "Compile Bunsen and Pipeline": de849f1cfbe6: Waiting
Step #4 - "Compile Bunsen and Pipeline": a7203ca35e75: Waiting
Step #2 - "Launch Sink Server JDBC": 0426ec0ed60a: Download complete
Step #0 - "Launch HAPI Source Server": 0426ec0ed60a: Download complete
Step #1 - "Launch Sink Server Search": 0426ec0ed60a: Verifying Checksum
Step #1 - "Launch Sink Server Search": 0426ec0ed60a: Download complete
Step #2 - "Launch Sink Server JDBC": 9ac2a98ece5b: Verifying Checksum
Step #2 - "Launch Sink Server JDBC": 9ac2a98ece5b: Download complete
Step #0 - "Launch HAPI Source Server": 9ac2a98ece5b: Download complete
Step #1 - "Launch Sink Server Search": 9ac2a98ece5b: Verifying Checksum
Step #1 - "Launch Sink Server Search": 9ac2a98ece5b: Download complete
Step #5 - "Build Uploader Image": a803e7c4b030: Pulling fs layer
Step #5 - "Build Uploader Image": bf3336e84c8e: Pulling fs layer
Step #5 - "Build Uploader Image": 8973eb85275f: Pulling fs layer
Step #5 - "Build Uploader Image": f9afc3cc0135: Pulling fs layer
Step #5 - "Build Uploader Image": 39312d8b4ab7: Pulling fs layer
Step #5 - "Build Uploader Image": f9afc3cc0135: Waiting
Step #5 - "Build Uploader Image": a803e7c4b030: Waiting
Step #5 - "Build Uploader Image": bf3336e84c8e: Waiting
Step #5 - "Build Uploader Image": 8973eb85275f: Waiting
Step #5 - "Build Uploader Image": 39312d8b4ab7: Waiting
Step #1 - "Launch Sink Server Search": 0426ec0ed60a: Pull complete
Step #0 - "Launch HAPI Source Server": 0426ec0ed60a: Pull complete
Step #2 - "Launch Sink Server JDBC": 0426ec0ed60a: Pull complete
Step #1 - "Launch Sink Server Search": 9ac2a98ece5b: Pull complete
Step #0 - "Launch HAPI Source Server": 9ac2a98ece5b: Pull complete
Step #2 - "Launch Sink Server JDBC": 9ac2a98ece5b: Pull complete
Step #0 - "Launch HAPI Source Server": Digest: sha256:b60a020c0f68047b353a4a747f27f5e5ddb17116b7b018762edfb6f7a6439a82
Step #2 - "Launch Sink Server JDBC": Digest: sha256:b60a020c0f68047b353a4a747f27f5e5ddb17116b7b018762edfb6f7a6439a82
Step #1 - "Launch Sink Server Search": Digest: sha256:b60a020c0f68047b353a4a747f27f5e5ddb17116b7b018762edfb6f7a6439a82
Step #2 - "Launch Sink Server JDBC": Status: Downloaded newer image for docker/compose:latest
Step #0 - "Launch HAPI Source Server": Status: Downloaded newer image for docker/compose:latest
Step #1 - "Launch Sink Server Search": Status: Image is up to date for docker/compose:latest
Step #1 - "Launch Sink Server Search": docker.io/docker/compose:latest
Step #0 - "Launch HAPI Source Server": docker.io/docker/compose:latest
Step #2 - "Launch Sink Server JDBC": docker.io/docker/compose:latest
Step #7 - "Build E2E Image": 64a0b7566174: Verifying Checksum
Step #7 - "Build E2E Image": 64a0b7566174: Download complete
Step #7 - "Build E2E Image": 7608715873ec: Verifying Checksum
Step #7 - "Build E2E Image": 7608715873ec: Download complete
Step #7 - "Build E2E Image": fa1796814410: Download complete
Step #7 - "Build E2E Image": dc3ab4515b24: Verifying Checksum
Step #7 - "Build E2E Image": dc3ab4515b24: Download complete
Step #7 - "Build E2E Image": 66b6d86e5b33: Verifying Checksum
Step #7 - "Build E2E Image": 66b6d86e5b33: Download complete
Step #7 - "Build E2E Image": 7608715873ec: Pull complete
Step #7 - "Build E2E Image": 495d1ae42cb9: Verifying Checksum
Step #7 - "Build E2E Image": 495d1ae42cb9: Download complete
Step #7 - "Build E2E Image": 414e25888ba9: Verifying Checksum
Step #7 - "Build E2E Image": 414e25888ba9: Download complete
Step #7 - "Build E2E Image": 90062ecd5dec: Verifying Checksum
Step #7 - "Build E2E Image": 90062ecd5dec: Download complete
Step #4 - "Compile Bunsen and Pipeline": 38a980f2cc8a: Verifying Checksum
Step #4 - "Compile Bunsen and Pipeline": 38a980f2cc8a: Download complete
Step #7 - "Build E2E Image": 64a0b7566174: Pull complete
Step #4 - "Compile Bunsen and Pipeline": de849f1cfbe6: Verifying Checksum
Step #4 - "Compile Bunsen and Pipeline": de849f1cfbe6: Download complete
Step #4 - "Compile Bunsen and Pipeline": 81485058ab89: Verifying Checksum
Step #4 - "Compile Bunsen and Pipeline": 81485058ab89: Download complete
Step #4 - "Compile Bunsen and Pipeline": b548970362bb: Verifying Checksum
Step #4 - "Compile Bunsen and Pipeline": b548970362bb: Download complete
Step #4 - "Compile Bunsen and Pipeline": 38a980f2cc8a: Pull complete
Step #4 - "Compile Bunsen and Pipeline": dbd02ad382f5: Verifying Checksum
Step #4 - "Compile Bunsen and Pipeline": dbd02ad382f5: Download complete
Step #4 - "Compile Bunsen and Pipeline": de849f1cfbe6: Pull complete
Step #4 - "Compile Bunsen and Pipeline": a7203ca35e75: Verifying Checksum
Step #4 - "Compile Bunsen and Pipeline": a7203ca35e75: Download complete
Step #4 - "Compile Bunsen and Pipeline": 3337662e6dc9: Verifying Checksum
Step #4 - "Compile Bunsen and Pipeline": 3337662e6dc9: Download complete
Step #5 - "Build Uploader Image": a803e7c4b030: Verifying Checksum
Step #5 - "Build Uploader Image": a803e7c4b030: Download complete
Step #5 - "Build Uploader Image": bf3336e84c8e: Verifying Checksum
Step #5 - "Build Uploader Image": bf3336e84c8e: Download complete
Step #2 - "Launch Sink Server JDBC": Creating volume "sink-server-jdbc_hapi-data" with default driver
Step #5 - "Build Uploader Image": 8973eb85275f: Verifying Checksum
Step #5 - "Build Uploader Image": 8973eb85275f: Download complete
Step #7 - "Build E2E Image": 414e25888ba9: Pull complete
Step #7 - "Build E2E Image": fa1796814410: Pull complete
Step #5 - "Build Uploader Image": f9afc3cc0135: Download complete
Step #0 - "Launch HAPI Source Server": Creating network "hapi-compose_default" with the default driver
Step #1 - "Launch Sink Server Search": Creating volume "sink-server-search_hapi-data" with default driver
Step #5 - "Build Uploader Image": 39312d8b4ab7: Verifying Checksum
Step #5 - "Build Uploader Image": 39312d8b4ab7: Download complete
Step #0 - "Launch HAPI Source Server": Creating volume "hapi-compose_hapi-fhir-db" with default driver
Step #2 - "Launch Sink Server JDBC": Pulling sink-server (hapiproject/hapi:latest)...
Step #5 - "Build Uploader Image": a803e7c4b030: Pull complete
Step #0 - "Launch HAPI Source Server": Creating volume "hapi-compose_hapi-server" with default driver
Step #5 - "Build Uploader Image": bf3336e84c8e: Pull complete
Step #1 - "Launch Sink Server Search": Pulling sink-server (hapiproject/hapi:latest)...
Step #7 - "Build E2E Image": dc3ab4515b24: Pull complete
Step #7 - "Build E2E Image": 495d1ae42cb9: Pull complete
Step #5 - "Build Uploader Image": 8973eb85275f: Pull complete
Step #7 - "Build E2E Image": 66b6d86e5b33: Pull complete
Step #5 - "Build Uploader Image": f9afc3cc0135: Pull complete
Step #7 - "Build E2E Image": 90062ecd5dec: Pull complete
Step #7 - "Build E2E Image": Digest: sha256:ad4b34f02e52164df83182a2a05074b5288d6e6bcc2dfa0ce3d6fa43ec8b557f
Step #7 - "Build E2E Image": Status: Downloaded newer image for maven:3.8.7-eclipse-temurin-17-focal
Step #7 - "Build E2E Image":  ---> 896b49b4d0b7
Step #7 - "Build E2E Image": Step 2/14 : RUN apt-get update && apt-get install -y jq  python3.8 python3-pip
Step #0 - "Launch HAPI Source Server": Pulling db (postgres:)...
Step #2 - "Launch Sink Server JDBC": latest: Pulling from hapiproject/hapi
Step #5 - "Build Uploader Image": 39312d8b4ab7: Pull complete
Step #4 - "Compile Bunsen and Pipeline": a7203ca35e75: Pull complete
Step #1 - "Launch Sink Server Search": latest: Pulling from hapiproject/hapi
Step #5 - "Build Uploader Image": Digest: sha256:b53f496ca43e5af6994f8e316cf03af31050bf7944e0e4a308ad86c001cf028b
Step #5 - "Build Uploader Image": Status: Downloaded newer image for python:3.7-slim
Step #5 - "Build Uploader Image":  ---> a255ffcb469f
Step #5 - "Build Uploader Image": Step 2/10 : WORKDIR /uploader
Step #0 - "Launch HAPI Source Server": latest: Pulling from library/postgres
Step #4 - "Compile Bunsen and Pipeline": 3337662e6dc9: Pull complete
Step #4 - "Compile Bunsen and Pipeline": 81485058ab89: Pull complete
Step #4 - "Compile Bunsen and Pipeline": b548970362bb: Pull complete
Step #4 - "Compile Bunsen and Pipeline": dbd02ad382f5: Pull complete
Step #7 - "Build E2E Image":  ---> Running in 6954d3a440c4
Step #5 - "Build Uploader Image":  ---> Running in 04e47dd81655
Step #4 - "Compile Bunsen and Pipeline": Digest: sha256:3a9c30b3af6278a8ae0007d3a3bf00fff80ec3ed7ae4eb9bfa1772853101549b
Step #4 - "Compile Bunsen and Pipeline": Status: Downloaded newer image for maven:3.8.5-openjdk-17
Step #4 - "Compile Bunsen and Pipeline": docker.io/library/maven:3.8.5-openjdk-17
Step #5 - "Build Uploader Image": Removing intermediate container 04e47dd81655
Step #5 - "Build Uploader Image":  ---> d144d0a443e7
Step #5 - "Build Uploader Image": Step 3/10 : COPY  ./ ./
Step #5 - "Build Uploader Image":  ---> 26df0c76b838
Step #5 - "Build Uploader Image": Step 4/10 : RUN pip install -r requirements.txt
Step #5 - "Build Uploader Image":  ---> Running in d53ddf92cb9b
Step #7 - "Build E2E Image": Get:1 http://archive.ubuntu.com/ubuntu focal InRelease [265 kB]
Step #7 - "Build E2E Image": Get:2 http://security.ubuntu.com/ubuntu focal-security InRelease [128 kB]
Step #7 - "Build E2E Image": Get:3 http://archive.ubuntu.com/ubuntu focal-updates InRelease [128 kB]
Step #7 - "Build E2E Image": Get:4 http://archive.ubuntu.com/ubuntu focal-backports InRelease [128 kB]
Step #7 - "Build E2E Image": Get:5 http://archive.ubuntu.com/ubuntu focal/restricted amd64 Packages [33.4 kB]
Step #7 - "Build E2E Image": Get:6 http://archive.ubuntu.com/ubuntu focal/multiverse amd64 Packages [177 kB]
Step #7 - "Build E2E Image": Get:7 http://archive.ubuntu.com/ubuntu focal/universe amd64 Packages [11.3 MB]
Step #7 - "Build E2E Image": Get:8 http://security.ubuntu.com/ubuntu focal-security/universe amd64 Packages [1,276 kB]
Step #4 - "Compile Bunsen and Pipeline": [INFO] Error stacktraces are turned on.
Step #4 - "Compile Bunsen and Pipeline": [INFO] Scanning for projects...
Step #7 - "Build E2E Image": Get:9 http://archive.ubuntu.com/ubuntu focal/main amd64 Packages [1,275 kB]
Step #7 - "Build E2E Image": Get:10 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 Packages [4,570 kB]
Step #7 - "Build E2E Image": Get:11 http://archive.ubuntu.com/ubuntu focal-updates/restricted amd64 Packages [4,289 kB]
Step #7 - "Build E2E Image": Get:12 http://security.ubuntu.com/ubuntu focal-security/main amd64 Packages [4,107 kB]
Step #7 - "Build E2E Image": Get:13 http://archive.ubuntu.com/ubuntu focal-updates/multiverse amd64 Packages [33.5 kB]
Step #7 - "Build E2E Image": Get:14 http://archive.ubuntu.com/ubuntu focal-updates/universe amd64 Packages [1,566 kB]
Step #7 - "Build E2E Image": Get:15 http://archive.ubuntu.com/ubuntu focal-backports/main amd64 Packages [55.2 kB]
Step #7 - "Build E2E Image": Get:16 http://archive.ubuntu.com/ubuntu focal-backports/universe amd64 Packages [28.6 kB]
Step #7 - "Build E2E Image": Get:17 http://security.ubuntu.com/ubuntu focal-security/restricted amd64 Packages [4,137 kB]
Step #7 - "Build E2E Image": Get:18 http://security.ubuntu.com/ubuntu focal-security/multiverse amd64 Packages [30.9 kB]
Step #5 - "Build Uploader Image": Collecting google-auth
Step #5 - "Build Uploader Image":   Downloading google_auth-2.36.0-py2.py3-none-any.whl (209 kB)
Step #5 - "Build Uploader Image":      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 209.5/209.5 kB 5.7 MB/s eta 0:00:00
Step #5 - "Build Uploader Image": Collecting mock
Step #5 - "Build Uploader Image":   Downloading mock-5.1.0-py3-none-any.whl (30 kB)
Step #5 - "Build Uploader Image": Collecting requests
Step #5 - "Build Uploader Image":   Downloading requests-2.31.0-py3-none-any.whl (62 kB)
Step #5 - "Build Uploader Image":      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 62.6/62.6 kB 8.0 MB/s eta 0:00:00
Step #5 - "Build Uploader Image": Collecting cachetools<6.0,>=2.0.0
Step #5 - "Build Uploader Image":   Downloading cachetools-5.5.0-py3-none-any.whl (9.5 kB)
Step #5 - "Build Uploader Image": Collecting pyasn1-modules>=0.2.1
Step #5 - "Build Uploader Image":   Downloading pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Step #5 - "Build Uploader Image":      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 181.3/181.3 kB 12.3 MB/s eta 0:00:00
Step #5 - "Build Uploader Image": Collecting rsa<5,>=3.1.4
Step #5 - "Build Uploader Image":   Downloading rsa-4.9-py3-none-any.whl (34 kB)
Step #5 - "Build Uploader Image": Collecting charset-normalizer<4,>=2
Step #5 - "Build Uploader Image":   Downloading charset_normalizer-3.4.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (138 kB)
Step #4 - "Compile Bunsen and Pipeline": [INFO] ------------------------------------------------------------------------
Step #4 - "Compile Bunsen and Pipeline": [INFO] Reactor Build Order:
Step #4 - "Compile Bunsen and Pipeline": [INFO] 
Step #4 - "Compile Bunsen and Pipeline": [INFO] root                                                               [pom]
Step #4 - "Compile Bunsen and Pipeline": [INFO] Bunsen Parent                                                      [pom]
Step #4 - "Compile Bunsen and Pipeline": [INFO] Extension Structure Definitions                                    [jar]
Step #4 - "Compile Bunsen and Pipeline": [INFO] Bunsen Core                                                        [jar]
Step #4 - "Compile Bunsen and Pipeline": [INFO] Bunsen Core R4                                                     [jar]
Step #4 - "Compile Bunsen and Pipeline": [INFO] Bunsen Core Stu3                                                   [jar]
Step #4 - "Compile Bunsen and Pipeline": [INFO] Bunsen Avro                                                        [jar]
Step #4 - "Compile Bunsen and Pipeline": [INFO] FHIR Analytics                                                     [pom]
Step #4 - "Compile Bunsen and Pipeline": [INFO] common                                                             [jar]
Step #4 - "Compile Bunsen and Pipeline": [INFO] batch                                                              [jar]
Step #4 - "Compile Bunsen and Pipeline": [INFO] streaming                                                          [jar]
Step #4 - "Compile Bunsen and Pipeline": [INFO] controller                                                         [jar]
Step #4 - "Compile Bunsen and Pipeline": [INFO] coverage                                                           [pom]
Step #4 - "Compile Bunsen and Pipeline": [INFO] 
Step #4 - "Compile Bunsen and Pipeline": [INFO] Using the MultiThreadedBuilder implementation with a thread count of 32
Step #5 - "Build Uploader Image":      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 138.3/138.3 kB 9.5 MB/s eta 0:00:00
Step #4 - "Compile Bunsen and Pipeline": [INFO] 
Step #4 - "Compile Bunsen and Pipeline": [INFO] -------------------< com.google.fhir.analytics:root >-------------------
Step #4 - "Compile Bunsen and Pipeline": [INFO] Building root 0.2.7-SNAPSHOT                                      [1/13]
Step #4 - "Compile Bunsen and Pipeline": [INFO] --------------------------------[ pom ]---------------------------------
Step #7 - "Build E2E Image": Fetched 33.6 MB in 4s (8,134 kB/s)
Step #5 - "Build Uploader Image": Collecting certifi>=2017.4.17
Step #5 - "Build Uploader Image":   Downloading certifi-2024.8.30-py3-none-any.whl (167 kB)
Step #5 - "Build Uploader Image":      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 167.3/167.3 kB 10.1 MB/s eta 0:00:00
Step #5 - "Build Uploader Image": Collecting urllib3<3,>=1.21.1
Step #5 - "Build Uploader Image":   Downloading urllib3-2.0.7-py3-none-any.whl (124 kB)
Step #5 - "Build Uploader Image":      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 124.2/124.2 kB 9.2 MB/s eta 0:00:00
Step #4 - "Compile Bunsen and Pipeline": [INFO] 
Step #4 - "Compile Bunsen and Pipeline": [INFO] --- jacoco-maven-plugin:0.8.12:prepare-agent (default) @ root ---
Step #5 - "Build Uploader Image": Collecting idna<4,>=2.5
Step #5 - "Build Uploader Image":   Downloading idna-3.10-py3-none-any.whl (70 kB)
Step #5 - "Build Uploader Image":      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 70.4/70.4 kB 11.4 MB/s eta 0:00:00
Step #5 - "Build Uploader Image": Collecting pyasn1<0.6.0,>=0.4.6
Step #5 - "Build Uploader Image":   Downloading pyasn1-0.5.1-py2.py3-none-any.whl (84 kB)
Step #5 - "Build Uploader Image":      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 84.9/84.9 kB 13.7 MB/s eta 0:00:00
Step #5 - "Build Uploader Image": Installing collected packages: urllib3, pyasn1, mock, idna, charset-normalizer, certifi, cachetools, rsa, requests, pyasn1-modules, google-auth
Step #4 - "Compile Bunsen and Pipeline": [INFO] argLine set to -javaagent:/root/.m2/repository/org/jacoco/org.jacoco.agent/0.8.12/org.jacoco.agent-0.8.12-runtime.jar=destfile=/workspace/target/jacoco.exec
Step #4 - "Compile Bunsen and Pipeline": [INFO] 
Step #4 - "Compile Bunsen and Pipeline": [INFO] --- maven-install-plugin:2.4:install (default-install) @ root ---
Step #7 - "Build E2E Image": Reading package lists...
Step #5 - "Build Uploader Image": Successfully installed cachetools-5.5.0 certifi-2024.8.30 charset-normalizer-3.4.0 google-auth-2.36.0 idna-3.10 mock-5.1.0 pyasn1-0.5.1 pyasn1-modules-0.3.0 requests-2.31.0 rsa-4.9 urllib3-2.0.7
Step #5 - "Build Uploader Image": �[91mWARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
Step #4 - "Compile Bunsen and Pipeline": [INFO] Installing /workspace/pom.xml to /root/.m2/repository/com/google/fhir/analytics/root/0.2.7-SNAPSHOT/root-0.2.7-SNAPSHOT.pom
Step #4 - "Compile Bunsen and Pipeline": [INFO] 
Step #4 - "Compile Bunsen and Pipeline": [INFO] ------------------< com.cerner.bunsen:bunsen-parent >-------------------
Step #4 - "Compile Bunsen and Pipeline": [INFO] Building Bunsen Parent 0.5.14-SNAPSHOT                            [2/13]
Step #4 - "Compile Bunsen and Pipeline": [INFO] --------------------------------[ pom ]---------------------------------
Step #4 - "Compile Bunsen and Pipeline": [INFO] 
Step #4 - "Compile Bunsen and Pipeline": [INFO] ----------------< com.google.fhir.analytics:pipelines >-----------------
Step #4 - "Compile Bunsen and Pipeline": [INFO] Building FHIR Analytics 0.2.7-SNAPSHOT                            [3/13]
Step #4 - "Compile Bunsen and Pipeline": [INFO] --------------------------------[ pom ]---------------------------------
Step #5 - "Build Uploader Image": �[0m�[91m
Step #5 - "Build Uploader Image": [notice] A new release of pip is available: 23.0.1 -> 24.0
Step #5 - "Build Uploader Image": [notice] To update, run: pip install --upgrade pip
Step #7 - "Build E2E Image": Reading package lists...
Step #7 - "Build E2E Image": Building dependency tree...
Step #7 - "Build E2E Image": Reading state information...
Step #7 - "Build E2E Image": The following additional packages will be installed:
Step #7 - "Build E2E Image":   build-essential cpp cpp-9 dirmngr dpkg-dev fakeroot file g++ g++-9 gcc
Step #7 - "Build E2E Image":   gcc-10-base gcc-9 gcc-9-base gnupg gnupg-l10n gnupg-utils gpg gpg-agent
Step #7 - "Build E2E Image":   gpg-wks-client gpg-wks-server gpgconf gpgsm libalgorithm-diff-perl
Step #7 - "Build E2E Image":   libalgorithm-diff-xs-perl libalgorithm-merge-perl libasan5 libassuan0
Step #7 - "Build E2E Image":   libatomic1 libc-dev-bin libc6 libc6-dev libcc1-0 libcrypt-dev libdpkg-perl
Step #7 - "Build E2E Image":   libexpat1 libexpat1-dev libfakeroot libfile-fcntllock-perl libgcc-9-dev
Step #7 - "Build E2E Image":   libgcc-s1 libgomp1 libisl22 libitm1 libjq1 libksba8 liblocale-gettext-perl
Step #7 - "Build E2E Image":   liblsan0 libmagic-mgc libmagic1 libmpc3 libmpdec2 libmpfr6 libnpth0 libonig5
Step #7 - "Build E2E Image":   libpython3-dev libpython3-stdlib libpython3.8 libpython3.8-dev
Step #7 - "Build E2E Image":   libpython3.8-minimal libpython3.8-stdlib libquadmath0 libreadline8
Step #7 - "Build E2E Image":   libstdc++-9-dev libstdc++6 libtsan0 libubsan1 linux-libc-dev make manpages
Step #7 - "Build E2E Image":   manpages-dev mime-support pinentry-curses python-pip-whl python3 python3-dev
Step #7 - "Build E2E Image":   python3-distutils python3-lib2to3 python3-minimal python3-pkg-resources
Step #7 - "Build E2E Image":   python3-setuptools python3-wheel python3.8-dev python3.8-minimal
Step #7 - "Build E2E Image":   readline-common xz-utils zlib1g-dev
Step #7 - "Build E2E Image": Suggested packages:
Step #7 - "Build E2E Image":   cpp-doc gcc-9-locales dbus-user-session libpam-systemd pinentry-gnome3 tor
Step #7 - "Build E2E Image":   debian-keyring g++-multilib g++-9-multilib gcc-9-doc gcc-multilib autoconf
Step #7 - "Build E2E Image":   automake libtool flex bison gdb gcc-doc gcc-9-multilib parcimonie xloadimage
Step #7 - "Build E2E Image":   scdaemon glibc-doc bzr libstdc++-9-doc make-doc man-browser pinentry-doc
Step #7 - "Build E2E Image":   python3-doc python3-tk python3-venv python-setuptools-doc python3.8-venv
Step #7 - "Build E2E Image":   python3.8-doc binfmt-support readline-doc
Step #7 - "Build E2E Image": The following NEW packages will be installed:
Step #7 - "Build E2E Image":   build-essential cpp cpp-9 dirmngr dpkg-dev fakeroot file g++ g++-9 gcc gcc-9
Step #7 - "Build E2E Image":   gcc-9-base gnupg gnupg-l10n gnupg-utils gpg gpg-agent gpg-wks-client
Step #7 - "Build E2E Image":   gpg-wks-server gpgconf gpgsm jq libalgorithm-diff-perl
Step #7 - "Build E2E Image":   libalgorithm-diff-xs-perl libalgorithm-merge-perl libasan5 libassuan0
Step #7 - "Build E2E Image":   libatomic1 libc-dev-bin libc6-dev libcc1-0 libcrypt-dev libdpkg-perl
Step #7 - "Build E2E Image":   libexpat1-dev libfakeroot libfile-fcntllock-perl libgcc-9-dev libgomp1
Step #7 - "Build E2E Image":   libisl22 libitm1 libjq1 libksba8 liblocale-gettext-perl liblsan0
Step #7 - "Build E2E Image":   libmagic-mgc libmagic1 libmpc3 libmpdec2 libmpfr6 libnpth0 libonig5
Step #7 - "Build E2E Image":   libpython3-dev libpython3-stdlib libpython3.8 libpython3.8-dev
Step #7 - "Build E2E Image":   libpython3.8-minimal libpython3.8-stdlib libquadmath0 libreadline8
Step #7 - "Build E2E Image":   libstdc++-9-dev libtsan0 libubsan1 linux-libc-dev make manpages manpages-dev
Step #7 - "Build E2E Image":   mime-support pinentry-curses python-pip-whl python3 python3-dev
Step #7 - "Build E2E Image":   python3-distutils python3-lib2to3 python3-minimal python3-pip
Step #7 - "Build E2E Image":   python3-pkg-resources python3-setuptools python3-wheel python3.8
Step #7 - "Build E2E Image":   python3.8-dev python3.8-minimal readline-common xz-utils zlib1g-dev
Step #7 - "Build E2E Image": The following packages will be upgraded:
Step #7 - "Build E2E Image":   gcc-10-base libc6 libexpat1 libgcc-s1 libstdc++6
Step #7 - "Build E2E Image": 5 upgraded, 84 newly installed, 0 to remove and 79 not upgraded.
Step #7 - "Build E2E Image": Need to get 65.4 MB of archives.
Step #7 - "Build E2E Image": After this operation, 263 MB of additional disk space will be used.
Step #7 - "Build E2E Image": Get:1 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 gcc-10-base amd64 10.5.0-1ubuntu1~20.04 [20.8 kB]
Step #1 - "Launch Sink Ser
...
[Logs truncated due to log size limitations. For full logs, see https://storage.cloud.google.com/cloud-build-gh-logs/log-5f0605a7-ec38-4103-8151-75f62225b668.txt.]
...
Pipeline FHIR-search mode with OpenMRS source": 20:04:08.781 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=16310
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:08.789 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=4050
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:08.897 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=5800
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:08.908 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=14550
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:09.026 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=7550
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:09.098 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=2300
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:09.185 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=11050
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": java.io.IOException: Could not read footer: java.lang.RuntimeException: file:/workspace/e2e-tests/controller-spark/dwh/controller_DWH_TIMESTAMP_2024_11_15T20_03_51_364262761Z/Encounter/ConvertResourceFn_Encounter_output-parquet-th-449-ts-1731701045500-r-66542.parquet is not a Parquet file (too small length: 0)
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": 
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:09.431 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=620
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:09.449 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=9310
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:09.450 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=12800
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:09.519 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=16320
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:09.600 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=4060
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:09.635 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=5810
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:09.678 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=14560
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:09.790 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=7560
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:09.810 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=2310
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:09.991 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=11060
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:10.114 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=630
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:10.172 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=12810
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:10.185 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=9320
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:10.257 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=16330
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:10.353 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=4070
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:10.368 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=5820
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": java.io.IOException: Could not read footer: java.lang.RuntimeException: file:/workspace/e2e-tests/controller-spark/dwh/controller_DWH_TIMESTAMP_2024_11_15T20_03_51_364262761Z/Observation/ConvertResourceFn_Observation_output-parquet-th-1516-ts-1731701048762-r-864001.parquet is not a Parquet file (too small length: 0)
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": 
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:10.398 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=14570
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:10.480 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=2320
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:10.523 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=7570
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:10.689 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=11070
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:10.838 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=9330
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:10.839 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=12820
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:10.850 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=640
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:10.963 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=16340
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:11.060 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=4080
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:11.090 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=14580
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:11.094 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=5830
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:11.158 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=2330
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:11.284 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=7580
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:11.457 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=11080
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:11.536 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=12830
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:11.553 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=9340
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:11.610 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=650
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:11.761 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=16350
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:11.813 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=5840
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:11.853 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=4090
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:11.892 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=14590
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:11.914 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=2340
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:12.050 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=7590
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:12.140 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=11090
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:12.313 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=9350
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:12.336 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=12840
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:12.445 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=660
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:12.553 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=16360
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:12.588 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=5850
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:12.627 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=14600
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:12.627 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=4100
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:12.677 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=2350
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:12.794 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=7600
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:12.973 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=11100
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:13.073 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=9360
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:13.133 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=12850
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:13.260 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=670
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:13.310 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=16370
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:13.361 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=4110
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:13.372 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=5860
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:13.399 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=14610
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:13.412 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=2360
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:13.499 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=7610
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:13.681 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=11110
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": SUCCESSE2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total patients: 
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total encounters: 
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total observations: 
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total patient flat rows: 0
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total encounter flat rows: 0
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total observation flat rows: 0
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Mismatch in count of records
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Actual total patients: , expected total: 79
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Actual total encounters: , expected total: 4006
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total observations: , expected total: 17279
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Actual total materialized view patients: 0, expected total: 106
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Actual total materialized view encounters: 0, expected total: 4006
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Actual total materialized view observations: 0, expected total: 17279
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:13.751 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=9370
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:13.887 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=12860
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:13.983 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=680
Finished Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver"
ERROR
ERROR: build step 21 "us-docker.pkg.dev/cloud-build-fhir/fhir-analytics/e2e-tests/controller-spark:6baa6e6" failed: step exited with non-zero status: 2
Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source": 20:04:14.045 [direct-runner-worker] INFO  c.g.fhir.analytics.FetchResources com.google.fhir.analytics.FetchResources$SearchFn.processElement:137 - Fetching 10 resources for state Observation_main; URL= http://openmrs:8080/openmrs/ws/fhir2/R4?_getpages=6f29cbb6-6c39-40ca-a69c-e25f50dee8f7&_getpagesoffset=16380
Finished Step #30 - "Run Batch Pipeline FHIR-search mode with OpenMRS source"
Step #2 - "Launch Sink Server JDBC": �[1A�[2K
Creating sink-server-jdbc ... �[32mdone�[0m
�[1B
Step #0 - "Launch HAPI Source Server": �[2A�[2K
Creating hapi-fhir-db ... �[32mdone�[0m
�[2B�[1A�[2K
Creating hapi-server  ... �[32mdone�[0m
�[1B
Step #1 - "Launch Sink Server Search": �[1A�[2K
Creating sink-server-search ... �[32mdone�[0m
�[1B
Step #19 - "Launch HAPI FHIR Sink Server Controller": �[1A�[2K
Creating sink-server-controller ... �[32mdone�[0m
�[1B
Step #20 - "Bring up controller and Spark containers": �[1A�[2K
Creating pipeline-controller ... �[32mdone�[0m
�[1B
Step #25 - "Launch OpenMRS Server and HAPI FHIR Sink Server for OpenMRS": �[1A�[2K
Creating openmrs                 ... �[32mdone�[0m
�[1B

Build Log: https://storage.cloud.google.com/cloud-build-gh-logs/log-5f0605a7-ec38-4103-8151-75f62225b668.txt